"I was really interested to see what changes have been made with the latest prototype. The short answer is that ventilation has improved (less fogging up of the glasses), the display resolution is now higher, the screen refreshes faster, and combined with the VR Audio the experience is more immersive than ever."
The msot significant changes with Crescent bay are the increased refresh rate (touched on later in the article), new lenses (a lens/fresnel hybrid doublet), a light diffusion filter to improve perceived fill factor, but above all; huge improvements to position tracking. Not only is there an increase in fidelity and further reduction in latency and jitter, but the headset is now visible to track from all angles (due to the rear marker coverage), and the camera's minimum tracking distance has been reduced, increasing usable tracking volume and moving it closer to the camera.
There's also the removal the the 'black smear' effect the DK2's panel had (where screen area at 0 brightness would under/overshot when adjacent to bright objects, whereas screen area at 1 level above black would not), which was a hardware limitation of how the Note 3 panel was driven.
I don't understand how the Rift would be of no interest as of yet. It only depends on the games you wanna play. Everything that relys on a seated experience (in reality) would be much better of by using VR headset than monitors. Take sim-racing for instance. The Rift is made for this. Even 3 monitors setup won't give you the immersion of seating in the car. Same goes for plane sims etc. I think the general idea is that it's made for playing FPS's while it's actually a perfect match for sims.
As for the consumer model - 4K would be great but what graphic card would we use to drive this thing in stereo? I always hoped that the res on this will be something reasonable (1440p max) so that a setup allowing the use of Rift would be semi-affordable. I agree with the Oculus founders - res is not as important as is ghosting, latency, refresh rates etc.
"I don't understand how the Rift would be of no interest as of yet. It only depends on the games you wanna play. "
Because software- and configuration-wise, it's a mess. You need to often switch between extended mode and direct modes depending on what game/demo you want to play, many games are built on older versions of the SDK, there's a lot of people who can't run direct mode at all due to crashes, etc.
Having not used Crescent Bay, I'm not 100% sure, but it sounds like the hardware is pretty much ready (or at least as ready as they can get given current display technology), but on the software side it's just nowhere near consumer-ready.
A 4k screen could work, users with lower end hardware would just upscale 1080p to 4k. By the time the rift releases either late this year/early next year going by what they have said, AMD and Nvidia will have newer gpu's out. Nvidia also did announce they are working on VR SLI so maybe when the rift releases those who want to run 4k@90hz will have SLI gtx 980ti's or whatever the next high end nvidia card is.
The big issue Oculus is facing isn't hardware though it's going to the the availability of quality content on release day.
Even with current resolutions, reliable 90Hz operation will require pretty beefy graphics hardware, way more than the cost of the Rift itself.
So what I want to see is foveal tracking integrated into the Rift.
Only a tiny proportion of your field of view is 'high resolution', the further you get from the fovea, the lower the resolution resolution gets. If you track the movements of the eye, you can partition your high resolution display into high and low resolution zones. When the fovea is positioned such that you are looking at one part of the display, that area is rendered at full detail while the rest of the screen is rendered at a lower detail level, saving you huge amounts of processing. It would also be really easy to tune, the faster your hardware, the larger high detail zone would be, and you could dynamically scale the size of the high detail zone of the next frame in response to length of time it took to render the previous frame.
Using this technique you could get the performance you need with far less computation (I can imagine driving a 4k Rift display with a GTX970 for instance). This technique could be used for monitors too, but there you would need to track both head and eye simultaneously, and the area the fovea would cover an arc of many more pixels.
This technique would, however, require high speed/accuracy eyeball tracking hardware to be integrated into the Rift, along with SDK support, graphics API support and probably game support., so it will be a long way off. I can see that it might be more likely in Mobile Rift, where making efficient use of GPU resources is even more important.
Anyway, roll on eye tracking, that's what I say...
Don't think this will work. Our eyes/brain would clearly notice the transition between high and low resolution. It would be like a tunnel vision. Besides that, I don't think there should a current rendering technique where you could slowly fade to lower resolutions. You could make different zones with circular clipping but our eyes would probably notice a very hard border.
You undersell just how high speed that tracking would have to be. Our eyes saccade (move and fixate) extremely fast. It's much harder than the head tracking Oculus is just managing to resolve now. To realize any benefit without gross artifacts you would have to reliably render a new frame probably ten time faster than needed on today's Oculus for head tracking. It turns out tracking and responding with a relevant new frame fast enough is much harder than just rendering the whole frame at high resolution.
There are cool things that can be done if we get really good eye tracking, eg. automated and accurate eye calibration, perspective adjustment for the movement of the pupil, simulated focal adjustments. Performance optimization, however, may be the hardest to achieve.
Ha, I was thinking the same thing this morning. Well-designed games will already render things in lower detail when they're further away, so it would make sense to do the same thing when they're further away from the fovea.
I think you're missing a crucial point. While our eyes only see high resolution in a small area, we can still distinguish changes in our periphery rather easily. You can do your own experiments at home if you have the proper hardware and know how to code a bit, but what you suggest would easily be perceptible. Our eyes are simply too fast and can recognize such changes. You're asking a sensor to track the eye with such precision, but also with such speed that it can then send that data to the computer for processing, process it, and then change the output before the next frame is drawn. For this technique to work, you need to do it extremely fast (much faster than the refresh rate of the display device) for it to appear normal, but then if you have to do it that much faster, you're asking your hardware to do all that work that I mentioned in the previous sentence that much faster.
High res graphics with VR were/are never meant to reach all users of gaming. This is one of those areas where you have to pay to play. It's not cheap, nor will it ever be. Current hardware is always playing catch up to the requirements of games at the best settings, and that's for monitor displays. VR requires so much more processing power than a monitor. I'll say it again. Playing the newest and visually stunning games on VR is not, nor will it be, for everyone. It inherently will always require expensive hardware.
The sad thing is, many games that are a few years old play wonderfully on the Rift and other VR devices, albeit with current hardware. Unfortunately, most people don't want to play "old" games. They want the new games. If people were willing to play older titles, I think they'd have fantastic experiences. It's amazing, even at 75Hz, what a game which is able to be played at 75fps constant, even with older (2, 3, 4 etc years old), looks like in VR. While you always know you're in VR, it is so easy to get lost in the environment of the game world when the experience is absolutely fluid.
"on the way there we passed by the Oculus booth that had a line of people waiting to experience DevKit 2. After Crescent Bay, I can’t help but feel a bit sorry for them"
Hmmm, maybe it was on a different day, but on Tuesday when I was checking out Oculus the line was to check out Crescent Bay. Now, I can't comment on exact specs but I was blown away by the demo. I even had a very cool moment during that last demo (slow mo on-rails sequence). The sequence had bullets flying around and some would pass through "you". Seconds into the demo one such bullet was going towards me and just as hit my imaginary body I felt a buzz/vibration - and for a moment I freaked out, I was thinking - how the heck did they just integrate haptic feedback into this thing!?! Then a second later I realized that it was my friend texting me and the phone in my pocket vibrated. :) But it was such an awesome coincidence and with me so immersed into the demo the vibration really did feel for a moment like it was coming from the area of my torso where the bullet was passing through, not my pants pocket.
The biggest thing I would love to see with Oculus is something that would track my hands and put them into the VR environment. I think with legs there would be too many challenges, but hands are totally doable and that would add soooo much to the feeling of immersivness.
Going from a kickstarter charity project to millionaires and facebook employees probably hasn't done much to add a sense of urgency for Oculus developers, but I can't recall the last product that took so long to go from development prototype to consumer product. I guess time will tell if the Rift ever shows up on store shelves.
Pretty sure Facebook management and shareholders will provide that sense of urgency. It's not a one-man show, and for all of Facebook's problems, it is professionally managed.
Lots of products take more than two years to go from prototype to commercial release. Off the top of my head: iPhone, Xbox One / PS4, Google's cars + Glass + other stuff. There are lots more.
Really? Cars? Thing that can kill you when things go awry are in the same category as XBONE and PS4. p.s.:I didn't put iPhones in there, because "they explode" and Glass is social suicide, so these to may be in the same category as cars.
That is a silly statement - even a "refresh" of existing product lines, such as annual smartphone updates, likely have 18-24 month lead times, if not longer. I would suspect Apple and Samsung already have teams working on the iPhone 8 and Note 6. And those are "established" products where the updates are iterative (not to mention the resources of two of the largest tech companies in the world). This is a new product that has no existing manufacture of technology base/sector - it has to build most of the tech it needs from scratch.
A better question would be to name a new technology that went from hypothesis to consumer availability in two years or less. I can't think of any.
There's always the constant commentary of the consumer Rift being "two years away", but I played Elite: Dangerous with the Rift DK2 (and a HOTAS controller) for maybe 10 hours, and for that particular game, which closely aligns with the "seated VR experience" that Oculus likes to talk about, I feel like all the pieces are nearly there. Improve the pixel resolution, sort out the driver situation ("direct mode" is still wonk), and improve the weight/ergonomics and then it's a totally marketable consumer product for a category of PC gaming. I hope Oculus doesn't get caught up in any scope creep over the consumer Rift needing to be perfect for every conceivable VR experience.
(It's not totally ideal for first-person action games, but I have stomached hours of Minecraft on it and it's pretty dang cool reshaping its blocky wilderness in realistically scaled VR.)
I've done hours of Minecraft on DK1, even! I think they should release as soon as possible (once software is all fixed up and games have been updated), and then save all the other features for later models.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
21 Comments
Back to Article
edzieba - Tuesday, January 13, 2015 - link
"I was really interested to see what changes have been made with the latest prototype. The short answer is that ventilation has improved (less fogging up of the glasses), the display resolution is now higher, the screen refreshes faster, and combined with the VR Audio the experience is more immersive than ever."The msot significant changes with Crescent bay are the increased refresh rate (touched on later in the article), new lenses (a lens/fresnel hybrid doublet), a light diffusion filter to improve perceived fill factor, but above all; huge improvements to position tracking. Not only is there an increase in fidelity and further reduction in latency and jitter, but the headset is now visible to track from all angles (due to the rear marker coverage), and the camera's minimum tracking distance has been reduced, increasing usable tracking volume and moving it closer to the camera.
edzieba - Tuesday, January 13, 2015 - link
There's also the removal the the 'black smear' effect the DK2's panel had (where screen area at 0 brightness would under/overshot when adjacent to bright objects, whereas screen area at 1 level above black would not), which was a hardware limitation of how the Note 3 panel was driven.bilago - Tuesday, January 13, 2015 - link
I used the CB prototype at Oculus Connect and there was still black smearing.slatanek - Tuesday, January 13, 2015 - link
I don't understand how the Rift would be of no interest as of yet. It only depends on the games you wanna play. Everything that relys on a seated experience (in reality) would be much better of by using VR headset than monitors. Take sim-racing for instance. The Rift is made for this. Even 3 monitors setup won't give you the immersion of seating in the car. Same goes for plane sims etc. I think the general idea is that it's made for playing FPS's while it's actually a perfect match for sims.As for the consumer model - 4K would be great but what graphic card would we use to drive this thing in stereo? I always hoped that the res on this will be something reasonable (1440p max) so that a setup allowing the use of Rift would be semi-affordable. I agree with the Oculus founders - res is not as important as is ghosting, latency, refresh rates etc.
jhoff80 - Tuesday, January 13, 2015 - link
"I don't understand how the Rift would be of no interest as of yet. It only depends on the games you wanna play. "Because software- and configuration-wise, it's a mess. You need to often switch between extended mode and direct modes depending on what game/demo you want to play, many games are built on older versions of the SDK, there's a lot of people who can't run direct mode at all due to crashes, etc.
Having not used Crescent Bay, I'm not 100% sure, but it sounds like the hardware is pretty much ready (or at least as ready as they can get given current display technology), but on the software side it's just nowhere near consumer-ready.
anthill - Tuesday, January 13, 2015 - link
A 4k screen could work, users with lower end hardware would just upscale 1080p to 4k. By the time the rift releases either late this year/early next year going by what they have said, AMD and Nvidia will have newer gpu's out. Nvidia also did announce they are working on VR SLI so maybe when the rift releases those who want to run 4k@90hz will have SLI gtx 980ti's or whatever the next high end nvidia card is.The big issue Oculus is facing isn't hardware though it's going to the the availability of quality content on release day.
cmdrdredd - Tuesday, January 13, 2015 - link
Cause flicking my mouse is faster to get a shot off than turning my head.markbanang - Wednesday, January 14, 2015 - link
Even with current resolutions, reliable 90Hz operation will require pretty beefy graphics hardware, way more than the cost of the Rift itself.So what I want to see is foveal tracking integrated into the Rift.
Only a tiny proportion of your field of view is 'high resolution', the further you get from the fovea, the lower the resolution resolution gets. If you track the movements of the eye, you can partition your high resolution display into high and low resolution zones. When the fovea is positioned such that you are looking at one part of the display, that area is rendered at full detail while the rest of the screen is rendered at a lower detail level, saving you huge amounts of processing. It would also be really easy to tune, the faster your hardware, the larger high detail zone would be, and you could dynamically scale the size of the high detail zone of the next frame in response to length of time it took to render the previous frame.
Using this technique you could get the performance you need with far less computation (I can imagine driving a 4k Rift display with a GTX970 for instance). This technique could be used for monitors too, but there you would need to track both head and eye simultaneously, and the area the fovea would cover an arc of many more pixels.
This technique would, however, require high speed/accuracy eyeball tracking hardware to be integrated into the Rift, along with SDK support, graphics API support and probably game support., so it will be a long way off. I can see that it might be more likely in Mobile Rift, where making efficient use of GPU resources is even more important.
Anyway, roll on eye tracking, that's what I say...
OG84 - Wednesday, January 14, 2015 - link
Don't think this will work. Our eyes/brain would clearly notice the transition between high and low resolution. It would be like a tunnel vision. Besides that, I don't think there should a current rendering technique where you could slowly fade to lower resolutions. You could make different zones with circular clipping but our eyes would probably notice a very hard border.Lindorfer - Wednesday, January 14, 2015 - link
You undersell just how high speed that tracking would have to be. Our eyes saccade (move and fixate) extremely fast. It's much harder than the head tracking Oculus is just managing to resolve now. To realize any benefit without gross artifacts you would have to reliably render a new frame probably ten time faster than needed on today's Oculus for head tracking. It turns out tracking and responding with a relevant new frame fast enough is much harder than just rendering the whole frame at high resolution.There are cool things that can be done if we get really good eye tracking, eg. automated and accurate eye calibration, perspective adjustment for the movement of the pupil, simulated focal adjustments. Performance optimization, however, may be the hardest to achieve.
If you're interested in a deep dive, these videos with Lucky Palmer and John Carmack address some of the difficulties:
https://www.youtube.com/watch?v=gn8m5d74fk8&fe...
https://www.youtube.com/watch?v=8CRdRc8CcGY
mkozakewich - Wednesday, January 14, 2015 - link
Ha, I was thinking the same thing this morning. Well-designed games will already render things in lower detail when they're further away, so it would make sense to do the same thing when they're further away from the fovea.JohnnyBoBells - Friday, January 16, 2015 - link
I think you're missing a crucial point. While our eyes only see high resolution in a small area, we can still distinguish changes in our periphery rather easily. You can do your own experiments at home if you have the proper hardware and know how to code a bit, but what you suggest would easily be perceptible. Our eyes are simply too fast and can recognize such changes. You're asking a sensor to track the eye with such precision, but also with such speed that it can then send that data to the computer for processing, process it, and then change the output before the next frame is drawn. For this technique to work, you need to do it extremely fast (much faster than the refresh rate of the display device) for it to appear normal, but then if you have to do it that much faster, you're asking your hardware to do all that work that I mentioned in the previous sentence that much faster.High res graphics with VR were/are never meant to reach all users of gaming. This is one of those areas where you have to pay to play. It's not cheap, nor will it ever be. Current hardware is always playing catch up to the requirements of games at the best settings, and that's for monitor displays. VR requires so much more processing power than a monitor. I'll say it again. Playing the newest and visually stunning games on VR is not, nor will it be, for everyone. It inherently will always require expensive hardware.
The sad thing is, many games that are a few years old play wonderfully on the Rift and other VR devices, albeit with current hardware. Unfortunately, most people don't want to play "old" games. They want the new games. If people were willing to play older titles, I think they'd have fantastic experiences. It's amazing, even at 75Hz, what a game which is able to be played at 75fps constant, even with older (2, 3, 4 etc years old), looks like in VR. While you always know you're in VR, it is so easy to get lost in the environment of the game world when the experience is absolutely fluid.
2late2die - Tuesday, January 13, 2015 - link
"on the way there we passed by the Oculus booth that had a line of people waiting to experience DevKit 2. After Crescent Bay, I can’t help but feel a bit sorry for them"Hmmm, maybe it was on a different day, but on Tuesday when I was checking out Oculus the line was to check out Crescent Bay.
Now, I can't comment on exact specs but I was blown away by the demo. I even had a very cool moment during that last demo (slow mo on-rails sequence). The sequence had bullets flying around and some would pass through "you". Seconds into the demo one such bullet was going towards me and just as hit my imaginary body I felt a buzz/vibration - and for a moment I freaked out, I was thinking - how the heck did they just integrate haptic feedback into this thing!?! Then a second later I realized that it was my friend texting me and the phone in my pocket vibrated. :)
But it was such an awesome coincidence and with me so immersed into the demo the vibration really did feel for a moment like it was coming from the area of my torso where the bullet was passing through, not my pants pocket.
The biggest thing I would love to see with Oculus is something that would track my hands and put them into the VR environment. I think with legs there would be too many challenges, but hands are totally doable and that would add soooo much to the feeling of immersivness.
mkozakewich - Wednesday, January 14, 2015 - link
Ooh, wire up a Kinect! They can be used synergistically!sorten - Tuesday, January 13, 2015 - link
Going from a kickstarter charity project to millionaires and facebook employees probably hasn't done much to add a sense of urgency for Oculus developers, but I can't recall the last product that took so long to go from development prototype to consumer product. I guess time will tell if the Rift ever shows up on store shelves.BrooksT - Tuesday, January 13, 2015 - link
Pretty sure Facebook management and shareholders will provide that sense of urgency. It's not a one-man show, and for all of Facebook's problems, it is professionally managed.Lots of products take more than two years to go from prototype to commercial release. Off the top of my head: iPhone, Xbox One / PS4, Google's cars + Glass + other stuff. There are lots more.
SleepyFE - Tuesday, January 13, 2015 - link
Really? Cars? Thing that can kill you when things go awry are in the same category as XBONE and PS4.p.s.:I didn't put iPhones in there, because "they explode" and Glass is social suicide, so these to may be in the same category as cars.
HammerStrike - Wednesday, January 14, 2015 - link
That is a silly statement - even a "refresh" of existing product lines, such as annual smartphone updates, likely have 18-24 month lead times, if not longer. I would suspect Apple and Samsung already have teams working on the iPhone 8 and Note 6. And those are "established" products where the updates are iterative (not to mention the resources of two of the largest tech companies in the world). This is a new product that has no existing manufacture of technology base/sector - it has to build most of the tech it needs from scratch.A better question would be to name a new technology that went from hypothesis to consumer availability in two years or less. I can't think of any.
SeannyB - Tuesday, January 13, 2015 - link
There's always the constant commentary of the consumer Rift being "two years away", but I played Elite: Dangerous with the Rift DK2 (and a HOTAS controller) for maybe 10 hours, and for that particular game, which closely aligns with the "seated VR experience" that Oculus likes to talk about, I feel like all the pieces are nearly there. Improve the pixel resolution, sort out the driver situation ("direct mode" is still wonk), and improve the weight/ergonomics and then it's a totally marketable consumer product for a category of PC gaming. I hope Oculus doesn't get caught up in any scope creep over the consumer Rift needing to be perfect for every conceivable VR experience.(It's not totally ideal for first-person action games, but I have stomached hours of Minecraft on it and it's pretty dang cool reshaping its blocky wilderness in realistically scaled VR.)
mkozakewich - Wednesday, January 14, 2015 - link
I've done hours of Minecraft on DK1, even! I think they should release as soon as possible (once software is all fixed up and games have been updated), and then save all the other features for later models.AnnonymousCoward - Sunday, January 18, 2015 - link
Those built-in headphones look like they suck. I hope they're easy to get out of the way, so I can wear my Sony MDR.