"In the underwater boat demo, my eyes were always focusing on the display, which appeared to be distant. However, fish swam by extremely close to my eyes staying perfectly in focus with no double vision effects. When I tried to do something similar after the demo, I was disoriented because the real world didn’t work the same as Vive."
Great Article, I just wanted to clarify on that one thing. It's actually the accommodation cue that is missing that you noticed. Double vision doesn't really have to do with it though, the vergence/convergence works exactly as it would in real life on the Vive and Rift. Accommodation, which is missing, is the blur when you aren't directly focused on an objects that is not there yet, basically depth of field (what I think you are describing).
So if you focus on something near with even only one eye, things behind it blur and vice-versa. (It's weak visual cue anyways though) The Vive and rift can emulate all of the other visual cues, I think there are 7 or 8 of them in total? It may eventually benefit from variable focus lenses also I had read, but I don't know what that would add exactly.
Happy to hear you enjoyed it btw. I tried it on the HTC Vive tour and it was amazing. Very dream-like as you describe.
Sorry, I forgot to mention if the fish got really close to your face like inches away it would affect vergence and give you double vision due to the accommodation/vergence conflict. But the overall scene shouldn't be affected by double vision.
I know of some VR tech coming down the pipe (can't recall by exactly who), that has eye tracking built in, and then can simulate depth of focus blur in software cause it knows exactly what you're looking at. Hopefully we'll see this soon.
The most promising technology for this is light field displays, which were demo'd a few times before already. They can reconstruct a good approximation of the full light field (position and direction) instead of merely two flat planes, so you get accommodation naturally.
The Fove tracks your sight to blur the image accordingly, but there is no real eye lenses accommodation unlike with light field displays refered by Friendly0Fire
I've been following VR development closely for some time now and I think I've read just about every review that's been posted for these devices. To be frank, this review was extremely light on detail and interest compared to so many other reviews that go into much greater detail about the experience of the Vive and its features. I encourage anyone interested in this tech to search for more thorough reviews if you're truly interested.
That being said, the one unique aspect to this review that I found very interesting was the description at the end about the perceptual shift that you underwent when taking the headset off and your description of the experience as seeming "dream-like" on reflection afterwards. That is an insight that I had not read before. I have not used the Vive myself but I have wondered about the lack of focal blurring/double-imaging as you described and couldn't understand why more wasn't said about it in other reviews. Is it really unnoticeable? was my continual question. But I think you've really provided a key insight here: it's not unnoticeable, but you can adapt to it pretty quickly. The result being that there is a perceptual shift when going into and out of this VR experience that can be disconcerting. Very interesting.
Unless the fish got really really close to his face he should still have seen things exactly as they were in real life throughout the entire scene. But due to the lack of accommodation it's a little hard to focus things right in front of your face, it's minor. So it's accommodation he is describing, but it only has an impact on vergence when things are really close like that. I couldn't tell exactly you why though, something to do with accommodation/vergence conflict. There are papers on it.
Lack of accommodation is overall extremely subtle. Some argue that in VR it can be better than real life vision for the overall scene because nothing blurs in the foreground or background when you focus on something.
I think if it were added with fast enough eye tracking and some sort of layered or lightfield type display it would add a subtle immerssive factor.
Hi bji, this wasn't a review. Hence why it's a single pager in the pipeline section of the website. For Josh this was his first proper VR experience (rather than the Gear) and he wanted to put a lot of his subjective thoughts on paper, and should be read as such in an anecdotal format. I did the same thing back when I tested the Vive back at MWC. Until we get a device in-house and devise an applicable testing strategy that is actually meaningful and beneficial beyond a user experience analysis, we won't be publishing a full, AnandTech class review.
Thank you for the info and the link. I find it interesting that the demos always are limited to around 20 minutes ... for Valve. Oculus, and Morpheus. I have this gut feeling that they believe that 20 minutes is all that the uninitiated to VR can handle before having a significant chance of developing VR sickness. Also I have to wonder if there isn't any kind of wrangling with whatever government agency would have the responsiblity for ensuring the safety of these devices, to give those agencies time to test the units to ensure that they don't pose an unnecessary risk to eyesight or some other health risk. Could this be why these devices are taking so long to get to market?
For my Vive demo, it was purely because at MWC they had two devices for the whole show and it was invite-only for press - one person per publication. The schedule was tight and they needed everyone to be in and out on time in order to process everyone through the show.
Plus, there's the argument of best foot forward. As HTC are the only company with their prototypes, they get to dictate the demo and focus the experience on what they deem to be their best foot forward.
With VR like this, there are two types - free movement, or sitting. For most home users, sitting will be the majority use model, whereas business is more likely to embrace the free movement aspect in order to accelerate their workflow. Of course in the business side of things, you have to be insured and regulated up the wazoo. Long term effects, there have been studies about altered vision perception (beyond glasses/lenses) which they can draw on, if it differs much from watching a monitor with glasses. Standard government suggestions apply I would assume - a break every hour etc.
It may be an artefact of having worked with the DK1 and DK2 for quite some time, but using the Vive I experienced LESS visual adaptation(/re-adaptation) after removing it. This could either be a result of the Vive's improved tracking and higher refresh rate, or simply that with habitual (or even regular) VR use the brain adapts to switch between real-world accommodation/vergence matching and VR accommodation/vergence mismatch (fixed accommodation) more quickly and easily.
It'll be interesting to see what solution for replicating accomodation becomes viable first: Lightfield displays (with their hit to effective resolution, something VR is already starved for by several orders of magnitude) which offer the 'perfect' implementation, or adaptive lenses (e.g. oil-filled lenses) combined with high-speed eye-tracking. While adaptive lenses can only replicate accomodation cues correctly for the foveal region, that's the only region that can reliably DETECT accommodation changes, so is of minimal issue. You also have all the other useful things eye-tracking can do, like foveated rendering, lens axis offset compensation, depth-correct DoF blur, saccade blanking tricks, etc.
You propose some tech that I had not heard discussed before with regards to better emulation of focal depth and accomodation etc. I don't know if any of them are practical but what I do know is that over the next 5 - 10 years we're going to see an amazing evolution of VR tech with ideas like these, or similar to them, being incorporated into the designs. I have no doubt that within the next 10 years we're going to get "close enough" to perfect visual fidelity in the VR experience so as to be a completely "solved" problem.
Kinesthetic sense is going to be the real difficult nut to crack ...
Haptics is the "we don't even know where to practically start" of consumer VR. Actuated exoskeletons (e.g. Cybergrasp) aren't going to reduce in price due to sheer part complexity, and have significant liability concerns. Plus all current systems need one or more extra people to assist the suer getting into and out of the device safely. Combianations of tactice actuators and TENS stimulators are mechancially simpler, but still have the liability concerns (and issues with correct electrode placement for untrained users). And even these only stimulate a limited number of your proprioceptive senses.
In the near-term, limited haptic devices that use clever tricks to fool your senses in a small number of scenarios are probably going to be the norm. Tactical Haptics have a neat system of moving handgrips that can effective simulate some of the effects of a mass moving in your hand. They stimulate your tactile skin senses of a weight causing an object to shift in your grip, but does not impart any force the the muscle spindles in your forearm.
When a lot of people in the field are of the opinion "it'll probably end up being easier to invectively stimulate nerves directly than replicate the stimuli externally" you know it's not going to be as easy a problem to solve as vision or audio.
Imagine 2 weight lifting resistance ropes attached to the ceiling or outer wall with wraps around your hand. Program the resistance with the game/simulation to control movement. It's a limited concept, but add enough ropes... Bondage style, anything like that out there?
There's a '3D force feedback mouse' design that uses a single controlled point and a set of cables, one to each corner attached to a motor/encoder, with the controlled point in the centre. e.g. The Inca 6D (https://www.youtube.com/watch?v=MWbFMP6rcSs).
Unfortunately, this is no good if you want to effectively walk around blindfolded while using it. Even worse if you want two or more points. You're more likely to strangle yourself than build an effective system.
>Almost every mobile VR headset like Gear VR or Google Cardboard just >doesn’t work with glasses very well
Cardboard works fine with glasses, and Gear VR isn't supposed to be used with glasses. You just adjust the dial until the screen is in focus. If you're too near or far-sighted to be in the range it can adjust to, you may have to wear contacts to use it. I'm near-sighted, and Gear VR focuses fine for me.
Gear VR is a lot of fun despite only being able to play phone games. Vive and OR should be fantastic.
Has the author tried the newest Oculus Rift demo with motion controls? I'm curious how Vive compares. Not sure if the Oculus motion controls have been demoed yet to press
Oculus Touch has been demoed at several events to the press, but usually the demos are invite-only due to the more involved setup (two rooms, one person in each with a CV1 + touch setup, networked in the same virtual space) and longer duration.
This is a subject that I am really excited about since I truly hope it revivals my interest in gaming, which I lost some years ago and I really miss. I believe I will be one of the early adopters and I hope AnandTech continues to analyze and write about VR, hopefully more often.
I was the QA Manager for HTC Vive and worked with content producers to deliver the best VR user experience. Even with HTC's disappointing financial results which turned to work force reduction, I still have to say Vive delivers the best full VR experience that truly is the game changer.
My team spent hours running demos and comparing user experience between Vive and other VR HMDs, mostly Oculus Rift. The ability to walk around VR world and engage objects in 360 degree view creates a environment not only for gaming but also other contents like walking around a art museum and examining artifacts up-close. The job training game is the other great example the concept can be used in rehabilitation.
One thing to keep in mind is OpenVR SDK is open source and as soon as Valve decides to open source the technology for HMD, Lighthouse and controller, there will be many Vive clones out in the market. Just like when HTC made the first Android smart phone G1 and may other companies followed.
Wow sorry to hear about your job loss due to HTC downsizing. Sounds like an exciting time to have been at the company, I wish it could have lasted longer for you.
Can you describe any ways in which the Vive was inferior to other headsets? Weight, comfort, display quality, lens quality, etc? Was there anything? Or was the Vive just superior in every way.
I personally find the Vive the most compelling of all of the VR headsets that have been announced and just cannot wait until they are released. I am in the process of building a new PC just for the purpose of being able to drive a Vive. For the first time in my life I'm buying an expensive ($500) video card, it's a brave new world for me for sure!
bji, thank you for the kind word. Yes it was a really exciting job to be able to test different VR setups everyday. Too bad, HTC is in financial trouble but the good news is we transferred all testing knowledge to QA teams in Taiwan. They will continue our works and ensure HTC delivers the best VR experience by release Vive end of this year.
Here is the PC spec. from our VR Test Lab: • NVIDIA GTX 970 / AMD 290 equivalent or greater • Intel i5-4590 equivalent or greater • 16GB+ RAM • Compatible HDMI 1.3 video output • USB 3.0 or greater
Compares with other VR setups, Vive requires higher spec. PC to drive it. We calculated the minimum spec. will be around $1500 price range. Vive HMD v1 is heavier and not as comfortable compare to Oculus Rift. In addition, setting up the lighthouse correctly requires some trial and error. However after everything setup correctly, Vive is the best in its own class and it makes all other VR experience feels like Virtual Boy!
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
27 Comments
Back to Article
Shadowmaster625 - Monday, September 14, 2015 - link
That oscilloscope smile is beyond creepy.user3311 - Monday, September 14, 2015 - link
"In the underwater boat demo, my eyes were always focusing on the display, which appeared to be distant. However, fish swam by extremely close to my eyes staying perfectly in focus with no double vision effects. When I tried to do something similar after the demo, I was disoriented because the real world didn’t work the same as Vive."Great Article, I just wanted to clarify on that one thing. It's actually the accommodation cue that is missing that you noticed. Double vision doesn't really have to do with it though, the vergence/convergence works exactly as it would in real life on the Vive and Rift. Accommodation, which is missing, is the blur when you aren't directly focused on an objects that is not there yet, basically depth of field (what I think you are describing).
So if you focus on something near with even only one eye, things behind it blur and vice-versa. (It's weak visual cue anyways though) The Vive and rift can emulate all of the other visual cues, I think there are 7 or 8 of them in total? It may eventually benefit from variable focus lenses also I had read, but I don't know what that would add exactly.
Happy to hear you enjoyed it btw. I tried it on the HTC Vive tour and it was amazing. Very dream-like as you describe.
user3311 - Monday, September 14, 2015 - link
Oops, typos and grammar errors in parts of my post but I couldn't edit.user3311 - Monday, September 14, 2015 - link
Sorry, I forgot to mention if the fish got really close to your face like inches away it would affect vergence and give you double vision due to the accommodation/vergence conflict. But the overall scene shouldn't be affected by double vision.scbundy - Monday, September 14, 2015 - link
I know of some VR tech coming down the pipe (can't recall by exactly who), that has eye tracking built in, and then can simulate depth of focus blur in software cause it knows exactly what you're looking at. Hopefully we'll see this soon.Friendly0Fire - Monday, September 14, 2015 - link
The most promising technology for this is light field displays, which were demo'd a few times before already. They can reconstruct a good approximation of the full light field (position and direction) instead of merely two flat planes, so you get accommodation naturally.CryingCyclops - Monday, September 14, 2015 - link
I think you might be thinking of the Fove. http://www.getfove.com/crim3 - Wednesday, September 16, 2015 - link
The Fove tracks your sight to blur the image accordingly, but there is no real eye lenses accommodation unlike with light field displays refered by Friendly0Firebji - Monday, September 14, 2015 - link
I've been following VR development closely for some time now and I think I've read just about every review that's been posted for these devices. To be frank, this review was extremely light on detail and interest compared to so many other reviews that go into much greater detail about the experience of the Vive and its features. I encourage anyone interested in this tech to search for more thorough reviews if you're truly interested.That being said, the one unique aspect to this review that I found very interesting was the description at the end about the perceptual shift that you underwent when taking the headset off and your description of the experience as seeming "dream-like" on reflection afterwards. That is an insight that I had not read before. I have not used the Vive myself but I have wondered about the lack of focal blurring/double-imaging as you described and couldn't understand why more wasn't said about it in other reviews. Is it really unnoticeable? was my continual question. But I think you've really provided a key insight here: it's not unnoticeable, but you can adapt to it pretty quickly. The result being that there is a perceptual shift when going into and out of this VR experience that can be disconcerting. Very interesting.
user3311 - Monday, September 14, 2015 - link
Unless the fish got really really close to his face he should still have seen things exactly as they were in real life throughout the entire scene. But due to the lack of accommodation it's a little hard to focus things right in front of your face, it's minor. So it's accommodation he is describing, but it only has an impact on vergence when things are really close like that. I couldn't tell exactly you why though, something to do with accommodation/vergence conflict. There are papers on it.Lack of accommodation is overall extremely subtle. Some argue that in VR it can be better than real life vision for the overall scene because nothing blurs in the foreground or background when you focus on something.
I think if it were added with fast enough eye tracking and some sort of layered or lightfield type display it would add a subtle immerssive factor.
Ian Cutress - Monday, September 14, 2015 - link
Hi bji, this wasn't a review. Hence why it's a single pager in the pipeline section of the website. For Josh this was his first proper VR experience (rather than the Gear) and he wanted to put a lot of his subjective thoughts on paper, and should be read as such in an anecdotal format. I did the same thing back when I tested the Vive back at MWC. Until we get a device in-house and devise an applicable testing strategy that is actually meaningful and beneficial beyond a user experience analysis, we won't be publishing a full, AnandTech class review.My thoughts on the demo I had:
http://anandtech.com/show/9048
bji - Tuesday, September 15, 2015 - link
Thank you for the info and the link. I find it interesting that the demos always are limited to around 20 minutes ... for Valve. Oculus, and Morpheus. I have this gut feeling that they believe that 20 minutes is all that the uninitiated to VR can handle before having a significant chance of developing VR sickness. Also I have to wonder if there isn't any kind of wrangling with whatever government agency would have the responsiblity for ensuring the safety of these devices, to give those agencies time to test the units to ensure that they don't pose an unnecessary risk to eyesight or some other health risk. Could this be why these devices are taking so long to get to market?IanCutress - Tuesday, September 15, 2015 - link
For my Vive demo, it was purely because at MWC they had two devices for the whole show and it was invite-only for press - one person per publication. The schedule was tight and they needed everyone to be in and out on time in order to process everyone through the show.Plus, there's the argument of best foot forward. As HTC are the only company with their prototypes, they get to dictate the demo and focus the experience on what they deem to be their best foot forward.
With VR like this, there are two types - free movement, or sitting. For most home users, sitting will be the majority use model, whereas business is more likely to embrace the free movement aspect in order to accelerate their workflow. Of course in the business side of things, you have to be insured and regulated up the wazoo. Long term effects, there have been studies about altered vision perception (beyond glasses/lenses) which they can draw on, if it differs much from watching a monitor with glasses. Standard government suggestions apply I would assume - a break every hour etc.
edzieba - Monday, September 14, 2015 - link
It may be an artefact of having worked with the DK1 and DK2 for quite some time, but using the Vive I experienced LESS visual adaptation(/re-adaptation) after removing it. This could either be a result of the Vive's improved tracking and higher refresh rate, or simply that with habitual (or even regular) VR use the brain adapts to switch between real-world accommodation/vergence matching and VR accommodation/vergence mismatch (fixed accommodation) more quickly and easily.It'll be interesting to see what solution for replicating accomodation becomes viable first: Lightfield displays (with their hit to effective resolution, something VR is already starved for by several orders of magnitude) which offer the 'perfect' implementation, or adaptive lenses (e.g. oil-filled lenses) combined with high-speed eye-tracking. While adaptive lenses can only replicate accomodation cues correctly for the foveal region, that's the only region that can reliably DETECT accommodation changes, so is of minimal issue. You also have all the other useful things eye-tracking can do, like foveated rendering, lens axis offset compensation, depth-correct DoF blur, saccade blanking tricks, etc.
bji - Monday, September 14, 2015 - link
You propose some tech that I had not heard discussed before with regards to better emulation of focal depth and accomodation etc. I don't know if any of them are practical but what I do know is that over the next 5 - 10 years we're going to see an amazing evolution of VR tech with ideas like these, or similar to them, being incorporated into the designs. I have no doubt that within the next 10 years we're going to get "close enough" to perfect visual fidelity in the VR experience so as to be a completely "solved" problem.Kinesthetic sense is going to be the real difficult nut to crack ...
edzieba - Monday, September 14, 2015 - link
Haptics is the "we don't even know where to practically start" of consumer VR. Actuated exoskeletons (e.g. Cybergrasp) aren't going to reduce in price due to sheer part complexity, and have significant liability concerns. Plus all current systems need one or more extra people to assist the suer getting into and out of the device safely. Combianations of tactice actuators and TENS stimulators are mechancially simpler, but still have the liability concerns (and issues with correct electrode placement for untrained users). And even these only stimulate a limited number of your proprioceptive senses.In the near-term, limited haptic devices that use clever tricks to fool your senses in a small number of scenarios are probably going to be the norm. Tactical Haptics have a neat system of moving handgrips that can effective simulate some of the effects of a mass moving in your hand. They stimulate your tactile skin senses of a weight causing an object to shift in your grip, but does not impart any force the the muscle spindles in your forearm.
When a lot of people in the field are of the opinion "it'll probably end up being easier to invectively stimulate nerves directly than replicate the stimuli externally" you know it's not going to be as easy a problem to solve as vision or audio.
TheFuzz77 - Monday, September 14, 2015 - link
Imagine 2 weight lifting resistance ropes attached to the ceiling or outer wall with wraps around your hand. Program the resistance with the game/simulation to control movement. It's a limited concept, but add enough ropes... Bondage style, anything like that out there?Yaldabaoth - Monday, September 14, 2015 - link
There are plenty of awesome tension/predicament rope bondage sites......wait.... VR? Never mind.
edzieba - Monday, September 14, 2015 - link
There's a '3D force feedback mouse' design that uses a single controlled point and a set of cables, one to each corner attached to a motor/encoder, with the controlled point in the centre. e.g. The Inca 6D (https://www.youtube.com/watch?v=MWbFMP6rcSs).Unfortunately, this is no good if you want to effectively walk around blindfolded while using it. Even worse if you want two or more points. You're more likely to strangle yourself than build an effective system.
JeffFlanagan - Monday, September 14, 2015 - link
>Almost every mobile VR headset like Gear VR or Google Cardboard just>doesn’t work with glasses very well
Cardboard works fine with glasses, and Gear VR isn't supposed to be used with glasses. You just adjust the dial until the screen is in focus. If you're too near or far-sighted to be in the range it can adjust to, you may have to wear contacts to use it. I'm near-sighted, and Gear VR focuses fine for me.
Gear VR is a lot of fun despite only being able to play phone games. Vive and OR should be fantastic.
tonyoramos1 - Monday, September 14, 2015 - link
Has the author tried the newest Oculus Rift demo with motion controls? I'm curious how Vive compares. Not sure if the Oculus motion controls have been demoed yet to pressedzieba - Monday, September 14, 2015 - link
Oculus Touch has been demoed at several events to the press, but usually the demos are invite-only due to the more involved setup (two rooms, one person in each with a CV1 + touch setup, networked in the same virtual space) and longer duration.Badelhas - Tuesday, September 15, 2015 - link
This is a subject that I am really excited about since I truly hope it revivals my interest in gaming, which I lost some years ago and I really miss. I believe I will be one of the early adopters and I hope AnandTech continues to analyze and write about VR, hopefully more often.akbisw - Tuesday, September 15, 2015 - link
"Two headset itself contains " Should be The headset. #typoBu11dog - Tuesday, September 15, 2015 - link
I was the QA Manager for HTC Vive and worked with content producers to deliver the best VR user experience. Even with HTC's disappointing financial results which turned to work force reduction, I still have to say Vive delivers the best full VR experience that truly is the game changer.My team spent hours running demos and comparing user experience between Vive and other VR HMDs, mostly Oculus Rift. The ability to walk around VR world and engage objects in 360 degree view creates a environment not only for gaming but also other contents like walking around a art museum and examining artifacts up-close. The job training game is the other great example the concept can be used in rehabilitation.
One thing to keep in mind is OpenVR SDK is open source and as soon as Valve decides to open source the technology for HMD, Lighthouse and controller, there will be many Vive clones out in the market. Just like when HTC made the first Android smart phone G1 and may other companies followed.
bji - Wednesday, September 16, 2015 - link
Wow sorry to hear about your job loss due to HTC downsizing. Sounds like an exciting time to have been at the company, I wish it could have lasted longer for you.Can you describe any ways in which the Vive was inferior to other headsets? Weight, comfort, display quality, lens quality, etc? Was there anything? Or was the Vive just superior in every way.
I personally find the Vive the most compelling of all of the VR headsets that have been announced and just cannot wait until they are released. I am in the process of building a new PC just for the purpose of being able to drive a Vive. For the first time in my life I'm buying an expensive ($500) video card, it's a brave new world for me for sure!
Bu11dog - Wednesday, September 16, 2015 - link
bji, thank you for the kind word. Yes it was a really exciting job to be able to test different VR setups everyday. Too bad, HTC is in financial trouble but the good news is we transferred all testing knowledge to QA teams in Taiwan. They will continue our works and ensure HTC delivers the best VR experience by release Vive end of this year.Here is the PC spec. from our VR Test Lab:
• NVIDIA GTX 970 / AMD 290 equivalent or greater
• Intel i5-4590 equivalent or greater
• 16GB+ RAM
• Compatible HDMI 1.3 video output
• USB 3.0 or greater
Compares with other VR setups, Vive requires higher spec. PC to drive it. We calculated the minimum spec. will be around $1500 price range. Vive HMD v1 is heavier and not as comfortable compare to Oculus Rift. In addition, setting up the lighthouse correctly requires some trial and error. However after everything setup correctly, Vive is the best in its own class and it makes all other VR experience feels like Virtual Boy!
Here is the Vive setup document if you like to start preparing for it: https://dl.dropboxusercontent.com/u/9855505/vr_set...