So, here's what I don't get. If these wireless VR headsets are intended to display video content rendered on an external graphics card, then let's see: 90 [FPS - 'meh', I'd rather see closer to 120, but OK] x (1080 x 1200 x 2 [kinda 'meh', IMO, but OK]) x 24 [bpp - no HDR, sorry] =~ 5.6 Gbps
WiGig is only capable of up to 7 Gbps (under ideal conditions, and only by simultaneously using 60, 5, and 2.4 Ghz bands in a tri-band configuration.) So how are they planning on "supporting multiple users sharing the same space"? (And what is 'multiple' - do they seriously mean more than 2 ?!?)
Sure, they can try to compress the hell out of the video stream. But intensive decompression workloads would help kill the battery of the headset even faster than the constant-on WiGig streaming, video display to both eyes, and audio to both ears would be doing already...
That wasn't the point: the point is all those users would be sharing the same spectrum in the same room, and within that room there are only so many bits that can be flying around at any given time (any more, and packets would be getting dropped due to signal interference.)
Isn't there some compression happening along the way? HDMI 2.0 supports 4K@60fps with HDR, which by your calculations would require 28.8 Gbps, and yet HDMI 2.0 only allows for 18 Gbps. Is there something I'm missing?
There is one they called Foveated Rendering. The resolution is increased only on the small area where your eyeballs are focused. The rest is rendered in low resolution. Big bandwith & CPU saver. Needs an eyeball tracker to work.
A very rapid eyeball tracker, at that... (saccades are super-fast, and the eyes are constantly jittering on a small angular scale and at a very rapid rate that we aren't consciously aware of, and then there's the whole issue of head-motion on top of it all...) Insofar as foveated rendering goes, my impression is that the point was mostly to reduce load on the GPU -- though I can certainly see how it might be leveraged for video signal compression (still, this kind of a non-uniform compression scheme would be pretty novel, as far as I know.) And I'd still question the ultimate efficiency of any such compression vs. the impact on visual quality (and attendant nausea-inducing lag or compression artifacts...)
Video compression/decompression is very resource intensive, which is why Intel and Nvidia like to talk about how well they can do it in every new product they try to sell you. Which codec they pick is a matter of cost and market size.
I assume they'll pick h264 for the demo because common off the shelf parts can do it. But if this will ship in a year, it could use anything
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
9 Comments
Back to Article
boeush - Tuesday, May 30, 2017 - link
So, here's what I don't get. If these wireless VR headsets are intended to display video content rendered on an external graphics card, then let's see: 90 [FPS - 'meh', I'd rather see closer to 120, but OK] x (1080 x 1200 x 2 [kinda 'meh', IMO, but OK]) x 24 [bpp - no HDR, sorry] =~ 5.6 GbpsWiGig is only capable of up to 7 Gbps (under ideal conditions, and only by simultaneously using 60, 5, and 2.4 Ghz bands in a tri-band configuration.) So how are they planning on "supporting multiple users sharing the same space"? (And what is 'multiple' - do they seriously
mean more than 2 ?!?)
Sure, they can try to compress the hell out of the video stream. But intensive decompression workloads would help kill the battery of the headset even faster than the constant-on WiGig streaming, video display to both eyes, and audio to both ears would be doing already...
Also, what is 'XR'?
MrHollow - Wednesday, May 31, 2017 - link
Well i think they meant multiple users but with a different PC and HMD and not one PC and multiple HMDs.boeush - Wednesday, May 31, 2017 - link
That wasn't the point: the point is all those users would be sharing the same spectrum in the same room, and within that room there are only so many bits that can be flying around at any given time (any more, and packets would be getting dropped due to signal interference.)yhselp - Wednesday, May 31, 2017 - link
Isn't there some compression happening along the way? HDMI 2.0 supports 4K@60fps with HDR, which by your calculations would require 28.8 Gbps, and yet HDMI 2.0 only allows for 18 Gbps. Is there something I'm missing?boeush - Wednesday, May 31, 2017 - link
Not sure how you get to the 28.8 number. 3840 x 2160 (4K) x 60 (Hz) * 30 (bpp, at 10 bpc) =~ 15 Gbps. At 12 bpc, it would be still only ~18 Gbps...BoyBawang - Wednesday, May 31, 2017 - link
There is one they called Foveated Rendering. The resolution is increased only on the small area where your eyeballs are focused. The rest is rendered in low resolution. Big bandwith & CPU saver. Needs an eyeball tracker to work.boeush - Wednesday, May 31, 2017 - link
A very rapid eyeball tracker, at that... (saccades are super-fast, and the eyes are constantly jittering on a small angular scale and at a very rapid rate that we aren't consciously aware of, and then there's the whole issue of head-motion on top of it all...) Insofar as foveated rendering goes, my impression is that the point was mostly to reduce load on the GPU -- though I can certainly see how it might be leveraged for video signal compression (still, this kind of a non-uniform compression scheme would be pretty novel, as far as I know.) And I'd still question the ultimate efficiency of any such compression vs. the impact on visual quality (and attendant nausea-inducing lag or compression artifacts...)Luminair - Wednesday, May 31, 2017 - link
They can compress it however they want, including with AV1 http://aomedia.org/about-us/Video compression/decompression is very resource intensive, which is why Intel and Nvidia like to talk about how well they can do it in every new product they try to sell you. Which codec they pick is a matter of cost and market size.
I assume they'll pick h264 for the demo because common off the shelf parts can do it. But if this will ship in a year, it could use anything
Pat78 - Thursday, June 1, 2017 - link
XR is Extended Reality. It is an umbrella term encapsulating Augmented Reality (AR), Virtual Reality (VR), mixed reality (MR), and everything in between.https://www.qualcomm.com/invention/cognitive-techn...