Comments Locked

32 Comments

Back to Article

  • p1esk - Friday, September 13, 2019 - link

    Why is resolution and refresh rate so low for a top of the line tethered headset in 2019?
  • Death666Angel - Friday, September 13, 2019 - link

    It has a higher resolution than the competing Valve and Oculus HMDs. And only the Index has a higher refresh rate. Only things with even higher resolution are Pimax and the HP Reverb.
  • p1esk - Friday, September 13, 2019 - link

    Are there any technical reasons why this thing can't be OLED 4kx4K@120Hz per eye? I'm not asking about video card limitations. Why is the headset itself so limited?
  • SeannyB - Saturday, September 14, 2019 - link

    My mostly wild guess is panel supply and price point, judging from the fact that the original HMDs went to LCD for their successor models, including the $1K Valve Index. The original Vive/Rift used a custom made-for-VR OLED panel. Maybe the asking price for an incremental improvement was too much, and the market for VR proved to be a low-volume niche rather than the next hottest thing.
  • p1esk - Saturday, September 14, 2019 - link

    Seems like VR market is in a tough spot: no one is buying because the technology sucks, but they can't improve the technology because no one is buying. I thought Facebook getting behind Oculus would break this vicious cycle. VR industry needs an 'iPhone moment'. And it's increasingly likely it will be Apple again to do it.
  • brunis.dk - Sunday, September 15, 2019 - link

    It isn't the software in VR that sucks, so it wont really benefit from a pricier headset with the same software.
  • Oxford Guy - Sunday, September 15, 2019 - link

    OLED isn't better than LCD in every respect, as far as I know. Blue OLED subpixels can have fading issues, which is why white subpixels were added to televisions (to mask the problem more). Motion blur may be higher. Image retention may be a factor.

    OLED does have the advantage of much better contrast than LCD.
  • plonk420 - Sunday, September 15, 2019 - link

    nah, OLED has the best motion/persistence (IIRC the term) of any current tech. used to be the big thing Carmack would talk about first (i highly suggest his talks at dev conferences). now it's dealing with reducing latency to make it not suck
  • Santoval - Monday, September 16, 2019 - link

    If microLED based panels really have all the strengths of OLED panels and almost none of their weaknesses (including much higher endurance and roughly equal lifetimes for the red, green and blue subpixels) then perhaps that's the technology that will prevail in the mid to long term. The question is if microLEDs can scale easily at 4K-class resolutions and reasonable panel sizes of 30 to 50 inches. I don't know if that has been mastered yet, I keep hearing about them being used for huge panels (where the individual microLEDs do not have to be very small).
  • Reflex - Monday, September 16, 2019 - link

    TV’s don’t work that way, at least not LG panels. They use white OLED’s entirely and filter them through a colored film to produce colors. This results in an even brightness on all colors and avoids the color degradation issue as each color degrades at a different rate.

    As to the other parts, having had both LCD and an OLED tv, motion blur is considerably worse on LCD, and OLED is quick enough that even mismatched framerates produce far less stutter than they do on LCD’s.

    Image retention certainly can be an issue, although in nearly two years I haven’t seen it. That said, I don’t live in front of my tv and I wouldn’t recommend a OLED for, say, a bar tv constantly tuned to a sports channel with a logo in a corner.
  • Beaver M. - Tuesday, September 17, 2019 - link

    Problem is that most OLEDs still have massive motion blur. I still remember CRT "motion blur". It was a night and day difference to LCDs. But most OLEDs act exactly the same way as LCDs in that aspect, because they havent been built to get rid of motion blur. Its a feature even OLEDs have to have, (which I cant recall the term for right now) and most simply dont offer it.

    Why wouldnt you recommend an OLED as a bar TV? People there wouldnt care much about the picture quality anyway. Nor would you really notice it from as far away as you normally sit from bar TVs. I would very much recommend it as a bar TV, because it saves power.

    I wouldnt recommend it to people who use it to play console games.
  • Beaver M. - Tuesday, September 17, 2019 - link

    But this wouldnt be the case if there were videos cards supporting this properly, because then panels like this would be cheaper, since more people would want one.
  • Skeptical123 - Sunday, September 15, 2019 - link

    There are manufacturing limitations and basic common sense that no computer on the planet could render games close to two 4k feeds at 120HZ in 2019 till at least ~2022. The big point I have not seen anyone mention is the fact that these basics stats mean very little. As usual there are other factors at play that matter a lot more. For VR headsets the screen HZ is effective a myth as the early Oculus team discovered and shared with the industry around 2013 if memory serves me. Yes of course you want more than 60fps but after ~75fps it does not matter much as the main factor is the pixcel response time in the sense of black to black. They pinned that down as the main cause of motion sickness. So the short version is a screen resolution is "set" around 2k due to production of the screen and the required computer resources to drive it. And a stable frame rate above 60 only matters if the other specs of the panned meet the special needs of VR. "4k oled@120" is used more by marketers than engineers for a reason...
  • nandnandnand - Monday, September 16, 2019 - link

    No.

    https://www.blurbusters.com/blur-busters-law-amazi...

    We should be looking to hit at least 240 Hz, if not 1000 Hz.

    Resolution should top out at 16K (or ~132 megapixels) over a wide field of view (200-220° horizontal, 150-180° vertical).

    Foveated rendering can massively reduce the GPU requirements, possibly by over 95%. The VR headset is obviously a good place for eye tracking, but maybe gaming monitors could also use it. Add in depth sensing to determine how far away the eyes are from the display.
  • Santoval - Monday, September 16, 2019 - link

    "Resolution should top out at 16K (or ~132 megapixels) over a wide field of view (200-220° horizontal, 150-180° vertical)."
    I assume, since you are quoting FOV degrees, that you are referring to VR panels. A 16K resolution for a tiny 3 - 4" panel is almost certainly never going to happen*. The pixel density is just immensely high, and adding a very high refresh rate on top of that is a further technical hurdle.

    *At least not with LCDs, OLEDs and every other display technology that's currently widely available. MicroLEDs could apparently do it. The tiny microLED display described in the link has 14,000 PPI, or ~17 times the pixel density of the current record holder smartphone, Sony Xperia XZ Premium, which has 807 PPI.

    I did a quick calculation with an online tool and found that with 14,000 PPI you can reach a 16K resolution at just 1.25", so there is plenty of room to spare. That microdisplay is amazingly even denser than the original target I thought was beyond reach :
    https://venturebeat.com/2019/05/30/mojo-vision-rev...

    Driving such a VR headset with a powerful enough PC and graphics card would be another matter though..
  • nandnandnand - Monday, September 16, 2019 - link

    The FOV I'm describing would require 2 panels, or maybe 1 large flexible one.

    StarVR One has 210°h/130°v FOVs and 2x 4.77-inch panels with a 22.5:9 total aspect ratio.

    For my endgame headset, I'm looking for 220°h/(150-180)°v FOVs. PPI should end up somewhere between 2000 and 4000.

    Here's 2,228 PPI AMOLED: https://www.roadtovr.com/int-announces-2228ppi-hig...

    What I'm describing is very achievable. As for driving it, add eye tracking and use foveated rendering. As an example, if you render 1% of the display at 16K at any given moment, 1% at 8K, 1% at 4K, 1% at 1080p, and 96% at 720p, that's just 2,647,296 pixels out of 132.7 megapixels, the equivalent of 2% of 16K or just 127% of 1080p. You can play around with the %s but you get the idea.
  • p1esk - Monday, September 16, 2019 - link

    I've seen benchmarks that showed 120Hz+ for 4k gaming (I think it was Shadow of Tomb Rider) on two 2080Ti in SLI. So yes, you can drive two 4k@120Hz feeds in 2019 if you have four cards.
  • Beaver M. - Tuesday, September 17, 2019 - link

    But the video card limitation is the main reason.
    There simply arent videocards that can support something like this properly. VR games already have to have inferior graphics to normal games to be able to work fine.

    Blame Nvidia and AMD for not getting their hooves out of their mouths.
    Imnstead they actually release cards that are only slightly faster than the last generation, offers less or as much VRAM only and costs MUCH MUCH more than the last generation.

    So it might still take quite a while until we see displays in VR headsets that dont show screen door effects.
  • nandnandnand - Friday, September 13, 2019 - link

    The field of view is also garbage.
  • glnpwl - Monday, September 16, 2019 - link

    Its not, its 90hz which is the standard for VR Headsets. The only headset to fall below this is Rift S, and the only one so far that has surpassed it is the Valve Index offering 122hz. Both Cosmos and Index offer something different in my opinion, with Index you get the wider FOV and higher framerate, but with Cosmos, you get options between insideout tracking and external base stations to track you plus passthrough that is in color, higher resolution than the Vive Pro, and built-in headphones instead of off ear speakers. Also, Cosmos price point for 2 controllers and a headset is a few hundred cheaper than for a valve index and index controllers.
  • DigitalFreak - Friday, September 13, 2019 - link

    I'd wait until the reviews are in. The previews are mixed.
  • konbala - Friday, September 13, 2019 - link

    Wonder if wireless adapter requires Pro Add-on too or.
  • PeachNCream - Friday, September 13, 2019 - link

    That price! Ouch!
  • nandnandnand - Friday, September 13, 2019 - link

    Tethered. Not even once.
  • stephenbrooks - Friday, September 13, 2019 - link

    I've got an original Vive and this one doesn't seem to be a big enough jump to be worth upgrading to, although if there's a Mk.3 with the same rate of progress it will be.
  • Oxford Guy - Sunday, September 15, 2019 - link

    Foveated rendering?

    It seems to me that VR will never be much to speak of until it has that really down. Not only does it greatly reduce the processing demands, it makes the experience more real. I suppose the trouble is getting the eye tracking speed high enough while maintaining low enough headset cost.
  • nandnandnand - Sunday, September 15, 2019 - link

    I think it is achievable. I saw a paper that used FOVE, a headset with 70 Hz frame rate and 120 Hz eye tracking, and that's years old by now.

    You might even get away with slower/crummier tracking if you render a higher % of the displays at high res. Then incrementally improve it.
  • stephenbrooks - Sunday, September 15, 2019 - link

    My guess is that by the time eye tracking etc. is widespread in hardware, PC graphics cards will have advanced to the point they can brute-force render the whole scene anyway.

    What'd really be nice is a wireless video link technology. That would be useful for a lot of things besides VR, but the bandwidth is so big it sounds like we'll need to use lasers rather than wi-fi to do it.
  • nandnandnand - Sunday, September 15, 2019 - link

    Well, your guess is wrong. The target is 16K resolution, 240-1000 FPS. And eye tracking is mostly solved. See FOVE or Vive Pro Eye.

    Standalone headsets are preferred, so it would have to render 16K on a SoC. Foveated rendering could make that possible by decreasing pixels rendered per frame by over 90%.

    If you want your beefy computer to render instead and transmit wirelessly using WiGig 2.0/3.0, guess what? Foveated rendering would dramatically decrease the necessary data rate.

    So it all comes back to foveated rendering. Amazing you didn't make the connection.
  • stephenbrooks - Monday, September 16, 2019 - link

    The Vive just took several years to move from 1.2K to 1.7K pixels, I think you'll be waiting a bit for 16K. Generally, I'd expect screen res, graphics card capability and display link bandwidth to move in sync (reason: the rest of the graphics market).

    So by some time in the 2030s when there are 16K screens, I'd expect to be able to brute force render them on my PC, every pixel.

    Standalone headsets will keep suffering from being either too heavy, too hot, poor battery life or low performance. See how bulky gaming laptops are? It's basically the same problem.
  • nandnandnand - Monday, September 16, 2019 - link

    Google and LG came up with a 1443 PPI display, and INT has gone up to 2228 PPI. That may already be enough for 16K using two larger panels for wide/tall FOV.

    I wouldn't look to how long Vive or Oculus took as an indicator of anything. Their products have basically been dev kits and early adopter junk until recently.

    Brute forcing 16K is completely unwanted if you have foveated rendering. Any extra GPU power can be used to increase scene complexity and other things instead.

    The wireless link may not improve much. There is 802.11ay, aka WiGig 2. That won't be around in products for a while, and you may get one improvement on 60 GHz Wi-Fi speeds by 2030.
  • glnpwl - Monday, September 16, 2019 - link

    I am excited about this headset, but I'm confused why the author of this article says it has knuckles style controllers (what valve index uses). This is closer to Oculus Touch style controllers.

Log in

Don't have an account? Sign up now