Comments Locked

50 Comments

Back to Article

  • erwos - Tuesday, July 17, 2018 - link

    This is excellent news - the two plug system absolutely sucks right now if your HDMI and USB ports aren't right next to each other. I'm also pleasantly surprised at how (apparently) future-proofed this particular system is - that's a lot of bandwidth and power they're providing for. IMHO, they'll need it when they go to 6-8 camera systems for inside-out, and/or a larger number of lighthouses.
  • milkywayer - Tuesday, July 17, 2018 - link

    > "VR hasn’t proven to be as immensely popular with consumer"

    I consider myself an over spender when it comes to computers and games but I'm not comfortable tipping my toes in VR because the precious gen VR like original Vive and Oculus aren't as attractive when compared to Vive Pro (and the next Vive that is coming soon), yet the price is too high for the Pro and upcoming Oculus 2. I hope the latest versions are around $500 price point and I'd love to play some VR on my 1080 Ti which currently just powers my MOBA game.
  • ExarKun333 - Tuesday, July 17, 2018 - link

    You purchased a $1000 GPU for a MOBA game(s) that generally only require a ~$200 GPU, but want cheaper VR. Hmm. Something doesn't add up there...
  • The Chill Blueberry - Wednesday, July 18, 2018 - link

    Spends 4000$ on a gaming set-up, too cheap to buy any games.
  • FullmetalTitan - Wednesday, July 18, 2018 - link

    I stayed out of VR this round with a 1080 TI because there just isn't a lot of developer support for big titles. Sure there are a few hundred small titles, puzzle games, gimmicky VR theaters/simulators, etc. but none of the big studios are designing with VR as a core element, they just make their flagship title kind of work in VR 2 years later.
  • Matthmaroo - Wednesday, July 18, 2018 - link

    The Vive pro is not that great

    All of it’s features are unavailable or not sold yet

    I too find it odd you spent 1000 dollars on a gpu for mobas and 349 for a rift is too much

    VR is amazing and truly the frontier of gaming

    I have a rift and love it
  • milkywayer - Thursday, July 19, 2018 - link

    I just don't wanna spend that much an amount when the next rift might be 3 months away with much better specs. I tried samsung VR and hated the screen door effect. So when i do make the jump, I in the 2nd gen model with better experience
  • Simplex - Thursday, July 19, 2018 - link

    "the next rift might be 3 months away"

    You really believe that if the next Rift was 3 month away there would be zero leaks or info about it?
    Also: https://www.roadtovr.com/oculus-rift-cv1-supersede...
  • Targon - Monday, July 23, 2018 - link

    To be fair, if you look at the VR that is out there for the PC, there isn't a lot of really good titles. You can buy a great CPU, video card, RAM, SSD, and those are going to power anything you throw at your computer, but VR for 1-3 good games with the rest being more of a demo for what VR can be like does NOT justify a $500 product.

    I remember when the original 3Dfx Voodoo first came out, and within one year there were already a bunch of good games that took advantage of it. Now, how many really good games are there for the PC with VR? I have not felt that there is a compelling reason for what should be seen as technology from two years ago, and unless game developers step up and embrace VR, it won't change.
  • piroroadkill - Wednesday, August 29, 2018 - link

    Lighthouses don't have a connection to the PC. The Vive only requires USB2.
  • lightningz71 - Tuesday, July 17, 2018 - link

    With respect to the cable length requirement on USB 3.1 gen2, that can be overcome through higher quality cables from the manufacturers combined with the alternate mode supporting a reduced data rate. Combining those two things can more than double the length that you can run a stable data connection through. Those cables are NOT going to be cheap though. While, in my experience, USB-C connectors seem more durable than Micro USB, they aren't especially rugged in general. Given the nature of VR headsets, with respect to the fact that they are designed to be moveable, I would hope that there is some effort put into a "rugged" USB-C compatible connector standard.

    As for integrating it into video cards, does anyone but me see that as problematic in the long run? VR headsets would be assumed to more naturally connect to the front of a PC. Integration onto the back connector bracket puts it in the single most inconvenient place it can be on a PC. Perhaps it can also include a standard for some sort of cable that can connect to a header on the card PCB that runs to the front panel of the computer? I'm sure that a Dell or an HP could work towards that, but something that works in the DIY space would also be useful. While having something like that on the board of a SFF case would be nice, how long will it be before we have iGPUs that have enough capability to properly drive something like those VR headsets?
  • TechieTommy - Tuesday, July 17, 2018 - link

    The cable length limitation actually can not be overcome with just improvements to the cable quality because unlike the previous USB specs, the cable length limitation is a hard limit for passive cables. The solution they are moving towards is actually active cables, which include retimers and redrivers in the cable.
  • DanNeely - Wednesday, July 18, 2018 - link

    That's not a free lunch though, it's why thunderbolt cables are stupidly expensive.
  • repoman27 - Wednesday, July 18, 2018 - link

    Neither the USB 3.2 or USB Type-C 1.3 specifications put any hard limits on passive cable length.

    To quote the spec: "The cable lengths listed in the table are informative and represents [sic] the practical length based on cable performance requirements."

    As long as you meet the performance requirements, you're fine. The two major parameters being IR drop for the Vbus and ground, and differential insertion loss for the SuperSpeed signaling pairs. For Gen 2 (10 Gbps) cables, those requirements are ≥ −4 dB at 2.5 GHz, ≥ −6 dB at 5 GHz, and ≥ −11 dB at 10 GHz. For Gen 1 only (5 Gbps), they're only ≥ −7.0 dB at 2.5 GHz, and > −12 dB at 5 GHz. The extension of the requirements out to a Nyquist frequency of 10 GHz is merely for a possible future 20 Gbps USB data rate which could be ignored in this instance. Drawing cable stock with thicker micro-coax or even micro-twinax could easily extend the length of passive cables as long as you're willing to roll with the decreased flexibility / increased outside diameter and mass.
  • edzieba - Wednesday, July 18, 2018 - link

    Both the Rift and Vive are ALREADY using redrivers for both video and data. The Vive has the discrete 'link box', and the Rift's cable assembly has a couple of Spectra7 chips (VR 7050 & 7100 IIRC) at one end.

    By using a subset of Type C DisplayPort Alternate Mode, they can take advantage of existing extended length options, such as fibre transport (though power injection would be necessary).
  • Bp_968 - Monday, August 27, 2018 - link

    In reply to your first paragraph, i agree. I think they will need to either make the cable "breakaway" at the center point so that you just yank the cable loose, or there will be a market for high quality cable extensions that are designed to extend it away from the PC a foot or two so that the cable pops loose at that connection point instead of ripping the port out of the back of a 1200$ video card.

    IGPUs capable of running second gen VR (2-4k@120hz or better) are at least a decade away. Probably longer.

    I dont disagree about plugging it in the back of the PC being annoying, but I dont see them putting it on a header. Thats just too advanced for your average PC owner and its unshielded. I think a breakaway cable or cable extension are both much more likely alternatives to handle that particular issue.
  • rtho782 - Tuesday, July 17, 2018 - link

    My understanding was that the length issue is also why HDMI was used on Gen1 - Displayport is only rated to 3 metres.
  • masimilianzo - Tuesday, July 17, 2018 - link

    Am I wrong in thinking that Thunderbolt 3 (especially the newest controllers offering DP1.4) would have been equally good for this, if not better?
    And Intel is opening the standard and integrating the controller in its CPUs..
  • jaggedcow - Tuesday, July 17, 2018 - link

    This will likely be an alternative mode for Thunderbolt 3, which also uses the USB-C connector, but without necessitating a Thunderbolt 3 chip so it can also be implemented by AMD systems or added onto GPUs without much extra cost.
    Yes it would probably better (although the latency might not be?) but it’s also more than what they need in the foreseeable future. Given that VR is already super expensive it’s probably better to go for the cheaper easier option.
  • masimilianzo - Tuesday, July 17, 2018 - link

    When Intel opens the standard, everyone will be able to have TB3, even AMD. No royalties so cost of the controller will go down a lot. TB3 could potentially be the new USB.
  • arashi - Tuesday, July 17, 2018 - link

    "When".
  • mr_tawan - Wednesday, July 18, 2018 - link

    good question...
  • boeush - Tuesday, July 17, 2018 - link

    Still waiting for that optical version originally touted as the future of "Light Peak". An optical data cable would be lighter, more flexible (no need for all that shielding), have no length limits, and would allow for virtually unlimited bandwidth/framerates...
  • repoman27 - Tuesday, July 17, 2018 - link

    You’ve always had the option of using optical cables with Thunderbolt, they just moved the optical transceiver from the device to the cable. For short hauls of less than 3m, which are by far the most common scenarios, optical makes no sense. The transceivers take up space, cost more, and use more power, so why include them by default? Plus you can’t do power delivery over glass, so the cable still needs copper.

    Apple sells a passive copper cable that can support 40 Gbit/s Thunderbolt 3 (20.625 Gbit/s signaling) and 5 A power delivery. Optical does not magically get you infinite bandwidth. The transceiver is still the limiting factor, and for simple NRZ they top out around 30 Gbit/s these days. Thunderbolt 4 will likely be ~40 Gbit/s PAM4 signaling over copper, because that’s the most economical and proven path forward.
  • boeush - Wednesday, July 18, 2018 - link

    Well no duh you need copper for power delivery, however two thin wires are sufficient for that (and the magnitude of power involved), and again there's no need for heavy shielding. Which makes the cable thinner, lighter, and more flexible.

    As far as transceivers: Intel and IBM (among others) have been regularly hyping their various silicon photonics research "breakthroughs" for the last 20 years at least. You'd think by now they should be able to build integrated optical transceivers that add about as much bulk/cost as any other modem (e.g. WiFi, LTE, etc.) on a modern SoC...
  • repoman27 - Wednesday, July 18, 2018 - link

    5A @ 20V requires... a sufficient amount of copper. And an optical cable may only require four, hair-thin strands of glass to get the signal from point a to b, but to protect them from the average consumer while wearing a VR headset requires quite a bit of cladding. Yes, optical cables do tend to be thinner and lighter, but not enough to make up for the additional cost. Unless you need to go more than a few meters, and then they're generally the best solution.

    Silicon photonics will hopefully become commonplace sometime, but it sure hasn't happened yet.
  • quorm - Tuesday, July 17, 2018 - link

    ...and cost about an order of magnitude more.
  • Sivar - Tuesday, July 17, 2018 - link

    A VR headset would then need both optical and copper, as power cannot be efficiently transmitted over glass.
  • Valantar - Wednesday, July 18, 2018 - link

    Considering that 2m TB3 cables are ~$50 (and thick and inflexible like you wouldn't believe), you'd need to add $100-150 to the price of whatever headset you're considering for a 4m+ TB3 cable. Not a good idea.
  • DanNeely - Wednesday, July 18, 2018 - link

    The cost of TB3 cables is mostly the electronics in the plugs, not the wiring in between. eg this example on amazon.

    .5/1m $32, 2m $38, 3m $50.
    https://www.amazon.com/dp/B00822GIA4/ref=twister_B...

    I couldn't find anything in copper above 3m though, so that might be a hard limit (or require a signal booster mid cable for double the price); optical cables can get a lot longer but prices for those I see are getting into the cost of a headset itself; and probably not a good choice for the use due to the glass fibers being breakable if bent too hard.
  • repoman27 - Wednesday, July 18, 2018 - link

    2m Thunderbolt 3 3A cable outside diameter: 4.6mm
    2m HDMI Premium High Speed cable outside diameter: 7.2mm

    I'm not sure what you're comparing your Thunderbolt cables to that causes you arrive at the assessment that they are "thick and inflexible like you wouldn't believe". 5A cables tend to be slightly thicker than 3A because they require more copper for the Vbus connections, but this spec is only calling for 3A @ 9V.

    Also, the cost of the retimers / redrivers embedded in the cable connectors, assembly, testing, etc. all stay the same regardless of the length of the cable, and the cost of the cable stock itself isn't that exorbitant. So for a function specific cable, they can make it the appropriate length and pack one in with the headset. It would probably only add about $30 to the retail price compared to the current harness they're using.
  • mr_tawan - Wednesday, July 18, 2018 - link

    I've seen some HDMI cable that's smaller than 3mm in diameter in Japan. It's a bit hard to find in my country (the smallest one here would be ~4mm).
  • repoman27 - Tuesday, July 17, 2018 - link

    "By the standard, DisplayPort alt mode replaces all of the USB 3.1 data channels, leaving only the much slower USB 2.0 baseline channels available."

    DP Alternate Mode fully supports a 2-lane DisplayPort HBR3 main link alongside SuperSpeed USB 10 Gbps signaling. It's only when you go to a full 4-lane main link that you're limited to USB 2.0 for data.

    As for cable length, these are function specific and presumably tethered cable assemblies, so they can just go with an active cable design and make it however long they like. Although even for a passive cable, 2.0 m shouldn't be an issue for 5 Gbit/s signaling and 3 A power delivery.
  • unrulycow - Tuesday, July 17, 2018 - link

    I don't see the point in creating a new standard when this could be done with thunderbolt 3 that exists now
  • DanNeely - Tuesday, July 17, 2018 - link

    TB adds a lot to the hardware cost for its higher signaling rates. This uses the cheaper usb3 standard to work
  • doggface - Tuesday, July 17, 2018 - link

    So will this require a specific controller in the laptop/PC?
    I already have a thunderbolt / USB 3.1 G2 capable port in my laptop, will that be enough?
  • Dr. Swag - Tuesday, July 17, 2018 - link

    Because of the use of DP could we potentially see headsets with adaptive sync in the future? That would be neat.
  • tuxfool - Tuesday, July 17, 2018 - link

    You absolutely don't want adaptive sync in headsets. You games etc to absolutely hit the display frequency, no ifs or buts.

    However in the rare instances where that fails you want to temporal warping creating a new fake frame and not a frame with random persistence.

    Another thing that goes against using adaptive sync is that headsets use low pixel persistence, that strobe pixels so they're only lit half of the frametime (it eliminates image ghosting and smearing). This technique presumes a known frametime.
  • boozed - Tuesday, July 17, 2018 - link

    As they say, the great thing about standards is that there are so many to choose from.

    Also see: xkcd 927
  • blkspade - Wednesday, July 18, 2018 - link

    The display throughput makes sense moving forward, the power is perhaps a little over specced. I can't however imagine needing 10gbps of data in/out the headset. You'd probably barely max out half that with audio and inside out tracking.
  • PeterThorpe81 - Wednesday, July 18, 2018 - link

    I'm not really seeing much point in this over USB-C DisplayPort with Power Delivery it seems relatively minor. DisplayPort already has a 720Mbps aux channel which could be used for tracking, sensors, audio and other data. Are these devices really going to use more than that? Or am I missing something? Is seems to me the mistake was going with HDMI.

    The next version of DisplayPort is meant to double the bandwidth, they could well use the extra pins completely differently making more "standards".

    Also in the article it says DisplayPort 1.4 will be 4k at 120Hz 8bit colour. If they use Display Stream Compression which is part of the standard it can be pushed a little higher to support 10 bit with HDR or a slight framerate boost.
  • edzieba - Wednesday, July 18, 2018 - link

    As well as the extra USB 3.1 bandwidth (and avoiding eating into DP link bandwidth, which is what the Aux Channel does), VirtualLink also formalises other things like link negotiation, and cable standards. e.g.

    "Though I’m very curious what the consortium is doing here (if they’re doing anything at all) to combat the fact that USB 3.1 Gen 2 data is normally only rated to run over 1 meter cables due to faster signal attenuation, which is a rather short cable length for a VR headset and the room scale experiences the vendors are pushing."

    The spec demands that both the DP channels and both USB 3.1 directions all conform to the same minimum signal characteristics e.g. dB budget (demonstrated in the spec with a 5m cable but could be longer depending on component selection).

    DSC can be ruled out (too high latency for VR), but Multi stream Transport is on the cards for sending video data other than just "a big rectandle" or "a rectangle per eye", e.g. a variant on Fixed Foveated Rendering, or multi-layer chequerboarding.
  • repoman27 - Wednesday, July 18, 2018 - link

    DisplayPort AUX channel is out of band. It uses the SBU pins of the USB Type-C connector / cable in DisplayPort Alternate Mode.

    And I cannot fathom why DSC would introduce excessive latency, given how it is implemented.
  • PeterThorpe81 - Thursday, July 19, 2018 - link

    The cable stuff is good but I still don't think it needed the 10Gbit channel, just seems like messing with DisplayPort for little reason. This could have just been a cable standard on top of USB-C PD displayport.

    As @repoman27 said the AUX channel is separate from the rest on USB-C and I thought the whole point of DSC was that it added virtually no latency? It's all desinged ot be simple and achievable in a small part of silicon. Among other things the plan is to use DSC internally on mobile phones and other similar hardware for a lower bandwidth transmission between the gpu and display chip.
  • edzieba - Thursday, July 19, 2018 - link

    "Virtually no" latency is not zero, just like the "visually lossless" compression of DSC is not actually lossless. When you have a 20ms motion-photons budget (and for current VR systems, after overhead that gives you 7ms-11ms for render depending on whether you're using SteamVR or OVR), and all your latency compensation is at the source end, then every millisecond you add as transport latency is a millisecond cut from your render budget. e.g. if you add 'just a mere 5ms' of transmission latency, you will have effectively halved your render budget (the only pert of the pipeline you can control the length of) and must drop image quality to compensate.
  • repoman27 - Thursday, July 19, 2018 - link

    But you're not in the right ballpark. For DSC, my understanding is that we're talking about additional latency in the single-digit µs range. 5ms is off by a factor of 2000. And while DSC is not fully reversible (i.e. not actually lossless), human subjects cannot visually distinguish the difference between the original and compressed versions. DSC is only shooting for a very low (not more than 3:1) compression ratio and is nothing like jpeg or mpeg.
  • edzieba - Thursday, July 19, 2018 - link

    As an example of "why would you need 10 gigabit?", eye tracking suitable for foveated rendering is a LOT harder than most assume from playing with CoTS eyetrackers. Sticking a PSEye next to your face and doing the stare-to-select demo is worlds away from being able to identify an accurate 3D gaze target (taking into account the pupil is not a dot moving on a plane, but a 3D object physically translating and rotating in 3D space), forward-predict the future location of the current saccade at the frame scanout time, and report that well within time needed for that to be incorporated into the rendering of the current frame. Doing that 'in hardware' is not impossible, but it makes far more sense (and is much cheaper) to feed back the high-res high-framerate video and process it on the host PC.
  • mr_tawan - Wednesday, July 18, 2018 - link

    Hmm... I think the dp alt mode is quite adequate for current gen headsets. I don't think the tracking devices and maybe audio device do not consume more bandwidth than what USB2.0 can provide.

    My guess is they aimed for camera on the headset (ie for AR... perhaps) which require quite a lot of bandwidth especially if it's RAW video (I think compressed video is not suitable for AR application as it adds a large sum of latency).
  • peevee - Wednesday, July 18, 2018 - link

    What is wrong with using existing standards, say, USB 3.2 or Thunderbolt 3?
  • Catchy Title - Monday, June 22, 2020 - link

    USB type C is a future for many devices https://en.wikipedia.org/wiki/USB-C

Log in

Don't have an account? Sign up now