Who gives a crap about this, where is the article on the new Intel bugs: Fallout, RIDL and ZombieLoad. Apparently a new class of speculative execution bugs that only affect Intel CPUs. Intel is even saying, if you care about security, disable Hyper Threading! Holy crap that is huge!
Intel on suicide watch? It just keeps getting worse. Just learned about it after you posted that and google'd a bit. And to think we might have actually had hardware-baked fixes for meltdown/spectre soon, only to see that yeah, we'd be waiting even longer for Fallout/RIDL/ZombieLoad fixes. Kind of seals the deal that my next CPU will be a Zen2/Ryzen3000 chip.
Also, back to the article topic, the Mali-D71 chip is nice. I do think an all-in-one chip that can act as a way to both correct visual aberrations and possibly power VR headsets without a GPU could be cool. Still think that the 3 biggest problems for VR aren't solved by this. 1) Video/audio/power cables feeding content/power to the HMD. 2) Lack of compelling content created for VR-type HMDs. 3) Relatively poor price/performance offered by current gen HMDs. They're either too expensive for an adequately convincing experience, or they're affordable (ex: google cardboard) but provide a terrible experience overall.
This is the future of VR.... we're heading to an ASIC world. But with that said, I think this SoC is missing a key feature:
ASW/PTW (Async Space Warp / Positional Time Warp). ATW isn't even all that taxing on current mobile processors. But we need hardware ASW if we're ever going to overcome display interface bandwidth limitations.
Right now, the biggest limitation to pushing higher and higher refresh rates is the display interface if we want to extrapolate fake frames. But if we had hardware ASW, we could theoretically get a KHz display, and beyond.
This is going to be an important point moving forward, because as we move to AR, motion to photon latency will become even more important than it is on VR, because the whole world actually moves with the objects you are seeing in real time.
So to combat this latency, we're going to need super-high refresh rates that we won't have enough power to render, or enough bandwidth. This is where an ASIC that can do PTW/ASW would come in and save the day, and give us extrapolated frames beyond KHz.
Extrapolated frames would also give us the advantage of not having to strobe the backlight, or do BFI (Black frame insertion) on an OLED.
This technology already exists on the PC, but it's bandwidth constrained by DisplayPort, So the only way to do this right, is to do it on directly in the SoC, to bypass such bandwidth limitations.
Either way, It's good to see that there are people working on this. And I wouldn't be surprised at all, if Oculus designed their own SoC for their next VR-AR headsets in a few years. They kind of have to go the Apple route and design their own SoC, because there's a lot of features they need that no one is working on.
The problem is all these VR chips ruins ppl's opinion of VR.
You need a GTX1070 for a reasonable VR experience and these vendors are claiming to be able to do it with meager cellphone SoCs? OBviously that's not true.
Unless its pre-rendered video footage or just vector graphics its setup for failure.
Putting some of the brains in the display to reduce GPU power/compute overhead is great though
I'm not sure what you're asking for out of something like the D77 that it isn't already addressing. It is clearly allowing for headset movement that will update the display without a new frame from the GPU.
First, ASW isn't as important as ATW. The point of ATW is to keep users from getting sick. ASW is mainly to hide the lower rendering framerate of the GPU, which certainly improves the experience but isn't as crucial. That said, it would clearly benefit mobile/standalone use-cases and therefore will probably grace a future generation of this DPU.
As for reaching kHz update rates, once you get above a certain point, I doubt you don't need fancy interpolation for every one of those frames. Maybe use ASW to double the framerate, then simple ATW to fill in any further frames.
Finally, Oculus is not Apple or Google. If they go down the road of their own SoC, it will certainly involve using off-the-shelf IP, such as this DPU. Even MS just used Tensilica cores, in the HPU of their Hololens. However, I think current sales volumes and margins are probably too low to justify a custom SoC.
A little cheap to use the same image for the lense's chromatic aberration and the computed inverse, considering one would assume they know very well how to produce the inverse...
Spelling and grammar corrections: "Two years ago, we saw the release of Arm's new Mali-D71 display processor which represented a branch new architecture and foundation for the company upcoming DP IP blocks." "brand" not "branch": "Two years ago, we saw the release of Arm's new Mali-D71 display processor which represented a brand new architecture and foundation for the company upcoming DP IP blocks."
"...the display always simply display the last GPU render frame." Excess words, missing "s". Maybe: "..the display always displays the last GPU render frame."
"eliminating the resulting experienced image artefacts when viewed through the lens." i before e... :) "eliminating the resulting experienced image artifacts when viewed through the lens."
> What this also opens up is a possible new generation of “dumber” HMDs in the future without a GPU, powered by some other external system, yet providing the same latency and optics advantages as described above in a cheaper integrated HMD SoC.
Actually, what I hope to see is a low-cost SoC with this DPU being used in wireless, PC-based HMDs. Not that you couldn't also put it in a bigger SoC so the HMD can also be used standalone, but I think the key to *really* good wireless PC HMDs is doing the ATW in the HMD, to help offset the latency introduced by the wireless transmission.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
18 Comments
Back to Article
jgraham11 - Wednesday, May 15, 2019 - link
Who gives a crap about this, where is the article on the new Intel bugs: Fallout, RIDL and ZombieLoad. Apparently a new class of speculative execution bugs that only affect Intel CPUs. Intel is even saying, if you care about security, disable Hyper Threading! Holy crap that is huge!JoeyJoJo123 - Wednesday, May 15, 2019 - link
Intel on suicide watch? It just keeps getting worse. Just learned about it after you posted that and google'd a bit. And to think we might have actually had hardware-baked fixes for meltdown/spectre soon, only to see that yeah, we'd be waiting even longer for Fallout/RIDL/ZombieLoad fixes. Kind of seals the deal that my next CPU will be a Zen2/Ryzen3000 chip.Also, back to the article topic, the Mali-D71 chip is nice. I do think an all-in-one chip that can act as a way to both correct visual aberrations and possibly power VR headsets without a GPU could be cool. Still think that the 3 biggest problems for VR aren't solved by this.
1) Video/audio/power cables feeding content/power to the HMD.
2) Lack of compelling content created for VR-type HMDs.
3) Relatively poor price/performance offered by current gen HMDs. They're either too expensive for an adequately convincing experience, or they're affordable (ex: google cardboard) but provide a terrible experience overall.
Kamus - Wednesday, May 15, 2019 - link
Thanks for your useless input.sa666666 - Wednesday, May 15, 2019 - link
Ooooh, you've just triggered HStewart. He'll be showing up to defend Intel soon.mode_13h - Wednesday, May 22, 2019 - link
Lol.Ian Cutress - Wednesday, May 15, 2019 - link
As stated in previous comments, we're waiting for answers to our questions before we publish.ballsystemlord - Thursday, May 16, 2019 - link
Answer from Intel: "Please go easy on us we're losing to AMD and TSMC." :)Kamus - Wednesday, May 15, 2019 - link
This is the future of VR.... we're heading to an ASIC world. But with that said, I think this SoC is missing a key feature:ASW/PTW (Async Space Warp / Positional Time Warp). ATW isn't even all that taxing on current mobile processors. But we need hardware ASW if we're ever going to overcome display interface bandwidth limitations.
Right now, the biggest limitation to pushing higher and higher refresh rates is the display interface if we want to extrapolate fake frames. But if we had hardware ASW, we could theoretically get a KHz display, and beyond.
This is going to be an important point moving forward, because as we move to AR, motion to photon latency will become even more important than it is on VR, because the whole world actually moves with the objects you are seeing in real time.
So to combat this latency, we're going to need super-high refresh rates that we won't have enough power to render, or enough bandwidth. This is where an ASIC that can do PTW/ASW would come in and save the day, and give us extrapolated frames beyond KHz.
Extrapolated frames would also give us the advantage of not having to strobe the backlight, or do BFI (Black frame insertion) on an OLED.
This technology already exists on the PC, but it's bandwidth constrained by DisplayPort, So the only way to do this right, is to do it on directly in the SoC, to bypass such bandwidth limitations.
Either way, It's good to see that there are people working on this. And I wouldn't be surprised at all, if Oculus designed their own SoC for their next VR-AR headsets in a few years. They kind of have to go the Apple route and design their own SoC, because there's a lot of features they need that no one is working on.
webdoctors - Wednesday, May 15, 2019 - link
The problem is all these VR chips ruins ppl's opinion of VR.You need a GTX1070 for a reasonable VR experience and these vendors are claiming to be able to do it with meager cellphone SoCs? OBviously that's not true.
Unless its pre-rendered video footage or just vector graphics its setup for failure.
Putting some of the brains in the display to reduce GPU power/compute overhead is great though
jordanclock - Thursday, May 16, 2019 - link
The D77 isn't an SoC.I'm not sure what you're asking for out of something like the D77 that it isn't already addressing. It is clearly allowing for headset movement that will update the display without a new frame from the GPU.
mode_13h - Wednesday, May 22, 2019 - link
First, ASW isn't as important as ATW. The point of ATW is to keep users from getting sick. ASW is mainly to hide the lower rendering framerate of the GPU, which certainly improves the experience but isn't as crucial. That said, it would clearly benefit mobile/standalone use-cases and therefore will probably grace a future generation of this DPU.As for reaching kHz update rates, once you get above a certain point, I doubt you don't need fancy interpolation for every one of those frames. Maybe use ASW to double the framerate, then simple ATW to fill in any further frames.
Finally, Oculus is not Apple or Google. If they go down the road of their own SoC, it will certainly involve using off-the-shelf IP, such as this DPU. Even MS just used Tensilica cores, in the HPU of their Hololens. However, I think current sales volumes and margins are probably too low to justify a custom SoC.
skavi - Wednesday, May 15, 2019 - link
nice, always have thought timewarp should be done on the display processor.ajp_anton - Wednesday, May 15, 2019 - link
A little cheap to use the same image for the lense's chromatic aberration and the computed inverse, considering one would assume they know very well how to produce the inverse...ballsystemlord - Thursday, May 16, 2019 - link
Spelling and grammar corrections:"Two years ago, we saw the release of Arm's new Mali-D71 display processor which represented a branch new architecture and foundation for the company upcoming DP IP blocks."
"brand" not "branch":
"Two years ago, we saw the release of Arm's new Mali-D71 display processor which represented a brand new architecture and foundation for the company upcoming DP IP blocks."
"...the display always simply display the last GPU render frame."
Excess words, missing "s". Maybe:
"..the display always displays the last GPU render frame."
"eliminating the resulting experienced image artefacts when viewed through the lens."
i before e... :)
"eliminating the resulting experienced image artifacts when viewed through the lens."
jordanclock - Thursday, May 16, 2019 - link
Artefact is a legitimate spelling of artifact.mode_13h - Wednesday, May 22, 2019 - link
Correct. I actually prefer that spelling, when I'm referring to a side-effect.mode_13h - Wednesday, May 22, 2019 - link
> ... the display always simply display the last GPU render frame.I think it should actually be:
"... the display always simply displays the last GPU-rendered frame."
mode_13h - Wednesday, May 22, 2019 - link
> What this also opens up is a possible new generation of “dumber” HMDs in the future without a GPU, powered by some other external system, yet providing the same latency and optics advantages as described above in a cheaper integrated HMD SoC.Actually, what I hope to see is a low-cost SoC with this DPU being used in wireless, PC-based HMDs. Not that you couldn't also put it in a bigger SoC so the HMD can also be used standalone, but I think the key to *really* good wireless PC HMDs is doing the ATW in the HMD, to help offset the latency introduced by the wireless transmission.