Without a unified agreed system nor the interface to test many aspects that need to be tested I don't see how this can be done successfully at this time, especially with software only.
"VRMark is our first major product announcement since we joined forces with UL. Combining the expertise of both companies, we will additionally offer professional lab-based VR testing with precision instruments and verification of performance claims to manufacturers and other customers."
On the mobile side of things, I wonder how they make it fairly consistent across devices for scoring. I'm sure you can get more efficiency gearing towards an Kepler, Adreno, Mali, Pascal, etc. Any efficiencies found in the programs would raise the scores across all architectures. How do they do cross-platform benchmarks (ARM vs x86) now?
This either needs pervasive and reliable latency testers built into HMDs (like the DK2's 'pulse the first transmitted pixel' in-display-controller latency timer), or include a hardware latency tester. IMU testing could be as simple as a small table with a 'thumper' solenoid to provide a nice sharp impulse with a known cycle time (i.e. you know how long the solenoid takes to move, so you know when the IMu should report an impulse). Volume tracking system linearity and reliability is going to be a LOT harder to test though. The 'gold standard' is to stick the HMD on a robotic arm and bring it through a set of known movements, to compare those to the tracked movements. That's not something most testers have lying around, let alone end-users.
Even without testing the tracking system's performance, this is going to be closer to an FCAT setup than a software-only test.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
4 Comments
Back to Article
Wwhat - Wednesday, June 10, 2015 - link
Without a unified agreed system nor the interface to test many aspects that need to be tested I don't see how this can be done successfully at this time, especially with software only.FM-Jarnis - Thursday, June 11, 2015 - link
It is not software only. Quoting the PR:"VRMark is our first major product announcement since we joined forces with UL. Combining the expertise of both companies, we will additionally offer professional lab-based VR testing with precision instruments and verification of performance claims to manufacturers and other customers."
ishould - Wednesday, June 10, 2015 - link
On the mobile side of things, I wonder how they make it fairly consistent across devices for scoring. I'm sure you can get more efficiency gearing towards an Kepler, Adreno, Mali, Pascal, etc. Any efficiencies found in the programs would raise the scores across all architectures. How do they do cross-platform benchmarks (ARM vs x86) now?edzieba - Thursday, June 11, 2015 - link
This either needs pervasive and reliable latency testers built into HMDs (like the DK2's 'pulse the first transmitted pixel' in-display-controller latency timer), or include a hardware latency tester. IMU testing could be as simple as a small table with a 'thumper' solenoid to provide a nice sharp impulse with a known cycle time (i.e. you know how long the solenoid takes to move, so you know when the IMu should report an impulse).Volume tracking system linearity and reliability is going to be a LOT harder to test though. The 'gold standard' is to stick the HMD on a robotic arm and bring it through a set of known movements, to compare those to the tracked movements. That's not something most testers have lying around, let alone end-users.
Even without testing the tracking system's performance, this is going to be closer to an FCAT setup than a software-only test.