Is the point of something like this not exactly to display 500 fps, but more to display a new frame the moment it has been calculated? So that the time wasted waiting for the frame to get to the screen is as small as possible, even though the actual frame rate is something varying but probably around 120 or so?
I don't think that it is. With VRR technologies (Adaptive Sync, G-Sync, FreeSync, etc.) the screen should already be able to be driven to refresh immediately after a frame is available for display.
> With VRR technologies ... the screen should already be able to be > driven to refresh immediately after a frame is available for display.
I'm not sure about that, but I *am* sure the *minimum* amount of time between frames, on a 144 Hz monitor, is 1/144 s. So, a panel with a faster max refresh should be able to offer less judder, at the high end of the framerate range.
I am not the target audience and don't necessarily see the value of this, but that sounds pretty cool. I am curious how long it will be until the 1kHz mark is broken. I expect refresh rates to stop there, but I think a manufacturer will go there just for the sake of marketing.
No, they don't. I don't recall the details, but there are limits to the eye's switching speed from perceiving one color vs. another.
I think there are benefits to high-refresh displays, however. When your eye is trying to track a moving object across the screen, your eye is moving (more or less) continuously. If the object is jumping by too much between screen refreshes, you get judder or blur, because the eye can't center the object correctly.
That said, I don't claim to know how it makes sense to go. 500 Hz *feels* excessive to me.
10 ns -> 10 GHz! I don't know the brightness involved, so conservatively say "MHz".
Secondly, let's say your screen is 1 m across, and an object moves 100 m/s. This 500 Hz screen would display just 5 samples of the object. 5 kHz refresh gets you 50.
> This source says people have seen 10 ns light pulses
If the intensity is well beyond what a screen can reasonably and safely output, then it's irrelevant.
Whatever intensity they're talking about, I'm very skeptical it's safe to be looking at such a high-energy light source for extended periods of time.
I think there's a further dimension to this question, which is the brain's ability to process what your photo receptors are detecting. Seeing a single 10 ns light pulse is very different from processing 10 GHz worth of information.
> Secondly, let's say your screen is 1 m across, and an object moves 100 m/s. > This 500 Hz screen would display just 5 samples of the object. 5 kHz refresh gets you 50.
For very fast-moving objects, it's difficult for your eye to *start* tracking it, before it disappears. Therefore, the whole "eye-tracking" argument I made doesn't really apply. In these cases, you're better off just burning a little computation on motion-blur. That quite likely ends up being computationally cheaper than rendering a bunch of additional frames.
Thanks for your reply...but you're not making any counterpoints to what I said. "MHz" was already margining the 10ns example by a factor of 10,000! My point remains.
Eye tracking isn't necessary: you can see a short shooting star with your side vision.
The brain can interpolate successfully at very low frame rates, evidenced by decades of movie and television industry functioning at 24 - 30 fps. That's Hz, not even kHz. Your notion that vision operates at the MHz and GHz level is very misleading. The eyes ability to "see" intense flashes at the nanosecond level doesn't indicate how overall vision functions in terms of fps. If the eye could detect a very small light intensity change at that interval you might have a point, but that's not what that experiment showed. Same experiment, if you had alternating nano second flashes, to our vision they would look like continuous blinding brightness. Matter of fact, if you had those same nanosecond intense flashes only showing up a dozen times through each second evenly spaced apart, the result would be the same.
Data for pro gamers and their ability to react at the refresh rates shown here (500 Hz vs 144 Hz) are really questionable. We're entering areas at which the latency for what you're seeing on screen is no longer driven by the screen, instead you now have guaranteed longer latency from networks and peripheral processing capabilities. Maybe there are use cases where this really is important though and perhaps one person per a few billion can make use of it.
> The brain can interpolate successfully at very low frame rates, > evidenced by decades of movie and television industry functioning at 24 - 30 fps.
There are confounding factors, such as how CRT phosphors very quickly lose intensity and how movie projectors would use a strobe light. Also, cameras (especially older ones) have some amount of in-built motion blur.
Conventional LCD don't strobe their backlight, which leads to the "latched pixel effect". The consequence of this is a smearing or blurring of objects an your eyes track them across the screen.
However, visual acuity rapidly diminishes as movement speeds increase, which should mean there's a point beyond which fast-moving objects no longer need to be as sharp. Beyond that point, you should be able to simply use simulated motion blur to achieve a similar user perception as even higher framerates.
The idea is that a missile streaking across the screen for a fraction of a second should appear as a blurry streak, rather than appearing unblurred in just a few places.
I really like your comment and it improved my day. One can appreciate something new and accept its existence even though it has no personal benefit for oneself. I personally value image quality over fps but that does not have to apply to everyone. And this trend of higher framerates will surely trickle over to other display segments. We already see wide color gamut displays with higher than 60 fps.
Best is a NO_HZ panel. We're a ways off from having enough bandwidth for that, but if the GPU was integrated with the display then the display raster could be framebuffer, and the display could do updates in the 100kHz range even for a 4k display. We have thousands of stream processors bottlenecked by an expensive and slow display cable, so this is a natural evolution which will arrive some day.
Had same screen and it was good but it's now outdated. Changed to 32" Asus ROG VA screen and oh boy not only difference in size but colors and blacks are so much better, no ghosting either.
"Gamers really don't care about image quality at all?"
Competitive gamers don't. This is the same crowd that turns down image quality settings for lower frame render times. It's all about trying to minimize the end-to-end latency.
This is for "professionals" that play "e-sports" (LOL) not for us mere plebs that work at jobs that actually add value to our civilization. You've got to be a sweaty basement dweller with delusions of winning a trophy that says you clicked a mouse button better in Fortnite than some other slob that is disinterested in the idea of education or a career.
I'm not saying most incels don't deserve your biting critiques, but they feel almost like too easy targets.
These (mostly) guys would be frittering away their time and money on some fruitless pastime or another. If it weren't games, maybe hot rods, sports fandom, or who knows what else. I'm not defending gamer culture, but gaming seems to me like not the worst case scenario.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
22 Comments
Back to Article
name99 - Tuesday, May 24, 2022 - link
Is the point of something like this not exactly to display 500 fps, but more to display a new frame the moment it has been calculated?So that the time wasted waiting for the frame to get to the screen is as small as possible, even though the actual frame rate is something varying but probably around 120 or so?
ZoZo - Tuesday, May 24, 2022 - link
I don't think that it is. With VRR technologies (Adaptive Sync, G-Sync, FreeSync, etc.) the screen should already be able to be driven to refresh immediately after a frame is available for display.mode_13h - Tuesday, May 24, 2022 - link
> With VRR technologies ... the screen should already be able to be> driven to refresh immediately after a frame is available for display.
I'm not sure about that, but I *am* sure the *minimum* amount of time between frames, on a 144 Hz monitor, is 1/144 s. So, a panel with a faster max refresh should be able to offer less judder, at the high end of the framerate range.
ingwe - Tuesday, May 24, 2022 - link
I am not the target audience and don't necessarily see the value of this, but that sounds pretty cool. I am curious how long it will be until the 1kHz mark is broken. I expect refresh rates to stop there, but I think a manufacturer will go there just for the sake of marketing.mode_13h - Tuesday, May 24, 2022 - link
Nvidia long ago demonstrated a 1700 Hz display. That suggests it's technically feasible.https://techreport.com/news/29955/nvidia-reveals-a...
AnnonymousCoward - Tuesday, May 24, 2022 - link
While the motivation might be for marketing, the benefit is real: our eyes see MHzmode_13h - Wednesday, May 25, 2022 - link
> our eyes see MHzNo, they don't. I don't recall the details, but there are limits to the eye's switching speed from perceiving one color vs. another.
I think there are benefits to high-refresh displays, however. When your eye is trying to track a moving object across the screen, your eye is moving (more or less) continuously. If the object is jumping by too much between screen refreshes, you get judder or blur, because the eye can't center the object correctly.
That said, I don't claim to know how it makes sense to go. 500 Hz *feels* excessive to me.
AnnonymousCoward - Saturday, May 28, 2022 - link
This source says people have seen 10 ns light pulses: https://www.quora.com/Vision-eyesight-What-is-the-...10 ns -> 10 GHz! I don't know the brightness involved, so conservatively say "MHz".
Secondly, let's say your screen is 1 m across, and an object moves 100 m/s. This 500 Hz screen would display just 5 samples of the object. 5 kHz refresh gets you 50.
mode_13h - Sunday, May 29, 2022 - link
> This source says people have seen 10 ns light pulsesIf the intensity is well beyond what a screen can reasonably and safely output, then it's irrelevant.
Whatever intensity they're talking about, I'm very skeptical it's safe to be looking at such a high-energy light source for extended periods of time.
I think there's a further dimension to this question, which is the brain's ability to process what your photo receptors are detecting. Seeing a single 10 ns light pulse is very different from processing 10 GHz worth of information.
> Secondly, let's say your screen is 1 m across, and an object moves 100 m/s.
> This 500 Hz screen would display just 5 samples of the object. 5 kHz refresh gets you 50.
For very fast-moving objects, it's difficult for your eye to *start* tracking it, before it disappears. Therefore, the whole "eye-tracking" argument I made doesn't really apply. In these cases, you're better off just burning a little computation on motion-blur. That quite likely ends up being computationally cheaper than rendering a bunch of additional frames.
AnnonymousCoward - Monday, May 30, 2022 - link
Thanks for your reply...but you're not making any counterpoints to what I said. "MHz" was already margining the 10ns example by a factor of 10,000! My point remains.Eye tracking isn't necessary: you can see a short shooting star with your side vision.
niva - Tuesday, May 31, 2022 - link
The brain can interpolate successfully at very low frame rates, evidenced by decades of movie and television industry functioning at 24 - 30 fps. That's Hz, not even kHz. Your notion that vision operates at the MHz and GHz level is very misleading. The eyes ability to "see" intense flashes at the nanosecond level doesn't indicate how overall vision functions in terms of fps. If the eye could detect a very small light intensity change at that interval you might have a point, but that's not what that experiment showed. Same experiment, if you had alternating nano second flashes, to our vision they would look like continuous blinding brightness. Matter of fact, if you had those same nanosecond intense flashes only showing up a dozen times through each second evenly spaced apart, the result would be the same.Data for pro gamers and their ability to react at the refresh rates shown here (500 Hz vs 144 Hz) are really questionable. We're entering areas at which the latency for what you're seeing on screen is no longer driven by the screen, instead you now have guaranteed longer latency from networks and peripheral processing capabilities. Maybe there are use cases where this really is important though and perhaps one person per a few billion can make use of it.
mode_13h - Tuesday, June 7, 2022 - link
Good points. Overall, I think we agree.> The brain can interpolate successfully at very low frame rates,
> evidenced by decades of movie and television industry functioning at 24 - 30 fps.
There are confounding factors, such as how CRT phosphors very quickly lose intensity and how movie projectors would use a strobe light. Also, cameras (especially older ones) have some amount of in-built motion blur.
Conventional LCD don't strobe their backlight, which leads to the "latched pixel effect". The consequence of this is a smearing or blurring of objects an your eyes track them across the screen.
However, visual acuity rapidly diminishes as movement speeds increase, which should mean there's a point beyond which fast-moving objects no longer need to be as sharp. Beyond that point, you should be able to simply use simulated motion blur to achieve a similar user perception as even higher framerates.
The idea is that a missile streaking across the screen for a fraction of a second should appear as a blurry streak, rather than appearing unblurred in just a few places.
simonpschmitt - Wednesday, May 25, 2022 - link
I really like your comment and it improved my day. One can appreciate something new and accept its existence even though it has no personal benefit for oneself. I personally value image quality over fps but that does not have to apply to everyone. And this trend of higher framerates will surely trickle over to other display segments. We already see wide color gamut displays with higher than 60 fps.linuxgeex - Wednesday, May 25, 2022 - link
Best is a NO_HZ panel. We're a ways off from having enough bandwidth for that, but if the GPU was integrated with the display then the display raster could be framebuffer, and the display could do updates in the 100kHz range even for a 4k display. We have thousands of stream processors bottlenecked by an expensive and slow display cable, so this is a natural evolution which will arrive some day.mode_13h - Wednesday, May 25, 2022 - link
> if the GPU was integrated with the display then the display raster could be framebufferI don't see how that's supposed to work. If you render directly to the on-screen framebuffer, you get a flickery, jumbled mess.
You really don't want to watch frames, as their drawn. You only want to see the fully rendered frame.
> We have thousands of stream processors bottlenecked by an expensive and slow display cable
Have you looked at game benchmarks, lately? 4k with high detail? In most cases, GPUs aren't close to being limited by the display refresh rate.
Makaveli - Tuesday, May 24, 2022 - link
24" 1080 500hz and TN monitor hmm ya hard pass. Gamers really don't care about image quality at all?meacupla - Tuesday, May 24, 2022 - link
Asus TN gaming monitors have historically had good image quality, for a TN panel.I own a PG278Q, and I don't really have any issues with it.
7BAJA7 - Tuesday, May 24, 2022 - link
Had same screen and it was good but it's now outdated. Changed to 32" Asus ROG VA screen and oh boy not only difference in size but colors and blacks are so much better, no ghosting either.Ryan Smith - Tuesday, May 24, 2022 - link
"Gamers really don't care about image quality at all?"Competitive gamers don't. This is the same crowd that turns down image quality settings for lower frame render times. It's all about trying to minimize the end-to-end latency.
PeachNCream - Wednesday, May 25, 2022 - link
This is for "professionals" that play "e-sports" (LOL) not for us mere plebs that work at jobs that actually add value to our civilization. You've got to be a sweaty basement dweller with delusions of winning a trophy that says you clicked a mouse button better in Fortnite than some other slob that is disinterested in the idea of education or a career.mode_13h - Wednesday, May 25, 2022 - link
I'm not saying most incels don't deserve your biting critiques, but they feel almost like too easy targets.These (mostly) guys would be frittering away their time and money on some fruitless pastime or another. If it weren't games, maybe hot rods, sports fandom, or who knows what else. I'm not defending gamer culture, but gaming seems to me like not the worst case scenario.
Alistair - Wednesday, May 25, 2022 - link
"Not to be confused with the standard TN panels"It isn't confused. It is a TN panel. A standard one.