It’s important and it’s nice to see progress in temporal resolution pushed. Refresh rates already push past what LCDs can handle. 360 Hz is well into OLED territory. A 360 Hz consumer OLED: now that would be cause to sit up in one’s seat.
240hz is already a subtle difference to 120/144. Also, it's not about whether you can "tell" the difference or not, because ultimately, you need to practice with your mouse, your keyboard, your desk, your monitor -- your entire setup, and often any change to anything involves some personal retraining to get used to the new change. Ordinarily you could just people in a double-blind study (and have them guess which monitor is 240hz and which is 360hz), but really you need way more time to get attuned with the peripheral.
Often times, pros are pros because they're highly attuned to their setup and can minimize the human element of ingesting information (time taken to digest what's happening on the monitor/sounds heard from headphones) and especially minimizing the human element of interacting with input elements (controller/mouse/keyboard/arcade stick/wheel+pedals/etc).
It's as naive of an idea as just going up to a fighting game pro like Daigo Umehara, taking their arcade stick and giving them a 3rd party MadCatz controller and saying "Hey, if you're so pro, then you should be able to win with this." No, it doesn't work like that, you need to actually get used to it before you can really start noticing any differences.
By the same token having 5 minutes to try to compare the two monitors really isn't enough time. You need to let it sit and boil for like a month and see if they prefer the new 360hz monitor over the old one.
Additionally, as refresh rates (and brightness) increase ever higher, you can start implementing black frame insertion on every other frame through software, while still keeping a good enough effective refresh rate. Losing every other frame at 360hz = 180hz + low display persistence (diminished ghosting/motion blur), and that lowered motion blur may help a bit more than the additional slight smoothness from 240hz->360hz.
So there's a lot of stuff to consider before we go about dismissing 360hz just because "the human eye can't see past 23.998 fps cinema movies" memes.
60Hz->144Hz and 144Hz->360Hz are similar percentage jumps.
Benefits are most noticeable this way in the curve of diminishing returns. Tests have shown that the diminishing curve still has benefits to well beyond 1000Hz.
True statement. Within a product category there is a need to fight a specifications war to land sales and the current specifications battle has settled around refresh rate. Manufacturers NEED to create a reason behind the specifications or no one will care. In this case the industry consensus is to impress upon gullible customers that a higher refresh rate will improve their competitive gameplay so that they crawl out of their parents' basement to beg their parents to spot them a few hundred dollars for a new monitor just until they can get a few more Grubhub deliveries done and can pay them back or finally win that one vidja games tournament with said monitor.
There's some interesting information on this topic available on specialist sites covering motion blurring in displays... In short, the closer we get to ~1000Hz, the less motion blur is visible on sample-and-hold displays - all while avoiding the brightness reduction of blank-frame-insertion.
You don't make a monitor supposed to last 3-5 years after you purchase it merely on today's market for the monitor. That's frankly moronic. Also the hardware needs to proceed the software that utilizes the hardware—especially by-the-numbers-oriented developers like game developers.
It's the same weird logic people had issues w/ Nvidia RTX series that is faulty: Game developers can't & won't make games leveraging ray-tracing if there's no GPUs available for mass consumers in addition to themselves + consoles. That's played itself exactly like what most expected besides the strong cynicism by gamers with no next-gen console information on the horizon when such GPUs launched (outside of their control); that's since been predictably corrected with Xbox One Series X's & PS5's ray-tracing support.
Same thing will happen for this especially now that Windows 7 is EOL'd, all modern GPUs will get a significant bump with new games during the lifecycle of this monitor being sold to be able to run 360hz
ray tracing is dumb (Control with RT off had no window reflections even though there were many ways to accomplish it without ray tracing such as planar reflections and screen space reflections) and a waste of resources (opportunity cost of silicon area and money) that would be better spent on other parts of the silicon which is why people don't like it, there's no faulty logic i assure you
No it's just matter of time, like all new technologies in graphics it will take time to developers to implement all features and hardware to be powerful enough to back it up, those things take years. And it is part of on going path of getting more realistic graphics in real time rendering.
The pushback against RTX was mainly for 3 reasons: 1) The initial implementation doesn't perform well. 2) Introducing it on the 14nm node bloated the costs of their products. 3) Pushing it as far down the stack as the 2060 exacerbated (1) and (2) for cost-sensitive buyers - to get the substantial performance benefits of Turing, you are forced to invest in what is - at that level - a completely useless feature that does very little now and will never perform well enough for the future games that (may) eventually use it better. The same's true for the 2070 for anybody who already invested in a 1440p or better display.
Even without a cap, a lot of games would hit single-threaded CPU performance limits before reaching 360 FPS. Like with 240Hz displays, these are made for only a handful of games.
Thank goodness for the fate of humanity that this statement is correct and that only a really stupid handful of a niche of people will lust after this product.
The only real reason most games can’t, in theory, run at arbitrarily high frame rates is because animation is hard work and without care to specifically ensure frame rate scalability, the animations may only look good at certain frame rates (look at halo reach on PC at frame rates above 60 fps). We’ve (mostly, not entirely) moved past physics engines and rendering engines being in lockstep.
Also physics engines can have variable frame rates in game engines too. There’s no reason that too could not be cranked into the hundreds of Hz, except for needing a CPU capable of doing the work.
The human eye probably can barely tell a steady 30 from a steady 60. But this is not about how the human eye accepts motion from still images, this is about latency. When you issue a command it takes longer on average to register on the screen when the game is running at a lower fps. When change of motion of whatever the eye is tracking on the screen happens, it probably will be visible sooner with a high fps than with a lower one. So the player with the higher fps is given a small boost to his reaction times and that will affect the probability of who kills whom in the match.
The diminishing curve of returns don't disappear until well beyond 1000Hz. There's stroboscopics and sample-and-hold effects to consider, which are different effects than other tests (including the fighter pilot 1/250sec tests).
Not as much of a gimmick as you might think. I remember reading that a fighter pilot could detect subtle changes in their environment as quickly as 1/250th of a second. This indicates to me that there's also a degree of training and experience that goes into how easily you notice the changes. You would expect similar results from competitive gamers. The intensity of what you are seeing matters a lot too. The brain responds to small changes in intensity faster than large changes because of persistence of vision. So yeah, black to white and back might cap out at a pretty low frame-rate, small variations might be perceived at much higher frame rates. There's also a big difference in knowing exactly what changed and perceiving a change.
Agree. Most people can’t detect flicker at over 60hz. This isn’t aimed at them. Many fluro lights / poor quality leds noticeably flicker to me, but not to other people, so I’m willing to believe a very small number of people, especially twitch gamers, are even more sensitive than me.
The thresholds are actually MUCH higher than that test; diminishing curve does not end until well beyond 1000 Hz.
The fighter pilot test does not take into account of Talbot-Plateau Law. It was a brief-flash object identification test, and it did not flash twice as bright during half-brevity flashes.
Even on the Windows desktop, it's easy to see the difference between a 60Hz and a 144Hz panel. The mouse cursor is smoother and scrolling text is also clearer (and it's partly this one that makes higher refresh rate displays more suitable for gaming). Turn around quickly on a 60Hz and everything will turn into a blurry mess while you're turning. It's much improved on a higher refresh rate.
Nah. Average eye saccade lasts 20-200 ms (5-50 FPS). This is your floor and source of often quoted 42 FPS (because it is a nice number). This leads to ~90 FPS on the screen to avoid aliasing. Trained people that don't have any special eyes manage to *recognize* aircraft when seeing it for mere 5 ms (200 FPS). Part of this is afterimage though, which makes a lot of research quite problematic. Anyway, few people can go above that, seeing artefacts in 2 ms (500 FPS) with similar intensity to the image - no afterimage to help. But all this is foveal vision, which is slower than peripheral. Evolution and stuff. I have no idea what is the ceiling there, but I wouldn't be surprised if some people can see "something happened" with a mere 0.1 ms (10k FPS) stimulus.
Peripheral is definitely more sensitive. Back in the day on CRTs, I couldn't use my monitor at 60Hz for desktop stuff because of terrible flickering, but only around the edges. Any raise above 60 helped. I believe 80 was where it stopped being consciously perceptible, though it has been a few years.
There's OTHER effects than flicker to consider. Such as stroboscopic effects (phantom arrays) and persistence blur (sample-and-hold). And input latency (quicker frame transport).
I wonder how much they paid that washout CS player to spout some nonsense. Probably not much since, if he's like most people, he is up to his eyeballs in debt and needs the cash to cover all that debt.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
38 Comments
Back to Article
Sychonut - Sunday, January 5, 2020 - link
360Hz? Who cares dawg, who cares? Another unnecessary "need" created to drive up sales.willis936 - Sunday, January 5, 2020 - link
It’s important and it’s nice to see progress in temporal resolution pushed. Refresh rates already push past what LCDs can handle. 360 Hz is well into OLED territory. A 360 Hz consumer OLED: now that would be cause to sit up in one’s seat.surt - Sunday, January 5, 2020 - link
Who cares? High end gamers looking for an edge. Every ms in end-to-end latency matters.A5 - Monday, January 6, 2020 - link
There are probably a handful of people in the world that can tell the difference between 240 and 360 once they're in-game.JoeyJoJo123 - Monday, January 6, 2020 - link
240hz is already a subtle difference to 120/144. Also, it's not about whether you can "tell" the difference or not, because ultimately, you need to practice with your mouse, your keyboard, your desk, your monitor -- your entire setup, and often any change to anything involves some personal retraining to get used to the new change. Ordinarily you could just people in a double-blind study (and have them guess which monitor is 240hz and which is 360hz), but really you need way more time to get attuned with the peripheral.Often times, pros are pros because they're highly attuned to their setup and can minimize the human element of ingesting information (time taken to digest what's happening on the monitor/sounds heard from headphones) and especially minimizing the human element of interacting with input elements (controller/mouse/keyboard/arcade stick/wheel+pedals/etc).
It's as naive of an idea as just going up to a fighting game pro like Daigo Umehara, taking their arcade stick and giving them a 3rd party MadCatz controller and saying "Hey, if you're so pro, then you should be able to win with this." No, it doesn't work like that, you need to actually get used to it before you can really start noticing any differences.
By the same token having 5 minutes to try to compare the two monitors really isn't enough time. You need to let it sit and boil for like a month and see if they prefer the new 360hz monitor over the old one.
Additionally, as refresh rates (and brightness) increase ever higher, you can start implementing black frame insertion on every other frame through software, while still keeping a good enough effective refresh rate. Losing every other frame at 360hz = 180hz + low display persistence (diminished ghosting/motion blur), and that lowered motion blur may help a bit more than the additional slight smoothness from 240hz->360hz.
So there's a lot of stuff to consider before we go about dismissing 360hz just because "the human eye can't see past 23.998 fps cinema movies" memes.
crimsonson - Monday, January 6, 2020 - link
Daigo can pick the stick he uses in a tournament. Monitors are something the organizers pick because sponsors pay for them.It is not an apt analogy.
mdrejhon - Tuesday, January 7, 2020 - link
One wants to jump up the Hz curve geometrically.60Hz->144Hz and 144Hz->360Hz are similar percentage jumps.
Benefits are most noticeable this way in the curve of diminishing returns. Tests have shown that the diminishing curve still has benefits to well beyond 1000Hz.
PeachNCream - Monday, January 6, 2020 - link
True statement. Within a product category there is a need to fight a specifications war to land sales and the current specifications battle has settled around refresh rate. Manufacturers NEED to create a reason behind the specifications or no one will care. In this case the industry consensus is to impress upon gullible customers that a higher refresh rate will improve their competitive gameplay so that they crawl out of their parents' basement to beg their parents to spot them a few hundred dollars for a new monitor just until they can get a few more Grubhub deliveries done and can pay them back or finally win that one vidja games tournament with said monitor.Spunjji - Monday, January 6, 2020 - link
There's some interesting information on this topic available on specialist sites covering motion blurring in displays... In short, the closer we get to ~1000Hz, the less motion blur is visible on sample-and-hold displays - all while avoiding the brightness reduction of blank-frame-insertion.willis936 - Monday, January 6, 2020 - link
No reason to cover up information. The site is tft central.Awful - Monday, January 6, 2020 - link
willis936 - actually sounds more like blurbusters than tftcentralSirDragonClaw - Sunday, January 5, 2020 - link
Nice, now tell me when they release a 1440p one at 300+ fps. Then I will buy it regardless of price.But this is very good for the state of the industry.
isthisavailable - Sunday, January 5, 2020 - link
Can most game engines even output 360 fps? Most of them have a cap at like 200 fps.lilkwarrior - Monday, January 6, 2020 - link
You don't make a monitor supposed to last 3-5 years after you purchase it merely on today's market for the monitor. That's frankly moronic. Also the hardware needs to proceed the software that utilizes the hardware—especially by-the-numbers-oriented developers like game developers.It's the same weird logic people had issues w/ Nvidia RTX series that is faulty: Game developers can't & won't make games leveraging ray-tracing if there's no GPUs available for mass consumers in addition to themselves + consoles. That's played itself exactly like what most expected besides the strong cynicism by gamers with no next-gen console information on the horizon when such GPUs launched (outside of their control); that's since been predictably corrected with Xbox One Series X's & PS5's ray-tracing support.
Same thing will happen for this especially now that Windows 7 is EOL'd, all modern GPUs will get a significant bump with new games during the lifecycle of this monitor being sold to be able to run 360hz
Alistair - Monday, January 6, 2020 - link
ray tracing is dumb (Control with RT off had no window reflections even though there were many ways to accomplish it without ray tracing such as planar reflections and screen space reflections) and a waste of resources (opportunity cost of silicon area and money) that would be better spent on other parts of the silicon which is why people don't like it, there's no faulty logic i assure youEliadbu - Monday, January 6, 2020 - link
No it's just matter of time, like all new technologies in graphics it will take time to developers to implement all features and hardware to be powerful enough to back it up, those things take years. And it is part of on going path of getting more realistic graphics in real time rendering.Spunjji - Monday, January 6, 2020 - link
The pushback against RTX was mainly for 3 reasons:1) The initial implementation doesn't perform well.
2) Introducing it on the 14nm node bloated the costs of their products.
3) Pushing it as far down the stack as the 2060 exacerbated (1) and (2) for cost-sensitive buyers - to get the substantial performance benefits of Turing, you are forced to invest in what is - at that level - a completely useless feature that does very little now and will never perform well enough for the future games that (may) eventually use it better. The same's true for the 2070 for anybody who already invested in a 1440p or better display.
SeannyB - Monday, January 6, 2020 - link
Even without a cap, a lot of games would hit single-threaded CPU performance limits before reaching 360 FPS. Like with 240Hz displays, these are made for only a handful of games.PeachNCream - Monday, January 6, 2020 - link
Thank goodness for the fate of humanity that this statement is correct and that only a really stupid handful of a niche of people will lust after this product.willis936 - Monday, January 6, 2020 - link
The only real reason most games can’t, in theory, run at arbitrarily high frame rates is because animation is hard work and without care to specifically ensure frame rate scalability, the animations may only look good at certain frame rates (look at halo reach on PC at frame rates above 60 fps). We’ve (mostly, not entirely) moved past physics engines and rendering engines being in lockstep.willis936 - Monday, January 6, 2020 - link
Also physics engines can have variable frame rates in game engines too. There’s no reason that too could not be cranked into the hundreds of Hz, except for needing a CPU capable of doing the work.HideOut - Sunday, January 5, 2020 - link
The human eye can only see to about 70fps, if your above 100hz then you are capped out. This is pretty much a marketing gimmick.Yojimbo - Sunday, January 5, 2020 - link
The human eye probably can barely tell a steady 30 from a steady 60. But this is not about how the human eye accepts motion from still images, this is about latency. When you issue a command it takes longer on average to register on the screen when the game is running at a lower fps. When change of motion of whatever the eye is tracking on the screen happens, it probably will be visible sooner with a high fps than with a lower one. So the player with the higher fps is given a small boost to his reaction times and that will affect the probability of who kills whom in the match.willis936 - Monday, January 6, 2020 - link
>The human eye probably can barely tell a steady 30 from a steady 60.Oh come on. I thought we had moved past people spouting easily falsifiable nonsense opinions.
https://www.ncbi.nlm.nih.gov/books/NBK11559/#!po=1...
mdrejhon - Tuesday, January 7, 2020 - link
You clearly haven't seen the latest research.https://www.blurbusters.com/1000hz-journey
The diminishing curve of returns don't disappear until well beyond 1000Hz. There's stroboscopics and sample-and-hold effects to consider, which are different effects than other tests (including the fighter pilot 1/250sec tests).
SaberKOG91 - Sunday, January 5, 2020 - link
Not as much of a gimmick as you might think. I remember reading that a fighter pilot could detect subtle changes in their environment as quickly as 1/250th of a second. This indicates to me that there's also a degree of training and experience that goes into how easily you notice the changes. You would expect similar results from competitive gamers. The intensity of what you are seeing matters a lot too. The brain responds to small changes in intensity faster than large changes because of persistence of vision. So yeah, black to white and back might cap out at a pretty low frame-rate, small variations might be perceived at much higher frame rates. There's also a big difference in knowing exactly what changed and perceiving a change.Tomatotech - Monday, January 6, 2020 - link
Agree. Most people can’t detect flicker at over 60hz. This isn’t aimed at them. Many fluro lights / poor quality leds noticeably flicker to me, but not to other people, so I’m willing to believe a very small number of people, especially twitch gamers, are even more sensitive than me.SaberKOG91 - Monday, January 6, 2020 - link
Yup and it's even more noticeable when your head is moving relative to the stationary light.mdrejhon - Tuesday, January 7, 2020 - link
The thresholds are actually MUCH higher than that test; diminishing curve does not end until well beyond 1000 Hz.The fighter pilot test does not take into account of Talbot-Plateau Law. It was a brief-flash object identification test, and it did not flash twice as bright during half-brevity flashes.
There are other effects of ultra-high-Hz gaming monitors that benefits users, that are not tested in that fighter pilot study.
-- Latency effects
-- Motion blur effects (sample-and-hold), https://www.blurbusters.com/1000hz-journey
-- Stroboscopic effects (phantom arraying), https://www.blurbusters.com/stroboscopics
cosmotic - Monday, January 6, 2020 - link
Citation needed.Devo2007 - Monday, January 6, 2020 - link
Completely false!Even on the Windows desktop, it's easy to see the difference between a 60Hz and a 144Hz panel. The mouse cursor is smoother and scrolling text is also clearer (and it's partly this one that makes higher refresh rate displays more suitable for gaming). Turn around quickly on a 60Hz and everything will turn into a blurry mess while you're turning. It's much improved on a higher refresh rate.
As others indicated, the latency is also reduced
lilkwarrior - Monday, January 6, 2020 - link
This is BS. Let this BS die in 2020 already, FFS.Zizy - Monday, January 6, 2020 - link
Nah. Average eye saccade lasts 20-200 ms (5-50 FPS). This is your floor and source of often quoted 42 FPS (because it is a nice number). This leads to ~90 FPS on the screen to avoid aliasing. Trained people that don't have any special eyes manage to *recognize* aircraft when seeing it for mere 5 ms (200 FPS). Part of this is afterimage though, which makes a lot of research quite problematic. Anyway, few people can go above that, seeing artefacts in 2 ms (500 FPS) with similar intensity to the image - no afterimage to help. But all this is foveal vision, which is slower than peripheral. Evolution and stuff. I have no idea what is the ceiling there, but I wouldn't be surprised if some people can see "something happened" with a mere 0.1 ms (10k FPS) stimulus.Lord of the Bored - Tuesday, January 7, 2020 - link
Peripheral is definitely more sensitive. Back in the day on CRTs, I couldn't use my monitor at 60Hz for desktop stuff because of terrible flickering, but only around the edges. Any raise above 60 helped. I believe 80 was where it stopped being consciously perceptible, though it has been a few years.mdrejhon - Tuesday, January 7, 2020 - link
There's OTHER effects than flicker to consider. Such as stroboscopic effects (phantom arrays) and persistence blur (sample-and-hold). And input latency (quicker frame transport).69369369 - Monday, January 6, 2020 - link
"unlock gamer skill"Cringe.
PeachNCream - Monday, January 6, 2020 - link
I wonder how much they paid that washout CS player to spout some nonsense. Probably not much since, if he's like most people, he is up to his eyeballs in debt and needs the cash to cover all that debt.TheWereCat - Monday, January 6, 2020 - link
I haven't seen it mentioned in the article but I think its safe to guess that this will be a TN panel?