Someone was mentioning that in another forum that 1440p, 144hz, a 32 bit color is beyond display port 1.2's capabilities. I looked it up and it appears to be close. Is using two cables an option?
Since, AFAIK, there aren't any DP 1.3 capable cards out yet (the standard was only finalized in September of 2014); all current 1440p 120/144hz monitors and all 4k 60hz monitors support dual cable DP1.2. I believe that they all appear as two half width monitors to your graphicscard and require using eyefininity/3d surround to combine them into a single screen to be presented to the OS.
err strike part of that; I was off one DP generation (and 2x speed) mentally. DP 1.2 is able to fit both of those resolutions in a single stream. For high bitrate color; output levels are 30 bits (10 bits/channel); which does just squeeze in at 1440p /144Hz. What is normally called 32-bit color on your GPU is 24bit color and an 8 bit alpha (transparency) channe. The alpha is only used in rendering and not output to the monitor. I'm not sure if 30bit color is bitpacked to 40bits (to minimize memory consumption) when an alpha channel is needed or expanded to 64 bits/pixel (memory accesses are easier to program/generally faster when aligned to the size of processor data words).
Thanks man, not only clarified for me but gave me some ammo to use on another forum.
(some guy was saying this monitor is stupid because it can't even support 32bit color @ 144hz/1440p). By the bandwidth numbers i was looking up it was close, but now that you clarified how the video card actually does 32 bit color, it makes a ton more sense.
They do know, which is why they didn't call it "bezel-less". The way LCD panels are made makes it near impossible to achieve a panel with zero frame or bezel.
This makes me wonder if press reports are calling panel types by the right name. AUO anounced a 2560x1440 144Hz AHVA panel( http://www.tftcentral.co.uk/news_archive/31.htm#14... ), but a lot of sites reported it as an IPS panel because the tech is similar and everyone knows what IPS is.
So is this Acer display the AHVA panel being called an IPS for marketing simplicity, or did LG get their IPS panels up to 144Hz?
The fact that they do not disclose the input lag on the IPS model is worrying. The main problem of IPS displays have always been input lag from a gaming perspective. Plus the ROG Swift TN panel had very good colors. Still, it's good that they are moving in the right direction.
Now we just need similar Freesync models so we can get rid of the stupid G-sync tax.
"IPS panel ... will give ... generally a better color accuracy as well, although that will have to be tested" You know, the worst IPS has better colors thatn the best TN. "will have to be tested" lol
That's not true at all. There are plenty of low-end IPS with crap color quality, and there are a few 8-bit TN panels with color quality nearly as good as high end IPS.
Your comment clearly shows you know nothing of the tech you are talking about. They can't measure total input lag because they don't know what your system its. They can however use standard tests for response. Which its what its listed as the latency from a received signal until panel alteration. Input latency is the whole chain from the button press to the monitors response. That will vary by device (Pc, console) therefore almost impossible to quantify for a manufacturer to replicate. What they will do in many cases is attach a simple device with one button that changes the screen from black to white and measure that difference. When the button its pressed a internal timer starts and when a photo sensor detects any change the timer stops the difference is then displayed. Your controllers, mice, and kb all have ICs that add latency to this equation asking a monitor maker to account for your given setup is inane.
A: Adaptive V-Sync is a solution to V-Sync’s issue of stuttering, but input lag issues still persist. NVIDIA G-SYNC solves both the stutter and input lag issue.
No, what's normally referred to as input lag when talking about a monitor is the delay from when a monitor recieves a new frame to when it displays it on the screen. No one other than people who're confused or are trying to obfuscate the issue tries to combine processing times in the computer. This is why input lag can be measured and specified by monitor vendors; because everything in the number is under their control.
This is due to processing that the monitor does to implement scalers to show non-native resolution content full screen, to improve response time scores (by starting transitions that have longer amounts of latency before becoming apparent a frame or two ahead of ones that would occur faster - this was much more of a problem with *VA type displays from a half dozen years ago), and possibly for FRC dithering to make a cheap 6 bit panel appear to be 8 bit (or occasionally 8 appear 10). (I'm not sure if FRC implementations do use implementations that add latency; but being able to look a frame or two ahead would allow for algorithms that more accurately produced colors).
What gSync/Adaptive Vsync do is different. Normally your video card starts sending a new frame out as soon as it renders it; this results in tearing (although rarely at an apparent level) because the rendering pipeline isn't running at exactly 16.7ms/frame (60 Hz). VSync causes your GPU to hold the frame it just completed rendering until the next 16.7ms clock interval passes so that it starts outputing it at the top of the screen instead of the middle to avoid tearing. The problem with doing so is that if your GPU can't complete a frame in 16.7ms it has to wait until the start of the next frame (another 16.7ms later) to begin sending it to the monitor. This means that if your GPU is taking 20ms/frame instead of getting 50 FPS and a bit of tearing you get no tearing but only 30 FPS because the GPU ends up sitting for 13ms after finishing each frame before outputting it. In the real world your games won't render all frames in the same amount of time and Adaptive VSync just turns it on/off depending on if their algorithm thinks tearing or delaying a fame to be the greater evil.
Both GSync and FreeSync give the monitor the ability to receive frames of output at something other than a strict 60hz rate so the GPU can (to reuse my example from above) send out fresh frames every 20ms as they're rendered instead of having to add the synchronization delay for vsync.
The VSync delay that's being removed is not what's normally called input lag. My gut reaction is that nVidia marketting is playing fast and loose with definitions to make a deceptive claim; although it's possible that nVidia only allows its gsync modules to be used in monitors that don't do a frame or two of pre-processing before displaying them.
While you're right in theory, enabling vsync in many games can add a huge amount of latency, even when rendering at a solid 60FPS. Far more latency than the 17ms you'd normally expect. Perhaps it's a buffering issue.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
27 Comments
Back to Article
Dhalmel - Saturday, January 3, 2015 - link
That feel when you don't know if you have a high refresh 1440p IPS monitor or a HQ IPS 60hz 4k monitor.....SOME DAY
Dhalmel - Saturday, January 3, 2015 - link
want*kenansadhu - Sunday, January 4, 2015 - link
Lol =)r3loaded - Sunday, January 4, 2015 - link
144Hz IPS 3840x2160 (or even 5120x2880!) monitor with adaptive sync.I can dream!
DanNeely - Sunday, January 4, 2015 - link
Unfortunately, to get 5k above 60hz; even displayport 1.3 will need two cables. :(Kutark - Sunday, January 4, 2015 - link
Someone was mentioning that in another forum that 1440p, 144hz, a 32 bit color is beyond display port 1.2's capabilities. I looked it up and it appears to be close. Is using two cables an option?DanNeely - Sunday, January 4, 2015 - link
Since, AFAIK, there aren't any DP 1.3 capable cards out yet (the standard was only finalized in September of 2014); all current 1440p 120/144hz monitors and all 4k 60hz monitors support dual cable DP1.2. I believe that they all appear as two half width monitors to your graphicscard and require using eyefininity/3d surround to combine them into a single screen to be presented to the OS.DanNeely - Sunday, January 4, 2015 - link
err strike part of that; I was off one DP generation (and 2x speed) mentally. DP 1.2 is able to fit both of those resolutions in a single stream. For high bitrate color; output levels are 30 bits (10 bits/channel); which does just squeeze in at 1440p /144Hz. What is normally called 32-bit color on your GPU is 24bit color and an 8 bit alpha (transparency) channe. The alpha is only used in rendering and not output to the monitor. I'm not sure if 30bit color is bitpacked to 40bits (to minimize memory consumption) when an alpha channel is needed or expanded to 64 bits/pixel (memory accesses are easier to program/generally faster when aligned to the size of processor data words).Kutark - Monday, January 5, 2015 - link
Thanks man, not only clarified for me but gave me some ammo to use on another forum.(some guy was saying this monitor is stupid because it can't even support 32bit color @ 144hz/1440p). By the bandwidth numbers i was looking up it was close, but now that you clarified how the video card actually does 32 bit color, it makes a ton more sense.
zodiacfml - Sunday, January 4, 2015 - link
same.cosmotic - Saturday, January 3, 2015 - link
I don't think acer knows what "Frameless" means.nathanddrews - Saturday, January 3, 2015 - link
They do know, which is why they didn't call it "bezel-less". The way LCD panels are made makes it near impossible to achieve a panel with zero frame or bezel.sor - Saturday, January 3, 2015 - link
Seems horrible to have a red bezel... You'd want a neutral color so it doesn't potentially clash with what's on screen.Samus - Saturday, January 3, 2015 - link
I agree. That's why I never understood Apple's white-bezel Cinema Displays (which eventually went to Silver/Aluminum Bezels)mobutu - Sunday, January 4, 2015 - link
You know, both black and white are neutral.Mr.r9 - Sunday, January 4, 2015 - link
I know. I've even painted the wall behind my PC/TV mat black.Hung_Low - Sunday, January 4, 2015 - link
finally, 144hz ips!!!it's probably an AUO panel, since AUO announced a year ago that they are R&Ding on an 120hz+ IPS panel.
Mr Perfect - Sunday, January 4, 2015 - link
This makes me wonder if press reports are calling panel types by the right name. AUO anounced a 2560x1440 144Hz AHVA panel( http://www.tftcentral.co.uk/news_archive/31.htm#14... ), but a lot of sites reported it as an IPS panel because the tech is similar and everyone knows what IPS is.So is this Acer display the AHVA panel being called an IPS for marketing simplicity, or did LG get their IPS panels up to 144Hz?
Kutark - Sunday, January 4, 2015 - link
AHVA and IPS are basically the same thing, and yes, AHVA panels are marketed as IPS, because they are IPS panels, they do use in-plane switchingMondozai - Sunday, January 4, 2015 - link
The fact that they do not disclose the input lag on the IPS model is worrying. The main problem of IPS displays have always been input lag from a gaming perspective. Plus the ROG Swift TN panel had very good colors. Still, it's good that they are moving in the right direction.Now we just need similar Freesync models so we can get rid of the stupid G-sync tax.
mobutu - Sunday, January 4, 2015 - link
"IPS panel ... will give ... generally a better color accuracy as well, although that will have to be tested"You know, the worst IPS has better colors thatn the best TN. "will have to be tested" lol
QuantumPion - Monday, January 5, 2015 - link
That's not true at all. There are plenty of low-end IPS with crap color quality, and there are a few 8-bit TN panels with color quality nearly as good as high end IPS.hpglow - Sunday, January 4, 2015 - link
Your comment clearly shows you know nothing of the tech you are talking about. They can't measure total input lag because they don't know what your system its. They can however use standard tests for response. Which its what its listed as the latency from a received signal until panel alteration. Input latency is the whole chain from the button press to the monitors response. That will vary by device (Pc, console) therefore almost impossible to quantify for a manufacturer to replicate. What they will do in many cases is attach a simple device with one button that changes the screen from black to white and measure that difference. When the button its pressed a internal timer starts and when a photo sensor detects any change the timer stops the difference is then displayed. Your controllers, mice, and kb all have ICs that add latency to this equation asking a monitor maker to account for your given setup is inane.Kutark - Sunday, January 4, 2015 - link
From my basic understanding of Gsync, is also that gsync removes any input lag. Something to consider:http://www.geforce.com/hardware/technology/g-sync/...
Q: How is G-SYNC different than Adaptive V-SYNC?
A: Adaptive V-Sync is a solution to V-Sync’s issue of stuttering, but input lag issues still persist. NVIDIA G-SYNC solves both the stutter and input lag issue.
DanNeely - Sunday, January 4, 2015 - link
No, what's normally referred to as input lag when talking about a monitor is the delay from when a monitor recieves a new frame to when it displays it on the screen. No one other than people who're confused or are trying to obfuscate the issue tries to combine processing times in the computer. This is why input lag can be measured and specified by monitor vendors; because everything in the number is under their control.This is due to processing that the monitor does to implement scalers to show non-native resolution content full screen, to improve response time scores (by starting transitions that have longer amounts of latency before becoming apparent a frame or two ahead of ones that would occur faster - this was much more of a problem with *VA type displays from a half dozen years ago), and possibly for FRC dithering to make a cheap 6 bit panel appear to be 8 bit (or occasionally 8 appear 10). (I'm not sure if FRC implementations do use implementations that add latency; but being able to look a frame or two ahead would allow for algorithms that more accurately produced colors).
What gSync/Adaptive Vsync do is different. Normally your video card starts sending a new frame out as soon as it renders it; this results in tearing (although rarely at an apparent level) because the rendering pipeline isn't running at exactly 16.7ms/frame (60 Hz). VSync causes your GPU to hold the frame it just completed rendering until the next 16.7ms clock interval passes so that it starts outputing it at the top of the screen instead of the middle to avoid tearing. The problem with doing so is that if your GPU can't complete a frame in 16.7ms it has to wait until the start of the next frame (another 16.7ms later) to begin sending it to the monitor. This means that if your GPU is taking 20ms/frame instead of getting 50 FPS and a bit of tearing you get no tearing but only 30 FPS because the GPU ends up sitting for 13ms after finishing each frame before outputting it. In the real world your games won't render all frames in the same amount of time and Adaptive VSync just turns it on/off depending on if their algorithm thinks tearing or delaying a fame to be the greater evil.
Both GSync and FreeSync give the monitor the ability to receive frames of output at something other than a strict 60hz rate so the GPU can (to reuse my example from above) send out fresh frames every 20ms as they're rendered instead of having to add the synchronization delay for vsync.
The VSync delay that's being removed is not what's normally called input lag. My gut reaction is that nVidia marketting is playing fast and loose with definitions to make a deceptive claim; although it's possible that nVidia only allows its gsync modules to be used in monitors that don't do a frame or two of pre-processing before displaying them.
Guspaz - Sunday, January 4, 2015 - link
While you're right in theory, enabling vsync in many games can add a huge amount of latency, even when rendering at a solid 60FPS. Far more latency than the 17ms you'd normally expect. Perhaps it's a buffering issue.theuglyman0war - Friday, January 9, 2015 - link
3D Vision?