Because dual-panel costs far more than FALD, has absurd power consumption (the Dulby Vision concept display has a WATER COOLED backlight to avoid cooking the panels!), and poor black levels without using a FALD backlight anyway.
I was under the impression that NVIDIA was transitioning to supporting FreeSync or something with their newer GPUs. Why G-Sync anything now if that's about to be abandoned?
As I understand it, there will still be G-sync monitors which will have that branding and the physical G-sync module in the monitor along with meeting certain standards Nvidia sets and tests the monitor for. Then there are also various FreeSync monitors without the G-sync module which Nvidia tests and then qualifies as "G-sync compatible" - basically stating that the VRR experience is good enough for their standards, even if it isn't actually a G-sync monitor. Then there's also G-sync ultimate, which has to do with 1000+ nits HDR or something.
if you know nvidia, then u know they are always trying to be on the forefront. everything out there except the latest LG panels uses gen 1. they needed a refresh badly but didnt want to miss out on those first gen high refresh displays and gpu sales so they started this gsync compatible stopgap. Once they roll out gsync gen 2 or whatever is called premium, thats gonna be their focus and not so much gsynci compatible displays
NVIDIA added support for VESA Adaptive Sync monitors. However they are not dropping G-Sync, particularly at the high-end since there's no equivalent VESA standard for all of the different requirements NVIDIA has for G-Sync Ultimate HDR monitors (brightness, color space, GtG, refresh rate, etc).
No, nVidia just added software / driver level support for VESA AdaptiveSync monitors so that you can use them in GSync Compatible (with artifacts and all w/ most AdaptiveSync monitors) mode with your nVidia GPU.
If you want: - guaranteed low latency - ULMB - wider refresh rate range guaranteed - HDR with G-Sync enabled (incl. lower latency, wide colour space, minimum contrast, brightness, etc)
Then you need to shell out money for a hardware module based nVidia GSync (or Gsync HDR aka GSync Ultimate) certified monitor. Not just a GSync compatible monitor (which has no Gsync lab testing and hardware module inside).
To be fair, aside from the GSync hardware module, there is nothing stopping a (non-GSync) VESA Adaptive Sync monitor from supporting any or all of the requisite features offer by the higher end nVidia certifications. In fact, there are FreeSync monitors available that have some of the features you listed. However, you would need to verify it all yourself. The nVidia certification makes things much easier. I'm relatively happy with the GSync program in its current form and wish it had evolved to this point much earlier.
I know there is a lot of praise here for Gsync Ultimate monitors. However, Asus has recently announced a 43" LCD 4k144 w/ DP1.4+DSC1.2 DisplayHDR1000 w/ FreeSync which carries same VESA certification as these GSync Ultimate monitors with the added advantage of having DSC1.2 (which allows 4k120 at full chroma over a single DP cable), bigger screen size, and no Gsync tax.
The backlight system of this monitor is unknown and FreeSync HDR require the use of an AMD's proprietary API by the game developers to offer a proper tone mapping without incurring a big latency penalty. The priceing is also unknown so there may be a FreeSync tax instead
These will cost >$2k which is ridiculous in the age of dropping OLED prices. Bezels are huge and stands are typical ugly gamer style. No HDMI 2.1. Limited by DP 1.4. No full chroma at 4k beyond 98 Hz.
Actually I'm still waiting to hear more on that. We're finally to the point where DSC is viable; but it's not clear if NV is going to use it (or if it can even be easily added to their current G-Sync U module).
Not too crazy considering that OLED monitors will probably always be a niche product as the frequently static images from a computer and burn-in don't mix.
tl;dr: Why don't they just track subpixel aging and then dim the other pixels to match so that the image is always uniform?
One thing I don't understand is why OLED panels don't do per-subpixel lifespan tracking with adapting dimming as one of the several techniques to avoid burn-in. OLED subpixels age at a predictable rate based on time, intensity, and colour, right? So why not track the total amount of light emitted by each OLED subpixel and then adjust the brightness of the entire screen on a per-subpixel basis to compensate? That way, instead of burn-in, you'd just decrease the overall brightness of the screen a bit so that the image was uniform.
It doesn't seem like it'd be that hard either, assign a weighted 8-bit aging value to each subpixel in each frame, and then accumulate the values for every frame in a 64-bit-per-subpixel non-volatile buffer. At 4K120 with four subpixels per pixel (as I believe LG TV panels are), that should be enough for roughly 167 billion operating hours of data in a ~256MB buffer. Then that data can also be used to generate a brightness map for the screen for compensation.
You don't need to write the data every frame, just keep the running tally in RAM and then write it periodically to SLC or something. Or hell, write it every frame to optane, it's not like 256MB of anything costs much.
The big questions for me are a) have they fixed the DP bandwidth problem that lead to chroma subsampling over 120hz [unfortunately it doesn't look like it], and b) have they gotten rid of the awful fan [unfortunately it doesn't look like it]
This is unfortunately a disappointing second generation until they can fix those problems.
a)There's nothing to fix (nothing broke), it's a limitation of the DP protocol b)The FALD (and its driver) need to be cooled or they will damage the panel and themselves, the driver cold be "easily" passively cooled, the FALD I'm not so sure
This isn't a gaming monitor, this would be better suited for a TV. Higher refresh (beyond 240), lower response times, 0 input lag, sync technology... These are a few of our favorite things. Then you throw all the other bells and whistles at it, like 4k.
i got banned in forum and i cant contact the admin, the site is bloked, please anandtech, reactivate my account, it was a simple failure of me! please help me! i will not do the post again. thank you!
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
32 Comments
Back to Article
nathanddrews - Tuesday, June 18, 2019 - link
Unless these also use dual-layer LCD, I'm worried that the contrast will still struggle. However, as a MicroLED stopgap goes, it could be nice.mukiex - Tuesday, June 18, 2019 - link
Maaaan, dual-layer LCD would be awesome. How come we haven't seen anything on that since the HiSense demo in January?TheWereCat - Tuesday, June 18, 2019 - link
Wouldn't the brightness be an issue then?edzieba - Tuesday, June 18, 2019 - link
Because dual-panel costs far more than FALD, has absurd power consumption (the Dulby Vision concept display has a WATER COOLED backlight to avoid cooking the panels!), and poor black levels without using a FALD backlight anyway.PeachNCream - Tuesday, June 18, 2019 - link
I was under the impression that NVIDIA was transitioning to supporting FreeSync or something with their newer GPUs. Why G-Sync anything now if that's about to be abandoned?0siris - Tuesday, June 18, 2019 - link
As I understand it, there will still be G-sync monitors which will have that branding and the physical G-sync module in the monitor along with meeting certain standards Nvidia sets and tests the monitor for. Then there are also various FreeSync monitors without the G-sync module which Nvidia tests and then qualifies as "G-sync compatible" - basically stating that the VRR experience is good enough for their standards, even if it isn't actually a G-sync monitor. Then there's also G-sync ultimate, which has to do with 1000+ nits HDR or something.Hxx - Tuesday, June 18, 2019 - link
if you know nvidia, then u know they are always trying to be on the forefront. everything out there except the latest LG panels uses gen 1. they needed a refresh badly but didnt want to miss out on those first gen high refresh displays and gpu sales so they started this gsync compatible stopgap. Once they roll out gsync gen 2 or whatever is called premium, thats gonna be their focus and not so much gsynci compatible displaysRyan Smith - Tuesday, June 18, 2019 - link
NVIDIA added support for VESA Adaptive Sync monitors. However they are not dropping G-Sync, particularly at the high-end since there's no equivalent VESA standard for all of the different requirements NVIDIA has for G-Sync Ultimate HDR monitors (brightness, color space, GtG, refresh rate, etc).halcyon - Wednesday, June 19, 2019 - link
No, nVidia just added software / driver level support for VESA AdaptiveSync monitors so that you can use them in GSync Compatible (with artifacts and all w/ most AdaptiveSync monitors) mode with your nVidia GPU.If you want:
- guaranteed low latency
- ULMB
- wider refresh rate range guaranteed
- HDR with G-Sync enabled (incl. lower latency, wide colour space, minimum contrast, brightness, etc)
Then you need to shell out money for a hardware module based nVidia GSync (or Gsync HDR aka GSync Ultimate) certified monitor. Not just a GSync compatible monitor (which has no Gsync lab testing and hardware module inside).
BurntMyBacon - Thursday, June 20, 2019 - link
@halcyonTo be fair, aside from the GSync hardware module, there is nothing stopping a (non-GSync) VESA Adaptive Sync monitor from supporting any or all of the requisite features offer by the higher end nVidia certifications. In fact, there are FreeSync monitors available that have some of the features you listed. However, you would need to verify it all yourself. The nVidia certification makes things much easier. I'm relatively happy with the GSync program in its current form and wish it had evolved to this point much earlier.
Diji1 - Tuesday, June 25, 2019 - link
>aside from the GSync hardware moduleYeah, monitors can just magically do all that stuff that the G-sync hardware does without the G-sync hardware module guys.
Please talk sense.
JasonAT - Wednesday, June 19, 2019 - link
I know there is a lot of praise here for Gsync Ultimate monitors. However, Asus has recently announced a 43" LCD 4k144 w/ DP1.4+DSC1.2 DisplayHDR1000 w/ FreeSync which carries same VESA certification as these GSync Ultimate monitors with the added advantage of having DSC1.2 (which allows 4k120 at full chroma over a single DP cable), bigger screen size, and no Gsync tax.JasonAT - Wednesday, June 19, 2019 - link
I should clarify that DSC1.2 allows for "visually lossless" 4k120, whether that is true 4:4:4 or not is debatable.Sefem - Wednesday, June 19, 2019 - link
The backlight system of this monitor is unknown and FreeSync HDR require the use of an AMD's proprietary API by the game developers to offer a proper tone mapping without incurring a big latency penalty.The priceing is also unknown so there may be a FreeSync tax instead
DigitalFreak - Tuesday, June 18, 2019 - link
$3000Greyh0und - Tuesday, July 30, 2019 - link
Asus PG35VQ has 512 dimming zones (vs 576 for these monitors) and costs about €2700,00, so these new monitors are likely going to cost €3000+.JasonAT - Tuesday, June 18, 2019 - link
These will cost >$2k which is ridiculous in the age of dropping OLED prices. Bezels are huge and stands are typical ugly gamer style. No HDMI 2.1. Limited by DP 1.4. No full chroma at 4k beyond 98 Hz.IndianaKrom - Tuesday, June 18, 2019 - link
Yeah, VESA definitely needs stop dragging their feet on the next DisplayPort standard.Ryan Smith - Tuesday, June 18, 2019 - link
"No full chroma at 4k beyond 98 Hz."Actually I'm still waiting to hear more on that. We're finally to the point where DSC is viable; but it's not clear if NV is going to use it (or if it can even be easily added to their current G-Sync U module).
Zanor - Tuesday, June 18, 2019 - link
Not too crazy considering that OLED monitors will probably always be a niche product as the frequently static images from a computer and burn-in don't mix.Guspaz - Thursday, June 20, 2019 - link
tl;dr: Why don't they just track subpixel aging and then dim the other pixels to match so that the image is always uniform?One thing I don't understand is why OLED panels don't do per-subpixel lifespan tracking with adapting dimming as one of the several techniques to avoid burn-in. OLED subpixels age at a predictable rate based on time, intensity, and colour, right? So why not track the total amount of light emitted by each OLED subpixel and then adjust the brightness of the entire screen on a per-subpixel basis to compensate? That way, instead of burn-in, you'd just decrease the overall brightness of the screen a bit so that the image was uniform.
It doesn't seem like it'd be that hard either, assign a weighted 8-bit aging value to each subpixel in each frame, and then accumulate the values for every frame in a 64-bit-per-subpixel non-volatile buffer. At 4K120 with four subpixels per pixel (as I believe LG TV panels are), that should be enough for roughly 167 billion operating hours of data in a ~256MB buffer. Then that data can also be used to generate a brightness map for the screen for compensation.
You don't need to write the data every frame, just keep the running tally in RAM and then write it periodically to SLC or something. Or hell, write it every frame to optane, it's not like 256MB of anything costs much.
halcyon - Wednesday, June 19, 2019 - link
LG already sells a 38" "1ms" IPS no-Gsync, faux-HDR screen for $2000USD. That is ridiculous.At least these monitors have:
- guaranteed/tested low latency
- True HDR
- FALD
- GSync (Hw) + HDR both on
Unlike the LG basic high refresh rate IPS screen
zodiacfml - Wednesday, June 19, 2019 - link
Same thoughts. I'd rather take LG's flagship OLED TV with HDMI 2.1 and VRR for $2000spkay31 - Tuesday, June 18, 2019 - link
Far too small a display to appreciate the 4K resolution. I would be interested in a curved 40" or 43" display with these specs.Cygni - Tuesday, June 18, 2019 - link
The big questions for me are a) have they fixed the DP bandwidth problem that lead to chroma subsampling over 120hz [unfortunately it doesn't look like it], and b) have they gotten rid of the awful fan [unfortunately it doesn't look like it]This is unfortunately a disappointing second generation until they can fix those problems.
Sefem - Wednesday, June 19, 2019 - link
a)There's nothing to fix (nothing broke), it's a limitation of the DP protocolb)The FALD (and its driver) need to be cooled or they will damage the panel and themselves, the driver cold be "easily" passively cooled, the FALD I'm not so sure
Bensam123 - Wednesday, June 19, 2019 - link
This isn't a gaming monitor, this would be better suited for a TV. Higher refresh (beyond 240), lower response times, 0 input lag, sync technology... These are a few of our favorite things. Then you throw all the other bells and whistles at it, like 4k.Slashchat - Wednesday, June 19, 2019 - link
i got banned in forum and i cant contact the admin, the site is bloked, please anandtech, reactivate my account, it was a simple failure of me! please help me! i will not do the post again. thank you!jabbadap - Wednesday, June 19, 2019 - link
Are these still active cooled?peevee - Thursday, June 20, 2019 - link
"576-zone Mini LED-based backlighting system"Give me 1920x1200x3 hard color-specific LEDs already.
naphtali - Sunday, September 22, 2019 - link
Will there be 32" mini LED monitors as well? 27" is somewhat small.JEmlay - Tuesday, October 29, 2019 - link
Why is it 27" only? Lame!