You're grossly incorrect about Nvidia losing. They forced the hand of the industry to better adopt Adaptive-Sync within acceptable range than what they were going with originally.
Nvidia is who made HDR1000 and the adaptive ranges be as wide as they are advocating for pro & high-end gamers getting what most would consider competent HDR for LCD panels.
Be more objective than looking at it at a fanboy-ish lense. Most of us read tech news on this site to avoid such nonsense.
Except it isn't; HDMI-VRR is an entirely different implementation. And given that I expect most high-range TVs are going to support it by default, I suspect this will eventually kill off Freesync entirely.
You're misinformed; Nvidia leverages VRR for RTX & 16 series owners to have adaptive-sync for their content on this TV & other devices.
Some Freesync monitors aren't compatible for having abysmal adaptive sync ranges not aligned w/ Nvidia & most prosumer's idea of a sensible range for interactive content. Nvidia primarily caters to prosumers & high-end gamers advocating for these wider ranges on behalf of such users while AMD & the original standards at first had different ideas that wasn't better but enabled cheaper & worser panels to be considered adaptive-sync-capable enough.
This has been contentious since G-Sync vs FreeSync has existed with G-Sync panels consistently providing better experiences & ranges for adaptive-sync gaming than AMD and the standard.
This has been ironed quite a bit w/ AMD FreeSync 2 that also considered HDR & G-Sync HDR. That said G-Sync HDR has far better and prosumer & high-end gamer friendly HDR standards than FreeSync HDR such as requiring HDR1000 minimum & etc.
The panel is not the real difference between G-Sync/Freesync monitors. The difference is between monitor drivers and firmwares.
Nvidia basically sell the driver chip to "ensure better experience". The chip itself is expensive and the G-sync version monitor usually cost 150$ more compared to the Freesync monitor with same panel, vendor, design,etc.
It's not just the Nvidia chip, it's the work the monitor maker had to do in addition to pass all the requirements of the gysnc standard (which Nvidia test to check they've done). Freesync had nearly no requirements or testing, so they could slap a freesync sticker on practically anything. That's still the case with gsync compatible - they are actually forced to make their monitor work to a certain standard as opposed to implementing something, slapping a freesync sticker on and calling it a day.
Freesync-over-HDMI was always a stop-gap standard. Since the HDMI Forum opted to develop HDMI-VRR rather than promote Freesync-over-HDMI to an official HDMI extension, it was never going to get traction. Instead, it's largely served as a good proof of concept for the idea of variable refresh.
Going forward, everyone who implements variable refresh for TVs is going to be doing HDMI-VRR. That will go for both sources and sinks. Which is fine for AMD; they were the first vendor to ship a VRR source device anyhow (via the Xbox One X).
why in the hell would people be blaming Nv for AMD actions? Nv does so plenty enough on their own ROFL...
What I find funny and would tick me off big time, is having to paid X on top of monitor/display costs for years since Nv made a big song and dance that G-sync absolutely required that stupid module AND specific GPU from them and them alone to be "graced" the ability to use it in first place.seems this was outright BS by Nv to get even more $$$$ however they possibly could...
FUDGE NV, we not need IMHO corporations playing BS like that..notice it says TURING..again, shank their own customers in the gut/back...how lovely
While I am disappointed that NVIDIA has thus far never done much to justify a proprietary module being placed on the monitor instead of designating changes that need to be implemented in the scalar chips, I think you are mischaracterizing the difference between G-Sync and Freesync, an important reason for the difference in price, and why NVIDIA is able to "grace" some monitors with the ability to be called "G-Sync". It is because G-Sync and FreeSync (as opposed to the "Adaptive Sync" VESA standard) are certification programs and the G-Sync certification program is much more stringent than the FreeSync one. The difference in the experience of G-Sync and FreeSync over the range of available monitors has been well-documented over the years. More stringent certification means higher prices. FreeSync monitors that meet this more stringent certification without using NVIDIA's module are now called "GSync-compatible" or whatever the name is.
With LG 2019 OLED having HDMI 2.1 along w/ Dolby Vision, HLG, TechniColor, & HDR10 HDR, this update practically makes the value proposition of all current "gaming TVs" & existing 4K@120hz gaming monitors extremely questionable—including the Alienware OLED that uses the same panel.
Asus has their answer w/ the PA32UCG coming out next year (HDR1400+, 4K@120hz VRR, Dolby Vision, HLG, & HDR10 support, Thunderbolt 3, HDMI 2.1 rumored w/ its VRR support), but the other monitor manufacturers don't. monitors manufacturers creating specialized monitors for interactive monitor need to get their act together ASAP.
No self-respecting high-end 4K prosumer & high-end gamer should buy a 4K monitor without the following next year: - Dolby Vision & HLG on top of HDR10 (ideally HDR10+ for dynamic HDR on the Web) - HDMI 2.1; at least VRR or GTFO - Thunderbolt 3 (charge laptop simultaneously that connect to it & only use one cable; USB4 merges in TB3 so you're good for years).
uhh i got some bad news for you then..that monitor is delayed indefinitely. But you should know that because, well Asus always puts out press releases as proof of concept than actual products for a long awhile. I mean i got a bookmark showing a asus monitor i wanted for almost 2 years ago that just came out this year..and didn't come close to hype.
Also Dolby vision and HDR? Are you serious? No self respecting person would even consider that for a monitor, especially since they don't sell well as it is as a gimmick.
My guess they are not even going to explore it. Technically possible or not, they are going to try sell 2019 models this Xmas season. It sucks but that is capitalism for you.
This might tip the scale for me as I was considering Samsung QLED TV for living room. OLED provides better blacks but LCD can be brighter and stable in terms of color rendering. OLEDs age incredibly fast.
Not sure though if I will game with the big TV instead of my desktop monitor. So not sure how useful this will be for me.
A point in LG’s favor is that the E8/C8/B8 series panels and earlier are all HDMI 2.0. They’d never be able to go beyond 60hz with a 4K signal, and presumably they are using a different HDMI chipset from the 9 series panels; assuming it could be made to work with HDMI-VRR they’d have to engineer a second solution for a product line they no longer sell and would limited in its headline mode (4K).
Not saying it wouldn’t be welcome but it’s understandable if they are focusing on their current line that can support it in full. Personally eyeing up a C9 77” - if it goes below $5k for Black Friday I might have to pull the trigger.
But technically, there may be a processor limit in the 8 series of oleds that wouldn't allow them to handle the duties required for this. They may have been done with the 8 series spec before starting this initiative. Also, the business case as already stated.
Unlikely given the 2018 series are limited to supporting the HDMI 2.0 spec under the hood.
The LG 2019 TVs could already support the HDMI 2.1 specification, but there are no current HDMI 2.1 GPUs out there. NVIDIA is just exposing HDMI VRR when outputting HDMI 2.0. In theory, other devices that support HDMI VRR would also benefit, but right now nothing else does; LG was the only manufacturer that made it's 2019 lineup HDMI 2.1 capable.
Next year, I suspect pretty much everything will support HDMI VRR, making this all moot.
I used to use 3 24" 1080p monitors. now I use 1 50" TV as a monitor. Works great but I wanted a better panel than what I have now.I was eyeballing a 50" was hoping it would be supported. The price delta between 50 and 55" is a good bit and 55" is just a bit too large.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
36 Comments
Back to Article
jeremyshaw - Friday, November 1, 2019 - link
Still no Freesync HDMI support, lol. I wonder how long it will be before people blame Nvidia for AMD's actions.plopke - Friday, November 1, 2019 - link
what do you mean no Freesync HDMI support? unless linux or something , doesn't freesync basiclly uses the HDMI-VRR stuff?plopke - Friday, November 1, 2019 - link
nevermind , after some googlign i just got more confused. Holy moly is USB,display,HDMI so confusign these days with all these flexible "standards".voicequal - Friday, November 1, 2019 - link
AMD was instrumental in getting VRR adopted into HDMI 2.1 as an open standard.https://www.anandtech.com/show/12095/hdmi-21-speci...
And that's what NVIDIA is using here:
"NVIDIA has confirmed, under the hood this is all built on top of HDMI-VRR functionality"
Flunk - Friday, November 1, 2019 - link
That's what this is, it's literally FreeSync HDMI. Nvidia calls FreeSync support "G-Sync Compatible" because they don't want to admit they lost.lilkwarrior - Saturday, November 2, 2019 - link
You're grossly incorrect about Nvidia losing. They forced the hand of the industry to better adopt Adaptive-Sync within acceptable range than what they were going with originally.Nvidia is who made HDR1000 and the adaptive ranges be as wide as they are advocating for pro & high-end gamers getting what most would consider competent HDR for LCD panels.
Be more objective than looking at it at a fanboy-ish lense. Most of us read tech news on this site to avoid such nonsense.
Death666Angel - Saturday, November 2, 2019 - link
No pro gamer uses HDR or VRR. :)Lesliestandifer - Saturday, November 2, 2019 - link
Oh how correct you are....flyingpants265 - Monday, November 4, 2019 - link
Is adaptive sync even good? I've never used it. I think I'd rather just set my refresh rate to 90hz..gamerk2 - Sunday, November 10, 2019 - link
Except it isn't; HDMI-VRR is an entirely different implementation. And given that I expect most high-range TVs are going to support it by default, I suspect this will eventually kill off Freesync entirely.lilkwarrior - Saturday, November 2, 2019 - link
You're misinformed; Nvidia leverages VRR for RTX & 16 series owners to have adaptive-sync for their content on this TV & other devices.Some Freesync monitors aren't compatible for having abysmal adaptive sync ranges not aligned w/ Nvidia & most prosumer's idea of a sensible range for interactive content. Nvidia primarily caters to prosumers & high-end gamers advocating for these wider ranges on behalf of such users while AMD & the original standards at first had different ideas that wasn't better but enabled cheaper & worser panels to be considered adaptive-sync-capable enough.
This has been contentious since G-Sync vs FreeSync has existed with G-Sync panels consistently providing better experiences & ranges for adaptive-sync gaming than AMD and the standard.
This has been ironed quite a bit w/ AMD FreeSync 2 that also considered HDR & G-Sync HDR. That said G-Sync HDR has far better and prosumer & high-end gamer friendly HDR standards than FreeSync HDR such as requiring HDR1000 minimum & etc.
hyno111 - Saturday, November 2, 2019 - link
The panel is not the real difference between G-Sync/Freesync monitors. The difference is between monitor drivers and firmwares.Nvidia basically sell the driver chip to "ensure better experience". The chip itself is expensive and the G-sync version monitor usually cost 150$ more compared to the Freesync monitor with same panel, vendor, design,etc.
Dribble - Monday, November 4, 2019 - link
It's not just the Nvidia chip, it's the work the monitor maker had to do in addition to pass all the requirements of the gysnc standard (which Nvidia test to check they've done). Freesync had nearly no requirements or testing, so they could slap a freesync sticker on practically anything. That's still the case with gsync compatible - they are actually forced to make their monitor work to a certain standard as opposed to implementing something, slapping a freesync sticker on and calling it a day.Ryan Smith - Saturday, November 2, 2019 - link
Freesync-over-HDMI was always a stop-gap standard. Since the HDMI Forum opted to develop HDMI-VRR rather than promote Freesync-over-HDMI to an official HDMI extension, it was never going to get traction. Instead, it's largely served as a good proof of concept for the idea of variable refresh.Going forward, everyone who implements variable refresh for TVs is going to be doing HDMI-VRR. That will go for both sources and sinks. Which is fine for AMD; they were the first vendor to ship a VRR source device anyhow (via the Xbox One X).
Dragonstongue - Monday, November 4, 2019 - link
why in the hell would people be blaming Nv for AMD actions? Nv does so plenty enough on their ownROFL...
What I find funny and would tick me off big time, is having to paid X on top of monitor/display costs for years since Nv made a big song and dance that G-sync absolutely required that stupid module AND specific GPU from them and them alone to be "graced" the ability to use it in first place.seems this was outright BS by Nv to get even more $$$$ however they possibly could...
FUDGE NV, we not need IMHO corporations playing BS like that..notice it says TURING..again, shank their own customers in the gut/back...how lovely
Yojimbo - Monday, November 4, 2019 - link
While I am disappointed that NVIDIA has thus far never done much to justify a proprietary module being placed on the monitor instead of designating changes that need to be implemented in the scalar chips, I think you are mischaracterizing the difference between G-Sync and Freesync, an important reason for the difference in price, and why NVIDIA is able to "grace" some monitors with the ability to be called "G-Sync". It is because G-Sync and FreeSync (as opposed to the "Adaptive Sync" VESA standard) are certification programs and the G-Sync certification program is much more stringent than the FreeSync one. The difference in the experience of G-Sync and FreeSync over the range of available monitors has been well-documented over the years. More stringent certification means higher prices. FreeSync monitors that meet this more stringent certification without using NVIDIA's module are now called "GSync-compatible" or whatever the name is.willis936 - Friday, November 1, 2019 - link
Announced just in time for TV season. I wonder if this will come before or after the B9 gets BFI actually added.SeannyB - Friday, November 1, 2019 - link
I saw an article about this earlier today, and they said VRR 120Hz+ would be limited to 1080p and 1440p. Is there confirmation?SeannyB - Friday, November 1, 2019 - link
Actually it's confirmed on Nvidia's press release. (Click Nvidia source link at the bottom of the article.)Ryan Smith - Saturday, November 2, 2019 - link
Yeah. It's still HDMI 2.0 signaling, so there's not enough bandwidth for 4K above 60Hz.Hixbot - Saturday, November 2, 2019 - link
There should be enough bandwidth for 4k 8bit 4:2:0 at 120hz. But it doesn't work.lilkwarrior - Saturday, November 2, 2019 - link
With LG 2019 OLED having HDMI 2.1 along w/ Dolby Vision, HLG, TechniColor, & HDR10 HDR, this update practically makes the value proposition of all current "gaming TVs" & existing 4K@120hz gaming monitors extremely questionable—including the Alienware OLED that uses the same panel.Asus has their answer w/ the PA32UCG coming out next year (HDR1400+, 4K@120hz VRR, Dolby Vision, HLG, & HDR10 support, Thunderbolt 3, HDMI 2.1 rumored w/ its VRR support), but the other monitor manufacturers don't. monitors manufacturers creating specialized monitors for interactive monitor need to get their act together ASAP.
No self-respecting high-end 4K prosumer & high-end gamer should buy a 4K monitor without the following next year:
- Dolby Vision & HLG on top of HDR10 (ideally HDR10+ for dynamic HDR on the Web)
- HDMI 2.1; at least VRR or GTFO
- Thunderbolt 3 (charge laptop simultaneously that connect to it & only use one cable; USB4 merges in TB3 so you're good for years).
lilkwarrior - Saturday, November 2, 2019 - link
Displayport 2.0 is ideal, but will only become viable during the 2nd half of 2020.imaheadcase - Saturday, November 2, 2019 - link
uhh i got some bad news for you then..that monitor is delayed indefinitely. But you should know that because, well Asus always puts out press releases as proof of concept than actual products for a long awhile. I mean i got a bookmark showing a asus monitor i wanted for almost 2 years ago that just came out this year..and didn't come close to hype.Also Dolby vision and HDR? Are you serious? No self respecting person would even consider that for a monitor, especially since they don't sell well as it is as a gimmick.
crimsonson - Saturday, November 2, 2019 - link
HDR is a gimmick? It is one of the most important progress for display in years, probably since HDTV. More relevant than 3D, 4K, etc.Not sure how you can even say that. The current standard of SDR at 100 nits is decades old. Even current low end monitors can do 200+ nits.
Yojimbo - Monday, November 4, 2019 - link
To some people, everything that doesn't take over the world in a year is a gimmick.Yojimbo - Monday, November 4, 2019 - link
Examples: VR is a gimmick, ray tracing is a gimmick, HDR is a gimmick, television is a gimmick, etc.andysab - Saturday, November 2, 2019 - link
Is there any chance this firmware update will eventually make it to the 2018 OLED line as well, or are there technical limitations preventing it?crimsonson - Saturday, November 2, 2019 - link
My guess they are not even going to explore it. Technically possible or not, they are going to try sell 2019 models this Xmas season. It sucks but that is capitalism for you.This might tip the scale for me as I was considering Samsung QLED TV for living room. OLED provides better blacks but LCD can be brighter and stable in terms of color rendering. OLEDs age incredibly fast.
Not sure though if I will game with the big TV instead of my desktop monitor. So not sure how useful this will be for me.
HammerStrike - Saturday, November 2, 2019 - link
A point in LG’s favor is that the E8/C8/B8 series panels and earlier are all HDMI 2.0. They’d never be able to go beyond 60hz with a 4K signal, and presumably they are using a different HDMI chipset from the 9 series panels; assuming it could be made to work with HDMI-VRR they’d have to engineer a second solution for a product line they no longer sell and would limited in its headline mode (4K).Not saying it wouldn’t be welcome but it’s understandable if they are focusing on their current line that can support it in full. Personally eyeing up a C9 77” - if it goes below $5k for Black Friday I might have to pull the trigger.
megadirk - Tuesday, November 5, 2019 - link
I mean technically it's below 5k. https://www.bestbuy.com/site/lg-77-class-oled-c9-s...I'm in the same boat as you.
lipscomb88 - Saturday, November 2, 2019 - link
Crimsonson is probably right here.But technically, there may be a processor limit in the 8 series of oleds that wouldn't allow them to handle the duties required for this. They may have been done with the 8 series spec before starting this initiative. Also, the business case as already stated.
gamerk2 - Sunday, November 10, 2019 - link
Unlikely given the 2018 series are limited to supporting the HDMI 2.0 spec under the hood.The LG 2019 TVs could already support the HDMI 2.1 specification, but there are no current HDMI 2.1 GPUs out there. NVIDIA is just exposing HDMI VRR when outputting HDMI 2.0. In theory, other devices that support HDMI VRR would also benefit, but right now nothing else does; LG was the only manufacturer that made it's 2019 lineup HDMI 2.1 capable.
Next year, I suspect pretty much everything will support HDMI VRR, making this all moot.
Manch - Monday, November 4, 2019 - link
I used to use 3 24" 1080p monitors. now I use 1 50" TV as a monitor. Works great but I wanted a better panel than what I have now.I was eyeballing a 50" was hoping it would be supported. The price delta between 50 and 55" is a good bit and 55" is just a bit too large.Simon_Says - Monday, November 4, 2019 - link
Does this mean that AMD graphics will also work with this or is it really locked down to NV cards? Be a shame if it was.gamerk2 - Sunday, November 10, 2019 - link
Locked to NVIDA's Turing line until AMD puts out a Video Card that supports the HDMI 2.1 specification.