Does this mean you can finally use the full 4k 120Hz VRR output of the new LG OLED TVs? I think those TVs would be much better than those "Big Format Gaming Displays" from Nvidia. The HP G Sync monitor is $4k-$5k while the 65" C9 is only $3200.
This sounds like something that would need to be implemented on-board an AIB GPU or motherboard but if it is implemented on on an RTX/Navi 5700 series GPU with DP1.4+DSC1.2. However, this should allow 4k120 at 4:4:4 but whether you will get VRR is unknown.
Unfortunately, LG doesn't support FreeSync. It supports HDMI Forum VRR. FreeSync is AMD's derivative of VESA's Display Port Adaptive Sync which has been modified to work on HDMI as well.
Looking further into DSC1.2 specs, I am unsure it actually supports 4:4:4 chroma at high refresh rates. I think it is actually subsampled to 4:2:2 or 4:2:0.
HDMI 2.1 is better than DisplayPort, and it will be a more common standard. LG went the extra mile to add it to their 2019 TVs, and it's the correct step forward.
It isn't their fault that Nvidia and AMD don't seem to give a shit.
I think what you meant to say was LG went the extra mile to add an off-the-shelf HDMI 2.1 scaler to their consumer electronics device, and it isn't AMD or NVIDIA's fault that the HDMI 2.1 spec wasn't released by the time they were done taping out their 10.3B - 18.6B transistor GPUs.
Yeah that will never happen. The TV oems built HDMI to have a spec THEY control. Displayport is cheaper to implement and without licensing. Have you ever seen a single HDTV with a single displayport input? Just one would be nice for connecting a computer and the rest HDMI but even that? Nope. HDMI or bust.
At least one in Japan and, of course, the rather expensive HP Emperor (with Nvidia G-Sync, probably the only reason why it even has DP input). I guess the new ASUS version of the Nvidia BFGD will have a DP, too.
" Keep in mind that the maximum bandwidth supported by a DP1.4 interconnection using the HBR3 data rates is 32.4 Gbps, which is enough for 8Kp60 when Display Stream Compression 1.2 is used. Meanwhile, the maximum bandwidth of HDMI 2.1 is 48 Gbps and it does not need to use DSC 1.2 for 8Kp60."
That is not correct. 8K60 with CVT-R2 blanking requires 62.06 Gbps at 10 bit color (49.65 Gbps @ 8 bpc) for 4:4:4 or RGB. So even with the higher bitrate of HDMI 2.1 it requires DSC.
Now 4K120 (10 bit) requires 32.27 Gbps so it can be done by HDMI 2.1 without DSC but would require DSC if using displayport 1.4. Since LG OLED support 120 Hz input the question is do they support it with DSC?
Are HDMI 2.1 and DP1.4a w/ DSC1.2 equivalent at 4k resolution at high refresh rates? It seems like HDMI 2.1 still has advantages as far as chroma/color bit depth, not to mention additional standard features like VRR.
DSC supports multiple compression rates. So since HDMI 2.1 has a higher underlying bitrate than DP 1.4 it would require a lower compression rate to hit the same target (resolution, bit depth, frequency). I am unsure if it would be noticeable visibly but it is a benefit in favor of HDMI 2.1
Your math certainly checks out on the bandwidth calculations, but I'm guessing this chip may not even support DSC, because both of the listed resolutions / frame rates are possible at 10 bpc for DisplayPort 1.4 HBR3 and HDMI 2.1 FRL using 4:2:0 chroma subsampling. Which according to the HDMI Forum qualifies as "fully uncompressed video"!
I highly doubt VRR is in the offing either, because it would pose considerable technical challenges, but we'll have to wait until Realtek posts more detailed specs to know for sure what this thing can do.
Yeah that is a good point. If it doesn't support DSC then it may just be enabling those resolution and framerates using 4:2:2 or 4:2:0 subsampling (which would be worse PQ).
It shouldn't be too bad. Microsoft got it working in millions of Xbox One S/X consoles, and those use an off-the-shelf DP--> HDMI chip from TI (not AMD's native HDMI output block). Sony uses a custom LSPCon from Panasonic to do something similar in the PS4, though without HDMI Forum VRR support.
The max bit rates listed in the article don't include the encoding on the wire. DisplayPort uses 8b/10b so 32.4 Gbps becomes 25.92 Gbps. HDMI 2.1 uses 16b/18b so 48 Gbps becomes 42.67 Gbps.
The 62.06, 49.65, 32.27 Gbps numbers are correct (they are the bitrate required for the resolutions without adding encoding, they include the extra pixels for horizontal and vertical blanking).
It does not follow that 32.27 requires DSC in the DisplayPort 1.4 case when the reader sees the 32.4 Gbps number instead of the corrected 25.92 Gbps number.
I can't figure out why, but I've never been able to get DisplayPort to work. I've tried several different brands of DisplayPort cables, and no matter what I do, my displays randomly flicker and lose signal.
By contrast, the same displays work fine with HDMI. So I just use active DisplayPort to HDMI adapters to connect my Radeon RX 580 to my computer monitors.
I can't figure out why. It's obviously not a problem with the cables, because I tried several brands. And it's not a problem with the DisplayPort outputs, because they work fine when combined with DisplayPort to HDMI adapters. Maybe it's a problem with the DisplayPort inputs on the monitors? But in any case, I can't get my computer monitors to work unless I use HDMI.
So I'm glad to see that DisplayPort to HDMI adapters continue to be developed.
Same. DP has been problematic for me in ways that HDMI and DVI never have been (DVI holding a slight edge over HDMI). This is across many monitors (Dell, LG, BenQ, etc), GPUs (Intel, AMD, and Nvidia), OS's (Win 7-10, OSX, and several Linux distributions, though nothing decent). Of course, I now have a box of DP cables, too. Most often, the link just up and fails after turning the monitor off then on. Bonus points for turning on the computer first, then the monitor (DP fails), starting with the wrong input (using a source via HDMI, then switching to DP for my PC causes the audio output to disappear), just flat out failing after working fine for hours, etc.
I now have grown to dislike displayport. I only put up with it since my GSync monitor requires DP to even function. Once that's gone, DP can go away, too.
It has gotten better over the years. The issues still remain, but the frequency has gone down. I can still reliably trigger DP failing by powercycling the monitor, though. Shame, since I was trying out a dual monitor setup, and I don't always need 2 monitors blaring at once (this is also the reason why switching inputs fail - I had another device hooked up to the HDMI input of one of the monitors).
I will say, Win 10 1903 with a Nvidia GPU running a Gsync monitor, has been the least problematic setup yet. This only only exhibits the audio issue (so far). Of course, numerous laptops run their internal displays with eDP just fine... Given all that, I'm willing to bet a lot of the DP issues I've experienced come from a bad chain of firmware and software.
"A number of modern high-end televisions featuring an 8K resolution are outfitted with HDMI ports that are technically ready to receive HDMI 2.1 signals, but are not marketed as HDMI 2.1 because they have not been certified by the HDMI Forum."
This is technically inaccurate.
Per the HDMI Forum (https://www.hdmi.org/manufacturer/hdmi_2_1/), "Q: Can I use “HDMI 2.1” in my marketing A: You can only use version numbers when clearly associating the version number with a feature or function as defined in that version of the HDMI Specification. You cannot use version numbers by themselves to define your product or component capabilities or the functionality of the HDMI interface. And please note that NO use of version numbers is allowed in the labeling, packaging, or promotion of any cable product."
This means you will not see TV's marketed as "HDMI 2.1" to distinguish them from other TV's. "HDMI 2.1" can only be used in marketing materials when referring to an HDMI 2.1 specific feature. It is more likely (as we see today), that TV manufacturers will refer to individual HDMI 2.1 features their TV supports (ALLM, eRAC, QFT, QMS, VRR), and I suspect there is no requirement for all HDMI 2.1 features to actually be supported in a single device.
In this regard, I believe the presentation of this Realtek device is misleading, because it is highly unlikely that it will support all HDMI 2.1 features when converting from DisplayPort.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
31 Comments
Back to Article
jeremyshaw - Thursday, June 13, 2019 - link
Are than any DP1.4 outputs (outside of the new Navi 5700) that support DSC? Does this allow for VESA Adaptive Sync --> HDMI VRR?JasonAT - Thursday, June 13, 2019 - link
nVidia RTX & AMD Navy 5700 series both support DP1.4a with DSC 1.2.repoman27 - Thursday, June 13, 2019 - link
And Intel will be in the mix with Ice Lake, so the full list goes:NVIDIA Turing: DisplayPort 1.4a, HBR3, DSC 1.2, HDMI 2.0b, HDCP 2.2
AMD Navi: DisplayPort 1.4a, HBR3, DSC 1.2a, HDMI 2.0b, HDCP 2.2
Intel Gen11: DisplayPort 1.4, HBR3, DSC 1.1, HDMI 2.0b, HDCP 2.2
Looks like each vendor will have a slightly different version of DSC this time around. That should make things fun for end users.
JasonAT - Thursday, June 13, 2019 - link
It's a freaking mess. Please for the love of god just release HDMI 2.1 GPUs already.FXi - Friday, June 14, 2019 - link
Seriously - get 2.1 products on the shelves people. We've waited a huge amount of time to get 4k @ 120 or better.wrkingclass_hero - Thursday, June 13, 2019 - link
Expect devicesSomeguyperson - Thursday, June 13, 2019 - link
Does this mean you can finally use the full 4k 120Hz VRR output of the new LG OLED TVs? I think those TVs would be much better than those "Big Format Gaming Displays" from Nvidia. The HP G Sync monitor is $4k-$5k while the 65" C9 is only $3200.JasonAT - Thursday, June 13, 2019 - link
This sounds like something that would need to be implemented on-board an AIB GPU or motherboard but if it is implemented on on an RTX/Navi 5700 series GPU with DP1.4+DSC1.2. However, this should allow 4k120 at 4:4:4 but whether you will get VRR is unknown.nathanddrews - Thursday, June 13, 2019 - link
LG's 9-series OLED supports 4K120 input and it supports FreeSync, so it's highly likely that it will work. Wait for rtings or avsforum to test it.JasonAT - Thursday, June 13, 2019 - link
Unfortunately, LG doesn't support FreeSync. It supports HDMI Forum VRR. FreeSync is AMD's derivative of VESA's Display Port Adaptive Sync which has been modified to work on HDMI as well.JasonAT - Thursday, June 13, 2019 - link
Looking further into DSC1.2 specs, I am unsure it actually supports 4:4:4 chroma at high refresh rates. I think it is actually subsampled to 4:2:2 or 4:2:0.Gunbuster - Thursday, June 13, 2019 - link
Or the TV manufacturers could have put on DisplayPort inputs in the first place...weilin - Thursday, June 13, 2019 - link
It's more than just TVs, you'd need receivers and everything else in the home entertainment business to adopt it too.Then the question becomes, why go to DisplayPort, why not go straight to USB-C and support DisplayPort/HDMI alt mode?
stanleyipkiss - Thursday, June 13, 2019 - link
Exactly. USB-C will be better in every single case. Even with power delivery! Why not let my receiver power my bluray player as well?Sancus - Thursday, June 13, 2019 - link
HDMI 2.1 is better than DisplayPort, and it will be a more common standard. LG went the extra mile to add it to their 2019 TVs, and it's the correct step forward.It isn't their fault that Nvidia and AMD don't seem to give a shit.
repoman27 - Thursday, June 13, 2019 - link
I think what you meant to say was LG went the extra mile to add an off-the-shelf HDMI 2.1 scaler to their consumer electronics device, and it isn't AMD or NVIDIA's fault that the HDMI 2.1 spec wasn't released by the time they were done taping out their 10.3B - 18.6B transistor GPUs.timecop1818 - Thursday, June 13, 2019 - link
i like your answers, always so informative wrt DisplayPort, HDMI, thunderbolt, etc.TheUnhandledException - Thursday, June 13, 2019 - link
Yeah that will never happen. The TV oems built HDMI to have a spec THEY control. Displayport is cheaper to implement and without licensing. Have you ever seen a single HDTV with a single displayport input? Just one would be nice for connecting a computer and the rest HDMI but even that? Nope. HDMI or bust.jeremyshaw - Friday, June 14, 2019 - link
At least one in Japan and, of course, the rather expensive HP Emperor (with Nvidia G-Sync, probably the only reason why it even has DP input). I guess the new ASUS version of the Nvidia BFGD will have a DP, too.TheUnhandledException - Thursday, June 13, 2019 - link
" Keep in mind that the maximum bandwidth supported by a DP1.4 interconnection using the HBR3 data rates is 32.4 Gbps, which is enough for 8Kp60 when Display Stream Compression 1.2 is used. Meanwhile, the maximum bandwidth of HDMI 2.1 is 48 Gbps and it does not need to use DSC 1.2 for 8Kp60."That is not correct. 8K60 with CVT-R2 blanking requires 62.06 Gbps at 10 bit color (49.65 Gbps @ 8 bpc) for 4:4:4 or RGB. So even with the higher bitrate of HDMI 2.1 it requires DSC.
Now 4K120 (10 bit) requires 32.27 Gbps so it can be done by HDMI 2.1 without DSC but would require DSC if using displayport 1.4. Since LG OLED support 120 Hz input the question is do they support it with DSC?
JasonAT - Thursday, June 13, 2019 - link
Are HDMI 2.1 and DP1.4a w/ DSC1.2 equivalent at 4k resolution at high refresh rates? It seems like HDMI 2.1 still has advantages as far as chroma/color bit depth, not to mention additional standard features like VRR.TheUnhandledException - Friday, June 14, 2019 - link
DSC supports multiple compression rates. So since HDMI 2.1 has a higher underlying bitrate than DP 1.4 it would require a lower compression rate to hit the same target (resolution, bit depth, frequency). I am unsure if it would be noticeable visibly but it is a benefit in favor of HDMI 2.1repoman27 - Thursday, June 13, 2019 - link
Your math certainly checks out on the bandwidth calculations, but I'm guessing this chip may not even support DSC, because both of the listed resolutions / frame rates are possible at 10 bpc for DisplayPort 1.4 HBR3 and HDMI 2.1 FRL using 4:2:0 chroma subsampling. Which according to the HDMI Forum qualifies as "fully uncompressed video"!I highly doubt VRR is in the offing either, because it would pose considerable technical challenges, but we'll have to wait until Realtek posts more detailed specs to know for sure what this thing can do.
TheUnhandledException - Friday, June 14, 2019 - link
Yeah that is a good point. If it doesn't support DSC then it may just be enabling those resolution and framerates using 4:2:2 or 4:2:0 subsampling (which would be worse PQ).jeremyshaw - Monday, June 17, 2019 - link
It shouldn't be too bad. Microsoft got it working in millions of Xbox One S/X consoles, and those use an off-the-shelf DP--> HDMI chip from TI (not AMD's native HDMI output block). Sony uses a custom LSPCon from Panasonic to do something similar in the PS4, though without HDMI Forum VRR support.joevt - Sunday, August 11, 2019 - link
The max bit rates listed in the article don't include the encoding on the wire. DisplayPort uses 8b/10b so 32.4 Gbps becomes 25.92 Gbps. HDMI 2.1 uses 16b/18b so 48 Gbps becomes 42.67 Gbps.The 62.06, 49.65, 32.27 Gbps numbers are correct (they are the bitrate required for the resolutions without adding encoding, they include the extra pixels for horizontal and vertical blanking).
It does not follow that 32.27 requires DSC in the DisplayPort 1.4 case when the reader sees the 32.4 Gbps number instead of the corrected 25.92 Gbps number.
Mikewind Dale - Sunday, June 16, 2019 - link
Awesome.I can't figure out why, but I've never been able to get DisplayPort to work. I've tried several different brands of DisplayPort cables, and no matter what I do, my displays randomly flicker and lose signal.
By contrast, the same displays work fine with HDMI. So I just use active DisplayPort to HDMI adapters to connect my Radeon RX 580 to my computer monitors.
I can't figure out why. It's obviously not a problem with the cables, because I tried several brands. And it's not a problem with the DisplayPort outputs, because they work fine when combined with DisplayPort to HDMI adapters. Maybe it's a problem with the DisplayPort inputs on the monitors? But in any case, I can't get my computer monitors to work unless I use HDMI.
So I'm glad to see that DisplayPort to HDMI adapters continue to be developed.
jeremyshaw - Monday, June 17, 2019 - link
Same. DP has been problematic for me in ways that HDMI and DVI never have been (DVI holding a slight edge over HDMI). This is across many monitors (Dell, LG, BenQ, etc), GPUs (Intel, AMD, and Nvidia), OS's (Win 7-10, OSX, and several Linux distributions, though nothing decent). Of course, I now have a box of DP cables, too. Most often, the link just up and fails after turning the monitor off then on. Bonus points for turning on the computer first, then the monitor (DP fails), starting with the wrong input (using a source via HDMI, then switching to DP for my PC causes the audio output to disappear), just flat out failing after working fine for hours, etc.I now have grown to dislike displayport. I only put up with it since my GSync monitor requires DP to even function. Once that's gone, DP can go away, too.
It has gotten better over the years. The issues still remain, but the frequency has gone down. I can still reliably trigger DP failing by powercycling the monitor, though. Shame, since I was trying out a dual monitor setup, and I don't always need 2 monitors blaring at once (this is also the reason why switching inputs fail - I had another device hooked up to the HDMI input of one of the monitors).
I will say, Win 10 1903 with a Nvidia GPU running a Gsync monitor, has been the least problematic setup yet. This only only exhibits the audio issue (so far). Of course, numerous laptops run their internal displays with eDP just fine... Given all that, I'm willing to bet a lot of the DP issues I've experienced come from a bad chain of firmware and software.
fschwartz - Saturday, July 20, 2019 - link
"A number of modern high-end televisions featuring an 8K resolution are outfitted with HDMI ports that are technically ready to receive HDMI 2.1 signals, but are not marketed as HDMI 2.1 because they have not been certified by the HDMI Forum."This is technically inaccurate.
Per the HDMI Forum (https://www.hdmi.org/manufacturer/hdmi_2_1/), "Q: Can I use “HDMI 2.1” in my marketing
A: You can only use version numbers when clearly associating the version number with a feature or function as defined in that version of the HDMI Specification. You cannot use version numbers by themselves to define your product or component capabilities or the functionality of the HDMI interface. And please note that NO use of version numbers is allowed in the labeling, packaging, or promotion of any cable product."
This means you will not see TV's marketed as "HDMI 2.1" to distinguish them from other TV's. "HDMI 2.1" can only be used in marketing materials when referring to an HDMI 2.1 specific feature. It is more likely (as we see today), that TV manufacturers will refer to individual HDMI 2.1 features their TV supports (ALLM, eRAC, QFT, QMS, VRR), and I suspect there is no requirement for all HDMI 2.1 features to actually be supported in a single device.
In this regard, I believe the presentation of this Realtek device is misleading, because it is highly unlikely that it will support all HDMI 2.1 features when converting from DisplayPort.
LGC94K120fps - Sunday, February 23, 2020 - link
It is now almost Q2 of 2020... where is this adapter?! Anyone know of it yet? Thanks!4k@120hz - Wednesday, May 6, 2020 - link
It has been almost a year. Is there any update on this? Also Club3D wanted to release a product capable of converting 4k@120hz