I'm a bit confused by the no-compression claim. 8k @120hz @ 24bit color needs 95.5 gigabits/sec. At a minimum that would require an equivalent to DP1.4's ~2:1 compression and 128/130 bit encoding or equivalents. 48bit color would need at least 4:1 compression at 8k/120. Depending on exactly what 10k means you'd need somewhere around 6:1 compression in the same circumstances, more if only using 8/10bit for some reason.
The exact quote is: "48G cables enable up to 48Gbps bandwidth for uncompressed HDMI 2.1 feature support including 8K video with HDR."
Virtually all 4K and 8K video content that is available or WILL BE available is chroma 4:2:0, so the compression they are referring to is likely TRANSPORT COMPRESSION - which is what DisplayPort uses.
I think the article is somewhat confusingly written, but isn't claiming HDMI 2.1 supports 8k120 at full color rate. Other sources claim 8k60 and 4k120, which makes more sense.
I don't see why it shouldn't support 4k240 and even 2k960 with 24 bit color (and for that matter full resolution 3D for each configuration at half the frame rate) as long as the source and display are capable. It's HDMI, though, so they'll leave out a few obvious bits for HDMI 2.1a, HDMI 2.2, and/or HDMI 3.0.
Those numbers don't add up. By tripling bandwidth (and I'm being generous) you can not push 6x more pixels (10k vs 4k) and double refreshrate (120 vs 60Hz) without compression. For 10k@120Hz with 24bpp (8 bit per channel) you need 10240*5760*24*120 bps = 158Gbps. Plus some overhead - so more like 180Gbps cable. For this resolution you need at least 4 HDMI 2.1 cables or lossy (though probably "visualy-lossless") compression.
You got me curious, Definitely does not pass the sniff test. I had to look up Nvidia's Pascal memory compression to have something to compare against. Nvidia is getting between 4:1 and 8:1 compression ratios on textures and frame buffers. This would mean that at least 4:1 should be doable for loss-less low latency compression.
No, it's not. Not for general purpose compression of images. For example random noise (if you remember analog TV noise) is incompressible. Nvidia does "up to" 8:1 compression - meaning some parts of the image are compressed 8:1. Any compression, that has to guarantee compression ratio must be lossy (or at least lossy in some situations).
4:1+ is "typical". I'm not going to argue against contrived edge cases like random data. There will always be outliers, but it is safe to assume a 4:1 compression ratio on average with a low standard deviation. The main problem with this is there is no guarantee and the system will need to handle situations gracefully where there is not enough effective bandwidth.
Check this page: http://www.compression.ru/video/ls-codec/index_en.... There are some tests of offline lossless codecs under different situations. And for the most part 3x compression is the limit. On average over many frames. Our usecase requires that every frame meets your size limit and it is compressed in real-time. I wouldn't call that "edge-case". That is why DisplayPort uses term "visualy lossless" for its compression and I imagine it will be very similar here.
That's really bad news for random noise viewing enthusiasts. They will not be able to enjoy their random noise at a fluid frame rate. Going back to the drawing board then!
My point is, that not even the best lossless compression can maintain ANY predefined compression ratio. If you look at some pages, that specialize in this kind of video compression, you will find, that under pretty normal circumstances even 2:1 is sometimes not achieved. And I am talking about average over longer sequences and those are not real-time low-latency codecs, but heavy-weight codecs! Peeks are obviously even worse. Check for examle (MSU looks like one of the best): http://www.compression.ru/video/ls-codec/index_en.... or http://forum.doom9.org/showthread.php?p=1463278#po...
the capability has increased so much that I was surprised they did not call it HDMI 3.0... However, seeing that license 2.0 holders have access to this 2.1 upgrade explains it....
Capability will actually increase when between you and me we have two TV sets and all companion hardware with full support of the standard. Until then it`s just a storm in the glass.
Because crosstalk and external interference problems get steadily worse at higher signally rates. Most likely longer cables will end up looking more like thunderbolt or SPF networking cables with transceiver modules in the cable plugs themselves to put the long distance signaling hardware on the cable side of the socket and able to transparently -to the end user- switch from wire to fiber.
As for DP always being half a step ahead, HDMI 2.1 appears to reflect the HDMI association getting tired of people saying that and decided to make a much larger leap ahead. The latest DP standard only has 32Gb of pre-encoding bandwidth.
30/60 doesn't really divide cleanly into 144, so ya it would be extra hardware so support both 120Hz and 144Hz refresh rates. You can't just support 144Hz or 30/60Hz sources will start experiencing judder...
Also, I have also never seen a TV supporting 144Hz refresh...
That won't happen as, in a few years, they'll increase the spec in order to ensure the $$$$/££££ keeps on going. Plus, by then, they'll probably change the connection form factor to something new. Repeat.
I'm not sure if it needs to go all the way to the display, but fulldome 360 video needs a ridiculous number of pixels to look sharp. At least 4x the horizontal pixel number.
I played with it a bit, 4K looks outright blurry, 8K is acceptible, 16K is noticeably better even though I was panning around on a 2K(1080p) display...
When they first started Hyping Eyefinity, I remember someone at AMD using a 10s of megapixels number as being needed to fully fill the human field of view at a high enough resolution to not need anti-aliasing. 10k is in the same ballpark. It'll be a long time before we can build anything capable of running decently at that resolution though.
They need to update SPDIF -- that way you could have an out port on the TV and then just feed that into the receiver -- then you wouldn't need a new receiver too. Nobody has seemed to say anything about that though.
This certainly makes it difficult for nVidia to only support G-Sync. More likely they'll keep G-Sync support standard for DP and adopt HDMI 2.1 VRR for the single HDMI port on the card.
This can wait, Got 60" plasma, but sofa is so far out, I can't distinguish 720p from 1080p. For graphics, I'm starting to have "internal antialiasing" and my eye sight isn't getting better. After 1080p there is diminishing returns to most of us. HDR may be something to wait for, but other than that, meh.
I would prefer gears that are made to last and have good upgradeability, not new HDMI 4.5c...
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
40 Comments
Back to Article
DanNeely - Thursday, January 5, 2017 - link
I'm a bit confused by the no-compression claim. 8k @120hz @ 24bit color needs 95.5 gigabits/sec. At a minimum that would require an equivalent to DP1.4's ~2:1 compression and 128/130 bit encoding or equivalents. 48bit color would need at least 4:1 compression at 8k/120. Depending on exactly what 10k means you'd need somewhere around 6:1 compression in the same circumstances, more if only using 8/10bit for some reason.nathanddrews - Thursday, January 5, 2017 - link
The exact quote is:"48G cables enable up to 48Gbps bandwidth for uncompressed HDMI 2.1 feature support including 8K video with HDR."
Virtually all 4K and 8K video content that is available or WILL BE available is chroma 4:2:0, so the compression they are referring to is likely TRANSPORT COMPRESSION - which is what DisplayPort uses.
http://www.hdmi.org/manufacturer/hdmi_2_1/index.as...
chaos215bar2 - Thursday, January 5, 2017 - link
I think the article is somewhat confusingly written, but isn't claiming HDMI 2.1 supports 8k120 at full color rate. Other sources claim 8k60 and 4k120, which makes more sense.I don't see why it shouldn't support 4k240 and even 2k960 with 24 bit color (and for that matter full resolution 3D for each configuration at half the frame rate) as long as the source and display are capable. It's HDMI, though, so they'll leave out a few obvious bits for HDMI 2.1a, HDMI 2.2, and/or HDMI 3.0.
extide - Thursday, January 5, 2017 - link
It shows right in the table -- 8K at 120 FPS is only with 4:2:0 color -- which you could argue is a form of compression.qap - Thursday, January 5, 2017 - link
Those numbers don't add up. By tripling bandwidth (and I'm being generous) you can not push 6x more pixels (10k vs 4k) and double refreshrate (120 vs 60Hz) without compression. For 10k@120Hz with 24bpp (8 bit per channel) you need 10240*5760*24*120 bps = 158Gbps. Plus some overhead - so more like 180Gbps cable. For this resolution you need at least 4 HDMI 2.1 cables or lossy (though probably "visualy-lossless") compression.bcronce - Thursday, January 5, 2017 - link
You got me curious, Definitely does not pass the sniff test. I had to look up Nvidia's Pascal memory compression to have something to compare against. Nvidia is getting between 4:1 and 8:1 compression ratios on textures and frame buffers. This would mean that at least 4:1 should be doable for loss-less low latency compression.qap - Friday, January 6, 2017 - link
No, it's not. Not for general purpose compression of images. For example random noise (if you remember analog TV noise) is incompressible. Nvidia does "up to" 8:1 compression - meaning some parts of the image are compressed 8:1.Any compression, that has to guarantee compression ratio must be lossy (or at least lossy in some situations).
bcronce - Friday, January 6, 2017 - link
4:1+ is "typical". I'm not going to argue against contrived edge cases like random data. There will always be outliers, but it is safe to assume a 4:1 compression ratio on average with a low standard deviation. The main problem with this is there is no guarantee and the system will need to handle situations gracefully where there is not enough effective bandwidth.qap - Friday, January 6, 2017 - link
Check this page:http://www.compression.ru/video/ls-codec/index_en....
There are some tests of offline lossless codecs under different situations. And for the most part 3x compression is the limit. On average over many frames. Our usecase requires that every frame meets your size limit and it is compressed in real-time. I wouldn't call that "edge-case". That is why DisplayPort uses term "visualy lossless" for its compression and I imagine it will be very similar here.
ddriver - Friday, January 6, 2017 - link
That's really bad news for random noise viewing enthusiasts. They will not be able to enjoy their random noise at a fluid frame rate. Going back to the drawing board then!qap - Friday, January 6, 2017 - link
My point is, that not even the best lossless compression can maintain ANY predefined compression ratio. If you look at some pages, that specialize in this kind of video compression, you will find, that under pretty normal circumstances even 2:1 is sometimes not achieved. And I am talking about average over longer sequences and those are not real-time low-latency codecs, but heavy-weight codecs! Peeks are obviously even worse. Check for examle (MSU looks like one of the best):http://www.compression.ru/video/ls-codec/index_en....
or
http://forum.doom9.org/showthread.php?p=1463278#po...
HardwareDufus - Thursday, January 5, 2017 - link
the capability has increased so much that I was surprised they did not call it HDMI 3.0... However, seeing that license 2.0 holders have access to this 2.1 upgrade explains it....Michael Bay - Thursday, January 5, 2017 - link
Capability will actually increase when between you and me we have two TV sets and all companion hardware with full support of the standard. Until then it`s just a storm in the glass.lilmoe - Thursday, January 5, 2017 - link
Not bad summing up all the damn problems in display tech in one sentence.+1
Poik - Thursday, January 5, 2017 - link
How can 2m be the max for a standard cable? I wish HDMI would just die. DP gives you 3m and always seems to be a half step ahead.DanNeely - Thursday, January 5, 2017 - link
Because crosstalk and external interference problems get steadily worse at higher signally rates. Most likely longer cables will end up looking more like thunderbolt or SPF networking cables with transceiver modules in the cable plugs themselves to put the long distance signaling hardware on the cable side of the socket and able to transparently -to the end user- switch from wire to fiber.As for DP always being half a step ahead, HDMI 2.1 appears to reflect the HDMI association getting tired of people saying that and decided to make a much larger leap ahead. The latest DP standard only has 32Gb of pre-encoding bandwidth.
ruthan - Thursday, January 5, 2017 - link
Why is there other stupid limit - 120 Hz, there are already 144 Hz displays. Why not - 240 Hz for 4k and 120Hz for 8k?A5 - Thursday, January 5, 2017 - link
120 is a far more common target in TVs because it is a multiple of both 24 (film) and 30/60 (NTSC TV).The HDMI forum is not particularly concerned with computer displays, having ceded that space to DisplayPort.
alphasquadron - Thursday, January 5, 2017 - link
Does it cost more or something to make it 144Hz instead of 120Hz. DisplayPort is great but not every TV has those.weilin - Thursday, January 5, 2017 - link
30/60 doesn't really divide cleanly into 144, so ya it would be extra hardware so support both 120Hz and 144Hz refresh rates. You can't just support 144Hz or 30/60Hz sources will start experiencing judder...Also, I have also never seen a TV supporting 144Hz refresh...
Meteor2 - Friday, January 6, 2017 - link
I'd be amazed if anyone could tell the difference between 120 Hz and 144.damianrobertjones - Thursday, January 5, 2017 - link
That won't happen as, in a few years, they'll increase the spec in order to ensure the $$$$/££££ keeps on going. Plus, by then, they'll probably change the connection form factor to something new. Repeat.nils_ - Thursday, January 5, 2017 - link
10k? I guess I'll need LASIK at some point...DanNeely - Thursday, January 5, 2017 - link
VR googles with peripheral vision displays.stephenbrooks - Thursday, January 5, 2017 - link
I'm not sure if it needs to go all the way to the display, but fulldome 360 video needs a ridiculous number of pixels to look sharp. At least 4x the horizontal pixel number.I played with it a bit, 4K looks outright blurry, 8K is acceptible, 16K is noticeably better even though I was panning around on a 2K(1080p) display...
DanNeely - Thursday, January 5, 2017 - link
When they first started Hyping Eyefinity, I remember someone at AMD using a 10s of megapixels number as being needed to fully fill the human field of view at a high enough resolution to not need anti-aliasing. 10k is in the same ballpark. It'll be a long time before we can build anything capable of running decently at that resolution though.beisat - Thursday, January 5, 2017 - link
*sigh*, can't we separate video and audio again so that one can keep ones AV Receiver while only updating the screen and player / pc / console...extide - Friday, January 6, 2017 - link
They need to update SPDIF -- that way you could have an out port on the TV and then just feed that into the receiver -- then you wouldn't need a new receiver too. Nobody has seemed to say anything about that though.Murloc - Friday, January 6, 2017 - link
I do that on my PC.TristanSDX - Thursday, January 5, 2017 - link
If Variable Refresh is required, then NV must break their G-Sync monopoly :)Kevin G - Thursday, January 5, 2017 - link
This certainly makes it difficult for nVidia to only support G-Sync. More likely they'll keep G-Sync support standard for DP and adopt HDMI 2.1 VRR for the single HDMI port on the card.Spunjji - Saturday, January 7, 2017 - link
My thoughts exactly. That would be wonderful, assuming it's not an option al part of the spec!Meteor2 - Friday, January 6, 2017 - link
I wonder where the equivalent DP spec is, let alone an equivalent DP spec as a USB-C alt mode. That's what I want to see.MobiusStrip - Friday, January 6, 2017 - link
"4Kp60"WTF is that? Which format has a vertical resolution of 4000+?
And it's time to move to DisplayPort. HDMI is a vestige of CRTs.
Spunjji - Saturday, January 7, 2017 - link
1) You know they mean 60p (progressive)2) No it isn't. HDMI is based off the same tech as DVI, ergo it's digital. Nothing whatsoever to do with CRTs.
yannigr2 - Friday, January 6, 2017 - link
I hope Anandtech to improve financially so it doesn't have to rely on writers like Anton.Can someone find any mention of 10K in the press release? Because I can't.
Anato - Saturday, January 7, 2017 - link
This can wait, Got 60" plasma, but sofa is so far out, I can't distinguish 720p from 1080p. For graphics, I'm starting to have "internal antialiasing" and my eye sight isn't getting better. After 1080p there is diminishing returns to most of us. HDR may be something to wait for, but other than that, meh.I would prefer gears that are made to last and have good upgradeability, not new HDMI 4.5c...
Sivar - Monday, January 9, 2017 - link
It would be great to see some DSC 1.2 before and after images.Light speed range x29 - Friday, March 10, 2017 - link
Poor performance HDMI 2.1 not available color 96 bit only this round HDMI 2.1 - 48 bit colour why ?Light speed range x29 - Friday, March 10, 2017 - link
HDMI 2.1 version ( missing colour 96 bit )