AMD/NVIDIA FreeSync/GSync mobile displays, here we come! It's too bad so many manufacturers choose to use 20-pin (768p) and 40-pin (1080p) LVDS, I hope these are cheap enough to get everyone on the same eDP setup.
As long as there's a penny to be saved, a lot of laptops will keep using the old one. And unfortunately, it seems that as long as that minuscule savings is around Intel will continue to yield to OEMs in keeping support in their chipset. Back in 2010 Intel and AMD both committed to phasing out LVDS from their chipsets in 2013...
"Visually lossless". They need to cut this shit out. Also loosy compression on display comm standards is forgivable when it's a shim until the phy is there but now they're using it in lower resolution displays? Consumers won't even know they're getting fucked.
What's most absurd about it is that the primary users of these better displays will be the professional graphics market (illustration, post-production, etc.), none of whom have any interest in lossy compression.
While I personally wouldn't want my display to use this, I kind of understand the problem that VESA is facing. An 8K display requires absurdly high data rates to run at 60 Hz, 8 bpc (bits per component), and thats not even "professional grade" then, as they want 10 or 12 bpc.
DP is already one of the fastests interconnects we have on our PC, the sheer amount of data going to your display is baffling if you consider 4K and 8K.
I did see the DSC stuff at CES and they even had the ability to run with, without, and show the difference. The actual diff image was black to the eye, so they had to crank up the brightness of the "missed" pixels by something like a 10X factor to make them visible. Even then all you could see was a scatter of "dust" pixels that were not the same between the two modes. Many people don't notice the use of FRC on 6-bit LCD panels, and I'm quite sure DSC is even less visible.
Does the consumer have the ability to turn DSC stuff off too? Or is this going to be one of those situations where everything is designed around the lowest common denominator and none of the hardware can handle enough bandwidth to run uncompressed?
It wouldn't seem logical to use DSC when you actually have enough bandwidth to not use it, so I doubt its going to be an option. Either you have the bandwidth and don't get DSC, or well, you don't and you do.
Given that the majority of video is already in compressed format (e.g. h.264, h.265 etc), why bother 1) decoding the already compressed stream, then 2) recompressing using DSC
All this would seem to achieve is to increase the required bandwidth, hurt picture quality, add component expense and increase latency. Sure DSC might be useful in cases where the computer / set top box isn't attempting to display a compressed video stream, but why not start with being able to pass through unadulterated H.264/265 before introducing a new compression method?
Why not just use multiple interconnects for 8k. The market for 8k will be extremely small and can afford to have multiple cables and even graphics cards to drive. Fulldome will be the main application of 8k but if you can afford a fulldome then you can afford 2-4 cables.
I agree completely. What really ticks me off is the reason they gave for basically giving up on keeping the same pace of bandwidth improvements: POWER.
The same excuse that killed single-core CPU improvements and is even starting to affect the performance of the highest-end GPUs, has now apparently also infected the display chain. Now sure, in mobile devices every saved watt counts, but using that as an excuse not to engineer more powerfull desktop solutions really makes me worried for the future of the entire industry. I strongly suspect that the quest for ever thinner display connectors, together with this no-power initiative, is going to cap display performance well below the true "retina" limit of 20/8 at any refresh rate appreciably higher than 60Hz.
On the bright side: "Meanwhile HBR3 + DSC also offers an alternative to using lower chroma subsampling resolutions to drive 8K displays, replacing the loss in chroma resolution with DSC." Now we just need the rec.2020 standard to stop recommending that 4:2:0 subsampling be used as default for all UHD multimedia recordings, but I wouldn't hold my breath for that either.
It's not an excuse, and the problem of DP needing more power is separate from the clock speed limitations on the CPU. In the lab, silicon transistors can switch at 50GHz. In practice, it requires more power than can be effectively dissipated in an actual processor.
For the ability to transfer information from one point to another, the Noisy-channel coding theorem applies. It basically determines how much bandwidth is possible through a channel where noise is present (which is everywhere.).
To get more bandwidth, you either need to raise the power of the signal, use more channels, or lower the noise. We're approaching the limits of copper in terms of how low the noise floor is and how many frequencies we can use to carry a signal. The only other variable that isn't facing significant diminishing returns is upping the signal strength, which requires more power. The alternative is to have more wires in the cable, or to use a different medium (Like optical).
I would give up picture perfect if I can get 120-144Hz. Twice as many pixels per second looks a lot better than having every pixel perfect. Unless you are a graphic designer of course.
You may think so, but visually lossless or audibly lossless (for audio) are very common terms to use, and make sense (even if some people don't like having them)
If a human cannot discern the difference, its visually lossless, even if there are tiny mathematical differences from the compression. Its as simple as that.
Yes and they aren't distinguishable from properly encoded MP3s (although the properly encoded part is why I only buy CDs and lossless) - this has been shows in double blind trials.
There are very distinguishable differences between MP3's & FLAC's. However it isn't until you move up properly mastered 96/24bit audio file running though to a decent external DAC that they become hugely apparent even in double blind testing.
Stuff mastered for CD already had loads stripped, mp3 compression didn't make much difference at that point because it'd already gone.
So my point stands, a “visually lossless” lossy codec is an oxymoron.
...it's a ridiculously conservative 3:1 compression ratio. The designers failed at their job if there's even a chance of it *not* being visually lossless.
You could probably do lossless at an average of 2:1 but the problem is, you'd get situations where compressed lossless could require something like 95% of the bandwidth of uncompressed. So it's understandable they need a guaranteed compression ratio and it has to be lossy. True, it might intellectually irk purists but as you say, at 3:1 there's really no excuse for artifacts of any kind (in context, a highest quality JPG is still something like 20:1 and that's with superseded algorithms). You'd have a bigger chance of noticing on a large screen at a low resolution, but on a large screen at a very high (4K or 5K) it would be pretty hard considering the DPI.
Exactly, all these idiots complaining about this have no clue how compression actually works. 3:1 is nothing in terms of image/video compression, and can easily be done without any visual difference.
@psychobriggsy - of course H.265 is viable if the video has already been encoded as H.265. Next gen bluray disks will have H.265, so it seems totally stupid to require the bluray player to 1) uncompress the perfectly good H.265 2) recompress as DSC 3) send DSC down the cable to the monitor 4) uncompress the DSC inside the monitor 5) display video
It would make much more sense to just send the H.265 through the display port to the monitor and let the monitor uncompress the H.265. It would require an order of magnitude less bandwidth than DSC and also have lower latency and better picture quality.
Your eyes and brain can only process so much information, and it's usually much less than the amount of information the display can pump out. It doesn't matter if the compression is technically lossy if the viewer can't tell the difference.
Then I guess you don't know what is. "Visually lossless" means there's no visible difference (although TBH I sorta doubt this without running my own eyes over it), "lossy" means some information is loss.
Take MP3 for example (in theory, in practice it greatly depends on the encoding, which is why MP3 sounded so bad when it first hit the scene and sometimes still dows). It cuts out the information you can't hear, so you lose information, but it's been clinically shown that nobody can hear the difference between a good enough MP3 and lossless (although not being able to hear the difference doesn't mean you should always use MP3, you still lose information which is very important if you are a sound engineer).
Now of course in practice, it's not uncommon for there to be an audible difference (to those with good enough ears) between MP3 and lossless formats, which is why I doubt that this compression will be as perfect as it claims.
As mentioned earlier, the compression is only 3:1 -- a 1080P display at 60Hz w/ 32bit color is 474MiB a sec! Now, think about Netflix SUPER HD... 6Mbit(2MiB)/sec -- thats 1:237 compression. 79 times more compressed than they are doing here... BASICALLY NOTHING. You WILL NOT be able to see it!!
Shit, my calculator failed. but lol, it makes my point even stronger .. 6mbit/sec is actually 0.75Mib/sec .. so Netflix SUPERHD is actually compressed at a ratio of 1:316 .. thats 105.33 TIMES more compressed than this.
And there's then loss in reconstruction of the analog signal too. You know where there should never be any loss? When transmitting a digital signal. That's one if the biggest reasons we moved to digital. Analog circuits can provide much higher bandwidth for video even today if you're willing to sacrifice quality for every cheap analog circuit it has to pass through.
ok, so now we have a standard for the internal communications for laptop displays, so what we need is a standard from factor for laptops so people can completely custom build a laptop with the capabilities they want. Why do we not already have this?
It irks me to no end that e.g. Google has been developing Project Ara (fully modular design) for smart phones for so long already, yet there is nothing like that in existence for laptops - which have been around much longer, and could benefit from it much more!
There's no need to let it irk you. Project Ara is likely a pipe dream just like modular/buildable laptops were. When size/complexity/price are all priorities, then modularity always suffers. It has a shot only because Google is involved, but at the end of the day, more people buy based on looks than features. With mobile contracts encouraging new phones, why upgrade some part?
A laptop has been modular (RAM, HD, CD drive, CPU, battery) but attempts to make the screen/graphics modular didn't have the desired effect (upgrade-ability) and every laptop was re-engineered so that the motherboard couldn't be modular (most important). There just isn't enough demand and when you get down to it, a new laptop is probably what you need anyway. (Better wifi, better memory support, newer CPU support, lower power chipset, faster SATA/PCIe support) and by the time you're thinking about that and factoring in that upgraded parts always cost more than they would because of low demand/manufacturing, you're looking at a new laptop.
In fact, due to the space/price/complexity battle, laptops have gotten significantly less modular. CPUs are usually soldered in, batteries aren't easily removable, RAM can be soldered in, cases harder to open.
In mobile, modularity has moved external. In fact, with most recent phones, it's arguable that even the case is modular as the cases these days are so thin, most people put big protectors on them because they aren't as durable.
Considering custom shape, size, cooling etc. of every laptop it's just not practical to ever make the motherboard modular. But the screen shouldn't be as much of an issue.
Higher resolution display options ARE useful for laptops as many people use it as their primary machine with an external monitor while at home. Similar to the need driving people to get external graphics amplifiers from Alienware and now others. This could save someone from having to have two computers, while still enabling them to run a 4K or bigger monitor while at home.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
41 Comments
Back to Article
nathanddrews - Wednesday, February 11, 2015 - link
AMD/NVIDIA FreeSync/GSync mobile displays, here we come! It's too bad so many manufacturers choose to use 20-pin (768p) and 40-pin (1080p) LVDS, I hope these are cheap enough to get everyone on the same eDP setup.DanNeely - Thursday, February 12, 2015 - link
As long as there's a penny to be saved, a lot of laptops will keep using the old one. And unfortunately, it seems that as long as that minuscule savings is around Intel will continue to yield to OEMs in keeping support in their chipset. Back in 2010 Intel and AMD both committed to phasing out LVDS from their chipsets in 2013...http://techreport.com/news/20099/intel-amd-to-phas...
willis936 - Wednesday, February 11, 2015 - link
"Visually lossless". They need to cut this shit out. Also loosy compression on display comm standards is forgivable when it's a shim until the phy is there but now they're using it in lower resolution displays? Consumers won't even know they're getting fucked.nathanddrews - Wednesday, February 11, 2015 - link
What's most absurd about it is that the primary users of these better displays will be the professional graphics market (illustration, post-production, etc.), none of whom have any interest in lossy compression.peterfares - Wednesday, February 11, 2015 - link
They'll probably team this lossy compression up with PenTile screens to make it a double-whammy!nevcairiel - Wednesday, February 11, 2015 - link
While I personally wouldn't want my display to use this, I kind of understand the problem that VESA is facing. An 8K display requires absurdly high data rates to run at 60 Hz, 8 bpc (bits per component), and thats not even "professional grade" then, as they want 10 or 12 bpc.DP is already one of the fastests interconnects we have on our PC, the sheer amount of data going to your display is baffling if you consider 4K and 8K.
JarredWalton - Wednesday, February 11, 2015 - link
I did see the DSC stuff at CES and they even had the ability to run with, without, and show the difference. The actual diff image was black to the eye, so they had to crank up the brightness of the "missed" pixels by something like a 10X factor to make them visible. Even then all you could see was a scatter of "dust" pixels that were not the same between the two modes. Many people don't notice the use of FRC on 6-bit LCD panels, and I'm quite sure DSC is even less visible.Mr Perfect - Wednesday, February 11, 2015 - link
Does the consumer have the ability to turn DSC stuff off too? Or is this going to be one of those situations where everything is designed around the lowest common denominator and none of the hardware can handle enough bandwidth to run uncompressed?nevcairiel - Thursday, February 12, 2015 - link
It wouldn't seem logical to use DSC when you actually have enough bandwidth to not use it, so I doubt its going to be an option. Either you have the bandwidth and don't get DSC, or well, you don't and you do.sunbear - Thursday, February 19, 2015 - link
Given that the majority of video is already in compressed format (e.g. h.264, h.265 etc), why bother1) decoding the already compressed stream, then
2) recompressing using DSC
All this would seem to achieve is to increase the required bandwidth, hurt picture quality, add component expense and increase latency. Sure DSC might be useful in cases where the computer / set top box isn't attempting to display a compressed video stream, but why not start with being able to pass through unadulterated H.264/265 before introducing a new compression method?
CSMR - Wednesday, February 11, 2015 - link
Why not just use multiple interconnects for 8k. The market for 8k will be extremely small and can afford to have multiple cables and even graphics cards to drive.Fulldome will be the main application of 8k but if you can afford a fulldome then you can afford 2-4 cables.
Xenonite - Wednesday, February 11, 2015 - link
I agree completely. What really ticks me off is the reason they gave for basically giving up on keeping the same pace of bandwidth improvements: POWER.The same excuse that killed single-core CPU improvements and is even starting to affect the performance of the highest-end GPUs, has now apparently also infected the display chain.
Now sure, in mobile devices every saved watt counts, but using that as an excuse not to engineer more powerfull desktop solutions really makes me worried for the future of the entire industry. I strongly suspect that the quest for ever thinner display connectors, together with this no-power initiative, is going to cap display performance well below the true "retina" limit of 20/8 at any refresh rate appreciably higher than 60Hz.
On the bright side:
"Meanwhile HBR3 + DSC also offers an alternative to using lower chroma subsampling resolutions to drive 8K displays, replacing the loss in chroma resolution with DSC." Now we just need the rec.2020 standard to stop recommending that 4:2:0 subsampling be used as default for all UHD multimedia recordings, but I wouldn't hold my breath for that either.
AnnonymousCoward - Thursday, February 12, 2015 - link
This is eDP--not an "excuse not to engineer more powerful desktop solutions".Etsp - Wednesday, February 18, 2015 - link
It's not an excuse, and the problem of DP needing more power is separate from the clock speed limitations on the CPU. In the lab, silicon transistors can switch at 50GHz. In practice, it requires more power than can be effectively dissipated in an actual processor.For the ability to transfer information from one point to another, the Noisy-channel coding theorem applies. It basically determines how much bandwidth is possible through a channel where noise is present (which is everywhere.).
To get more bandwidth, you either need to raise the power of the signal, use more channels, or lower the noise. We're approaching the limits of copper in terms of how low the noise floor is and how many frequencies we can use to carry a signal. The only other variable that isn't facing significant diminishing returns is upping the signal strength, which requires more power. The alternative is to have more wires in the cable, or to use a different medium (Like optical).
Fergy - Thursday, February 12, 2015 - link
I would give up picture perfect if I can get 120-144Hz. Twice as many pixels per second looks a lot better than having every pixel perfect. Unless you are a graphic designer of course.thingi - Wednesday, February 11, 2015 - link
A “visually lossless” lossy codec. If that isn't an oxymoron I don't know what is.nevcairiel - Wednesday, February 11, 2015 - link
You may think so, but visually lossless or audibly lossless (for audio) are very common terms to use, and make sense (even if some people don't like having them)If a human cannot discern the difference, its visually lossless, even if there are tiny mathematical differences from the compression. Its as simple as that.
jerwood - Wednesday, February 11, 2015 - link
The lossless audio codecs like FLAC and Apple Lossless are actually lossless, like a zip file.althaz - Wednesday, February 11, 2015 - link
Yes and they aren't distinguishable from properly encoded MP3s (although the properly encoded part is why I only buy CDs and lossless) - this has been shows in double blind trials.thingi - Sunday, February 15, 2015 - link
There are very distinguishable differences between MP3's & FLAC's. However it isn't until you move up properly mastered 96/24bit audio file running though to a decent external DAC that they become hugely apparent even in double blind testing.Stuff mastered for CD already had loads stripped, mp3 compression didn't make much difference at that point because it'd already gone.
So my point stands, a “visually lossless” lossy codec is an oxymoron.
psyq321 - Tuesday, February 17, 2015 - link
Can you please point us to the double blind tests which are able to discern between 16-bit and 24-bit audio?Gnarr - Thursday, February 19, 2015 - link
http://xiph.org/~xiphmont/demo/neil-young.htmlmavere - Wednesday, February 11, 2015 - link
...it's a ridiculously conservative 3:1 compression ratio. The designers failed at their job if there's even a chance of it *not* being visually lossless.darkfalz - Wednesday, February 11, 2015 - link
You could probably do lossless at an average of 2:1 but the problem is, you'd get situations where compressed lossless could require something like 95% of the bandwidth of uncompressed. So it's understandable they need a guaranteed compression ratio and it has to be lossy. True, it might intellectually irk purists but as you say, at 3:1 there's really no excuse for artifacts of any kind (in context, a highest quality JPG is still something like 20:1 and that's with superseded algorithms). You'd have a bigger chance of noticing on a large screen at a low resolution, but on a large screen at a very high (4K or 5K) it would be pretty hard considering the DPI.extide - Thursday, February 12, 2015 - link
Exactly, all these idiots complaining about this have no clue how compression actually works. 3:1 is nothing in terms of image/video compression, and can easily be done without any visual difference.psychobriggsy - Thursday, February 12, 2015 - link
Remember the algorithm has to run at 32 Gbps, in realtime, and guarantee a rate of compressiom. Something like H.265 just isn't viable.sunbear - Saturday, February 21, 2015 - link
@psychobriggsy - of course H.265 is viable if the video has already been encoded as H.265. Next gen bluray disks will have H.265, so it seems totally stupid to require the bluray player to1) uncompress the perfectly good H.265
2) recompress as DSC
3) send DSC down the cable to the monitor
4) uncompress the DSC inside the monitor
5) display video
It would make much more sense to just send the H.265 through the display port to the monitor and let the monitor uncompress the H.265. It would require an order of magnitude less bandwidth than DSC and also have lower latency and better picture quality.
SirKnobsworth - Wednesday, February 11, 2015 - link
Your eyes and brain can only process so much information, and it's usually much less than the amount of information the display can pump out. It doesn't matter if the compression is technically lossy if the viewer can't tell the difference.althaz - Wednesday, February 11, 2015 - link
Then I guess you don't know what is. "Visually lossless" means there's no visible difference (although TBH I sorta doubt this without running my own eyes over it), "lossy" means some information is loss.Take MP3 for example (in theory, in practice it greatly depends on the encoding, which is why MP3 sounded so bad when it first hit the scene and sometimes still dows). It cuts out the information you can't hear, so you lose information, but it's been clinically shown that nobody can hear the difference between a good enough MP3 and lossless (although not being able to hear the difference doesn't mean you should always use MP3, you still lose information which is very important if you are a sound engineer).
Now of course in practice, it's not uncommon for there to be an audible difference (to those with good enough ears) between MP3 and lossless formats, which is why I doubt that this compression will be as perfect as it claims.
extide - Thursday, February 12, 2015 - link
As mentioned earlier, the compression is only 3:1 -- a 1080P display at 60Hz w/ 32bit color is 474MiB a sec! Now, think about Netflix SUPER HD... 6Mbit(2MiB)/sec -- thats 1:237 compression. 79 times more compressed than they are doing here... BASICALLY NOTHING. You WILL NOT be able to see it!!extide - Thursday, February 12, 2015 - link
Actually, most video is ~30fps, not 60, so divide my numbers by half, so its "only" 40 times (not 79 times) more compressed.extide - Thursday, February 12, 2015 - link
Shit, my calculator failed. but lol, it makes my point even stronger .. 6mbit/sec is actually 0.75Mib/sec .. so Netflix SUPERHD is actually compressed at a ratio of 1:316 .. thats 105.33 TIMES more compressed than this.474/.75 = 632
632 / 2 = 316 (60 fps to 30fps)
Netflix = 1:316
316 / 3 = 105.3333
AnnonymousCoward - Thursday, February 12, 2015 - link
Guess what, 24-bit color quantization and 60Hz samples already introduce "loss".willis936 - Saturday, February 14, 2015 - link
And there's then loss in reconstruction of the analog signal too. You know where there should never be any loss? When transmitting a digital signal. That's one if the biggest reasons we moved to digital. Analog circuits can provide much higher bandwidth for video even today if you're willing to sacrifice quality for every cheap analog circuit it has to pass through.dgingeri - Wednesday, February 11, 2015 - link
ok, so now we have a standard for the internal communications for laptop displays, so what we need is a standard from factor for laptops so people can completely custom build a laptop with the capabilities they want. Why do we not already have this?boeush - Wednesday, February 11, 2015 - link
It irks me to no end that e.g. Google has been developing Project Ara (fully modular design) for smart phones for so long already, yet there is nothing like that in existence for laptops - which have been around much longer, and could benefit from it much more!Joel Kleppinger - Wednesday, February 11, 2015 - link
There's no need to let it irk you. Project Ara is likely a pipe dream just like modular/buildable laptops were. When size/complexity/price are all priorities, then modularity always suffers. It has a shot only because Google is involved, but at the end of the day, more people buy based on looks than features. With mobile contracts encouraging new phones, why upgrade some part?A laptop has been modular (RAM, HD, CD drive, CPU, battery) but attempts to make the screen/graphics modular didn't have the desired effect (upgrade-ability) and every laptop was re-engineered so that the motherboard couldn't be modular (most important). There just isn't enough demand and when you get down to it, a new laptop is probably what you need anyway. (Better wifi, better memory support, newer CPU support, lower power chipset, faster SATA/PCIe support) and by the time you're thinking about that and factoring in that upgraded parts always cost more than they would because of low demand/manufacturing, you're looking at a new laptop.
In fact, due to the space/price/complexity battle, laptops have gotten significantly less modular. CPUs are usually soldered in, batteries aren't easily removable, RAM can be soldered in, cases harder to open.
In mobile, modularity has moved external. In fact, with most recent phones, it's arguable that even the case is modular as the cases these days are so thin, most people put big protectors on them because they aren't as durable.
darkfalz - Wednesday, February 11, 2015 - link
Considering custom shape, size, cooling etc. of every laptop it's just not practical to ever make the motherboard modular. But the screen shouldn't be as much of an issue.extide - Thursday, February 12, 2015 - link
Check out Clevo/Sagerdaku123 - Tuesday, February 17, 2015 - link
On the very first chart, it should be GbPS (bits) not GBPS (bytes).Hrobertgar - Thursday, February 19, 2015 - link
Higher resolution display options ARE useful for laptops as many people use it as their primary machine with an external monitor while at home. Similar to the need driving people to get external graphics amplifiers from Alienware and now others. This could save someone from having to have two computers, while still enabling them to run a 4K or bigger monitor while at home.