This is actually a very simple design: one IPS panel acts as the light modulation layer between the backlight and the IPS panel on the surface. Pretty sure this method has been employed through other combinations of display technologies in the past, but the low cost of IPS now makes this more realistic. Assuming both are commonly available 1,000:1 contrast panels, Panasonic are just multiplying the ratios together to get 1,000,000:1. Pretty sure 1,000 nits is just a starting point as well, no reason they can't get a brighter light source behind it all, but then that might affect the black levels. Even at 1,000,000:1 contrast ratio, .001 nits for black is still infinitely whiter than OLED.
Well, adding another layer would block out more light (even while wide open), so sustaining 1000 nits with an extra liquid crystal layer (if that's what it is) is impressive.
Offhand it sounds like they've placed a second LCD layer between the backlight and the regular layer that doesn't have any color filters on it. I'm sure it's more complicated than that but it seems to be the basics of what they have done.
That's what I understood too, and it seems like it should work provided you have a powerful enough backlight. I'm a little worried about power efficiency though.
This would be primarily for desktop or laptop use, since most things smaller than that have moved to OLED already, and the thickness of the extra layers would be unwelcome.
Desktop LCD efficiency isn't really all that big a concern (and if they're using LEDs, they'd still be far more efficient than CCFL backlit screens.
While it may work for laptop, I expect desktop and tvs will be using this and anything that cares about batteries will not. This is because any extra layers will mean you need a stronger and more powerful backlight to bypass it. Think of it as something similar to a filter with sunglasses, anyform of barrier always takes more power to get the same effective brightness on the receiving end.
Agreed. This is dead in the water for laptops. Considering how laptop models already suffer with QHD vs FHD screens, the laptops would take even a bigger hit with adding another layer and brighter backlight. Also it goes against the stupid 'Thinner is Better' that's going around right now (I'm looking at you Apple) that's killing performance/battery life for the sake of thinness.
Dead in the water for thin-and-light laptops maybe, but not big honkin' gaming laptops and desktop replacements that usually get run plugged in anyway.
As I understand this, there's going to be an issue where the brighter you make a fully saturated pixel (one where at least one primary is zero), the less actually-saturated the pixel will be. For instance, for red at 100%, you would have the filter at 100% open and the maximum possible backlight bleed through the green and blue subpixels, whereas for red at 50%, you could have the filter at 50% open, red at 100%, and half the amount of backlight bleed through the green and blue subpixels.
No, they didn't specify whether they apply the additional filter per "logical pixel" or per "color spot". the latter would not have the disadvantage you're describing.
This does seem like it would add costs to the monitor, and make the display a little more complicated to control. I would guess the light-filtering layer is also variable, and would thus need something to translate the intended RGB value for each pixel into an additional light intensity value, and cause the light-filtering layer's crystal to twist open. Let's say it is an LCD-like layer that's got large cells big enough to encompass all of the subpixels that make up an RGB pixel. That would decrease the complexity of this LCD layer, but there still has to be a component in the monitor that's figuring out how much light to allow to each pixel cluster. I wonder if this would require additional data from the video source, or if the monitor is making this per-pixel computation on it's own.
It's an interesting solution to the problem of controlling light in an LCD/IPS monitor. It basically gives each pixel it's own adjustable spotlight behind it. The real trick will be maintaining the wide viewing angles that IPS monitors are known for.
It's just going to involve a modified LCD controller, and if they're doing it on a per-pixel level rather than a per-subpixel level, it wouldn't be any more complicated than the controller for existing displays that use RGBW in a 4-subpixel arrangement.
My first thought when I read the title was: oh, so they basically developed a panel that sparks 'Beacons of Gondor has bean lit!', so they'd claim that contrast? It sure still has that damn IPS glow.
I don't know why they compare the pixel size to a tennis ball. Presumably, some early version display had tennis ball size pixels and it didn't pan out too well.
They don't. It's a slide from a different company offering quantum dots, trying to visualize you how small "a few nm" are. Anton put it there because Samsung is applying them to displays.
I recall from the Surface Book review that Panasonic made the display. Even if less efficient than competing alternatives, MS wanted the increased contrast which Panasonic offered.
What the hell are light-modulating Cells? That is essentially what LCD is in the first place. If you have some other kind of Light Modulating cell, you can essentially ditch LCD and use that.
Basically what we have been told is LCD + magic layer that sounds better than LCD = better LCD.
I am betting this is just another local dimming setup with different marketing.
Well, if you had a LCD behind the color LCDs, there would be two sources of dimming which could be independently controlled. I doubt there would be a linear relationship between the two dimming sources, but could see having a much greater control over the final output. If one LED can attenuate by 75%, and a second another 75%, you get the lowest luminocity of 13%.
Interesting, but once laptops stopped shipping with passive matrix displays and moved to active matrix TFT LCDs, I pretty much haven't cared about advances in display technology since. Give me a cheap, low resolution TN panel on my devices, don't waste your time with calibration or any of that nonsense, and get out of my way. Color accuracy, contrast, response time...after a certain point that we already reached in the 1990s, it just hasn't mattered to me. In fact, before passive matrix trash, a lot of monochrome LCD panels were quite nice looking and more than good enough for most things that are meaningful to me on a PC.
I tried to make it very clear that I was aware my thougths were my own opinions on the matter fo screen quality and applied in situations where my own usage was concerned. Short of starting with an all caps disclaimer that's festooned with blinking lights and sirens, I'm not sure how much more obvious I could have been. I guess there's always someone that wants to get upset about how someone else feels regardless of whether or not its relevant to them. Also, I did own an IT company in the late 1990s that was very successful thanks. :)
But what you failed to mention is why you thought that your very personal preferences were worth broadcasting to the world in this forum. You are not adding any information to the discussion that has relevance beyond your own small world bubble. Why should we care?
Plasma was great, but it maxed out it's potential with regards to resolution and thinness. I bought one of the last great plasma TVs before they discontinued production, a Panasonic ST60. It is still one of the best pictures I've seen, LCD cannot touch it. This new technology might finally bring LCD into the picture quality realm of plasma, but with the benefits of higher resolution and quantum dot assisted wide gamut color. We might finally get a worthy successor to plasma, so long as they can tighten LCD response times. OLED doesn't seem like it will ever be a good laptop or TV display technology, it has worse wear and brightness problems than plasma ever did and it costs a boatload.
Plasma was a dead end in terms of size and pixel density. If we're talking about image quality then LCD can't come anywhere close, especially if we're talking about late model Panasonics. Those are still the best 1080p HDTVs around. OLED is the next big hope for high quality HDTV and its what I would get right now if I was looking for a 4k set. The LG sets are pretty great, I'm just waiting until I can get an 85" for less than the price of a car.
Not sure if LCD's are even worthy investing any more. Seems that this will add some thickness to the panel, which will further emphasise the amoled thinnes advantage. And there are other amoled advantages besides contrast - response time, and lower production prices when mass produced.
It depend on price. If normal 27" monitor cost 250$ and this hight contrast cost 600$ and amoled cost 1500$ There is a room for it. Large amoled panels Are still very expensive. So if this douples the price of normal lcd screen it still is much cheaper than amoled.
"but since it uses the term “cells”, it clearly indicates that we are dealing with a relatively thick layer of liquid crystals, not a thin layer of quantum dots"
Anton, in IT a "cell" can be a logical unit of any size, totally unrelated to biological cells. Consider e.g. a standard cell in circuit layout with maybe 4 fins. The required area would be about (200 nm)² in a modern process.
Hasn't Panasonic shuttered their Himeji plant? So what makes them relevant in this field today? Sharp is the only one making large panels in Japan now.
From what I gather it is for a while longer, but it was suppose to close back in September according to news reports in Japanese media. There was conflicting reports though, and I'm pretty sure they where suppose to stop making large panels (for TVs) at least. Which leaves some of their professional and medical lineup I guess. That removes much of the relevancy though. At least compared to Sharp that makes panels in anything from phones to large UHD-TVs and professional monitors.
Would like to see some reporting on what Panasonic's plans are going forward... It would be fun to see them recover some of their business rather then just die as every other Japanese manufacturer.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
50 Comments
Back to Article
SaolDan - Thursday, December 1, 2016 - link
Neat!!TesseractOrion - Thursday, December 1, 2016 - link
Great stuff, hope it makes it to market soon, along with DP 1.4....nathanddrews - Friday, December 2, 2016 - link
Haha, you'll be lucky if it has HDMI 2.1.This is actually a very simple design: one IPS panel acts as the light modulation layer between the backlight and the IPS panel on the surface. Pretty sure this method has been employed through other combinations of display technologies in the past, but the low cost of IPS now makes this more realistic. Assuming both are commonly available 1,000:1 contrast panels, Panasonic are just multiplying the ratios together to get 1,000,000:1. Pretty sure 1,000 nits is just a starting point as well, no reason they can't get a brighter light source behind it all, but then that might affect the black levels. Even at 1,000,000:1 contrast ratio, .001 nits for black is still infinitely whiter than OLED.
Valantar - Friday, December 2, 2016 - link
Well, adding another layer would block out more light (even while wide open), so sustaining 1000 nits with an extra liquid crystal layer (if that's what it is) is impressive.austinsguitar - Thursday, December 1, 2016 - link
:Dkpb321 - Thursday, December 1, 2016 - link
Offhand it sounds like they've placed a second LCD layer between the backlight and the regular layer that doesn't have any color filters on it. I'm sure it's more complicated than that but it seems to be the basics of what they have done.BlueScreenJunky - Thursday, December 1, 2016 - link
That's what I understood too, and it seems like it should work provided you have a powerful enough backlight. I'm a little worried about power efficiency though.Guspaz - Thursday, December 1, 2016 - link
This would be primarily for desktop or laptop use, since most things smaller than that have moved to OLED already, and the thickness of the extra layers would be unwelcome.Desktop LCD efficiency isn't really all that big a concern (and if they're using LEDs, they'd still be far more efficient than CCFL backlit screens.
Roland00Address - Thursday, December 1, 2016 - link
While it may work for laptop, I expect desktop and tvs will be using this and anything that cares about batteries will not. This is because any extra layers will mean you need a stronger and more powerful backlight to bypass it. Think of it as something similar to a filter with sunglasses, anyform of barrier always takes more power to get the same effective brightness on the receiving end.Pneumothorax - Thursday, December 1, 2016 - link
Agreed. This is dead in the water for laptops. Considering how laptop models already suffer with QHD vs FHD screens, the laptops would take even a bigger hit with adding another layer and brighter backlight. Also it goes against the stupid 'Thinner is Better' that's going around right now (I'm looking at you Apple) that's killing performance/battery life for the sake of thinness.ABR - Friday, December 2, 2016 - link
Dead in the water for thin-and-light laptops maybe, but not big honkin' gaming laptops and desktop replacements that usually get run plugged in anyway.ajp_anton - Friday, December 2, 2016 - link
But if a fully opened LCD cell passes through 99%* of the light, then you only lose another 1% by adding a second layer.*Ignoring the 50% that gets lost by polarizing the light.
Old_Fogie_Late_Bloomer - Thursday, December 1, 2016 - link
As I understand this, there's going to be an issue where the brighter you make a fully saturated pixel (one where at least one primary is zero), the less actually-saturated the pixel will be. For instance, for red at 100%, you would have the filter at 100% open and the maximum possible backlight bleed through the green and blue subpixels, whereas for red at 50%, you could have the filter at 50% open, red at 100%, and half the amount of backlight bleed through the green and blue subpixels.Still, it seems like a pretty cool idea.
MrSpadge - Friday, December 2, 2016 - link
No, they didn't specify whether they apply the additional filter per "logical pixel" or per "color spot". the latter would not have the disadvantage you're describing.bcronce - Friday, December 2, 2016 - link
"At a high level, one could think of them like gates placed behind each pixel on the display."MrSpadge - Friday, December 2, 2016 - link
I understand this as Antons vague interpretation of the device to give the reader a rough idea, not necessarily Panasonics own words.MrSpadge - Friday, December 2, 2016 - link
Yep, this also matches the contrast ratio: 2 layers of IPS with 1:1000 each would yield 1:10^6 combined contrast.knightspawn1138 - Thursday, December 1, 2016 - link
This does seem like it would add costs to the monitor, and make the display a little more complicated to control. I would guess the light-filtering layer is also variable, and would thus need something to translate the intended RGB value for each pixel into an additional light intensity value, and cause the light-filtering layer's crystal to twist open. Let's say it is an LCD-like layer that's got large cells big enough to encompass all of the subpixels that make up an RGB pixel. That would decrease the complexity of this LCD layer, but there still has to be a component in the monitor that's figuring out how much light to allow to each pixel cluster. I wonder if this would require additional data from the video source, or if the monitor is making this per-pixel computation on it's own.It's an interesting solution to the problem of controlling light in an LCD/IPS monitor. It basically gives each pixel it's own adjustable spotlight behind it. The real trick will be maintaining the wide viewing angles that IPS monitors are known for.
Guspaz - Thursday, December 1, 2016 - link
It's just going to involve a modified LCD controller, and if they're doing it on a per-pixel level rather than a per-subpixel level, it wouldn't be any more complicated than the controller for existing displays that use RGBW in a 4-subpixel arrangement.MadAd - Thursday, December 1, 2016 - link
outstanding, hopefully will see this added to gaming themed monitorsMr Perfect - Thursday, December 1, 2016 - link
Indeed. Throw this on a 27"+ 4K HDR 120Hz screen sometime next year, and I'm down for one!Vatharian - Thursday, December 1, 2016 - link
My first thought when I read the title was: oh, so they basically developed a panel that sparks 'Beacons of Gondor has bean lit!', so they'd claim that contrast? It sure still has that damn IPS glow.lefty2 - Thursday, December 1, 2016 - link
I don't know why they compare the pixel size to a tennis ball. Presumably, some early version display had tennis ball size pixels and it didn't pan out too well.MrSpadge - Friday, December 2, 2016 - link
They don't. It's a slide from a different company offering quantum dots, trying to visualize you how small "a few nm" are. Anton put it there because Samsung is applying them to displays.id4andrei - Thursday, December 1, 2016 - link
I recall from the Surface Book review that Panasonic made the display. Even if less efficient than competing alternatives, MS wanted the increased contrast which Panasonic offered.guidryp - Thursday, December 1, 2016 - link
This is absent any kind of useful detail.What the hell are light-modulating Cells? That is essentially what LCD is in the first place. If you have some other kind of Light Modulating cell, you can essentially ditch LCD and use that.
Basically what we have been told is LCD + magic layer that sounds better than LCD = better LCD.
I am betting this is just another local dimming setup with different marketing.
jhh - Thursday, December 1, 2016 - link
Well, if you had a LCD behind the color LCDs, there would be two sources of dimming which could be independently controlled. I doubt there would be a linear relationship between the two dimming sources, but could see having a much greater control over the final output. If one LED can attenuate by 75%, and a second another 75%, you get the lowest luminocity of 13%.guidryp - Thursday, December 1, 2016 - link
Stacked LCD would explain why this is limited to very high end applications.But it doesn't explain why they just don't say that it is stacked LCD.
QChronoD - Thursday, December 1, 2016 - link
Because that doesn't sound very fancy and therefore wouldn't demand a large price premium.BrokenCrayons - Thursday, December 1, 2016 - link
Interesting, but once laptops stopped shipping with passive matrix displays and moved to active matrix TFT LCDs, I pretty much haven't cared about advances in display technology since. Give me a cheap, low resolution TN panel on my devices, don't waste your time with calibration or any of that nonsense, and get out of my way. Color accuracy, contrast, response time...after a certain point that we already reached in the 1990s, it just hasn't mattered to me. In fact, before passive matrix trash, a lot of monochrome LCD panels were quite nice looking and more than good enough for most things that are meaningful to me on a PC.MrSpadge - Friday, December 2, 2016 - link
Yes, yes. And there's probably a market for 5 computers in the world. Glad you don't run IT companies!BrokenCrayons - Friday, December 2, 2016 - link
I tried to make it very clear that I was aware my thougths were my own opinions on the matter fo screen quality and applied in situations where my own usage was concerned. Short of starting with an all caps disclaimer that's festooned with blinking lights and sirens, I'm not sure how much more obvious I could have been. I guess there's always someone that wants to get upset about how someone else feels regardless of whether or not its relevant to them. Also, I did own an IT company in the late 1990s that was very successful thanks. :)MrSpadge - Friday, December 2, 2016 - link
Valid points, sorry!bji - Wednesday, December 7, 2016 - link
But what you failed to mention is why you thought that your very personal preferences were worth broadcasting to the world in this forum. You are not adding any information to the discussion that has relevance beyond your own small world bubble. Why should we care?BrokenCrayons - Friday, December 9, 2016 - link
Why? I don't know why, but its clear that you do as you bothered to reply. Take a look inward and you'll probably find an answer.KoolAidMan1 - Sunday, December 4, 2016 - link
You have low standards. Thanks for sharing!Houdani - Thursday, December 1, 2016 - link
Maybe this explains the fire sale on Dell monitors; clearing space for the new goodness.fanofanand - Thursday, December 1, 2016 - link
Best case scenario this creates competition in the high-end panel segment and assists in bringing OLED to the masses. One can dream right?hMunster - Thursday, December 1, 2016 - link
So what all this mumbo-jumbo is saying is that it's basically a black-and-white LCD strapped to a color LCD. There, that's easier to read.Hulk - Thursday, December 1, 2016 - link
Good way of looking at it. The question I'm wondering...What is the light transmission loss for a fully opened LCD cell?
Cost of this extra hardware?
dwade123 - Thursday, December 1, 2016 - link
LCD keeps on improving. Amazing tech. Can't say the same about one-trick pony Plasma crap.Hulk - Friday, December 2, 2016 - link
Plasma was a good "bridging" technology from CRT's to LCD's I think. But I have to admit they never appealed to me.3DoubleD - Friday, December 2, 2016 - link
Plasma was great, but it maxed out it's potential with regards to resolution and thinness. I bought one of the last great plasma TVs before they discontinued production, a Panasonic ST60. It is still one of the best pictures I've seen, LCD cannot touch it. This new technology might finally bring LCD into the picture quality realm of plasma, but with the benefits of higher resolution and quantum dot assisted wide gamut color. We might finally get a worthy successor to plasma, so long as they can tighten LCD response times. OLED doesn't seem like it will ever be a good laptop or TV display technology, it has worse wear and brightness problems than plasma ever did and it costs a boatload.KoolAidMan1 - Sunday, December 4, 2016 - link
Plasma was a dead end in terms of size and pixel density. If we're talking about image quality then LCD can't come anywhere close, especially if we're talking about late model Panasonics. Those are still the best 1080p HDTVs around. OLED is the next big hope for high quality HDTV and its what I would get right now if I was looking for a 4k set. The LG sets are pretty great, I'm just waiting until I can get an 85" for less than the price of a car.darkich - Friday, December 2, 2016 - link
Not sure if LCD's are even worthy investing any more.Seems that this will add some thickness to the panel, which will further emphasise the amoled thinnes advantage. And there are other amoled advantages besides contrast - response time, and lower production prices when mass produced.
haukionkannel - Sunday, December 4, 2016 - link
It depend on price.If normal 27" monitor cost 250$
and this hight contrast cost 600$
and amoled cost 1500$
There is a room for it. Large amoled panels Are still very expensive. So if this douples the price of normal lcd screen it still is much cheaper than amoled.
MrSpadge - Friday, December 2, 2016 - link
"but since it uses the term “cells”, it clearly indicates that we are dealing with a relatively thick layer of liquid crystals, not a thin layer of quantum dots"Anton, in IT a "cell" can be a logical unit of any size, totally unrelated to biological cells. Consider e.g. a standard cell in circuit layout with maybe 4 fins. The required area would be about (200 nm)² in a modern process.
Penti - Friday, December 2, 2016 - link
Hasn't Panasonic shuttered their Himeji plant? So what makes them relevant in this field today? Sharp is the only one making large panels in Japan now.scb6 - Wednesday, December 7, 2016 - link
The Himeji Gen 8.5 facility is still operating. I was there a few months ago.Penti - Saturday, December 10, 2016 - link
From what I gather it is for a while longer, but it was suppose to close back in September according to news reports in Japanese media. There was conflicting reports though, and I'm pretty sure they where suppose to stop making large panels (for TVs) at least. Which leaves some of their professional and medical lineup I guess. That removes much of the relevancy though. At least compared to Sharp that makes panels in anything from phones to large UHD-TVs and professional monitors.Would like to see some reporting on what Panasonic's plans are going forward... It would be fun to see them recover some of their business rather then just die as every other Japanese manufacturer.