Excellent overview, Brett. I will be linking this many weeks onward.
I’m curious how you were able to measure the SB2’s display power usage—that sounds incredibly handy as panel efficiency seems to be the name of the game here. Is this through software or hardware, like clamping or voltage measurements?
I had high hopes for IGZO penetrating and overtaking a-Si, but it seems like it’s the forgotten middle child sans one or two poster models like the Razer Blade.
Seeing LTPS proliferate, though, is welcome: Lenovo’s using it on their X1 Yoga HDR display and Huawei’s MateBook has won a lot of hearts (and eyes).
I enjoyed this article very much. I didn't know VA was a different technology, and assumed it was some subtype of IPS, so I'm glad that was cleared up.
I look forward to in-depth articles about other components!
Really nice article, but you're falling into some common confusion on HDR10. HDR10 is really only defined as a 'media profile', and for a display it means that it accepts at least 10 bits to support that profile. For PC displays, they often can accept a 12 bit signal. (I'm using one right now.)
Is "3k" eg 3200x1800 going out of favor on 13" laptops? I'd be rather disappointed if it is.
At 280 DPI it's equivalent to 4k on a 15.6" panel, and on anything that doens't have broken DPI scaling is high enough resolution that you can pick whatever scaling factor you want and have sharp can't come close to seeing the pixels anymore. The higher, going higher eg 4k and 330DPI doesn't really get anything except higher power consumption and lower battery life IMO.
It isn't, though. The SI unit for nits would be candela/square_meter [cd/m²]. Lux = Lumen/square_meter [lm/m²] has an additional light source component and a distant component in it, because it is used to measure the light that hits a certain point, not the source itself. Most non-US based tech reviewers I frequent use cd/m².
Since recently, for me an important part of the decision regarding buying a laptop is PWM.
When I start getting a headache I tend to lower the brightness to make it easier for my eyes, and I recently noticed that lowering the brightness was actually making it worse. I read a few articles and saw that it was due to PWM. Since then, if I start getting a headache, I put the brightness at 100% and it goes away. So I'm definitely going to be more careful choosing my next laptop and/or screen regarding the PWM.
We can't make normal LEDs small enough for individual pixels unless you're looking at something like a Jumbotron sized display.
LED's small enough to serve as pixels for a monitor are aspirationally branded as micro-LED, and are a subject of active research. To get an idea for where we're at with something near production ready, there's talk that the 27/32" gen 2 FALD LCDs will have an ~1000 zone microLED backlight - up from the current 384 zone models.
You can get a wider range of colors out of separate red, blue, and green LED backlights than from a single white LED. The tradeoff is that they're more expensive and draw more power.
I'm confused about the power math. I'd think one big LED (let's say 1 cm squared) would consume the same amount of power outputting the same amount of light, as the same LED broken into smaller pieces (let's say 100 mm-squared chunks) - or am I wrong? I thought for a given LED technology, you get X units of light per Y units of energy per unit of area - i.e. I thought efficiency of an LED does not depend on its dimensions but only on its materials, down to some quantum-level limit - am I wrong?
The wider gamut means you're putting light into a larger section of the visible spectrum. You've also got 3x the control circuitry/etc in the loop drawing power.
What Dan said. Plus: if you distribute the LED area, it will run cooler and gain some efficiency. But the comment about increased power draw was referring to the RGB backlight, not to distributing micro LEDs.
"We can't make normal LEDs small enough for individual pixels"
It's actually relatively easy, at first thought: use the substrate as common back contact, epitaxially grow the emitting layer and use a decently transparent conductive oxide as top electrode, just like in OLEDs. Structure the top contact into pixels.
However, doing that is simply prohibitively expensive as the entire display area would have to receive the super expensive expitaxial growth process, which generally only works on small samples (2 - 4") and requires expensive substrates. Regular LEDs are only affordable because the amount of actual chip area per device is tiny.
And with a common back gate there's be a major problem of adressing many pixels individually, so one would need some trick to structure the back contact as well.
I think Samsung currently has a 146" micro-led display demoed already. That screen is 4K resolution, so I guess they could do 75" FULL HD with the same technology.
However even at 75", it's still too large to my liking :)
When looking at the video for a laptop there are four items to look over: screen size, resolution, screen type and graphics processor. For most people, only the screen size and resolution are all that will really matter. The graphics processor really only tends to make a difference for those looking to possibly do some mobile gaming or high-definition video but they can be used for more than that. Pretty much all laptops use some form of backlit active matrix display to allow for bright fast displays capable of video playback.
"High resolution used to be a liability in terms of battery life..."
I'd imagine it still is - even if not as much due to the display backlight having to work harder, but still due to the sheer physical requirement to process, render, transmit, and update many more pixels per frame?
Start your home business right now. Spend more time with your family and earn. Start bringing 100$ per hr just on a computer. Very easy way to make your life happy and earning continuously. Start here…....... >>>
I've been using a High-Contrast theme with customized colours (because the default ones are just everywhere) to get a real and proper dark mode, and I'd *love* to try it on an OLED system. I can really notice my backlight, even at the lowest power usage and with the Intel Power-Saving thing active.
Regarding spreadsheets: by all means, keep all those horizontal pixels of a 'wide-screen' display - just give me some additional vertical pixels while you're at it! (Sometimes, working with spreadsheet-style documents can become a real PITA when the rows get too tall to fit on the screen...)
"Because pixel density is inversely proportional to how well a panel blocks its backlight – denser panels will block more light – increasing the pixel density of a display requires ramping up the strength of the backlighting system as well. And while that’s not an issue for desktop monitors because of their constant power source, for laptops it can have a significant impact on battery life."
"pixel density is inversely proportional to how well a panel blocks its backlight" - means that a higher pixel density display will struggle to achieve good contrast - in your words.
"increasing the pixel density of a display requires ramping up the strength of the backlighting system as well" - you are saying that as a result of the LCD blocking less of the light due to the higher pixel density struggling to block the light from passing through it. So... explain why you need more light from the backlight when the LCD blocks less of it. Sounds to me like you don't understand the technology, or that you don't understand yourself...?
The much more obvious cost to increased pixel density that you are not addressing is the cost of driving the extra pixels. The RAM that backs that display. The increased bus throughput. The increased frequency of the driving components. The increased work of the OS and applications to generate the extra pixel content. That's what kills the battery. That's why a WQHD laptop will get 2/3rds the idle operating time that an otherwise identical FHD model will, and why everyone who cares about all-day battery performance should stay away from the *QHD / 4K display models for the next couple years until those costs become a smaller portion of the idle power budget.
The author's comments conflict with each other is what I am pointing out. He needs to research it and come up with a consistent statement. Preferrably one that correlates well with facts.
The denser the display, the more transistors are blocking the passage of light, the brighter the backlight needs to be. That is what the author should have said instead of coming up with a much more wordy description that he himself obviously didn't even understand.
"Driving the extra pixels with the GPU and other components is a tiny difference. That's a common misconception you've stumbled upon."
Going from 1920x1080 to 3840x2160 is 4x the rendering cost, minimum (recognise that given more than 2 layers to composite you can easily exceed the CPU's L3 cache size with a 4k display), and that is 4x the amount of time that the CPU and all related subsystems can't drop to C7 sleep.
It's not a tiny difference at all. If it was negligible then why is the OS trying to use PSR (Panel Self Refresh) and FBC (FrameBuffer Compression) to reduce the IO channel and RAM access overheads, while those costs are negligible compared to keeping the CPU and GPU spinning with rasterizing and compositing.
What's keeping your OS and apps compositing constantly? Your browser which now does full-page 60hz updates of every pixel, changed or not, so the OS can't send only the damaged pixels to the display device as in earlier versions. Why? Because modern machines are fast enough and it's a "small difference" but keeps the render pathways hot in the caches so less frames are dropped. Welcome to 2018, when your battery life got slaughtered and people haven't quite clued in yet.
PSR and FBC tasks are tackling the 20% case, though, namely the parts at idle where 80% of the power consumed is just directly from keeping the backlight bright enough that the LCD can be seen. Note also that PSR and FBC doesn't make that much of a difference in battery life overall. I've seen up to about 10% in some cases. And that's consistent with doubling the GPU rendering pipeline efficiency _at idle_ for the entire display pipeline. Doubling the efficiency of 20% of your overall budget decreases power consumption by around 10%.
Note that much of the compositing engine is offloaded (in modern GPUs) from the heavyweight parts of the 3D rendering pipeline, so those costs aren't that high in comparison. It's not like you're keeping all 2048 stream processors (or however many equivalent GPU processors) active 60 times a second. That was the first "revolution" in GPU efficiency gains a while back - you didn't need to keep your entire GPU rendering silicon active all the time if they weren't being used.
"Less expensive displays may even reduce this more to 6-bit with Frame Rate Control (FRC) which uses the dithering of adjacent pixels to simulate the full 8-bit levels."
No. FRC uses Temporal dithering. It shows the pixel brighter or darker across multiple frames which average out to the intended intensity. On displays with poor response times this actually works out quite nicely. On TN displays, you can actually see the patterns flickering when you are close to a large display and cast your gaze around the display. Particularly in your peripheral vision which is more responsive to high-speed motion changes.
VA - You mentioned MVA, which is one type of PVA arrangement. PVA is Patterned Vertical Alignment, where not all of the VA pixels/subpixels are aligned in the same plane. Almost all VA displays are PVA. PVA allows to directly trade display brightness for wider viewing angles, and to choose in which direction those tradeoffs will be made. For example a PVA television will trade off mostly in the horizontal direction because that allows people to sit in various places around the room and still see the display well. They don't need to increase the vertical viewing angle so that the roof has a good view of the tv. ;-) But for a laptop just the opposite is true. You want to still see the display well when you stop slouching or stand up, but you don't really care if the people to your sides can see your display well. In fact, people purchase privacy guard overlays that reduce the side viewing angles intentionally.
The author was obviously in a hurry, saw the word "dithering", and jumped to the conclusion that it was spatial error distribution dithering as is commonly used in static images to create an appearance of a larger palette. ie GIFs, printers. But for video there's a 3rd dimension to perform dithering in which doesn't trade off resolution or cause edge flickering artefacts, so of course they're going to use FRC (Frame Rate Control which is basically a form of PWM) instead of spatial dithering.
WTF, you guys still test laptop displays at the time when more than half of personal computing has already moved onto mobile devices, like phones or tables, which you no longer review? Mmmokay.
So...no mention about what non-PWM screens use for dimming? Or do high frequency PWM screens don't count as PWM? And a lot of people seem to be complaining in particular PWM of OLED screens, so why do OLED screens (at least the RGB variety) all use low frequency controllers?
Very nice article. Some minor corrections/additions. (I'll post one at a time since the site is flagging it as spam)
Light from our sun is actually about 5800K. 6500K is the combination of sunlight and blue sky on a sunny day. It's bluer than direct sunlight because part of the red light gets scattered by the atmosphere (and sent to regions experiencing sunrise/sunset). Daylight in the shade (lit mainly by the blue sky) is closer to 9000K. That combined with the 5800K direct sunlight produces about 6500K.
Non-RGB subpixel arrangements can offer the same viewing experience as RGB while using fewer pixels. Your eyes have the best resolution in green, not so good in red, and absolutely terrible in blue. The non-RGB subpixel layouts take advantage of this, usually by using two green subpixels for each red and blue subpixel. This results in fewer total subpixels (a "lower" resolution), but no discernible loss of resolution to your eye. Older video standards like NTSC and even newer image encoding algorithms like JPEG and MPEG do the same thing to reduce storage space by decreasing the color resolution. So this isn't something new - every TV show you've viewed growing up had its colors mangled this way, and you've never noticed it. So it's silly to suddenly pretend that non-RGB is suddenly inferior.
Ah, it was the website link for this which was flagging the comment as spam. Google "your eyes suck at blue" and you'll get the site with a graphical example of how you can completely mangle the blue channel and the picture will still look the same.
The oddball RGB subpixel layout often used in OLED panels can actually be better for devices which are meant to be used in both portrait and landscape orientation. The RGB subpixels are usually arranged so the relative position of red, green, and blue are the same when rotated 90 degrees (though there might be a shift of one subpixel). In contrast, the traditional RGB stripe layout is completely different when rotated 90 degrees.
Keeping the subpixel layout the same in both orientations allows you to do subpixel rendering in both orientations. Subpiexl rendering improves the apparent resolution of the screen without increasing the actual resolution. So if you're rendering a diagonal white line, instead of rendering it as (capital letters are lit, lowercase are black):
rgbRGB rgbRGB RGBrgb RGBrgb
You render it as
rgbRGB rgBRGb rGBRgb RGBrgb
And you've tripled the screen's apparent resolution without adding any new pixels. This trick is most often used with fonts (ClearType in Windows). But the RGB subpixel layout means apparent resolution can only be increased in one direction, and fonts designed for subpixel rendering will only work in one screen orientation. If you rotate the screen 90 degrees, suddenly the subpixels don't fall the way you expect, and your subpixel rendering breaks. Not so with the subpixel layout used in some OLED screens. If you turn the screen 90 degrees, the RGB layout remains the same, and your subpixel rendering still works.
You forgot to mention the RGBW subpixel layout, which I wish would die but keeps coming back. That's where they add a white subpixel to increase the apparent brightness of the screen. The R, G, and B subpixels generate those colors by blocking 66% of the light. So when you display a white pixel (R, G, and B lit), you're actually only seeing 33% of the backlight brightness. Someone came up with the idea of adding a white subpixel, so when displaying white you see 3*(33%*25%)+(100%*25%) = 50% of the backlight brightness. So the screen can appear brighter at the same power level, or the laptop will use less power at the same image brightness.
Unfortunately this comes at the cost of muting colors, since your colors are now only transmitted through 75% of the subpixels instead of 100%. And you end up with pale red, green, blue, and mustard yellows and lavender purples.
The solution many vendors came up with to customers complaining about poor colors is to create a mode where the white subpixel is always off. But if you do that, now your whites are generated by only letting 3*(33%*25%)+(100%*0%) = 25% of the backlight through. And now your screen will either be dimmer than RGB, or will use more power when at the same brightness as RGB. In other words, you've defeated the purpose of using RGBW subpixels in the first place. If you see a review mention the screen is RGBW, that's a big red flag and should be avoided.
2. ASUS ZenBook Flip 14 UX461UA-DS51T best laptopsbest laptops
Asus Zenbook series originated as thin and light. UX310UA kept the proud heritage, featuring a timely elegant all-new design with a profile which is only 0.5-inch thickness. It took many manufacturing steps to architect the design into a sleek shape. Solid aluminum alloy has been used to keep the weight down to1.4 kg. Superb 8th generation Intel core up to i7 processor with 8GB LPDDR3 2133MHz RAMfor smooth video to be run. 14-Inch wide-view Full-HD nano-edge bezel touch display in 13.3” chassis with Stylus pen and Windows 10 Pre-installed display will give your eyes a soothing treat whether you are viewing photos, reading a text and video editing much easier than you can imagine. An ultra-storage capacity of 256GB SATA SSD and the built-in fingerprint reader with one-touch login via Windows Hello feature A full size backlit keyboard is constructed for an outstanding experience. This best budget laptop is surely going to value your money by providing you the best. https://happytechknow.com/2018/07/03/best-laptop-a...
Best work you have done, this online website is cool with great facts and looks. I have stopped at this blog after viewing the excellent content. I will be back for more qualitative work, here some like you https://tinyurl.com/y8f94w2l
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
49 Comments
Back to Article
ikjadoon - Tuesday, July 10, 2018 - link
Excellent overview, Brett. I will be linking this many weeks onward.I’m curious how you were able to measure the SB2’s display power usage—that sounds incredibly handy as panel efficiency seems to be the name of the game here. Is this through software or hardware, like clamping or voltage measurements?
I had high hopes for IGZO penetrating and overtaking a-Si, but it seems like it’s the forgotten middle child sans one or two poster models like the Razer Blade.
Seeing LTPS proliferate, though, is welcome: Lenovo’s using it on their X1 Yoga HDR display and Huawei’s MateBook has won a lot of hearts (and eyes).
MajGenRelativity - Tuesday, July 10, 2018 - link
I enjoyed this article very much. I didn't know VA was a different technology, and assumed it was some subtype of IPS, so I'm glad that was cleared up.I look forward to in-depth articles about other components!
Brett Howse - Tuesday, July 10, 2018 - link
Thanks!Ehart - Tuesday, July 10, 2018 - link
Really nice article, but you're falling into some common confusion on HDR10. HDR10 is really only defined as a 'media profile', and for a display it means that it accepts at least 10 bits to support that profile. For PC displays, they often can accept a 12 bit signal. (I'm using one right now.)DanNeely - Tuesday, July 10, 2018 - link
Is "3k" eg 3200x1800 going out of favor on 13" laptops? I'd be rather disappointed if it is.At 280 DPI it's equivalent to 4k on a 15.6" panel, and on anything that doens't have broken DPI scaling is high enough resolution that you can pick whatever scaling factor you want and have sharp can't come close to seeing the pixels anymore. The higher, going higher eg 4k and 330DPI doesn't really get anything except higher power consumption and lower battery life IMO.
Brett Howse - Tuesday, July 10, 2018 - link
Seems to be less options for 3200x1800 these days.CaedenV - Tuesday, July 10, 2018 - link
"so the loss of 16:10 was mourned by many."Yeah... I miss my 1200p 16:10 display. It wasnt the best quality... but man was it useful!
keg504 - Tuesday, July 10, 2018 - link
If nit is not an SI unit, why not use lux, which is, and is the same quantity (from my understanding)?Death666Angel - Tuesday, July 10, 2018 - link
It isn't, though. The SI unit for nits would be candela/square_meter [cd/m²]. Lux = Lumen/square_meter [lm/m²] has an additional light source component and a distant component in it, because it is used to measure the light that hits a certain point, not the source itself. Most non-US based tech reviewers I frequent use cd/m².Amoro - Tuesday, July 10, 2018 - link
What about adaptive refresh rate technologies?heffeque - Tuesday, July 10, 2018 - link
Since recently, for me an important part of the decision regarding buying a laptop is PWM.When I start getting a headache I tend to lower the brightness to make it easier for my eyes, and I recently noticed that lowering the brightness was actually making it worse. I read a few articles and saw that it was due to PWM. Since then, if I start getting a headache, I put the brightness at 100% and it goes away. So I'm definitely going to be more careful choosing my next laptop and/or screen regarding the PWM.
nfriedly - Tuesday, July 10, 2018 - link
Why don't they make RGB LED displays? That seems like it would have a lot of the benefits of OLED, but without the risk of burn-in.Also, you mentioned RGB LED backlighting - how does that work? What's the benefit?
DanNeely - Tuesday, July 10, 2018 - link
We can't make normal LEDs small enough for individual pixels unless you're looking at something like a Jumbotron sized display.LED's small enough to serve as pixels for a monitor are aspirationally branded as micro-LED, and are a subject of active research. To get an idea for where we're at with something near production ready, there's talk that the 27/32" gen 2 FALD LCDs will have an ~1000 zone microLED backlight - up from the current 384 zone models.
You can get a wider range of colors out of separate red, blue, and green LED backlights than from a single white LED. The tradeoff is that they're more expensive and draw more power.
Brett Howse - Tuesday, July 10, 2018 - link
What Dan said :)boeush - Wednesday, July 11, 2018 - link
I'm confused about the power math. I'd think one big LED (let's say 1 cm squared) would consume the same amount of power outputting the same amount of light, as the same LED broken into smaller pieces (let's say 100 mm-squared chunks) - or am I wrong? I thought for a given LED technology, you get X units of light per Y units of energy per unit of area - i.e. I thought efficiency of an LED does not depend on its dimensions but only on its materials, down to some quantum-level limit - am I wrong?DanNeely - Wednesday, July 11, 2018 - link
The wider gamut means you're putting light into a larger section of the visible spectrum. You've also got 3x the control circuitry/etc in the loop drawing power.MrSpadge - Wednesday, July 11, 2018 - link
What Dan said. Plus: if you distribute the LED area, it will run cooler and gain some efficiency. But the comment about increased power draw was referring to the RGB backlight, not to distributing micro LEDs.MrSpadge - Wednesday, July 11, 2018 - link
"We can't make normal LEDs small enough for individual pixels"It's actually relatively easy, at first thought: use the substrate as common back contact, epitaxially grow the emitting layer and use a decently transparent conductive oxide as top electrode, just like in OLEDs. Structure the top contact into pixels.
However, doing that is simply prohibitively expensive as the entire display area would have to receive the super expensive expitaxial growth process, which generally only works on small samples (2 - 4") and requires expensive substrates. Regular LEDs are only affordable because the amount of actual chip area per device is tiny.
And with a common back gate there's be a major problem of adressing many pixels individually, so one would need some trick to structure the back contact as well.
mr_tawan - Friday, July 13, 2018 - link
I think Samsung currently has a 146" micro-led display demoed already. That screen is 4K resolution, so I guess they could do 75" FULL HD with the same technology.However even at 75", it's still too large to my liking :)
Martin654 - Wednesday, July 11, 2018 - link
When looking at the video for a laptop there are four items to look over: screen size, resolution, screen type and graphics processor. For most people, only the screen size and resolution are all that will really matter. The graphics processor really only tends to make a difference for those looking to possibly do some mobile gaming or high-definition video but they can be used for more than that. Pretty much all laptops use some form of backlit active matrix display to allow for bright fast displays capable of video playback.boeush - Wednesday, July 11, 2018 - link
"High resolution used to be a liability in terms of battery life..."I'd imagine it still is - even if not as much due to the display backlight having to work harder, but still due to the sheer physical requirement to process, render, transmit, and update many more pixels per frame?
Kim151 - Wednesday, July 11, 2018 - link
Start your home business right now. Spend more time with your family and earn. Start bringing 100$ per hr just on a computer. Very easy way to make your life happy and earning continuously. Start here…....... >>>doggface - Wednesday, July 11, 2018 - link
I wish, oh I wish that 1366x768 displays would die. Just die already.MrSpadge - Wednesday, July 11, 2018 - link
I think they're OK - on budget 4" devices.mkozakewich - Wednesday, July 11, 2018 - link
I've been using a High-Contrast theme with customized colours (because the default ones are just everywhere) to get a real and proper dark mode, and I'd *love* to try it on an OLED system. I can really notice my backlight, even at the lowest power usage and with the Intel Power-Saving thing active.Evil Underlord - Wednesday, July 11, 2018 - link
Glad to see more hope of a return to a saner 3:2 ratio. Not all of us spend all our times watching movies or on spreadsheets.Speaking of which, I'd love to see a similar article on keyboards. (In my view, the #1 important feature).
boeush - Wednesday, July 11, 2018 - link
Regarding spreadsheets: by all means, keep all those horizontal pixels of a 'wide-screen' display - just give me some additional vertical pixels while you're at it! (Sometimes, working with spreadsheet-style documents can become a real PITA when the rows get too tall to fit on the screen...)linuxgeex - Wednesday, July 11, 2018 - link
"Because pixel density is inversely proportional to how well a panel blocks its backlight – denser panels will block more light – increasing the pixel density of a display requires ramping up the strength of the backlighting system as well. And while that’s not an issue for desktop monitors because of their constant power source, for laptops it can have a significant impact on battery life.""pixel density is inversely proportional to how well a panel blocks its backlight" - means that a higher pixel density display will struggle to achieve good contrast - in your words.
"increasing the pixel density of a display requires ramping up the strength of the backlighting system as well" - you are saying that as a result of the LCD blocking less of the light due to the higher pixel density struggling to block the light from passing through it. So... explain why you need more light from the backlight when the LCD blocks less of it. Sounds to me like you don't understand the technology, or that you don't understand yourself...?
The much more obvious cost to increased pixel density that you are not addressing is the cost of driving the extra pixels. The RAM that backs that display. The increased bus throughput. The increased frequency of the driving components. The increased work of the OS and applications to generate the extra pixel content. That's what kills the battery. That's why a WQHD laptop will get 2/3rds the idle operating time that an otherwise identical FHD model will, and why everyone who cares about all-day battery performance should stay away from the *QHD / 4K display models for the next couple years until those costs become a smaller portion of the idle power budget.
Brett Howse - Wednesday, July 11, 2018 - link
It's the TFT that's blocking the backlight and a denser TFT requires a stronger backlight to get the same brightness from the display.Driving the extra pixels with the GPU and other components is a tiny difference. That's a common misconception you've stumbled upon.
linuxgeex - Thursday, July 12, 2018 - link
The author's comments conflict with each other is what I am pointing out. He needs to research it and come up with a consistent statement. Preferrably one that correlates well with facts.The denser the display, the more transistors are blocking the passage of light, the brighter the backlight needs to be. That is what the author should have said instead of coming up with a much more wordy description that he himself obviously didn't even understand.
linuxgeex - Thursday, July 12, 2018 - link
"Driving the extra pixels with the GPU and other components is a tiny difference. That's a common misconception you've stumbled upon."Going from 1920x1080 to 3840x2160 is 4x the rendering cost, minimum (recognise that given more than 2 layers to composite you can easily exceed the CPU's L3 cache size with a 4k display), and that is 4x the amount of time that the CPU and all related subsystems can't drop to C7 sleep.
It's not a tiny difference at all. If it was negligible then why is the OS trying to use PSR (Panel Self Refresh) and FBC (FrameBuffer Compression) to reduce the IO channel and RAM access overheads, while those costs are negligible compared to keeping the CPU and GPU spinning with rasterizing and compositing.
What's keeping your OS and apps compositing constantly? Your browser which now does full-page 60hz updates of every pixel, changed or not, so the OS can't send only the damaged pixels to the display device as in earlier versions. Why? Because modern machines are fast enough and it's a "small difference" but keeps the render pathways hot in the caches so less frames are dropped. Welcome to 2018, when your battery life got slaughtered and people haven't quite clued in yet.
erple2 - Sunday, July 22, 2018 - link
PSR and FBC tasks are tackling the 20% case, though, namely the parts at idle where 80% of the power consumed is just directly from keeping the backlight bright enough that the LCD can be seen. Note also that PSR and FBC doesn't make that much of a difference in battery life overall. I've seen up to about 10% in some cases. And that's consistent with doubling the GPU rendering pipeline efficiency _at idle_ for the entire display pipeline. Doubling the efficiency of 20% of your overall budget decreases power consumption by around 10%.Note that much of the compositing engine is offloaded (in modern GPUs) from the heavyweight parts of the 3D rendering pipeline, so those costs aren't that high in comparison. It's not like you're keeping all 2048 stream processors (or however many equivalent GPU processors) active 60 times a second. That was the first "revolution" in GPU efficiency gains a while back - you didn't need to keep your entire GPU rendering silicon active all the time if they weren't being used.
linuxgeex - Wednesday, July 11, 2018 - link
"Less expensive displays may even reduce this more to 6-bit with Frame Rate Control (FRC) which uses the dithering of adjacent pixels to simulate the full 8-bit levels."No. FRC uses Temporal dithering. It shows the pixel brighter or darker across multiple frames which average out to the intended intensity. On displays with poor response times this actually works out quite nicely. On TN displays, you can actually see the patterns flickering when you are close to a large display and cast your gaze around the display. Particularly in your peripheral vision which is more responsive to high-speed motion changes.
VA - You mentioned MVA, which is one type of PVA arrangement. PVA is Patterned Vertical Alignment, where not all of the VA pixels/subpixels are aligned in the same plane. Almost all VA displays are PVA. PVA allows to directly trade display brightness for wider viewing angles, and to choose in which direction those tradeoffs will be made. For example a PVA television will trade off mostly in the horizontal direction because that allows people to sit in various places around the room and still see the display well. They don't need to increase the vertical viewing angle so that the roof has a good view of the tv. ;-) But for a laptop just the opposite is true. You want to still see the display well when you stop slouching or stand up, but you don't really care if the people to your sides can see your display well. In fact, people purchase privacy guard overlays that reduce the side viewing angles intentionally.
Brett Howse - Wednesday, July 11, 2018 - link
Excellent info thanks!linuxgeex - Thursday, July 12, 2018 - link
The author was obviously in a hurry, saw the word "dithering", and jumped to the conclusion that it was spatial error distribution dithering as is commonly used in static images to create an appearance of a larger palette. ie GIFs, printers. But for video there's a 3rd dimension to perform dithering in which doesn't trade off resolution or cause edge flickering artefacts, so of course they're going to use FRC (Frame Rate Control which is basically a form of PWM) instead of spatial dithering.linuxgeex - Thursday, July 12, 2018 - link
Oh Brett, lol that's you. ;-)UtilityMax - Friday, July 13, 2018 - link
WTF, you guys still test laptop displays at the time when more than half of personal computing has already moved onto mobile devices, like phones or tables, which you no longer review? Mmmokay.linuxgeex - Friday, July 13, 2018 - link
Actually they have reviewed new phones within the last 30 days... Mmmokay.Zan Lynx - Saturday, July 14, 2018 - link
A tablet is just a gimped laptop without a keyboard.madskills42001 - Tuesday, July 17, 2018 - link
Given that contrast is the most important factor in subjective image quality tests, why is more discussion not given to it in this article?s.yu - Saturday, July 21, 2018 - link
So...no mention about what non-PWM screens use for dimming? Or do high frequency PWM screens don't count as PWM? And a lot of people seem to be complaining in particular PWM of OLED screens, so why do OLED screens (at least the RGB variety) all use low frequency controllers?Solandri - Saturday, July 28, 2018 - link
Very nice article. Some minor corrections/additions. (I'll post one at a time since the site is flagging it as spam)Light from our sun is actually about 5800K. 6500K is the combination of sunlight and blue sky on a sunny day. It's bluer than direct sunlight because part of the red light gets scattered by the atmosphere (and sent to regions experiencing sunrise/sunset). Daylight in the shade (lit mainly by the blue sky) is closer to 9000K. That combined with the 5800K direct sunlight produces about 6500K.
Solandri - Saturday, July 28, 2018 - link
Non-RGB subpixel arrangements can offer the same viewing experience as RGB while using fewer pixels. Your eyes have the best resolution in green, not so good in red, and absolutely terrible in blue. The non-RGB subpixel layouts take advantage of this, usually by using two green subpixels for each red and blue subpixel. This results in fewer total subpixels (a "lower" resolution), but no discernible loss of resolution to your eye. Older video standards like NTSC and even newer image encoding algorithms like JPEG and MPEG do the same thing to reduce storage space by decreasing the color resolution. So this isn't something new - every TV show you've viewed growing up had its colors mangled this way, and you've never noticed it. So it's silly to suddenly pretend that non-RGB is suddenly inferior.Solandri - Saturday, July 28, 2018 - link
Ah, it was the website link for this which was flagging the comment as spam. Google "your eyes suck at blue" and you'll get the site with a graphical example of how you can completely mangle the blue channel and the picture will still look the same.Solandri - Saturday, July 28, 2018 - link
The oddball RGB subpixel layout often used in OLED panels can actually be better for devices which are meant to be used in both portrait and landscape orientation. The RGB subpixels are usually arranged so the relative position of red, green, and blue are the same when rotated 90 degrees (though there might be a shift of one subpixel). In contrast, the traditional RGB stripe layout is completely different when rotated 90 degrees.Keeping the subpixel layout the same in both orientations allows you to do subpixel rendering in both orientations. Subpiexl rendering improves the apparent resolution of the screen without increasing the actual resolution. So if you're rendering a diagonal white line, instead of rendering it as (capital letters are lit, lowercase are black):
rgbRGB
rgbRGB
RGBrgb
RGBrgb
You render it as
rgbRGB
rgBRGb
rGBRgb
RGBrgb
And you've tripled the screen's apparent resolution without adding any new pixels. This trick is most often used with fonts (ClearType in Windows). But the RGB subpixel layout means apparent resolution can only be increased in one direction, and fonts designed for subpixel rendering will only work in one screen orientation. If you rotate the screen 90 degrees, suddenly the subpixels don't fall the way you expect, and your subpixel rendering breaks. Not so with the subpixel layout used in some OLED screens. If you turn the screen 90 degrees, the RGB layout remains the same, and your subpixel rendering still works.
Solandri - Saturday, July 28, 2018 - link
You forgot to mention the RGBW subpixel layout, which I wish would die but keeps coming back. That's where they add a white subpixel to increase the apparent brightness of the screen. The R, G, and B subpixels generate those colors by blocking 66% of the light. So when you display a white pixel (R, G, and B lit), you're actually only seeing 33% of the backlight brightness. Someone came up with the idea of adding a white subpixel, so when displaying white you see 3*(33%*25%)+(100%*25%) = 50% of the backlight brightness. So the screen can appear brighter at the same power level, or the laptop will use less power at the same image brightness.Unfortunately this comes at the cost of muting colors, since your colors are now only transmitted through 75% of the subpixels instead of 100%. And you end up with pale red, green, blue, and mustard yellows and lavender purples.
Solandri - Saturday, July 28, 2018 - link
The solution many vendors came up with to customers complaining about poor colors is to create a mode where the white subpixel is always off. But if you do that, now your whites are generated by only letting 3*(33%*25%)+(100%*0%) = 25% of the backlight through. And now your screen will either be dimmer than RGB, or will use more power when at the same brightness as RGB. In other words, you've defeated the purpose of using RGBW subpixels in the first place. If you see a review mention the screen is RGBW, that's a big red flag and should be avoided.HappyTechKnow - Tuesday, July 31, 2018 - link
2. ASUS ZenBook Flip 14 UX461UA-DS51T
best laptopsbest laptops
Asus Zenbook series originated as thin and light. UX310UA kept the proud heritage, featuring a timely elegant all-new design with a profile which is only 0.5-inch thickness. It took many manufacturing steps to architect the design into a sleek shape. Solid aluminum alloy has been used to keep the weight down to1.4 kg. Superb 8th generation Intel core up to i7 processor with 8GB LPDDR3 2133MHz RAMfor smooth video to be run.
14-Inch wide-view Full-HD nano-edge bezel touch display in 13.3” chassis with Stylus pen and Windows 10 Pre-installed display will give your eyes a soothing treat whether you are viewing photos, reading a text and video editing much easier than you can imagine.
An ultra-storage capacity of 256GB SATA SSD and the built-in fingerprint reader with one-touch login via Windows Hello feature A full size backlit keyboard is constructed for an outstanding experience. This best budget laptop is surely going to value your money by providing you the best.
https://happytechknow.com/2018/07/03/best-laptop-a...
rannyjohns - Monday, November 5, 2018 - link
Best work you have done, this online website is cool with great facts and looks. I have stopped at this blog after viewing the excellent content. I will be back for more qualitative work, here some like you https://tinyurl.com/y8f94w2l