It's been a long road for Panel Self Refresh (the eDP protocol used as the basis for G-sync and later DP Adaptive Sync) to finally make its way to mobile as it was originally intended. https://www.anandtech.com/show/7208/understanding-...
PSR has been present in essentially all mobile devices in the past 5+ years. PSR helps with stopping the SoC sending new image data to the DDIC over the MIPI interfaces. That's the de-facto standard in any phone today.
In this case, we're talking about stopping the DDIC from refreshing the panel matrix unless it's needed.
We have seen some OLED displays (usually LG's) that look pretty bad when displaying low brightness content which is specially noticeable at low ambient brightness. Maybe this new Samsung displays have a similar issue. Andrei, what prevents a LTPS displays from being able to seamlessly vary their refresh rate? If I am not mistaken, IGZO displays (like the ones in the iPad Pro) can also have seamless VRR.
LG's issue has been related to their sub-par DDICs. They use the DAC drivers in the OLED matrix to manipulate brightness, but these only have a certain bit-depth available to them. At low-brightness, there isn't sufficient bit-depth available and it results in bad quality picture.
Samsung's and Synaptics' DDICs (and problably recent LG and MagnaChip ones) drive brightness via separate PWM drivers that don't use up DAC bit-depth for the voltage generation of that drives colour of the pixels.
> Andrei, what prevents a LTPS displays from being able to seamlessly vary their refresh rate? If I am not mistaken, IGZO displays (like the ones in the iPad Pro) can also have seamless VRR.
I'm not aware of the iPads having VRR? In any case the issue sounds to be the response times of the transistors in the backplane and the whole timing architecture between the panel, DDIC, and OS.
Andrei, this quote is from the 2018 iPad Pro which you co-authored:
"In addition, Apple’s iPad Pro offers their ProMotion technology, which means it is a 120 Hz display, but one that supports variable refresh rates in order to lower the display's refresh rate for power management purposes. The iPad Pro will go as low as 24 Hz, presumably chosen to match the framerate of most movies."
According to Apple, they insinuate (like everyone) that it's seamless VRR:
"ProMotion also improves display quality and reduces power consumption by automatically adjusting the display refresh rate to match the movement of the content.”
But, during at least the 2017 iPad Pro launch, their marketing slide shows only 24, 48 and 120 Hz options. I'd be curious to see the ramp-up / ramp-down rates and whether it's brightness-dependent.
Apples iPad Pro displays support supposedly multiple refresh rates. The OS switches between 24 Hz, 48 Hz, 60 Hz and 120 Hz depending on the content. This is said to be true at least since the 10.5 iPad Pro from 2017 https://www.flatpanelshd.com/news.php?subaction=sh...
To my knowledge LTPS excells at electron mobility so it can charge the pixel quickly and accurately, even with tiny sizes and high ppi. Oxide TFT’s on the other hand hold on their charge for longer so they can enable lower refresh rates - first gen iPad Pro with 60 and 30 fps modes. And LTPO can do both.
I think LTPS simply does not go low enough on refresh rate for VRR to be feasible. You do not need seamless 60-120 Hz when you are in one mode or the other 99.9% of the time.
A theory of mine (which I believe you should be able to easily verify) is that the piece of paper you were using to cover the light sensor was reflecting the light from the high-brightness white screen, increasing the sensed light value above the threshold.
By covering the sensor directly (without any distance), or using a black material, you should be able to verify that.
That doesn't make sense. The light sensor measured 0 lux when covered up, even with the paper. Plus it would not engage the low-brightness mode if it sensed that higher ambient light.
This limitation makes perfect sense of it was there is because of PWM. VRR wouldn't 'look' very compelling when the screen is flickering to emulate lower brightness. I hate this characteristic of OLED.
Try an overlay app to artificially darken it and slide the brightness of the display all the way up and see if this persists.
Luminosity power is 0, yes. Measuring power at pure black is good to measure other panel or system overheads that might exist. In this case, refreshing the pixel matrix, even with pure black, costs 200mW at 120Hz.
There's an app in Google Play called "Proximity Service" that can turn off (not just black out) a phone's display without root, adb or anything. Screen off power consuption might be a good data point.
I'm glad to see positive steps towards true VRR! I'm suprised all this detection (content brightness, environment brightness, user input and such) doesn't have a ton of power overhead itself.
A lot of discussion about "lux" and the "lux meter" on the Note 20...but...
1) Is the lux meter on the Note 20 cosine corrected?
2) What is the spectral response curve of the Note 20's lux meter? Does it have roughly the same spectral response as the human eye? Or does it over/under-read certain light sources, especially metal-halide and LED light sources, which have very different spectral power distribution curves versus incandescent light sources?
Hi Andrei, do you think it's possible for Samsung to backport the vrr to S20 series through a firmware update? Of course there are hardware limitations, but what about some software trick to mimic vrr?
I think I'd rather have the battery life over the benefits of higher refresh rates. That's less charging and a slower discharge which also helps with longevity since phone manufacturers are too inept to engineer a removable back panel.
There is quite a bit I find mystifying about this.
Why can't you just sent one frame to a panel and lock it to display that frame until it gets a new frame? For that matter, why do you have to blast out a whole rectangle full of frames. Why aren't people shocked that cell phones in 2020 are designed around the limitations of CRT tubes in 1940?
If you could just send the data that changes when it changes, you would get better latency and better power, no questions about it. You could get 1200 Hz class latency for small updates (e.g. type a character) but be able to shut the communications bus down when it is not necessary.
The rube goldberg approach used by this phone means you are going to have to squint at lots of charts to guess at whether or not it is faster or saves power. Consumers are getting wary about power and battery life claims: we've all bought the laptop that claims to have a 12 hour battery life, maybe has a 9 hour battery life when it is new, but after a year it is more like a 9 minute battery life (e.g. unplug it from the charger to move it to another room and it is crying for mommy)
It's possible to send sub-rectangles to the DRM driver for the display, but for the most part the OS composites full frames because it doesn't take significantly more power these days than jumping through all the hoops which would be required to determine which sub-rectangles have changed.
Why are we still stuck on CRTC concepts? Because the DRM drivers rely on CRTC concepts, and it remains the lowest common denominator because almost all devices to this day are still expected to be able to drive NTSC and/or VGA signals. It's legacy that keeps not going away, and probably won't for another 20 years. But the comparison you are drawing has nothing to do with that. Instead, it has to do with the temporal dithering (FRC) used by modern displays to fake higher bit-depths than the panel actually supports. FRC and Gamma are not very good friends. VRR and FRC are not very good friends. Change the framerate, change the Gamma. Change the gamma regularly, see flicker. Lower the framerate, see the FRC flicker. Lower the ambient light, see the flicker even at higher frame rates. Lower the content brightness, see the flicker even at higher frame rates. That's why Samsung is pegging the frame rate with low content brightness and low ambient brightness - to prevent people from complaining about the image quailty.
As an aside, I read sometime in the last year about a recent kernel getting DRM Damage support for syncing sub-rectangles, but Android is usually more than a year behind on kernel versions. For Damage to work, apps need to send in their updates with bounding boxes, all the way from text rendering to the DRM. That may sound trivial, and it can be, but phone apps are using UI toolkits which pass their work off to GPU-accelerated render stacks, which use a scene graph to organize layers and minimise work via occlusion culling. They composite with MSAA to avoid artifacts at the borders. Calculating where those borders are and rendering 4x as many pixels at every border when there's hundreds of rectangles (ie a page of the Settings app) is a lot of work. In many cases it's less work to just redraw the full frame. Firefox recently gave up and always sends in "60fps full-frame updates" in their quest to provide silky-smooth display like a video game. Android UI has had the same "buttery smooth" goals since Jellybean. So bearing that in mind, with the apps and middleware having goals which are opposed to DRM Damage having a hope in hell of sending only the subrectangles which have been modified... then you might be accepting of why, when the only thing on the display is an 8-character clock updating at 1hz, the OS is still sending in 120hz full-frame updates. Meanwhile a microcontroller hooked up via MIPI can do a clock with 1hz updates and save a ton of power... and now you know why modern phones can't manage 1 month of standby.
Started reading this article thinking "huh it's almost weird that anandtech explains things, this is really newb friendly" and then as it progressed I got more and more confused. Now I have a headache lol
Thanks Andrei! Interesting technology, not sure about the implementation in the 20 Note Ultra. That picture of the selection menu already shows the problem: 60Hz refresh without any dynamic adjustment is pointed to as consuming less power. I was/am hoping that having a true VRR will save power over 60 Hz all the time; as your data show, that's indeed not what's happening here.
Ambient thresholds is placed to avoid visible flickers when display switches refresh rates. The ambient thresholds depend on how Samsung has fine tuned the gamma curves across different refresh rates. Darker environment have higher threshold since human eyes are more sensitive to the flicker at darker environment.
First, you have a typo under the section with bold headline "The Effect on Battery Life" you incorrectly referent the S20 Ultra when that should be the Note 20 Ultra.
I also think there's a secondary question to be asked, which is, aside from gaming applications, how easily (if at all) can an average user with typical usage perceive any significant difference between 60 and 120Hz and if it is perceptible, what is the actual tangible benefit that would outweigh having a longer battery life using 60Hz?
It seems it is common with a lot of technology, due to marketing strategies, consumers become somewhat obsessed with technical specifications while failing to ask themselves, how much does it REALLY matter?
I have an S20+ that I run at 60Hz because I don't play games and I want the longest battery life. When I switch to 120Hz I can't tell any significant difference in reading my email, browsing a web page, checking the weather forecast, taking a photo, sending/receiving text messages. So there's no practical reason to even use 120Hz.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
43 Comments
Back to Article
edzieba - Tuesday, September 8, 2020 - link
It's been a long road for Panel Self Refresh (the eDP protocol used as the basis for G-sync and later DP Adaptive Sync) to finally make its way to mobile as it was originally intended. https://www.anandtech.com/show/7208/understanding-...Andrei Frumusanu - Tuesday, September 8, 2020 - link
PSR has been present in essentially all mobile devices in the past 5+ years. PSR helps with stopping the SoC sending new image data to the DDIC over the MIPI interfaces. That's the de-facto standard in any phone today.In this case, we're talking about stopping the DDIC from refreshing the panel matrix unless it's needed.
Luminar - Tuesday, September 8, 2020 - link
Damn, call 911, he needs a burn unit.Olaf van der Spek - Tuesday, September 8, 2020 - link
Does this mean the panel stores a copy of the image in internal memory?Andrei Frumusanu - Tuesday, September 8, 2020 - link
Yes, there's memory called GRAM in the DDIC which holds the image frame data.flyingpants265 - Tuesday, September 8, 2020 - link
I remember reading about this exact thing with the release of an LG phone, no idea which one. May have been G2 or G3.Spunjji - Wednesday, September 9, 2020 - link
G2 :)close - Wednesday, September 9, 2020 - link
The G2 had it and I feel it made a difference, 8 years later mine still lasts a day with light usage (half a day if I keep mashing the screen).Andrewsacar - Thursday, November 12, 2020 - link
https://forums.anandtech.com/threads/massive-20gb-...ss96 - Tuesday, September 8, 2020 - link
We have seen some OLED displays (usually LG's) that look pretty bad when displaying low brightness content which is specially noticeable at low ambient brightness. Maybe this new Samsung displays have a similar issue.Andrei, what prevents a LTPS displays from being able to seamlessly vary their refresh rate? If I am not mistaken, IGZO displays (like the ones in the iPad Pro) can also have seamless VRR.
Andrei Frumusanu - Tuesday, September 8, 2020 - link
LG's issue has been related to their sub-par DDICs. They use the DAC drivers in the OLED matrix to manipulate brightness, but these only have a certain bit-depth available to them. At low-brightness, there isn't sufficient bit-depth available and it results in bad quality picture.Samsung's and Synaptics' DDICs (and problably recent LG and MagnaChip ones) drive brightness via separate PWM drivers that don't use up DAC bit-depth for the voltage generation of that drives colour of the pixels.
> Andrei, what prevents a LTPS displays from being able to seamlessly vary their refresh rate? If I am not mistaken, IGZO displays (like the ones in the iPad Pro) can also have seamless VRR.
I'm not aware of the iPads having VRR? In any case the issue sounds to be the response times of the transistors in the backplane and the whole timing architecture between the panel, DDIC, and OS.
ddarko - Tuesday, September 8, 2020 - link
Andrei, this quote is from the 2018 iPad Pro which you co-authored:"In addition, Apple’s iPad Pro offers their ProMotion technology, which means it is a 120 Hz display, but one that supports variable refresh rates in order to lower the display's refresh rate for power management purposes. The iPad Pro will go as low as 24 Hz, presumably chosen to match the framerate of most movies."
https://www.anandtech.com/show/13661/the-2018-appl...
Andrei Frumusanu - Tuesday, September 8, 2020 - link
Ah fair enough, Brett had time with the iPad as I never had the opportunity.Still, I'm not aware if it's true VRR or whether it's normal refresh rate switching.
ikjadoon - Tuesday, September 8, 2020 - link
According to Apple, they insinuate (like everyone) that it's seamless VRR:"ProMotion also improves display quality and reduces power consumption by automatically adjusting the display refresh rate to match the movement of the content.”
But, during at least the 2017 iPad Pro launch, their marketing slide shows only 24, 48 and 120 Hz options. I'd be curious to see the ramp-up / ramp-down rates and whether it's brightness-dependent.
https://youtu.be/oaqHdULqet0?t=5903
dotjaz - Wednesday, September 9, 2020 - link
iPad doesn't have a 24Hz mode, so it can't be mode swtiching.Centurio_Macro - Monday, September 14, 2020 - link
Apples iPad Pro displays support supposedly multiple refresh rates. The OS switches between 24 Hz, 48 Hz, 60 Hz and 120 Hz depending on the content. This is said to be true at least since the 10.5 iPad Pro from 2017https://www.flatpanelshd.com/news.php?subaction=sh...
GC2:CS - Tuesday, September 8, 2020 - link
To my knowledge LTPS excells at electron mobility so it can charge the pixel quickly and accurately, even with tiny sizes and high ppi. Oxide TFT’s on the other hand hold on their charge for longer so they can enable lower refresh rates - first gen iPad Pro with 60 and 30 fps modes. And LTPO can do both.I think LTPS simply does not go low enough on refresh rate for VRR to be feasible. You do not need seamless 60-120 Hz when you are in one mode or the other 99.9% of the time.
Bigos - Tuesday, September 8, 2020 - link
A theory of mine (which I believe you should be able to easily verify) is that the piece of paper you were using to cover the light sensor was reflecting the light from the high-brightness white screen, increasing the sensed light value above the threshold.By covering the sensor directly (without any distance), or using a black material, you should be able to verify that.
Andrei Frumusanu - Tuesday, September 8, 2020 - link
That doesn't make sense. The light sensor measured 0 lux when covered up, even with the paper. Plus it would not engage the low-brightness mode if it sensed that higher ambient light.lilmoe - Tuesday, September 8, 2020 - link
This limitation makes perfect sense of it was there is because of PWM. VRR wouldn't 'look' very compelling when the screen is flickering to emulate lower brightness. I hate this characteristic of OLED.Try an overlay app to artificially darken it and slide the brightness of the display all the way up and see if this persists.
lilmoe - Tuesday, September 8, 2020 - link
**It makes perfect sense if it was there because of PWM...Smartphones panels now have VRR and Anandtech comments still don't have an edit button....
Lolimaster - Tuesday, September 8, 2020 - link
Wait for half life 3 and nintendo next gen console. Or wait for Intel's 7nm :DSpunjji - Wednesday, September 9, 2020 - link
This is pretty much what I surmised, too.Olaf van der Spek - Tuesday, September 8, 2020 - link
Isn't OLED's power consumption supposed to be very low when displaying all black content anyway?Andrei Frumusanu - Tuesday, September 8, 2020 - link
Luminosity power is 0, yes. Measuring power at pure black is good to measure other panel or system overheads that might exist. In this case, refreshing the pixel matrix, even with pure black, costs 200mW at 120Hz.brucethemoose - Tuesday, September 8, 2020 - link
There's an app in Google Play called "Proximity Service" that can turn off (not just black out) a phone's display without root, adb or anything. Screen off power consuption might be a good data point.I'm glad to see positive steps towards true VRR! I'm suprised all this detection (content brightness, environment brightness, user input and such) doesn't have a ton of power overhead itself.
Lolimaster - Tuesday, September 8, 2020 - link
One of the main offenders is thw light sensor still pinging for data when in manual brightness. Does; t happen in other galaxies? S20/s10/s9 series?Jetcat3 - Tuesday, September 8, 2020 - link
Wonderful work Andrei. Any thoughts on how this will be implemented in the Z Fold2? Can we expect similar behavior?IanCutress - Wednesday, September 9, 2020 - link
Convince Samsung to send us a review unit and we'll check.Luminar - Wednesday, September 9, 2020 - link
A lot of discussion about "lux" and the "lux meter" on the Note 20...but...1) Is the lux meter on the Note 20 cosine corrected?
2) What is the spectral response curve of the Note 20's lux meter? Does it have roughly the same spectral response as the human eye? Or does it over/under-read certain light sources, especially metal-halide and LED light sources, which have very different spectral power distribution curves versus incandescent light sources?
Zooster - Wednesday, September 9, 2020 - link
Hi Andrei, do you think it's possible for Samsung to backport the vrr to S20 series through a firmware update? Of course there are hardware limitations, but what about some software trick to mimic vrr?Andrei Frumusanu - Wednesday, September 9, 2020 - link
It's not possible as it relies on a new DDIC. LFD relies on a new panel.trenzterra - Wednesday, September 9, 2020 - link
Just wondering, is this any different from FreeSync?PeachNCream - Wednesday, September 9, 2020 - link
I think I'd rather have the battery life over the benefits of higher refresh rates. That's less charging and a slower discharge which also helps with longevity since phone manufacturers are too inept to engineer a removable back panel.PaulHoule - Wednesday, September 9, 2020 - link
There is quite a bit I find mystifying about this.Why can't you just sent one frame to a panel and lock it to display that frame until it gets a new frame? For that matter, why do you have to blast out a whole rectangle full of frames. Why aren't people shocked that cell phones in 2020 are designed around the limitations of CRT tubes in 1940?
If you could just send the data that changes when it changes, you would get better latency and better power, no questions about it. You could get 1200 Hz class latency for small updates (e.g. type a character) but be able to shut the communications bus down when it is not necessary.
The rube goldberg approach used by this phone means you are going to have to squint at lots of charts to guess at whether or not it is faster or saves power. Consumers are getting wary about power and battery life claims: we've all bought the laptop that claims to have a 12 hour battery life, maybe has a 9 hour battery life when it is new, but after a year it is more like a 9 minute battery life (e.g. unplug it from the charger to move it to another room and it is crying for mommy)
linuxgeex - Wednesday, September 9, 2020 - link
It's possible to send sub-rectangles to the DRM driver for the display, but for the most part the OS composites full frames because it doesn't take significantly more power these days than jumping through all the hoops which would be required to determine which sub-rectangles have changed.Why are we still stuck on CRTC concepts? Because the DRM drivers rely on CRTC concepts, and it remains the lowest common denominator because almost all devices to this day are still expected to be able to drive NTSC and/or VGA signals. It's legacy that keeps not going away, and probably won't for another 20 years. But the comparison you are drawing has nothing to do with that. Instead, it has to do with the temporal dithering (FRC) used by modern displays to fake higher bit-depths than the panel actually supports. FRC and Gamma are not very good friends. VRR and FRC are not very good friends. Change the framerate, change the Gamma. Change the gamma regularly, see flicker. Lower the framerate, see the FRC flicker. Lower the ambient light, see the flicker even at higher frame rates. Lower the content brightness, see the flicker even at higher frame rates. That's why Samsung is pegging the frame rate with low content brightness and low ambient brightness - to prevent people from complaining about the image quailty.
linuxgeex - Wednesday, September 9, 2020 - link
As an aside, I read sometime in the last year about a recent kernel getting DRM Damage support for syncing sub-rectangles, but Android is usually more than a year behind on kernel versions. For Damage to work, apps need to send in their updates with bounding boxes, all the way from text rendering to the DRM. That may sound trivial, and it can be, but phone apps are using UI toolkits which pass their work off to GPU-accelerated render stacks, which use a scene graph to organize layers and minimise work via occlusion culling. They composite with MSAA to avoid artifacts at the borders. Calculating where those borders are and rendering 4x as many pixels at every border when there's hundreds of rectangles (ie a page of the Settings app) is a lot of work. In many cases it's less work to just redraw the full frame. Firefox recently gave up and always sends in "60fps full-frame updates" in their quest to provide silky-smooth display like a video game. Android UI has had the same "buttery smooth" goals since Jellybean. So bearing that in mind, with the apps and middleware having goals which are opposed to DRM Damage having a hope in hell of sending only the subrectangles which have been modified... then you might be accepting of why, when the only thing on the display is an 8-character clock updating at 1hz, the OS is still sending in 120hz full-frame updates. Meanwhile a microcontroller hooked up via MIPI can do a clock with 1hz updates and save a ton of power... and now you know why modern phones can't manage 1 month of standby.Tunnah - Wednesday, September 9, 2020 - link
Started reading this article thinking "huh it's almost weird that anandtech explains things, this is really newb friendly" and then as it progressed I got more and more confused. Now I have a headache loleastcoast_pete - Wednesday, September 9, 2020 - link
Thanks Andrei! Interesting technology, not sure about the implementation in the 20 Note Ultra. That picture of the selection menu already shows the problem: 60Hz refresh without any dynamic adjustment is pointed to as consuming less power. I was/am hoping that having a true VRR will save power over 60 Hz all the time; as your data show, that's indeed not what's happening here.steevedavidson - Friday, September 11, 2020 - link
it will support 5G network as well?https://www.celluniverse.ca/cell-phone-repair-vanc...
mcdonsco - Wednesday, December 16, 2020 - link
Curious, if you set the panel to 60hz is it also variable then or does it lock it at 60hz no matter what on the screen?tribalfs - Thursday, January 28, 2021 - link
Ambient thresholds is placed to avoid visible flickers when display switches refresh rates. The ambient thresholds depend on how Samsung has fine tuned the gamma curves across different refresh rates. Darker environment have higher threshold since human eyes are more sensitive to the flicker at darker environment.consultant1027 - Wednesday, March 24, 2021 - link
First, you have a typo under the section with bold headline "The Effect on Battery Life" you incorrectly referent the S20 Ultra when that should be the Note 20 Ultra.I also think there's a secondary question to be asked, which is, aside from gaming applications, how easily (if at all) can an average user with typical usage perceive any significant difference between 60 and 120Hz and if it is perceptible, what is the actual tangible benefit that would outweigh having a longer battery life using 60Hz?
It seems it is common with a lot of technology, due to marketing strategies, consumers become somewhat obsessed with technical specifications while failing to ask themselves, how much does it REALLY matter?
I have an S20+ that I run at 60Hz because I don't play games and I want the longest battery life. When I switch to 120Hz I can't tell any significant difference in reading my email, browsing a web page, checking the weather forecast, taking a photo, sending/receiving text messages. So there's no practical reason to even use 120Hz.