Original Link: https://www.anandtech.com/show/7192/update-on-gpu-optimizations-galaxy-s-4

Yesterday we posted our analysis of the Exynos 5 Octa's behavior in international versions of the Galaxy S 4 in certain benchmarks first discovered by Beyond3D user @Andreif7Samsung addressed issue on their blog earlier today:

Under ordinary conditions, the GALAXY S4 has been designed to allow a maximum GPU frequency of 533MHz. However, the maximum GPU frequency is lowered to 480MHz for certain gaming apps that may cause an overload, when they are used for a prolonged period of time in full-screen mode. Meanwhile, a maximum GPU frequency of 533MHz is applicable for running apps that are usually used in full-screen mode, such as the S Browser, Gallery, Camera, Video Player, and certain benchmarking apps, which also demand substantial performance.

The maximum GPU frequencies for the GALAXY S4 have been varied to provide optimal user experience for our customers, and were not intended to improve certain benchmark results.

Samsung Electronics remains committed to providing our customers with the best possible user experience.

The blog seems to confirm our findings, that the 533MHz GPU frequency is available for certain benchmarks ("a maximum GPU frequency of 533MHz is applicable for running apps that are usually used in full-screen mode, such as the S Browser, Gallery, Camera, Video Player, and certain benchmarking apps"). The full screen statement doesn't make a ton of sense (both GLBenchmark 2.5.1 and 2.7.0 are full screen apps, but with different GPU behavior). Samsung claims however that a number of its first party apps (S Browser, Gallery, Camera and Video Player) can also run the GPU at up to 532MHz, which actually explains something else we saw while digging around.

Looking at resources.arc inside TwDVFSApp.apk we find the following reference to pre-loaded Samsung apps:

This is what we originally assumed was happening with GLBenchmark 2.5.1 (that it was broadcasting boost intent to Samsung, but Kishonti told us that wasn't the case when we asked).

As we mentioned in our original piece, there's a flag that's set whenever this boost mode is activated: +/sys/class/thermal/thermal_zone0/boost_mode. None of the first party apps get that flag set, only the specific benchmarks we pointed out in the original article. Of those first party apps, S Browser, Gallery and Video Player all top out at a GPU frequency of 266MHz (which makes sense, none of the apps are particularly GPU intensive). I tried running WebGL content in S Browser to confirm (I ran Aquarium with 500 fish), as well as edited some photos in Gallery - 266MHz was the max observed GPU frequency.

Max Observed GPU Frequency
  S Browser Gallery Video Player Camera Modern Combat 4 AnTuTu
Samsung GT-I9500 266MHz 266MHz 266MHz 532MHz 480MHz 532MHz

The camera app on the other hand is a unique example. Here we see short blips up to 532MHz if you play around with the live filters aggressively (I just quickly tapped between enabling all of the filters, the rugged/oil pastel/fish eye filters seemed to be the more likely to trigger a 532MHz excursion). I never saw 532MHz for more than a second though. The chart below plots GPU frequency vs. sampling time to illustrate what I saw there:

What appears to be happening here is the boost routine grants access to raised thermal limits, which makes it possible to sustain the 532MHz GPU frequency for the duration of certain benchmarks. Since the camera app doesn't get the boost mode flag, it doesn't seem to sustain 532MHz - only 480MHz.

Otherwise the Samsung response is consistent with our findings and we're generally in agreement. Games aren't given access to the 532MHz GPU frequency, while certain benchmarks are. Samsung's pre-loaded apps can send boost intent to the DVFS (dynamic voltage and frequency scaling) controller, but they don't appear to be given the same thermal boost ability as we see in the benchmarks. Their reasoning for not giving games access to the higher frequency makes sense as well. Higher frequencies typically require higher voltage to reach, and power scales quadratically with voltage so that's typically not the best way of increasing performance - especially at the limits of one's frequency/voltage curve. In short, you wouldn't want to for thermal (and battery life) reasons. The debate is ultimately about what happens within those specified benchmarks.

Note that we're ultimately talking about an optimization that would increase GPU performance in certain benchmarks by around 10%. It doesn't sound like a whole lot, but in a very thermally limited scenario it's likely the best you can do. 

I stand by the right solution here being to either allow the end user to toggle this boost mode on/off (for all apps) or to remove the optimization entirely. I suspect the reason why Samsung wouldn't want to do the former is because you honestly don't want to run in a less thermally constrained mode for extended periods of time in a phone. Long term I'm guessing we'll either see the optimization removed or we'll see access to view current GPU clock obscured. I really hope it's not the latter as we as an industry need more insight into what's going on underneath the hood of our mobile devices, not less. Furthermore, I think this also highlights a real issue with the way DVFS management is done presently in the mobile space. Software is absolutely the wrong place to handle DVFS, it needs to be done in hardware.

Since our post yesterday we've started finding others who exhibit the same CPU frequency behavior that we reported on the SGS4. This isn't really the big part of the discovery since the CPU frequencies offered are available to all apps (not just specific benchmarks). We'll be posting our findings there in the near future.

Log in

Don't have an account? Sign up now