Original Link: https://www.anandtech.com/show/13285/huawei-gpu-turbo-investigation



One of the biggest announcements from Huawei this year is that of its new GPU Turbo technology. The claims that it could provide more performance at less power, without a hardware change gave us quite a bit of pause. Internally, more than a few raised eyebrows appeared. As part of our discussions with Huawei this year at IFA, as well as some pretesting, we actually now have a base understanding of the technology, as well as additional insight into some of the marketing tactics – not all of which are the most honest representations of the new feature.

GPU Turbo: A Timeline

GPU Turbo is said to be a new mechanism that promised great performance and power improvements to new and existing devices. The new ‘technology’ was something that was first introduced in early June with the Chinese release of the Honor Play, and will be updated to a version ‘2.0’ with the launch of EMUI 9.0 later in the year.

Over the next few months Huawei plans to release the technology on all of its mainstream devices, as well as going back through its catalogue. Huawei promises that all devices, irrespective of hardware, will be able to benefit.

GPU Turbo Rollout
Huawei   Honor
Mate 10
Mate 10 Pro
Mate 10 Porsche Design
Mate RS Porsche Design
P20
P20 Pro
July/
August
 
Honor 10
Honor Play
Honor 9 lite
P20 lite
P smart
Nova 2i
Mate 10 lite
Y9
September Honor View 10
Honor 9
- October -
Mate 9
Mate 9 Pro
P10
P10 Plus
November Honor 8 Pro
  December Honor 7x

From the weeks following the release of GPU Turbo on the first few devices, we saw quite a lot of hype and marketing efforts on Honor and Huawei’s side with the goal of promoting GPU Turbo. Over all the presentations, press releases promoted articles, and fuzzy press analysis, one important thing was consistently missing: we saw no technical explanation as to what GPU Turbo actually is and how it works. Everything was about results, but nothing was about details. At AnandTech, it’s the details that really resonate in our understanding, as to whether a new feature is genuine or not.

Huawei, to its credit, did try to reach out to us, but neither the company nor PR ever really responded when we asked for a more technical briefing on the new mechanism. We’re not sure what the reason was for this, as historically the company has often been open to technical discussions. On the plus side, at this year’s IFA, we finally had the chance to meet with a team of Huawei’s hardware and software engineers/managers.

Through these discussions, we developed some detailed explanations that finally made more sense of the past month’s marketing claims. The plus side of this is that we now have a better understanding of what GPU Turbo actually does (and it makes sense), although it also puts a chunk of the marketing slides on the ignore pile.

In this first piece on GPU Turbo, we’re going to go through the story of the feature in stages. First, we’ll look at Huawei’s initial claims about the technology: specifically the numbers. Second, we’ll go deeper into what GPU Turbo actually does. Third, we examine the devices we do have with the technology, to see what differences we can observe, and finally we address the marketing, which really needs to be scrutinized.

It should be noted that time permitting, and resources permitting, we want to go deeper into GPU Turbo. With the upcoming launch of the Mate 20 and the new Kirin 980 SoC inside, we will want to do a more detailed analysis with more data. This is only the beginning of the story into GPU Turbo.



The Claimed Benefits of GPU Turbo: Huawei’s Figures

Some of the most popular and widely quoted figures from the marketing slides during the release of GPU Turbo were the claims that the new feature would somehow achieve both up to 60% higher performance while also being able to reduce SoC power by up to 30%. Huawei (and Honor) are very very keen on these numbers, and show the results in the best possible light.

These figures were so widespread that they were again reiterated in last Thursday’s IFA global launch of the Honor Play. Over the last few days, I’ve seen plenty publications reporting these figures and attributing them to the benefits of GPU Turbo. However as of yet, no analysis has taken place.

Most slides we have seen look something like this, making it very easy to just put numbers into an article:

As AnandTech is very much a technical-oriented publication, from the get-go these numbers looked suspicious to us. We take pride and value in our own independent analysis, making sure that the numbers promoted are remotely achievable. After all, numbers like this often go against common-sense engineering advancements. And this is exactly where the marketing numbers fall apart. More on that later.

So What is GPU Turbo? We Finally Have Some Clarity

When we initially went looking for a detailed explanation of GPU Turbo, I (Andrei) first attempted to figure out by myself what GPU Turbo does. The lack of details, aside from projected performance improvements, is a very weak place to start. It also ended up being quite a nightmare of a task for one specific reason: despite years of allowing devices to be rooted, at around the same time as GPU Turbo was launched, Huawei and Honor stopped allowing bootloader unlocking. This prevented users from modifying the system’s firmware, and us included as we were unable to root the devices to help with benchmark profiling. Furthermore, all new firmware that ships with GPU Turbo is said to be vastly more locked down. It has not been made clear if the two items are directly related, though anyone who likes a good conspiracy might be inclined to think so.

As a result, in the context of GPU Turbo, what I had in mind was to actually profile the GPU via Arm’s Streamline tool, as this would give us exact information on the workloads that the GPU is processing. With these tools, we can deeply analyze what is going on under the hood. Either intentionally or unintentionally, the lockdown of the bootloader prevents us from doing this. Unfortunately because of the limitations, this was a dead-end in my testing.

I had started writing this article with no further detailed explanation, however we met with the EMUI software team at IFA, and we were able to finally get a more comprehensive explanation as to what GPU Turbo is. We discussed this technology with both the hardware and software teams, and had very different discussions with both.

With the hardware team – specifically with HiSilicon – they made it clear that this is purely a software technology. The mechanisms in GPU Turbo are aided by the controls they put in place, but the actual way that GPU Turbo works is all down to software. This is good to know, and also explains why Huawei is able to roll it out across all of its smartphone range. It also is not tied to the NPU, although having an NPU in the mix does help, apparently.

However, the public first hint of what it actually does was included in Friday’s Kirin 980 keynote presentation, referred to as “AI Loading Prediction Technology”.

What the slide tries to convey is that GPU Turbo allows the SoC to better follow the compute workload, adjusting the hardware performance states in order to better adapt to the workload. For example, when the CPU needs more power than the GPU, the power ratio availability can be adjusted to match these new requirements, improving both performance and efficiency.

We go into a bit more depth on the next page, where we finally have a good explanation of the mechanism. We have to thank Dr. Wang Chenglu, Huawei’s President of CBG (Smartphone) Software, for this information.



The Detailed Explanation of GPU Turbo

Under the hood, Huawei uses TensorFlow neural network models that are pre-trained by the company on a title-by-title basis. By examining the title in detail, over many thousands of hours (real or simulated), the neural network can build its own internal model of how the game runs and its power/performance requirements. The end result can be put into one dense sentence:

Optimized Per-Device Per-Game DVFS Control using Neural Networks

In the training phase, the network analyzes and adjusts the SoC’s DVFS parameters in order to achieve the best possible performance while minimizing power consumption. This entails trying its best to hit the nearest DVFS states on the CPUs, GPU, and memory controllers that still allow for hitting 60fps, yet without going to any higher state than is necessary (in other words, minimizing performance headroom). The end result is that for every unit of work that the CPU/GPU/DRAM has to do or manage, the corresponding hardware block has the perfectly optimized amount of power needed. This has a knock-on effect for both performance and power consumption, but mostly in the latter.

The resulting model is then included in the firmware for devices that support GPU Turbo. Each title has a specific network model for each smartphone, as the workload varies with the title and the resources available vary with the phone model. As far as we understand the technology, on the device itself there appears to be an interception layer between the application and GPU driver which monitors render calls. These serve as inputs to the neural network model.  Because the network model was trained to output the DVFS settings that would be most optimal for a given scene, the GPU Turbo mechanism can apply this immediately to the hardware and adjust the DVFS accordingly.

For SoCs that have them, the inferencing (execution) of the network model is accelerated by the SoC’s own NPU. Where GPU Turbo is introduced in SoCs that don’t sport an NPU, a CPU software fall-back is used. This allows for extremely fast prediction. One thing that I do have to wonder is just how much rendering latency this induces, however it can’t be that much and Huawei says they focus a lot on this area of the implementation. Huawei confirmed that these models are all 16-bit floating point (FP16), which means that for future devices like the Kirin 980, further optimization might occur through using INT8 models based on the new NPU support.

Essentially, because GPU Turbo is in effect a DVFS mechanism that works in conjunction with the rendering pipeline and with a much finer granularity, it’s able to predict the hardware requirements for the coming frame and adjust accordingly. This is how GPU Turbo in particular is able to make claims of much reduced performance jitter versus more conventional "reactive" DVFS drivers, which just monitor GPU utilization rate via hardware counters and adapt after-the-fact.

Thoughts After A More Detailed Explanation

What Huawei has done here is certainly an interesting approach with the clear potential for real-world benefits. We can see how distributing resources optimally across available hardware within a limited power budget will help the performance, the efficiency, and the power consumption, all of which is already a careful balancing act in smartphones. So the detailed explanation makes a lot of technical sense, and we have no issues with this at all. It’s a very impressive feat that could have ramifications in a much wider technology space, eventually including PCs.

The downside to the technology is the per-device & per-game nature of it. Huawei did not go into detail about long it took to train a single game: the first version of GPU Turbo supports PUBG and a Chinese game called Mobile Legends: Bang Bang. The second version, coming with the Mate 20, includes NBA 2K18, Rules of Survival, Arena of Valor, and Vainglory.

Technically the granularity is per-SoC rather than per-device, although different devices will have different limits in thermal performance or memory performance. But it is obvious that while Huawei is very proud of the technology, it is a slow per-game roll out. There is no silver bullet here – while an ideal goal would be a single optimized network to deal with every game in the market, we have to rely on default mechanisms to get the job done.

Huawei is going after its core gaming market first with GPU Turbo, which means plenty of Battle Royale and MOBA action, like PUBG and Arena of Valor, as well as tie-ins with companies like EA/Tencent for NBA 2K18. I suspect on the back of this realization, some companies will want to get in contact with Huawei to add their title to the list of games to be optimized. Our only request is that you also include tools so we can benchmark the game and output frame-time data, please!

On the next page, we go into our analysis on GPU Turbo with devices on hand. We also come across an issue with how Arm’s Mali GPU (used in Huawei Kirin SoCs) renders games differently to Huawei’s competitor devices.



The Difficulty in Analyzing GPU Turbo

I still haven’t managed to get two identical devices with and without GPU Turbo. The closest practical comparison I was able to make is between the Huawei P20 and the Honor Play. These are two devices that use the same SoC and memory, albeit in different chassis.

The differences between the two phones are not just the GPU Turbo introduction, but the Honor Play also includes a newer Arm Bifrost driver, r12p0, while the P20 had the r9p0 release. Unfortunately no mobile vendor publishes driver release notes, so we can’t differentiate between possible improvements on the GPU driver side, and actual improvements that GPU Turbo makes.


Huawei P20 (no GPU Turbo)


Honor Play (GPU Turbo)

For raw frame rate numbers, it was extremely hard to tell the two phones apart. PUBG tops out at 40 FPS as well, although it should be noted that we could have invested a lot more time inspecting jitter and just how noticeable that would be in practice, but one thing that can be very empirically be measured is power consumption.

Here the Honor Play seemingly did have an advantage, coming in at ~3.9W while rendering the above scene. This was a tad less than the P20’s ~4.7W. These figures are total device power, and obviously the screen and rest of device components will be different between the two models. It does however represent a 15% difference in power, although to be clear we can't rule out the possibility that they could be different bins; i.e. they have different power/voltage characteristics as per random manufacturing variance, which is common in the space.

Huawei has quoted data for the Kirin 980:

Still, it does very much look like GPU Turbo has an efficiency advantage, however again a 10% figure as presented during the Kirin 980 keynote seems to be a lot closer to reality than the promised 30% marketing materials.

GPU Turbo Is Real, Just Be Wary of Marketing Numbers

One thing that should not be misunderstood in this article is that GPU Turbo itself is not just a marketing ploy, but rather a very real and innovative solution that tries to address the weaknesses of the current generation Kirin chipsets. Kirin still sits well behind both the performance and efficiency of Snapdragon-based Adreno graphics, and because Huawei cannot license Adreno, it has to try and make the best of what it has, aside from dedicating more die space to their GPUs.

However much of the technical merit of GPU Turbo has been largely overshadowed by quite overzealous marketing claims that are nothing short of misleading. More on this on the next page.

By nature of it being a software solution, it is something that augments the hardware, and if the hardware can’t deliver, then so won’t the software. Here a lot of the confusion and misleading material can be directly attributed to the way the Honor Play was presented to the public. Reality is, even with GPU Turbo, the Honor Play is still not competitive with Snapdragon 845 devices, even when it wants to portray itself as such. Here, the differences in the silicon are just too great to be overcome by a software optimization, not matter how innovative the new mechanism is.



Problems with PUBG: Not All GPUs Render Equally

In part of our testing with PUBG, we did stumble across a particularly alarming scenario which we never really see with standardized testing. When comparing Snapdragon to Kirin, trying to observe Huawei's quoted performance differences, there appears to be a major difference between what Adreno phones were rendering, and what Mali powered phones were rendering and displaying.

Looking into more detail, it’s very obvious that the OnePlus 6 tested here (a Snapdragon/Adreno phone) resulted in far better image quality compared to the other phones.

 

      

There are two notable characteristics. First of all, the Adreno render is simply a lot sharper. It looks like the game uses a very different image scaling algorithm. For equality testing, we set the rendering resolution to 720p and upscaled to 1080p on all of the phones. While the Adreno shows up as relatively sharp, the Mali phones are seemingly quite blurry, and this is actually also noticeable on the phone when playing.

The second noticeable element, and arguably more important, is that the Adreno phone actually has anisotropic texture filtering enabled, while the Mali devices are seemingly ignoring it and falling back to bilinear filtering. In a game like PUBG, this is also very noticeable when playing and creates quite big picture quality differences. This also puts quite a differential load on the graphics, resulting in an apples-to-oranges comparison.

Consequently, Huawei’s GPU Turbo marketing comparisons to the competition also are questionable: the anisotropic filtering performance issue can impact framerates by much as 16-18% on its own. Because the Mali GPU devices have this issue, it creates a very unequal comparison when diagnosing performance to such detail. It means that out of the gate, the performance of the Mali phones is already up 16-18%, but at the expense of quality. (Ed: We used to see this a lot in the PC space over 10 years ago, where different GPUs would render different paths or have ‘tricks’ to reduce the workload. They don’t anymore.)

It’s also to be noted that while the Mali devices actually should have a workload advantage given that they’re not doing nearly as much texture filtering work as the Adreno, the performance and efficiency of the Adreno smartphones is still better. Although admittedly the differences are minor given that the game caps out at a maximum of 40fps at maximum quality. That only leaves power efficiency as the metric.

For power efficiency, even with the difference in rendering paths and quality, here Snapdragon 845 phones have a massive advantage, playing the game at 2.5-3W with AF enabled, while the Kirin 970 phones routinely average at 4-4.5W. The higher power consumption and efficiency means that the battery life on those devices will have a deficit.

Real World vs. Synthetic Testing

While I fully understand Huawei’s focus on real-world performance comparison in PUBG rather than synthetic benchmarks, we use synthetic benchmarks to determine the varacity of new features for a good reason – they are industry standards and well understood. Honor’s and Huawei’s marketing focus on PUBG seems a bit poorly thought out when it comes to actual technical comparisons in that regard, which we address on the next page.

There is the added aspect of different GPUs not even rendering the same graphics path, as described below: the fact that Adreno GPUs add anisotropic filtering and have higher quality image scaling effectively means they’re running at a noticeably higher image quality level. This is not taken into account in the performance and efficiency comparisons in Huawei’s materials, lending the materials to be a lot less credible. 

The Bottom Line

Still, GPU Turbo is a promising new technology that will give Huawei a competitive edge, all other things being equal. The sad fact here is that for the Kirin 960 and Kirin 970, things are not equal. The competitive landscape will change a lot with the Kirin 980, but until then, current generation device users need have a clear understanding and realistic expectations to what GPU Turbo can actually bring to the table.



The Minor Issue of Overzealous Marketing

As mentioned earlier in the piece, the most common numbers from Huawei and Honor about the new technology follow the same pattern: GPU Turbo is going to offer up to 60% extra performance, and 30% better power consumption. Since launch, out of all the marketing materials we have seen, there is exactly one instance where either company expands on these figures. This is in the footnotes of Honor Play’s English global product page, explaining the context of the 60%/30% numbers:

Honor Play's Product Website GPU Turbo Explanation

Here is what that tiny bullet point says:

*2 The GPU Turbo is a graphics processing technology that is based on Kirin chips and incorporates mutualistic software and hardware interaction. And it supports some particular games. 
Results are based on comparison with the previous generation chip, the Kirin 960.

This is a big red flag. Normally when comparing a new technology, the performance difference should be quoted in an off/on state. So it shouldn’t be too complicated to see as to the fact that using the Kirin 960 as the base result is a pretty massive issue. It means that the marketing materials are mixing up its claims – values that are explicitly being attributed to GPU Turbo, a software technology, are mixed with silicon improvements between two generations of chipsets.

The honest comparison should be the Kirin 970 with GPU Turbo off and the Kirin 970 with GPU Turbo on. In this case, the baseline result is with the Kirin 960 with no GPU Turbo, compared against the latest Kirin 970 with GPU Turbo on.

For our readers unfamiliar with the generational improvements of the new Kirin 970 chipset, I recommend referring back to our in-depth article review of chipset released back in January. In terms of advancements, the Kirin 970 brings a new Mali G72MP12 GPU running at 747MHz, manufactured on a new TSMC 10nm process. This represented quite an improvement to the 16nm manufactured Kirin 960 which featured a Mali G71MP8 at up to 1037MHz.

Kirin 970 AnandTech Kirin 960
TSMC 10FF Mfg. Process TSMC 16FFC
4xA73 @ 2.36 GHz
4xA53 @ 1.84 GHz
CPU 4xA73 @ 2.36 GHz
4xA53 @ 1.84 GHz
Mali-G72MP12 @ 746 MHz GPU Mali-G71MP8 @ 1035 MHz
Yes NPU No
Cat 18/13 Modem Cat 12/13

Furthermore, the Kirin 960’s GPU performance and efficiency was extremely problematic, showcasing some of the worst behavior we’ve ever seen in any smartphone ever released. We’re not going to go back as to why this happened, but it was a competitive blow to the Kirin 960.

Now the Kirin 970 improved from these low figures, as we’ve shown in our reviews. But the 60% performance improvements and 30% power improvement mentioned for GPU Turbo, while in isolation might sound impressive, aren't nearly as impressive once we know what they're based on. By being relative to the badly performing Kirin 960, it completely changes the meaning. Users that enable GPU Turbo on their devices will not experience a 60%/30% difference in performance.

This is also in the face of Huawei’s own data presented throughout the lifetime of GPU Turbo. Quoting 60%/30% makes for impressive headlines (regardless of how honest they are), however even Huawei’s own analysis shows that 60%/30% are wildly optimistic:

Ultimately Huawei presented the 60%/30% figures as a differential between GPU Turbo On/Off. If anyone was expecting that on their device, then they would be sorely disappointed. The fact that the companies obfuscated the crucial comparison point of the Kirin 960 is almost unreal in that respect.

Also on that image above, we have to criticize quite heavily on the fact that those bar charts are misrepresenting all the gains: the 3 FPS gain in PUBG is shown as a 25% gain. Companies feel the need to misrepresent the true growth in values like this because it makes for a more impressive graph, rather than adhere to the standard of starting graphs at zero.

Why Using The Kirin 960 Is An Issue: Starting With A Low Bar

Going back to our GPU power efficiency tables measured in GFXBench Manhattan 3.1 and T-Rex, we put the two chipsets back into context:

GFXBench Manhattan 3.1 Offscreen Power Efficiency
(System Active Power)
AnandTech Mfc. Process FPS Avg. Power
(W)
Perf/W
Efficiency
Galaxy S9+ (Snapdragon 845) 10LPP 61.16 5.01 11.99 fps/W
Galaxy S9 (Exynos 9810) 10LPP 46.04 4.08 11.28 fps/W
Galaxy S8 (Snapdragon 835) 10LPE 38.90 3.79 10.26 fps/W
LeEco Le Pro3 (Snapdragon 821) 14LPP 33.04 4.18 7.90 fps/W
Galaxy S7 (Snapdragon 820) 14LPP 30.98 3.98 7.78 fps/W
Huawei Mate 10 (Kirin 970) 10FF 37.66 6.33 5.94 fps/W
Galaxy S8 (Exynos 8895) 10LPE 42.49 7.35 5.78 fps/W
Galaxy S7 (Exynos 8890) 14LPP 29.41 5.95 4.94 fps/W
Meizu PRO 5 (Exynos 7420) 14LPE 14.45 3.47 4.16 fps/W
Nexus 6P (Snapdragon 810 v2.1) 20Soc 21.94 5.44 4.03 fps/W
Huawei Mate 8 (Kirin 950) 16FF+ 10.37 2.75 3.77 fps/W
Huawei Mate 9 (Kirin 960) 16FFC 32.49 8.63 3.77 fps/W
Huawei P9 (Kirin 955) 16FF+ 10.59 2.98 3.55 fps/W
GFXBench T-Rex Offscreen Power Efficiency
(System Active Power)
AnandTech Mfc. Process FPS Avg. Power
(W)
Perf/W
Efficiency
Galaxy S9+ (Snapdragon 845) 10LPP 150.40 4.42 34.00 fps/W
Galaxy S9 (Exynos 9810) 10LPP 141.91 4.34 32.67 fps/W
Galaxy S8 (Snapdragon 835) 10LPE 108.20 3.45 31.31 fps/W
LeEco Le Pro3 (Snapdragon 821) 14LPP 94.97 3.91 24.26 fps/W
Galaxy S7 (Snapdragon 820) 14LPP 90.59 4.18 21.67 fps/W
Galaxy S8 (Exynos 8895) 10LPE 121.00 5.86 20.65 fps/W
Galaxy S7 (Exynos 8890) 14LPP 87.00 4.70 18.51 fps/W
Huawei Mate 10 (Kirin 970) 10FF 127.25 7.93 16.04 fps/W
Meizu PRO 5 (Exynos 7420) 14LPE 55.67 3.83 14.54 fps/W
Nexus 6P (Snapdragon 810 v2.1) 20Soc 58.97 4.70 12.54 fps/W
Huawei Mate 8 (Kirin 950) 16FF+ 41.69 3.58 11.64 fps/W
Huawei P9 (Kirin 955) 16FF+ 40.42 3.68 10.98 fps/W
Huawei Mate 9 (Kirin 960) 16FFC 99.16 9.51 10.42 fps/W

So while the Kirin 970 is an advancement and improvement over the 960 – in the context of the competition, it still has trouble holding up with this generation’s Exynos and Snapdragon.

The key point I’m trying to make here in the context of GPU Turbo claims, is that the 60%/30% figures are very much unrealistic and extremely misleading to users. If Huawei and Honor are not clear about the baseline comparisons, the companies' own numbers can never be trusted in future announcements again.

How Much Does GPU Turbo Actually Provide?

On Friday Huawei CEO Richard Yu announced the new Kirin 980, and some of the presentation slides addressed the new mechanism, showcasing a more concrete figure of the GPU Turbo effects on the newly released chipset:

Here, the actual performance improvement is rather minor because the workload is V-sync capped and the GPU doesn’t have issues in that regard, however the power improvements should be still representative. Here the actual power improvement was 10% - something that’s a lot more reasonable and believable improvement that can be attributed to software.



Conclusion: Still A Plus, But

Back in 2015, AnandTech was in the extraordinary position to be the first western publication to actually meet with HiSilicon, reporting on the Kirin 950 from their Beijing media briefing. One of the things I still remember from back then was a definitive sense of humility and a very good sense of self-awareness in regards to the company’s products. Things have changed a lot in the past three years, and Huawei as a company has grown considerably since then.

Huawei and Honor combined now sell 153 million smartphones a year, and Huawei is now the second largest smartphone manufacturer worldwide, recently overtaking Apple. Honor claims it is fifth worldwide. Huawei has also stated that it will raise its future R&D budget from 15% of total revenue (US$13.23 billion) up to 20-30% during the race to 5G. Both are major players in this space, to the point where Huawei’s in-house silicon design team, HiSilicon, creates its own SoCs to help differentiate its products from those based on Qualcomm and MediaTek SoCs.

The explosive growth from Huawei not only goes into R&D, but also marketing. As the company has expanded, we have seen smartphone launch after smartphone launch held in glamorous places in Europe, along with perhaps the biggest sampling program for smartphone launches ever: all members of the invited press to the show, typically around 500-2000, are sampled. This is compared to Samsung and Apple who offer limited units to select press only, or LG who does both unit distribution at briefings (30+) and sampling to press. With the desire to gain brand awareness, and recognition, Huawei has gone into overdrive with its methodology, which has been a metaphorical double-edged sword – while delivering innovative solutions on the technical side, the brand message in some ways strayed from its more humble roots.

Developing technologies like GPU Turbo is going to be key for Huawei. Huawei has obvious gaming performance, image quality, and power efficiency deficits to Qualcomm’s Snapdragon. Playing with the silicon die for price/performance/area is a balancing act and will offer some gains, but typically at the expense of other specifications. This is why GPU Turbo is important: development of GPU Turbo is a "free" efficiency gain from a hardware perspective. Rather than involving new silicon, it's developed through a dedicated software effort, and as a result of the gains the feature is being rolled out across the range of smartphones that both Huawei and Honor offer at retail. GPU Turbo still has ways to go, such as expanding the number of games supported, and potential improvements on the horizon, such as having a single algorithm to cover all use-cases, but the future looks good for it.

Marketing these technologies, like GPU Turbo, is also important. Comparing the headline 60% better performance and 30% better efficiency data on the Kirin 970, with GPU Turbo, to the Kirin 960, without GPU Turbo, almost a year after the launch of the Kirin 970 and not declaring this in every presentation is a deliberate attempt at obfuscation. This is borne through Huawei’s own data in the same presentations, showing the Kirin 980 achieving minor performance and efficiency gains when the feature is enabled. The gains are still worth having, but are vastly lower than the numbers Huawei likes to promote every time GPU Turbo is mentioned.

This, along with the recent discovery of benchmark detection and acceleration to get higher scores, suggests that someone at Huawei is trying to pull the wool over the users’ eyes. It is not a good look, and despite Huawei’s explosive growth, key figures at the company are going to have to look inwards at how they want the company to be perceived on the main stage. The bigger the company is, the more it ends up under the microscope, so maintaining a level of honesty is important, otherwise the trust with the consumer is lost quick and fast. Trying to hide the real world numbers, just to get high on a performance table, just looks bad. It is bad.

The Final Word on GPU Turbo

As mentioned several times in the review, now that GPU Turbo has been explained in greater detail, the technology is genuine as far as we can tell, and is likely going to be a catalyst for a new way for vendors to differentiate themselves across the mobile spectrum, as the competition will be looking into similar solutions.

For everything we just stated about Huawei’s presentations, we hope that the company is giving plenty of kudos and praise to the teams that had the idea, and developed the technology. In an ever competitive landscape where SoC vendors need to differentiate and try to one-up each other, technologies like GPU Turbo are emerging as innovative advantages, improving the overall user experience.

Time, hardware, and software permitting, we will be aiming to go and get fully accurate benchmark data with and without GPU Turbo on the Kirin 980 after its launch. Stay tuned.

Benchmarking and Core Analysis by Andrei Frumusanu
Extra Analysis and Conclusion by Ian Cutress

Log in

Don't have an account? Sign up now