Original Link: https://www.anandtech.com/show/11935/huawei-mate-10-pro-hands-on-with-kirin-970
Hands-on & More With Huawei's Mate 10 and Mate 10 Pro: Kirin 970 Meets Artificial Intelligence
by Ian Cutress on October 16, 2017 9:00 AM ESTThis morning Huawei is taking the wraps off of their latest generation flagship smartphone, the Mate 10 series. Powered by subsidiary HiSilicon’s Kirin 970 SoC, the new phones are a mix of something old and something new for the company. With a design that is not simply a carbon copy of the earlier Mate phones but is still very much a traditional smartphone, Huawei’s latest flagships are a mix of old and new; tried and true paired with the cutting edge. It’s an interesting balancing act, and one that, if consumers agree, will further bolster Huawei’s success in the international smartphone market while at the same time pushing a nascent technology to the forefront of the mobile industry.
That technology is, of course, artificial intelligence, which has become the buzzword for the latter half of this decade in the world of technology. Long a lofty goal of computer science – if not perhaps its holy grail – recent advancements in the field have opened the door to new methods and new applications. And while this era of neural networking-driven AI is not by any means producing devices that actually think like a human, even this weak form of AI is, in the right use cases, far more capable than anything that has come before it.
Of course, the usefulness of having neural networking hardware is only as good as the appications that run on it, and in these still-early days of the field, the industry as a whole is trying to figure out what those applications should be. Having a self-driving car or a smart NPC in a video game makes sense, but applying it to a smartphone is confusing at first. Huawei announced that its new Kirin 970 chipset had dedicated silicon for running artificial intelligence networks, and the Mate 10 series is going to be the first device running this chip. Today, they announced the smartphones and unveiled the features.
The Mate 10, Mate 10 Pro, and Mate 10 Porsche Design
The devices themselves are part of Huawei’s yearly cadence with the Mate series. Every year at around this time we see a new smartphone SoC and the first two devices that power it: the Mate and the Mate Pro. Both the hardware and the design are meant to be iterative – Huawei’s HiSilicon division takes the ‘best’ IP available from ARM to develop the processor, and the design team takes cues from the industry as to what will be the next statement in aesthetics.
One of the big trends for 2017 (and moving into 2018) is full-screen display technology. In previous years, manufacturers have often quoted ‘screen-to-body’ ratios in order to show how much of the face of the device is taken up by screen, but it is this year that has started to push the boundaries on this aspect. Arguably devices such as Xiaomi’s MI MIX range were instrumental in pushing this, but the upside is more screen for everyone or the same sized screen in smaller devices. Huawei is pushing the screen with its ‘FullView Display’ (the marketing name for it).
The Mate 10 comes with a 5.9-inch FullView display, using a glass front for the 2560x1440 LCD display, coming in at 499 pixels per inch. Huawei is quoting panels capable of a 1500:1 contrast ratio, while the color space is listed at a less-than-useful metric of 96% NTSC.
The Mate 10 Pro (and Porsche Design) are slightly bigger with their 6.0-inch displays, although this time it comes with an OLED screen at 2160x1080 resolution. This is a lower pixel density (402 ppi) and resolution compared to the regular Mate 10, but is rated at 112% NTSC and 7000:1 contrast. The smaller resolution and use of OLED might also assist in battery life as well, and overall the unit is lighter than the Mate 10.
Neither device goes to the extreme with the display completely covering the front, as it requires certain methods of moving the internals such as the camera (on the bottom on the MI MIX, on the notch on the iPhone X) as well as how to implement fingerprint technology. One of the biggest design deviations for this generation of Mate devices is that the Mate 10 regular edition now has the fingerprint sensor on the front of the phone, rather than the rear. In my eyes this is a pretty big jump, given that the Mate S, the Mate 8 and the Mate 9 regular editions all had fingerprint sensors on the rear. The Mate 10 Pro, by contrast, does keep the sensor on the rear.
This pre-production unit hasn't updated the logo
There is no difference between each of the devices for the SoC inside, with each device getting the full-fat Kirin 970. This means four ARM Cortex A73 cores at 2.36 GHz and four ARM Cortex A53 cores at 1.8 GHz. These are paired with Mali G72 MP12 graphics (at an unstated frequency), the i7 sensor processor, and Huawei’s new Neural Processing Unit, or NPU (more on this later). All of the units will use Huawei’s latest Category 18 integrated LTE modem, capable of 1.2 Gbps download using 4x4 MIMO on 3-carrier aggregation with 256-QAM. Each device supports dual-SIM LTE concurrently (along with dual-SIM VoLTE), although this limits downloads to Category 16. Uploads are at Category 13.
Only one option for memory and storage is available with the Mate 10, with Huawei settling on 4GB of LPDDR4X DRAM and 64GB of NAND for storage, with microSD card support further augmenting that, though by taking one of the SIM slots. For some reason it says limited to 256GB, though I will ask about the new 400GB microSD cards.
The Mate 10 Pro will be available in 4GB/64GB and 6GB/128GB versions, although the latter will be dependent on region – we are told around 20 countries are on the initial list. The Mate 10 Porsche Design model will be only available in a 6GB/256GB configuration, similar to last year.
All the devices come with the typical dual-band 802.11ac Wi-Fi support, extending to BT4.2, and will include NFC. All three devices use USB Type-C, but only the base model has a headphone jack. Despite the Mate 10 Pro/PD being physically bigger than the standard Mate 10, all three devices use a 4000 mAh battery which is TUV certified for SuperCharge. That in itself is fairly large for a modern flagship, which is perhaps a benefit of only a few smartphone companies now competing in the ‘under 7mm’ metric for thickness. The Huawei devices come in at 8.2mm and 7.9mm for that.
The cameras on all the devices are identical as well, with Huawei further leveraging the Leica band cooperation. The front camera is an 8MP f/2.0 unit, while the rear camera does something a little bit different this time around. The dual camera is vertical, like the Mate 10, but without the extra protective shroud around the lenses. The cameras are similar 12MP RGB and 20MP monochrome, as found on last year’s flagships, although this time they are both f/1.6 and using Leica SUMMILUX-H lenses with AI-powered bokeh. This allows for ‘2x hybrid zoom’ (which we established last year is more like a crop than a zoom), but the phones also have 4-way focus (PDAF, CAF, Laser, Depth) and have a dual LED flash.
Huawei will launch these devices on Android 8, using their custom implementation called EMUI. Last generation was EMUI 5, and this generation will be called EMUI 8. The reason for the jump is two-fold: the number 8 is a highly positive number in Chinese culture, but also it addresses some comments as to why the EMUI numbering system was ‘behind’ the Android version. Huawei intends to keep EMUI’s version number paired with the Android version for the foreseeable future.
Huawei Mate 10 Series | |||
Mate 10 | Mate 10 Pro | Mate 10 Porsche Design | |
SoC | HiSilicon Kirin 970 4x Cortex-A53 @ 1.84GHz 4x Cortex-A73 @ 2.36GHz ARM Mali-G72 MP12 @ ? |
||
Display | 5.9-inch 2560x1440 RGBW |
6.0-inch 2160x1080 OLED |
|
Dimensions | 150.5 x 77.8 x 8.2 mm 186 grams |
154.2 x 74.5 x 7.9 mm 178 g |
|
RAM | 4 GB LPDDR4X-1833 |
4/6 GB LPDDR4X-1833 |
6 GB LPDDR4X-1833 |
NAND | 64 GB (UFS 2.1) + microSD |
64/128 GB (UFS 2.1) | 256 GB (UFS 2.1) |
IP Rating | IP53 | IP67 | |
Battery | 4000 mAh (15.28 Wh) non-replaceable |
||
Front Camera | 8MP, 1/2.0" | ||
Rear Camera | Color: 12MP, 1/1.6 Monochrome: 20MP, f/1.6 PDAF + Laser AF + Contrast AF + Depth, OIS, HDR, dual-tone LED flash |
||
Modem | HiSilicon LTE (Integrated) 2G / 3G / 4G LTE Category 18/16 Download Category 13 Upload |
||
SIM Size | 2x NanoSIM (dual standby) | ||
Wireless | 802.11a/b/g/n/ac, BT 4.2 LE, NFC, IrLED, GPS/Glonass/Galileo/BDS | ||
Connectivity | USB 2.0 Type-C, 3.5mm headset | ||
Fingerprint Sensor | Front | Rear | |
Launch OS | Android 8.0 with EMUI 8.0 | ||
Launch Price | 699 Euro (4/64) | 799 Euro (6/128) | 1349 Euro |
US Price | ~$549-$599 | ~$649-$699 ~$749-$799 |
Some* fingers *may be more than some |
Pricing for the Mate 10 and Mate 10 Pro is likely to mirror the pricing for last year’s flagships. This means around $549-$599 for the regular edition and $649-$699 for the Pro. Add in another $100 for the higher capacity model, and probably another $250-$400 for the Porsche Design version. (Updated in table)
The Look
For a few generations now, Huawei has been cultivating a specific look on its devices. The machined aluminium metal unibody combined with the gaps required for the antenna meant that the Mate S, the Mate 8, the Mate 9, and the P9 felt like part of the family. I didn’t get the same feeling with the base P10 models, and I also don’t get the same feeling with the Mate 10 either. There are three immediate reasons I can think of.
First are the color choices. As I am writing this piece, I have only seen the Mate 10 and Mate 10 Pro in dark colors. When I put them side-by-side with other devices, it does not look significantly different.
Huawei P9, Huawei Mate 10, Huawei Mate 9, LG V30+
This is especially true in low light, and there’s no defining ‘Huawei’ feature. On the rear, the dark color again hides the fact that it is a Huawei device, aside from the perhaps odd way the dual cameras look. There is a band on some of the colors to signify a ‘strip’ where the cameras are, but this is not part of Huawei’s regular look. The strips we have seen to date come on the P9 and P10, not on the Mate units. One caveat to all this: when Huawei launched the P10 in ‘Greenery’, in collaboration with Pantone, it seemed odd at the time. But I can now pick that phone out of a crowd, it is so obvious. There is something to be said about being different.
A note on colors: the Mate 10 will be offered in Mocha Brown, Black, Champagne Gold, and Pink Gold. The Mate 10 Pro will be in Midnight Blue, Titanium Gray, Mocha Brown, and Pink Gold. The Mate 10 Porsche Design will be in Diamond Black only.
Second is the fingerprint sensor. This is perhaps more of a personal issue, but to date I have preferred rear fingerprint sensors. Moving to the front for the P10 put me off a little (especially in a dark color), and the fact that the regular Mate 10 now goes this way, with a thin fingerprint sensor, seems a little off-putting.
Third is the display. With most major smartphone manufacturers focusing on this ‘all-screen’ display technology, there leaves little room for individualization for the OEMs to make a mark. Apple, either by luck or by design, got this right. Despite the backlash on the iPhone X about that little notch for the cameras, there is no mistaking that a phone with a notch is an iPhone X. The Mate 10 and Mate 10 Pro do not have the same instantly recognizable look. How to make it obviously recognizable (and different to the iPhone) is for someone paid a lot more than me to think about, but it means the Mate 10 and Mate 10 Pro have the potential to be lost in the crowd. The P11 (if there is one next year) will have to do something on this front.
The Silicon: The Kirin 970
On the silicon side, at the heart of the new Mate 10 phones is the Kirin 970 SoC. The new Kirin 970 is fabbed at TSMC using its smartphone-focused 10nm process. We were expecting Huawei/HiSilicon to be the first SoC vendor to 10nm last year, but its release cycle was just before 10nm ramped up for mass production. The chip uses the same ARM Cortex-A73 and ARM Cortex-A53 cores as the previous generation, although this time running from more mature blueprints. For the last generation Huawei was the first to the gate with ARM’s latest cores, which had a bit of concern on the power side as shown in Matt’s review. ARM announced the next generation A75/A55 cores earlier this year, but in true ‘not ready yet’ fashion for Huawei, these designs are not ready for mass production.
A PCB mockup of the Kirin chip, alongside a 1.4 cm square Core i7 logo
Aside from the A73/A53 cores, the Kirin 970 uses ARM’s latest Mali G72 graphics, this time in an MP12 configuration. This means a base +50% gain for graphics cores, along with the improvements from G71 to G72, but the benefits of a ‘wider’ graphics engine typically allow running it at lower frequencies, nearer the power efficiency point, and saving power. In the game of silicon cat and mouse, balancing die size with cost and power, Huawei has gone for added cost/die size in order to reduce power consumption.
HiSilicon High-End Kirin SoC Lineup | |||
SoC | Kirin 970 | Kirin 960 | Kirin 950/955 |
CPU | 4x A73 @ 2.40 GHz 4x A53 @ 1.80 GHz |
4x A73 @ 2.36GHz 4x A53 @ 1.84GHz |
4x A72 @ 2.30/2.52GHz 4x A53 @ 1.81GHz |
GPU | ARM Mali-G72MP12 ? MHz |
ARM Mali-G71MP8 1037MHz |
ARM Mali-T880MP4 900MHz |
LPDDR4 Memory |
2x 32-bit LPDDR4 @ 1833 MHz |
2x 32-bit LPDDR4 @ 1866MHz 29.9GB/s |
2x 32-bit LPDDR4 @ 1333MHz 21.3GB/s |
Interconnect | ARM CCI | ARM CCI-550 | ARM CCI-400 |
Storage | UFS 2.1 | UFS 2.1 | eMMC 5.0 |
ISP/Camera | Dual 14-bit ISP | Dual 14-bit ISP (Improved) |
Dual 14-bit ISP 940MP/s |
Encode/Decode | 2160p60 Decode 2160p30 Encode |
2160p30 HEVC & H.264 Decode & Encode 2160p60 HEVC Decode |
1080p H.264 Decode & Encode 2160p30 HEVC Decode |
Integrated Modem | Kirin 970 Integrated LTE (Category 18) DL = 1200 Mbps 3x20MHz CA, 256-QAM UL = 150 Mbps 2x20MHz CA, 64-QAM |
Kirin 960 Integrated LTE (Category 12/13) DL = 600Mbps 4x20MHz CA, 64-QAM UL = 150Mbps 2x20MHz CA, 64-QAM |
Balong Integrated LTE (Category 6) DL = 300Mbps 2x20MHz CA, 64-QAM UL = 50Mbps 1x20MHz CA, 16-QAM |
Sensor Hub | i7 | i6 | i5 |
NPU | Yes | No | No |
Mfc. Process | TSMC 10nm | TSMC 16nm FFC | TSMC 16nm FF+ |
The third main metric for in the hardware is going to be its new ‘Neural Processing Unit’, or NPU. This is silicon dedicated to running artificial intelligence calculations and frameworks, in the form of neural networks. As with other task-specific processors, technically these AI tasks can be run on the CPU or GPU, but because an AI network can run at lower precision and have fixed calculation steps, by developing specific hardware it allows for higher performance at much lower power – the same basic rationale behind GPUs for graphics, ISPs for image processing, etc.
The IP for Huawei’s NPU comes from Cambricon Technologies, and from a high-level might be considered similar to NVIDIA’s Tensor Cores. We are under the impression the Huawei NPU runs several 3x3x3 matrix multiply engines, whereas the Tensor cores run 4x4x4. Huawei runs all this in 16-bit floating point mode, and has a listed performance of 1.92 TFLOPs. This is a relatively high number, and for reference is twice the throughput as what Apple quotes for its new Neural Engine found in the A11 Bionic processors for the iPhone 8 and iPhone X.
The latest unconfirmed reports I have seen put Huawei’s NPU at around 25-30% of the full silicon area. They are quoting ‘under 100 mm2’ for the total die size, and a total of 5.5 billion transistors. That comes out to a surprising 55 million transistors per square millimeter using TSMC’s 10nm process, which is double that of AMD’s Ryzen design, and even above Intel’s own 48MTr/mm2 estimate given at their manufacturing day.
If Huawei did not have an NPU, the die size would be a lot smaller, and here comes a fundamental fact as we move to even smaller process nodes (as in, physically smaller, rather than just a smaller number for a name): it becomes harder and harder to extract pure performance out of a non-parallel design. A chip designer either makes a smaller chip, or spends the transistors on dedicated hardware – either supporting a new video encoder algorithm, a new DSP, or in this case, hardware specifically for artificial intelligence networks.
Smartphone as a Desktop
I remember, almost ten years ago, one of Anand’s prophecies. It went something like this:
“Give me a smartphone, with all my files, I can dock and use as a PC, and it will revolutionize personal computing.”
At the time, Anand predicted that Microsoft had all the key elements in place: an OS, a smartphone platform, and potentially a gaming platform in the Xbox. All Microsoft had to do was put them all together, although at the time they were focusing on other matters, such as Windows 8 and fixing Windows 8.
Initially we saw Windows RT running on ARM on some hybrid tablets, but the ecosystem did not bite. Eventually we saw Windows' Continuum functionality hit the scene to not a lot of fanfare. It required significant grunt, and we saw a device from Acer, a device from HP, and it also had a slow death.
Qualcomm are going to push the concept via the Windows on Snapdragon platform, using the Snapdragon 835. Qualcomm is working with Microsoft and combined they are working with most of the major laptop OEMs to provide ARM devices that can run almost a full-blown copy of Windows. These are still laptops though, and not Anand’s original vision of a smartphone.
Huawei is going to try and roll its own solution to this. When connecting to a TV, a custom Linux interface will spring up like a traditional desktop operating system, somewhat similar to Samsung's recently launched DeX feature. Bluetooth devices can be connected, and it will have access to all the standard Android apps. The smartphone itself can act as a trackpad for a mouse, or a keyboard, and be connected to something like the MateDock (sold alongside the original Matebook) for additional functionality such as Ethernet, more USB ports, and additional video outputs.
As the headlines for the Mate 10 will be around artificial intelligence, this feature is likely to be left into a footnote for now, similar to how DeX has been on the Galaxy S8 series. In order to get it off the ground, I suspect that Huawei will have to implement some type of ‘Desktop Dock’ that can allow for additional attachments as well as charging at the same time – at this point Huawei says that users will have to buy a splitter cable to support charging at the same time. This is the first generation, so there are some rough edges – it only supports displays at their native resolution up to 1920x1080 for now, and when using a Bluetooth device I did notice some lag. Other features, such as something similar to Windows Snap, should be high on the list.
Artificial Intelligence
For the readers that are not too familiar with the new wave of neural networks and artificial intelligence, there are essentially two main avenues to consider: training, and inference.
Training involves putting the neural network in front of a lot of data, and letting the network improve its decision-making capabilities with a helpful hand now and again. This process is often very computationally expensive, and done in data centers.
Inference is actually using the network to do something once it has been trained. For a network that is trained to recognize pictures of flowers and determine their species, for example, the ‘inference’ part is showing the network a new picture and it calculates what said picture could possibly be. The accuracy of the neural network is its ability to succeed in inference testing, and the typical way of making an inference network better is to train it more.
The mathematics behind training and inference are pretty much identical, but on different scales. There are methods and tricks, such as reducing the precision of the numbers flowing through the calculations, that can make tradeoffs in memory consumption, power, speed and accuracy.
Huawei’s NPU is an engine designed for inference. The idea is that software developers, using either Android’s Neural Network APIs or Huawei’s own Kirin AI APIs, can apply their own pre-trained networks to the NPU and then use it for their software. This is basically the same as how we run video games on a smartphone: the developers use a common API (OpenGL, Vulkan) that leverages the hardware underneath.
The fact that Huawei is stating that the NPU supports Android NN APIs is going to be a plus. If Huawei had locked it down to its own API (which it will likely use for first-party apps), then as many analysts had predicted, it would have died with Huawei. By opening it up to a global platform, we are more likely to see NPU accelerated apps come to the Play Store, possibly even as ubiquitously as video games do now. Much like video games however, we are likely to see different levels of AI performance with different hardware, so some features may require substantial hardware to get on board.
What Huawei will have a problem with regarding the AI feature set is marketing. Saying that the smartphone supports Artificial Intelligence, or that it is an ‘AI-enabled’ smartphone is not going to be the primary reason for buying the device. Most common users will not understand (or care) if a device is AI capable, much in the same way that people barely discuss the CPU or GPU in a smartphone. This is AnandTech, so of course we will discuss it, but the reality is that most buyers do not care.
The only way that Huawei will be able to mass market such a feature is through the different user experiences it enables.
The First AI Applications for the Mate 10
Out of the gate, Huawei is supporting two primary applications that use AI – one of its own, and a major collaboration with Microsoft. I’ll start with the latter, as it is a pretty big deal.
With the Mate 10 and Mate 10 Pro, Huawei has collaborated with Microsoft to enable offline language translation using neural networks. This will be with the Microsoft Translate app, and consists of two main portions: word detection and then the translation itself. Normally both of these features would happen in the cloud and require a data connection, but this is the next evolution of the idea. It also leans on the idea of moving more of the functionality that commonly exists in the cloud (better compute, but ‘higher’ costs and data required) do the device or ‘edge’ (which is power and compute limited, but ‘free’). Now theoretically this could have been done offline many years ago, but Huawei is citing the use of the NPU to allow for it to be done more quickly and with less power consumed.
The tie-in with Microsoft has potential, especially if it works well. I have personally used tools like Google Translate to converse in the past, and it kind of worked. Having something like that which works offline is a plus, the only question would be around how much storage space is required and how accurate it will be. Both questions might be answered during the presentations today announcing the device, although it will be interesting to hear what metrics they will use.
The second application to get the AI treatment is in photography. This uses image and scene detection to apply one of fourteen presets to get ‘the best’ photo. When this feature was originally described, it sounded like that the AI was going to be the ultimate Pro photographer, and change all the pro-mode like settings based on what it thought was right – instead, it distils the scene down to one of fourteen potential scenes and runs a predefined script for the settings to use.
Nominally this isn’t a major ‘wow’ use-case for AI. It becomes a marketable feature for scene detection that, if enabled automatically, could significantly help the quality of auto-mode photography. But that is the feature that which users will experience without knowing AI is behind it: the minute Huawei starts to advertise it with the AI moniker, it is likely to get overcomplicated fast for the general public.
An additional side note. Huawei states that it is using the AI engine for two other parts of the device under the hood. The first is in battery power management, and recognizing which parts of the day typically need more power and responding through the DVFS curve to do so. The idea is that the device can work out what power can be expended and when in order to provide a day of full charge. Personally, I’m not too hopeful about this, given the light-touch explanation, but the results will be interesting to see.
The second under-the-hood addition is in general performance characteristics. In the last generation Huawei promoted that it had tested its hardware and software to provide 18 months of consistent performance. The details were (annoyingly) light but it related to memory fragmentation (shouldn’t be an issue with DRAM), storage fragmentation (shouldn’t be an issue with NAND) and other functionality. When pressing Huawei for more details, none was forthcoming. What the AI hardware inside the chip should do, according to Huawei, is enable the second generation of this feature, leading to performance retention.
Killer Applications for AI, and Application Lag
One of the problems Huawei has is that while these use cases are, in general, good, none of them is a killer application for AI. The translate feature is impressive, however as we move into a better-connected environment, it might be better to offload that sort of compute to servers if it will be more accurate. The problem AI for smartphones has is that it is a new concept: with both Huawei and Apple announcing their dedicated hardware to applying AI neural networks, and Samsung not far behind, there is going to be some form of application lag between implementing the hardware and getting the software right. Ultimately it ends up a big gamble on the state of the semiconductor designers to dedicate so much silicon to it.
When we consider how app developers will approach AI, there are two main directions. Firstly, we start with applications that already exist adding in AI to their software. They have a hammer and are looking for a nail: the first ones out of the gate publicly are likely to be the social media apps, though I would not count out professional apps to be too far behind. The second segment of developers will be those that are creating new apps with the AI requirement – their application would not work otherwise. Part of the issue here is having an application idea that is AI limited in the first place, and then having a system that defaults back down to the GPU (or CPU) if dedicated neural network hardware is not present.
Then comes the performance discussion. Huawei was keen to point out that their solution is capable of running an image recognition network at 2000 images per minute, around double that of the nearest competition. While that is an interesting metric, it ultimately becomes a synthetic – no-one is going to need 2000 images per minute every minute being identified. Perhaps this can extend to video, e.g. real-time processing and image recognition combined with audio transcription for later searching, but the application that does that is not currently on smartphones (or if one exists, is not using the new AI hardware).
One of the questions I put to Huawei and HiSilicon is around performance: if Huawei is advertising up to 2x the raw performance in FLOPS compared to say, Apple, how as a user is that going to affect my day-to-day use with the hardware? How is that extra horsepower going to generate new experiences that the Apple hardware cannot? Not only did Huawei not have a good answer for this, they didn’t have an answer at all. The only answer I can think of that might be appropriate is that the ideas required haven’t been thought of yet. There’s that Henry Ford quote about ‘if you ask the customer, all they want is faster horses’ which means that sometimes a paradigm shift is needed to generate new experiences; a new technology needs its killer application. Then comes the issue about the lag of app development behind these new features.
The second question to Huawei on this was about benchmarking. We already extensively benchmark the CPU and the GPU, and now we are going to have to test the NPU. Currently no real examples exist, and the applications using the AI hardware are not sufficient enough to get an accurate comparison on the hardware available, because the feature either works or it does not. Again, Huawei didn’t have a good answer for this, outside their 2000 images/minute metric. To a certain extent, they don’t need an answer for this right now – the raw appeal of AI dedicated hardware is the fact that it is new. The newness is the wow factor. The analysis of that factor is something that typically occurs in the second generation. I made it quite clear that as technical reviewers, we would be looking to see how we can benchmark the hardware (if not this generation, then perhaps next generation) and I actively encouraged Huawei to synchronize with the common industry standard benchmark software tools in order to do so. Again, Huawei has given itself a step up by supporting Android’s Neural Network APIs, which should open the software up to these developers.
On a final thought, last week at GTC Europe, NVIDIA's keynote mentioned an understated yet interesting feature where using AI could help improve it. Ray tracing, to provide realistic scene interpretation over polygon modeling, is usually a very computationally intensive task, however the benefit is an extreme payoff in visual fidelity. What NVIDIA showed was AI assisted ray-tracing: predicting the colors of nearby pixels based on the information already computed and then updating as more computation was performed. While true ray-tracing for interactive video (and video games) might still be a far-away wish, AI-assisted ray tracing looked like an obvious way to accelerate the problem. Could this be something applied to smartphones? If there is dedicated AI hardware, such as the NPU, it could be a good fit to enable better user experiences.