This is more interesting than a GTX 1070/1080 review imo. We more or less know what the nVidia cards are capable of. This ARM GPU design will be relevant for the next 2-3 years.
Well yeah, ofc it's not interesting anymore because by the time their reviews hit whatever it is they are reviewing is a known quantity. 1080 in this case is a perfect example.
Relevant for what? Phone GPU's are fine as is. For mobile gaming? Its all cash shop garbage. For productivity? ADroid and iOS pale in comparison to a real OS like Windows 10.
"Phone GPUs are fine as is." And there, ladies and gentlemen, is the "640K is enough for anyone" of our times... Truly strong the vision is, in this one.
He is not wrong though. A faster GPU on one offers nothing for people as of right now. If you look at the mobile apps that are used, not a single one would even come close to using what they use now..and phones are at max quality for the screen size they use.
The only way phones can improve now that users would notice is storage/CPU/and better app quality in general which is terrible.
You realize that GPU's are used for things beyond prettying up a screen right?
Lets just stop advancing mobile GPU's because in the future we will never use anything more advanced that needs more power or less power usage... *eye roll*
I think what he's trying to say is that the content is not available or sufficient right now to really justify the cost. Let's say you built a car but don't have the fuel that's needed to power it. Or what about those that got a 4K TV early and now are stuck with something that doesn't have HDR and that runs old, not updateable TV software. I understand his concern.
Well as the phone's and tablet's resolution is going ridiculously higher and higher everyday, having strong GPU is a must. It's not all about gaming, it affects day-to-day use as well. Everything on displays now on every OS nowadays are hardware-accelerated. And the current mobile GPU are under-power comparing to the task it's given (eg. driving 2K displays at 60fps).
This technology is going to affect millions if not billions of customers over the next few years, your 1080 will be used by a limited number of gamers, we know already about the Pascal architecture, as it has already been covered. Claiming that mobile dGPUs have no importance sounds like Ballmer saying the iphone had no market.
You might also have missed the iPad Pro and what you can use it for, and no it is not mobile gaming ...
Also your claim that only Windows 10 is a real OS shows your short sightedness, we will see in 5-10 years which OS will be dominating, since i am sure both Android and iOS will slowly but surely keep creeping into the professional space as features will be added.
You sound like me LOL. Still interesting though just on a tech level, but I use PCs and game systems as they have actual games (and good interfaces to play them).
Go ahead and take as long as you need. I don't read AT for the hot takes, I read AT because you do real testing and real analysis to give us real information. I definitely want to read the review, but I'm willing to wait for it to be good.
It all sounds great, unfortunately I am stuck with devices that get powered by qualcomm sons, since I have verizon, and most flagship phone use qualcomm snapdragons
It's a shame Samsung isn't selling its Exynos chip to other device makers, isn't it? I mean, it's probably not even economically worth it for Samsung to design a chip for only 1 or 2 of its smartphone models. I don't understand why they don't try to compete more directly with Qualcomm in the chip market. I also don't understand why they aren't buying AMD so they can compete more directly with Intel as well, but I digress.
Samsung sells certain Exynos SoC to Meizu, Meizu MX4 Pro had Exynos 5430, Pro 5 had Exynos 7420. With Pro 6 they went Mediatek X25. About costs of designing and profit - they used to use CPU core and GPU designed by ARM, it's cheaper this way than buying license to modify these cores (and you have to add R&D costs of modifying uarch). Moreover, Samsung is using their latest process node only for high end SoCs (Apple AX series, Snapdragon 8XX series), which is very profitable market share. It could be easier to just manufacture SoC and get cash for it than looking for partners, vendees for their SoC. Plus, they would have to create whole line up of Exynos SoC to compete with Qualcomm (I assume Qc would give discount for buying chips only from them).
> it's cheaper this way than buying license to modify these cores (and you have to add R&D costs of modifying uarch).
There is no such license. ARM does not allow vendors to modify the licensed micro-architectures, even on the newly announced license it's ARM themselves which do the modifications, not giving vendors access to change the RTL.
Also a bit of global supply economics in play there. Samsung uses Exynos chips in their flagship phones in the Asian market and sometimes select parts of the European market, but typically they buy from Qualcomm for the NA market and majority of Europe due to marketing, existing penetration, etc. It also helps that Samsung currently is a fabricator of the newest Snapdragon SoCs in both Korea and the U.S. and that affords them prime pricing.
They would be slapping the right hand with the left by getting too deep into the R&D and fighting for market share with Exynos. It would be taking revenue from their foundry business to try to grow their design business, and margins in mobile logic are pretty slim these days.
"In just six years, the number of GPU vendors with a major presence in high-end Android phones has been whittled down to only two: the vertically integrated Qualcomm, and the IP-licensing ARM."
What about PowerVR? Why did you omit them? Aren't they still a major player with GPUs that are at the top both in terms of overall performance and energy-efficiency?
I would consider PowerVR as a major player, after several generations i find them to be very balanced and powerful. But they have lost marketshare to ARM.
So far only bought products with PowerVR and Adreno and a couple Tegras.
I think the "android" statement is the qualifier. PowerVR is obviously big with Apple as a customer. In the non-Apple space I guess they're more limited? MediaTek uses them?
Bingo. I haven't forgotten about the IMG crew, but in the Android space (which is really the only competitive space for GPU IP licensing) they've lost most of their market share, especially at the high-end.
However it would be interesting to know how these various features (eg primacy of SIMT rather than SIMD, coherent common address space) compare to PowerVR.
Totally agree with you. PowerVr is an hell of solution, but for some reason IMG has lost his leadership in the mobile market, almost disappeared in Android. I wonder if IMG didn't have Apple, what could be the situation now. Maybe even worse..
I guess ARM will abandon HSAIL now that SPIR-V and Vulkan are here. It probably makes sense to stop focusing on OpenCL as well, if developers can just use some other language than OpenCL with SPIR-V.
One uses C99+ or C11++ in OpenCL 2.x. SPIR-V same thing. Why would I care to write in SPIR-V unless it was a requirement for portability? If I want a lower level, higher performance result I'll skip SPIR-V which bridges with OpenCL via LLVM-IR and go straight to using Clang/LLVM and OpenCL?
Don't confuse SPIR-V with the HSA Foundation. They are solving different needs and SPIR-V doesn't address what APUs via AMD are by designed to resolve.
Yeah that's a bit of a bummer. For me this pretty much means HSA is DOA. No software company will invest in something HSA compatible if it only is available on AMD APUs.
How about the review for the GTX 1080? It's been days since the card came out. Other major sites have already posted their reviews on both the 1080 and 1070, while AnandTech still haven't posted one yet. Only a pathetic preview.
Quantiy over quality my friend. Anandtech is known to write some concise, detailed and thorough articles. For me, sites like Engadget are tabloid newspapers while Anandtech is a respected newspaper that takes it time to write some thorough and intelligent reviews.
True, but they could split review into 2 parts - 1st one would consists of tests, benchmarks, etc (like other websites do), other would be about uarchi on it's own. This way almost everyone would be happy.
Engadget?! Come on! PCWorld, PCGamesHardware, HardOCP, KitGuru, Hothardware, Tomshardware, TechSpot, HardwareCanucks, TweakTown all posted their review, many almost as thourough as Anandtech... 13 days ago! And we could forgive Mr Smith if it was a one time thing, but it's been like this every GPU review since he took over as Editor in Chief. When Anand was in charge, no review was late like this and it still was as thorough.
Now I LOVE Ryan's writing, it's the best bar none, he is and awesome guy for sure! But please! For the love of GOD, step down as Editor and focus on writing only, delivering on time without a boss to push you is not your thing... Just check earlier reviews, same thing, 1-2 weeks late, even though promised so many times it would come out a week or two before actually published. (Except GTX 960 which never got published at all after 7 weeks of promises and then just silence) Alexa shows this website has lost an insane amount of readers in 1 year. I have been here for almost 20 years and I just want AT to be great again. Please someone do something! Anyone?! <3 Please save AT!
Because while other sites will be satisfied with a review that covers "zomg runs Crysis 3 and GTA V on max settings at xyz FPS, temps and noise are pretty good", Anandtech doesn't really roll that way. They won't be satisfied with their review until they've completed a deep dive on the Pascal architecture, the merits of GDDR5X and how it compares with GDDR5 and HBM/HBM2, and quantifying frame latency and consistency.
Other reviews are written by gamers and computer enthusiasts. Anandtech reviews are written by computer engineers.
Any clue about cache sizes and if a reduction there is factored into the perf density math? Also wondering about thermal , some of the mentioned changes will help but some more details would be nice.
What is really meaningful for me is that ARM is confirming the validity of AMD approach of heterogenous computing and graphic processing.I wonder why ARM didn't emulate nVidia and if they did try what were the results?
"This is a very similar transition to what AMD made with Graphics Core Next in 2011, a move that significantly improved AMD’s GPU throughput and suitability for modern rendering paradigms."
nVidia already did this in 2006(!) with the GeForce 8800. The main reason was of course CUDA, and nVidia's forward-looking perspective that CUDA would be running different types of workloads than the graphics workloads at the time.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
57 Comments
Back to Article
Ranger1065 - Monday, May 30, 2016 - link
Interesting but not quite the GPU review the faithfull are awaiting...hope springs eternal.Shadow7037932 - Monday, May 30, 2016 - link
This is more interesting than a GTX 1070/1080 review imo. We more or less know what the nVidia cards are capable of. This ARM GPU design will be relevant for the next 2-3 years.Alexey291 - Monday, May 30, 2016 - link
Well yeah, ofc it's not interesting anymore because by the time their reviews hit whatever it is they are reviewing is a known quantity. 1080 in this case is a perfect example.name99 - Tuesday, May 31, 2016 - link
Oh give it a rest! Your whining about the 1080 is growing tiresome.SpartanJet - Monday, May 30, 2016 - link
Relevant for what? Phone GPU's are fine as is. For mobile gaming? Its all cash shop garbage. For productivity? ADroid and iOS pale in comparison to a real OS like Windows 10.I find the Nvidia much more interesting.
name99 - Tuesday, May 31, 2016 - link
"Phone GPUs are fine as is."And there, ladies and gentlemen, is the "640K is enough for anyone" of our times...
Truly strong the vision is, in this one.
imaheadcase - Tuesday, May 31, 2016 - link
He is not wrong though. A faster GPU on one offers nothing for people as of right now. If you look at the mobile apps that are used, not a single one would even come close to using what they use now..and phones are at max quality for the screen size they use.The only way phones can improve now that users would notice is storage/CPU/and better app quality in general which is terrible.
shadarlo - Tuesday, May 31, 2016 - link
You realize that GPU's are used for things beyond prettying up a screen right?Lets just stop advancing mobile GPU's because in the future we will never use anything more advanced that needs more power or less power usage... *eye roll*
ribzy - Monday, November 21, 2016 - link
I think what he's trying to say is that the content is not available or sufficient right now to really justify the cost. Let's say you built a car but don't have the fuel that's needed to power it. Or what about those that got a 4K TV early and now are stuck with something that doesn't have HDR and that runs old, not updateable TV software. I understand his concern.mr_tawan - Thursday, June 2, 2016 - link
Well as the phone's and tablet's resolution is going ridiculously higher and higher everyday, having strong GPU is a must. It's not all about gaming, it affects day-to-day use as well. Everything on displays now on every OS nowadays are hardware-accelerated. And the current mobile GPU are under-power comparing to the task it's given (eg. driving 2K displays at 60fps).Shadow7037932 - Tuesday, May 31, 2016 - link
Mobile VR (hopefully, meaning "cheaper" VR) for starters.Spunjji - Wednesday, June 1, 2016 - link
Your lack of imagination is staggering.Zyzzyx - Wednesday, June 1, 2016 - link
This technology is going to affect millions if not billions of customers over the next few years, your 1080 will be used by a limited number of gamers, we know already about the Pascal architecture, as it has already been covered. Claiming that mobile dGPUs have no importance sounds like Ballmer saying the iphone had no market.You might also have missed the iPad Pro and what you can use it for, and no it is not mobile gaming ...
Also your claim that only Windows 10 is a real OS shows your short sightedness, we will see in 5-10 years which OS will be dominating, since i am sure both Android and iOS will slowly but surely keep creeping into the professional space as features will be added.
Wolfpup - Friday, June 3, 2016 - link
You sound like me LOL. Still interesting though just on a tech level, but I use PCs and game systems as they have actual games (and good interfaces to play them).SolvalouLP - Monday, May 30, 2016 - link
We all are waiting for AT review of GTX 1080, but please, this behaviour is childish at best.Ryan Smith - Monday, May 30, 2016 - link
And your hope will be rewarded.makerofthegames - Tuesday, May 31, 2016 - link
Go ahead and take as long as you need. I don't read AT for the hot takes, I read AT because you do real testing and real analysis to give us real information. I definitely want to read the review, but I'm willing to wait for it to be good.edlee - Monday, May 30, 2016 - link
It all sounds great, unfortunately I am stuck with devices that get powered by qualcomm sons, since I have verizon, and most flagship phone use qualcomm snapdragonsKrysto - Monday, May 30, 2016 - link
It's a shame Samsung isn't selling its Exynos chip to other device makers, isn't it? I mean, it's probably not even economically worth it for Samsung to design a chip for only 1 or 2 of its smartphone models. I don't understand why they don't try to compete more directly with Qualcomm in the chip market. I also don't understand why they aren't buying AMD so they can compete more directly with Intel as well, but I digress.Tabalan - Monday, May 30, 2016 - link
Samsung sells certain Exynos SoC to Meizu, Meizu MX4 Pro had Exynos 5430, Pro 5 had Exynos 7420. With Pro 6 they went Mediatek X25.About costs of designing and profit - they used to use CPU core and GPU designed by ARM, it's cheaper this way than buying license to modify these cores (and you have to add R&D costs of modifying uarch). Moreover, Samsung is using their latest process node only for high end SoCs (Apple AX series, Snapdragon 8XX series), which is very profitable market share. It could be easier to just manufacture SoC and get cash for it than looking for partners, vendees for their SoC. Plus, they would have to create whole line up of Exynos SoC to compete with Qualcomm (I assume Qc would give discount for buying chips only from them).
Andrei Frumusanu - Monday, May 30, 2016 - link
> it's cheaper this way than buying license to modify these cores (and you have to add R&D costs of modifying uarch).There is no such license. ARM does not allow vendors to modify the licensed micro-architectures, even on the newly announced license it's ARM themselves which do the modifications, not giving vendors access to change the RTL.
http://www.anandtech.com/show/10366/arm-built-on-c...
Tabalan - Monday, May 30, 2016 - link
I meant architecture license, custom design cores. My bad, used wrong phrase, thanks for pointing that out.Shadow7037932 - Monday, May 30, 2016 - link
Samsung sells the Exynos, but vendors are unlikely to jump ship from Qualcomm because the OEMs already know the Qualcomm stuff well.FullmetalTitan - Monday, May 30, 2016 - link
Also a bit of global supply economics in play there. Samsung uses Exynos chips in their flagship phones in the Asian market and sometimes select parts of the European market, but typically they buy from Qualcomm for the NA market and majority of Europe due to marketing, existing penetration, etc. It also helps that Samsung currently is a fabricator of the newest Snapdragon SoCs in both Korea and the U.S. and that affords them prime pricing.They would be slapping the right hand with the left by getting too deep into the R&D and fighting for market share with Exynos. It would be taking revenue from their foundry business to try to grow their design business, and margins in mobile logic are pretty slim these days.
Howard72 - Monday, May 30, 2016 - link
Mali G71 parallel architecture is now consider as RISC SIMD?Howard72 - Monday, May 30, 2016 - link
*considerdOEMG - Monday, May 30, 2016 - link
I hope they also push this to the low-end, at least for the sake of API parity (Vulkan! Vulkan! Vulkan!) among a wide range of devices.LeptonX - Monday, May 30, 2016 - link
"In just six years, the number of GPU vendors with a major presence in high-end Android phones has been whittled down to only two: the vertically integrated Qualcomm, and the IP-licensing ARM."What about PowerVR? Why did you omit them? Aren't they still a major player with GPUs that are at the top both in terms of overall performance and energy-efficiency?
Ariknowsbest - Monday, May 30, 2016 - link
I would consider PowerVR as a major player, after several generations i find them to be very balanced and powerful.But they have lost marketshare to ARM.
So far only bought products with PowerVR and Adreno and a couple Tegras.
Colin1497 - Monday, May 30, 2016 - link
I think the "android" statement is the qualifier. PowerVR is obviously big with Apple as a customer. In the non-Apple space I guess they're more limited? MediaTek uses them?Ariknowsbest - Monday, May 30, 2016 - link
Mainly MediaTek and Rockchip that I can remember.Ryan Smith - Monday, May 30, 2016 - link
Bingo. I haven't forgotten about the IMG crew, but in the Android space (which is really the only competitive space for GPU IP licensing) they've lost most of their market share, especially at the high-end.name99 - Tuesday, May 31, 2016 - link
However it would be interesting to know how these various features (eg primacy of SIMT rather than SIMD, coherent common address space) compare to PowerVR.lucam - Tuesday, May 31, 2016 - link
At this point I think it is a blessing that IMG has Apple as big customer; without it they would have completely lost all mobile market share.Ariknowsbest - Tuesday, May 31, 2016 - link
But it's not good to be dependent on one large customer. Maybe the emergence of VR can help them to retake market share.lucam - Tuesday, May 31, 2016 - link
Totally agree with you. PowerVr is an hell of solution, but for some reason IMG has lost his leadership in the mobile market, almost disappeared in Android.I wonder if IMG didn't have Apple, what could be the situation now. Maybe even worse..
zeeBomb - Monday, May 30, 2016 - link
Stay frosty my friends.Krysto - Monday, May 30, 2016 - link
I guess ARM will abandon HSAIL now that SPIR-V and Vulkan are here. It probably makes sense to stop focusing on OpenCL as well, if developers can just use some other language than OpenCL with SPIR-V.mdriftmeyer - Monday, May 30, 2016 - link
One uses C99+ or C11++ in OpenCL 2.x. SPIR-V same thing. Why would I care to write in SPIR-V unless it was a requirement for portability? If I want a lower level, higher performance result I'll skip SPIR-V which bridges with OpenCL via LLVM-IR and go straight to using Clang/LLVM and OpenCL?Don't confuse SPIR-V with the HSA Foundation. They are solving different needs and SPIR-V doesn't address what APUs via AMD are by designed to resolve.
beginner99 - Tuesday, May 31, 2016 - link
Yeah that's a bit of a bummer. For me this pretty much means HSA is DOA. No software company will invest in something HSA compatible if it only is available on AMD APUs.mdriftmeyer - Tuesday, May 31, 2016 - link
Aren't you glad you commented yesterday? See the update to HSA.prisonerX - Wednesday, June 1, 2016 - link
You're confusing OpenCL C with OpenCL. SPIR-V is an intermediate language also supported by OpenCL.pencea - Monday, May 30, 2016 - link
How about the review for the GTX 1080? It's been days since the card came out. Other major sites have already posted their reviews on both the 1080 and 1070, while AnandTech still haven't posted one yet. Only a pathetic preview.Quake - Monday, May 30, 2016 - link
Quantiy over quality my friend. Anandtech is known to write some concise, detailed and thorough articles. For me, sites like Engadget are tabloid newspapers while Anandtech is a respected newspaper that takes it time to write some thorough and intelligent reviews.Tabalan - Monday, May 30, 2016 - link
True, but they could split review into 2 parts - 1st one would consists of tests, benchmarks, etc (like other websites do), other would be about uarchi on it's own. This way almost everyone would be happy.funkforce - Monday, May 30, 2016 - link
Engadget?! Come on! PCWorld, PCGamesHardware, HardOCP, KitGuru, Hothardware, Tomshardware, TechSpot, HardwareCanucks, TweakTown all posted their review, many almost as thourough as Anandtech... 13 days ago! And we could forgive Mr Smith if it was a one time thing, but it's been like this every GPU review since he took over as Editor in Chief.When Anand was in charge, no review was late like this and it still was as thorough.
Now I LOVE Ryan's writing, it's the best bar none, he is and awesome guy for sure!
But please! For the love of GOD, step down as Editor and focus on writing only, delivering on time without a boss to push you is not your thing...
Just check earlier reviews, same thing, 1-2 weeks late, even though promised so many times it would come out a week or two before actually published. (Except GTX 960 which never got published at all after 7 weeks of promises and then just silence)
Alexa shows this website has lost an insane amount of readers in 1 year. I have been here for almost 20 years and I just want AT to be great again. Please someone do something! Anyone?! <3 Please save AT!
r3loaded - Tuesday, May 31, 2016 - link
Because while other sites will be satisfied with a review that covers "zomg runs Crysis 3 and GTA V on max settings at xyz FPS, temps and noise are pretty good", Anandtech doesn't really roll that way. They won't be satisfied with their review until they've completed a deep dive on the Pascal architecture, the merits of GDDR5X and how it compares with GDDR5 and HBM/HBM2, and quantifying frame latency and consistency.Other reviews are written by gamers and computer enthusiasts. Anandtech reviews are written by computer engineers.
prisonerX - Tuesday, May 31, 2016 - link
1080 whiners like you are really tedious. I hope they cancel the review.jjj - Monday, May 30, 2016 - link
Any clue about cache sizes and if a reduction there is factored into the perf density math? Also wondering about thermal , some of the mentioned changes will help but some more details would be nice.allanmac - Monday, May 30, 2016 - link
Nice review. Delivering a full-featured Vulkan/SPIR-V 1.1 GPU to the masses is something we're all ready for.mosu - Tuesday, May 31, 2016 - link
What is really meaningful for me is that ARM is confirming the validity of AMD approach of heterogenous computing and graphic processing.I wonder why ARM didn't emulate nVidia and if they did try what were the results?mkozakewich - Tuesday, May 31, 2016 - link
Ermagerd, intergers!TheFrisbeeNinja - Wednesday, June 1, 2016 - link
Love this article; well done. This one and the A73 one (http://www.anandtech.com/show/10347/arm-cortex-a73... are the primary reason I continue to visit this site.Scali - Sunday, June 5, 2016 - link
"This is a very similar transition to what AMD made with Graphics Core Next in 2011, a move that significantly improved AMD’s GPU throughput and suitability for modern rendering paradigms."nVidia already did this in 2006(!) with the GeForce 8800. The main reason was of course CUDA, and nVidia's forward-looking perspective that CUDA would be running different types of workloads than the graphics workloads at the time.
Scali - Sunday, June 5, 2016 - link
See the whitepaper here: http://www.nvidia.co.uk/content/PDF/Geforce_8800/G...lolipopman - Monday, October 3, 2016 - link
What are you trying to prove, exactly? How is this in anyway relevant?NoSoMo - Thursday, June 16, 2016 - link
Bi-Frost huh? Does it teleport you between planets? Did Thor have a hand in this?