Andrei has a legitimate concern over smaller core counts plus higher frequencies in terms of efficiency and sustained performance. So do I. But throttling and performance degradation is not nearly as severe in real world 3D gaming as they are in 3D benchmarks. Benchmarks like Manhattan push the GPU limits to "unrealistic" amounts. There are simply no games on Android that stress the GPU as much. T-Rex is probably more realistic.
What we really need is benchmarking the actual games themselves, and see how throttling and performance are affected over time.
Either way, Samsung is finally going for a wider implementation. That's a great start.
MP12??I remember(though from another source) that Kirin 950 uses MP4 of T880?? This should equate to at least twice of Kirin 950's graphics power then?
Wow, very suprising. Considering the comments by other manufacturers that phones are really not Gpu bound. Maybe wide and slow gives better power consumption.
In part it is your fault. Nobody is testing actual games so they go for the marketing wins in empty benchmarks to match the itoys even if the GPU could might be overdimensioned. MP12 does seem like it should easily beat SD820 ,assuming T880 does as ARM claims.
I am unhappy about synthetic but got other complains too. In battery tests there are a bunch of factors that have a relevant impact but the biggest one in AT's case has to be the fact that you guys load a page every X mins while users browse in bursts. This means that the time the screen is on is very different in testing vs real world and renders the results irrelevant. Some other parts have different power consumption in this case too but the screen should be the biggest. Testing with nothing in the background makes it worse especially when you got a big difference in number and size of cores as well as RAM (less RAM is favored). Then there are huge differences in signal strength and when you test close to the signal source you favor the ones that have poor antennas and would do much worse when further away from the tower/router. Some of these ofc impact perf testing too. What pages are loaded matters too, aiming for just the home page of some popular sites is not ideal. Sites with the most individual users are not always the ones with most clicks. Home pages can be very different from other pages on the site. Users don't browse just very popular sites that tend to not be very heavy. And then there needs to be a balance between mobile and desktop versions. Not testing signal strength is not good since it's important and there are huge differences. Motion blur is a big problem in cams and OEMs as well as the press aren't educating users about it at all. In many cases users think exposure is about brightness and motion blur is self inflicted since the user just makes it brighter without knowing how it all works. Just the few things that bug me the most, to keep it short.
I gotta agree on the signal strength. As for the web page loading, it may not be a good criteria at all, or at least should be put into consideration along with other criteria.
Not so surprising for Samsung, they've got GearVR releasing within a week. VR really needs more GPU grunt, so if they intend to continue GearVR for their next generation of phones, they need to focus on having a better GPU.
Qualcomm doesn't have the best modems, they just have the widest licenses on LTE, which was the main reason Exynos never appeared on the GS5 in the US. Samsung has apparently signed a license agreement with Qualcomm, to be able to use something not-Qualcomm.
I do not understand if this is "custom" or not: I mean, Snapdragon 820 uses a custom core, while this Exynos seems to be A53, so how this can be considered "custom"? Because it is a slightly modified A53 core?
It uses both a custom core (M1) as well as a licensed core (A53). A SoC consists of lots of parts, each of which might be custom or licensed, and most use a mix of both.
I've commented once on a "best of both worlds" scenario where custom wide cores are employed along with big.LITTLE. This seems to be it. All we need now is nice little deep dive from Andrei to see the real limits of this SoC.
I'm not too worried about the GPU's clock speed and what it shows (or advertises) in benchmarks. Since it's a MP12 implementation the "sustained" performance level *should* be much better and more efficient that years of last.
I'm interested to see what Samsung also added to compete with Qualcomm's new custom ISP/DSP and DAC. Wide custom cores are great and all, but at this point, I'm more interested in the other "goodies" SoCs employ.
It also mentions 30% improvement. I'm going to assume it's a clock for clock improvement because the performance boost + clock speed difference falls in line with following benchmark results:
Previous sources are saying that the M1's single core performance relative to Apple's Twister is similar to the difference between the 7420's Cortex A57 compared to Apple's Typhoon. Multi-core performance is speculated to be insane. But that's all speculation. It'll be interesting when the GS7 is released and performance data starts showing up.
Yeah, x2 that this is not a wide core -- only 3 wide, nothing amazing. BUT remember that Apple's first custom core (Swift, I think) was not super amazing, it was really their second one that took everyone by surprise. Hopefully Samsung is on a similar path. Time will tell..
I'm guessing their homegrown CPUs aren't living up to their own hopes, given that the claims here aren't even close to the claims made by Qualcomm for the 820 and that they hired Jim Keller last month. Will be interesting to see which, if any, models of the S7 or whatever end up with this inside.
Qualcomm still claim there's no heat issue with the 810. Maybe take what companies say about their own products with a pinch of salt? Nvidia always say their next-next SoC will be many times better than anything ever seen....
Certainly Qualcomm screwed up the 810, but previous to that they were THE high end Android SOC IHV. Samsung continuously produced its own SOCs, and continuously used Qualcomm's in anything where it mattered. The only reason they used their own for this last generation was because Qualcomm didn't have its own cores so there's no perf advantage there, and Samsung could jump its own 14nm LPE process for their sock to get an advantage.
Which isn't to say Qualcomm isn't overhyping the 820, but one can't assume Samsung isn't overhyping it's own Soc just as much. I wouldn't be surprised if the 820 shows up in "top tier" markets like Europe and the US, while Samsung's own Soc gets less performance per watt and shows up in markets they don't deem as competitive.
The million dollar question is, will samsung use SD820 or Exynos in next galaxy phone? They now have a good SOC. Even if this SOC doesn't beat SD820, but if close enough, then samsung can ditch Qualcom once again!
Well let's think back everytime qualcomm has used a custom core they have obliterated the competition not only due to their custom cores but to their superior modem integration offering the lowest power consumption of anything. Yes QC messed up when they scrapped their next custom core as it was still 32 bit and released a standard arm design to get them into 64 bit. They would have been better off releasing the planned 32 bit custom core instead of the abomination that was the 810 but people are idiots and would've said it sucked because 32 bit. Now they are back on track with their first real planned 64 bit chip.
I am calling it now thier domination shall continue. This adreno graphics they are using now is going to kill the standard mali designs used by samsung. Why do you think samsung had to but the huge mp12 version on theri soc? They knew the new adreno was going to be massively powerful. Maybe the mp12 mali helps them somewhat keep up but at what power cost? And nobody can compete with the on die integrated modems from qualcomm they are the fastest and lowest power consuming modems available. The 600/150 speed it can push maximum will ensure that the modem is never the weak link in the connection as well.
People only think the exynos is so great because qualcomm faltered on 1 chip making a bad decision to scrap the next 32 bit design and release a standard 64 bit arm design just to jump on the 64 bit bandwagon. I've been saying this from day 1 when qualcomm released the standard 64 bit cores, that they will be right back on top when they release their true krait successors and not that simple stop gap solution. Exynos has sucked and always will suck.
I won;t be upgrading anyways, I'd still be on the galaxy note 2 if it wasnt for t-mobile putting band 12 700mhz LTE in my area and my phone not supporting it. So i was forced to get a galaxy note 4 which i love but really everything i ran worked perfectly fine on the note 2 but the ability to use band 12 LTE was far worth it and the note 4's extra battery life was a nice bonus too, I can use the phone as heavily as I want and I know it will make it 16 hours a day with sporadic charging on my car when i drive, the note 2 I had to baby it a little or I'd find myself without a charge.
Even if the 820 is totally amazing the speed in my note 4 is more than enough for everything i need as is the battery life. The galaxy note 6 will have to bring something truly revolutionary for me to upgrade and nothing has done that for me so far. Most likely will stick to this phone until 600mhz starts being deployed and I need a new phone to take advantage of the new frequencies as t-mobile is sure to win a lot of 600mhz though i fear I won't be seeing it as they said it will mainly be for areas they dont have 700mhz coverage and i have 700mhz coverage already. Will still be useful for travel though. I'm still locked in on t-mobiles 70 dollars a month unlimited everything with 7GB of 4g LTE tethering and with the phone rooted you can actually cheat the tethering limit. And you don't get throttled at 21GB you only get throttled after 21GB if a cell site is congested. Basically once a cell site is congested anyone over 21GB on unlimited gets lower priority as well as people on mvno's.
Mali T880MP12 will be more powerful than PowerVR GPU in A9 . T880MP12 will be little more than twice more powerful than the T760MP8 currently being used in Exynos 7420. As per the CPU part single thread performance of the new exynos will be little less than A9 but as it is quadcore, it is absolutely going to destroy A9 in multicore performance. It will be really amazing that the custome core in exynos would reach(be slightly behind) the single thread performance of A9 even though having approximattely half the die size
So Samsung is claiming 30% improvement over the 7420 yet the "leaked" score posted previously for the 8890 shows single core at 70% faster than the 7420.
I doubt Samsung would claim "only" 30% if their custom cores were actually a full 70% faster than A57 cores used in the 7420. Which means that "leaked" score is probably wrong and the real single core score for the 8890 would be more like 1750. A far cry from the 2500 the A9 gets, and the A9 is clocked at 500MHz slower to boot.
So Samsung is claiming 30% improvement over the 7420 yet the "leaked" score posted previously for the 8890 shows single core at 70% faster than the 7420.
It is possible that 30% refers to improvement at same frequency ( clock for clock), while 70% refers to total improvement that also assume higher frequency. Or not ;p
Adreno was never a GPU to beat. And I doubt that the T880MP12 will struggle to compete with Adreno 530. The Apple's GPU, PowerVR GT7600, seems to be a stronger player at least in real games as synthetics lies very often.
"Adreno graphics they are using now is going to kill the standard mali designs used by Samsung" It's still unknown that the improvement of the new Adreno is purely architechture gain or they put more "cores" also.
"Maybe the mp12 mali helps them somewhat keep up but at what power cost" More "shader cores" mean MORE power efficiency at the same workload. The real cost here is the bigger die size.
"People only think the exynos is so great because qualcomm faltered on 1 chip making a bad decision to scrap the next 32 bit design and release a standard 64 bit arm design just to jump on the 64 bit bandwagon" "Exynos has sucked and always will suck" You're wrong. Many people including me still remember how crap the Crapdragon S3 was. So it's not only once for Qualcomm. The Exynos 4210 in S2 and Note 1 was simply lightyear ahead of Cradragon S3 that time.
I am more curious to see what apple would be doing with the A10 chip next year. The increase in clock spead + improvment in process(20nm to 14 nm) helped them with their performance this year. As the 10nm tech will only be ready in 2017, not sure how much of improvement they can make. They still have a room to increase the clock speed from 1.8 ghz to something like 2.2 or add an additional core.
I am curious too as, by now, they have used all 'obvious' technologies to make their CPU cores perform great in single threaded workloads. Now they will have to innovate and break new ground...
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
56 Comments
Back to Article
jjj - Thursday, November 12, 2015 - link
The GPU is MP12 ,seen a slide.so rather bananasjjj - Thursday, November 12, 2015 - link
Here the slide http://img1.mydrivers.com/img/20151112/a61e1ad5507...Andrei Frumusanu - Thursday, November 12, 2015 - link
Thanks, tracked it down to their Twitter account: https://twitter.com/SamsungExynos/status/664585940...I updated the article confirming the MP12 configuration.
AciMars - Thursday, November 12, 2015 - link
Just Wow using 12 core of T880... i highly doubt it how samsung can maintain peak performance after 1 minutes heavy gpu load.. LOLlilmoe - Thursday, November 12, 2015 - link
Andrei has a legitimate concern over smaller core counts plus higher frequencies in terms of efficiency and sustained performance. So do I. But throttling and performance degradation is not nearly as severe in real world 3D gaming as they are in 3D benchmarks. Benchmarks like Manhattan push the GPU limits to "unrealistic" amounts. There are simply no games on Android that stress the GPU as much. T-Rex is probably more realistic.What we really need is benchmarking the actual games themselves, and see how throttling and performance are affected over time.
Either way, Samsung is finally going for a wider implementation. That's a great start.
Flunk - Thursday, November 12, 2015 - link
Those games would need to implement a benchmark mode. Android isn't really conducive to using 3rd party programs to test games.Kutark - Sunday, November 15, 2015 - link
Games on android... The mere utterance of the phrase makes me /facepalm...darkich - Sunday, November 15, 2015 - link
The mere utterance of your comment makes me feel sick.Short sighted and ironic beyond belief
OCedHrt - Thursday, November 12, 2015 - link
Article still says MP8.s.yu - Saturday, November 14, 2015 - link
MP12??I remember(though from another source) that Kirin 950 uses MP4 of T880?? This should equate to at least twice of Kirin 950's graphics power then?colinisation - Thursday, November 12, 2015 - link
Wow, very suprising. Considering the comments by other manufacturers that phones are really not Gpu bound. Maybe wide and slow gives better power consumption.Andrei Frumusanu - Thursday, November 12, 2015 - link
Wide and slow is definitely more efficient, if that is indeed what they're doing here instead of going wide and fast.jjj - Thursday, November 12, 2015 - link
In part it is your fault. Nobody is testing actual games so they go for the marketing wins in empty benchmarks to match the itoys even if the GPU could might be overdimensioned.MP12 does seem like it should easily beat SD820 ,assuming T880 does as ARM claims.
Andrei Frumusanu - Thursday, November 12, 2015 - link
I've been quite harsh on the topic in the 7420 review and it's something I want to continue to actively address in the future.http://anandtech.com/show/9330/exynos-7420-deep-di...
jjj - Thursday, November 12, 2015 - link
I am unhappy about synthetic but got other complains too.In battery tests there are a bunch of factors that have a relevant impact but the biggest one in AT's case has to be the fact that you guys load a page every X mins while users browse in bursts. This means that the time the screen is on is very different in testing vs real world and renders the results irrelevant. Some other parts have different power consumption in this case too but the screen should be the biggest. Testing with nothing in the background makes it worse especially when you got a big difference in number and size of cores as well as RAM (less RAM is favored). Then there are huge differences in signal strength and when you test close to the signal source you favor the ones that have poor antennas and would do much worse when further away from the tower/router. Some of these ofc impact perf testing too. What pages are loaded matters too, aiming for just the home page of some popular sites is not ideal. Sites with the most individual users are not always the ones with most clicks. Home pages can be very different from other pages on the site. Users don't browse just very popular sites that tend to not be very heavy. And then there needs to be a balance between mobile and desktop versions.
Not testing signal strength is not good since it's important and there are huge differences.
Motion blur is a big problem in cams and OEMs as well as the press aren't educating users about it at all. In many cases users think exposure is about brightness and motion blur is self inflicted since the user just makes it brighter without knowing how it all works.
Just the few things that bug me the most, to keep it short.
MrSpadge - Thursday, November 12, 2015 - link
"... but got other complains too"Well, that's kind of self-explanatory in (almost) each of your posts. I'm not saying you're wrong, but you're certainly complaining a lot.
s.yu - Saturday, November 14, 2015 - link
I gotta agree on the signal strength. As for the web page loading, it may not be a good criteria at all, or at least should be put into consideration along with other criteria.edzieba - Thursday, November 12, 2015 - link
Not so surprising for Samsung, they've got GearVR releasing within a week. VR really needs more GPU grunt, so if they intend to continue GearVR for their next generation of phones, they need to focus on having a better GPU.darkich - Monday, November 16, 2015 - link
^thiszeeBomb - Thursday, November 12, 2015 - link
This is going to be one heck of a chipset. 12 cluster GPU, own custom cores... What more can 2016 ask for?jimjamjamie - Thursday, November 12, 2015 - link
Less throttling and overheating perhaps?BMNify - Thursday, November 12, 2015 - link
"zeeBomb -What more can 2016 ask for?"samsung to finally use wideIO2 ram in all their devices would be a very good start.
http://www.samsung.com/semiconductor/insights/news...
Samsung Develops Mobile DRAM with Wide I/O Interface
Seoul, Korea on Feb 21, 2011
https://www.youtube.com/watch?v=O4esBZsWiWY
ARM TechCon 2013 - Wide I/O with Ian Huh, Samsung
iwod - Thursday, November 12, 2015 - link
So..... Apple, where is your SoC with Modem or your own Modem Chip.Pissedoffyouth - Thursday, November 12, 2015 - link
They ue Qualcomm currently due to Qualcomm having pretty much the best and most compatible modems. Maybe they switch to intel next year.theduckofdeath - Thursday, November 12, 2015 - link
Qualcomm doesn't have the best modems, they just have the widest licenses on LTE, which was the main reason Exynos never appeared on the GS5 in the US. Samsung has apparently signed a license agreement with Qualcomm, to be able to use something not-Qualcomm.MrSpadge - Thursday, November 12, 2015 - link
If the Qualcom one works very well (best), then that's fine. Not much to be gained by building it yourself.iwod - Thursday, November 12, 2015 - link
Cost, these modem chips are expensive as the CPU / SOC itself.SydneyBlue120d - Thursday, November 12, 2015 - link
I do not understand if this is "custom" or not: I mean, Snapdragon 820 uses a custom core, while this Exynos seems to be A53, so how this can be considered "custom"? Because it is a slightly modified A53 core?WorldWithoutMadness - Thursday, November 12, 2015 - link
Reading it more carefully will help you by miles.Here to help you, keywords : Exynos M1, 4+4 big.LITTLE, SCI
Wilco1 - Thursday, November 12, 2015 - link
It uses both a custom core (M1) as well as a licensed core (A53). A SoC consists of lots of parts, each of which might be custom or licensed, and most use a mix of both.SydneyBlue120d - Thursday, November 12, 2015 - link
Thanks a lot, so the Big is custom and the little is "standard" A53. Very interesting.lilmoe - Thursday, November 12, 2015 - link
I've commented once on a "best of both worlds" scenario where custom wide cores are employed along with big.LITTLE.This seems to be it. All we need now is nice little deep dive from Andrei to see the real limits of this SoC.
I'm not too worried about the GPU's clock speed and what it shows (or advertises) in benchmarks. Since it's a MP12 implementation the "sustained" performance level *should* be much better and more efficient that years of last.
I'm interested to see what Samsung also added to compete with Qualcomm's new custom ISP/DSP and DAC. Wide custom cores are great and all, but at this point, I'm more interested in the other "goodies" SoCs employ.
tuxRoller - Thursday, November 12, 2015 - link
The article states that the M1 cores are "only" 3-wide, or same as the past few big arm cores.lilmoe - Thursday, November 12, 2015 - link
It also mentions 30% improvement. I'm going to assume it's a clock for clock improvement because the performance boost + clock speed difference falls in line with following benchmark results:http://www.sammobile.com/2015/10/06/samsungs-mongo...
Previous sources are saying that the M1's single core performance relative to Apple's Twister is similar to the difference between the 7420's Cortex A57 compared to Apple's Typhoon. Multi-core performance is speculated to be insane. But that's all speculation. It'll be interesting when the GS7 is released and performance data starts showing up.
extide - Thursday, November 12, 2015 - link
Yeah, x2 that this is not a wide core -- only 3 wide, nothing amazing. BUT remember that Apple's first custom core (Swift, I think) was not super amazing, it was really their second one that took everyone by surprise. Hopefully Samsung is on a similar path. Time will tell..Frenetic Pony - Thursday, November 12, 2015 - link
I'm guessing their homegrown CPUs aren't living up to their own hopes, given that the claims here aren't even close to the claims made by Qualcomm for the 820 and that they hired Jim Keller last month. Will be interesting to see which, if any, models of the S7 or whatever end up with this inside.theduckofdeath - Thursday, November 12, 2015 - link
Qualcomm still claim there's no heat issue with the 810. Maybe take what companies say about their own products with a pinch of salt?Nvidia always say their next-next SoC will be many times better than anything ever seen....
Frenetic Pony - Thursday, November 12, 2015 - link
Certainly Qualcomm screwed up the 810, but previous to that they were THE high end Android SOC IHV. Samsung continuously produced its own SOCs, and continuously used Qualcomm's in anything where it mattered. The only reason they used their own for this last generation was because Qualcomm didn't have its own cores so there's no perf advantage there, and Samsung could jump its own 14nm LPE process for their sock to get an advantage.Which isn't to say Qualcomm isn't overhyping the 820, but one can't assume Samsung isn't overhyping it's own Soc just as much. I wouldn't be surprised if the 820 shows up in "top tier" markets like Europe and the US, while Samsung's own Soc gets less performance per watt and shows up in markets they don't deem as competitive.
extide - Thursday, November 12, 2015 - link
Wait, Jim Keller went where? Samsung, or Qualcomm?ratte - Thursday, November 12, 2015 - link
samsungKvaern2 - Thursday, November 12, 2015 - link
Is he like the bicycle repairman for companies strugling with CPU design?TechGod123 - Friday, November 13, 2015 - link
He really is, isn't he? He did Apple's custom SOCs and Zen and next will be a Samsung SOC that he'll seriously spruce up...jospoortvliet - Monday, November 16, 2015 - link
Let's hope so. It will take 3 years before we see results of course but by then glad the industry runs Keller chips :-)sseemaku - Thursday, November 12, 2015 - link
The million dollar question is, will samsung use SD820 or Exynos in next galaxy phone? They now have a good SOC. Even if this SOC doesn't beat SD820, but if close enough, then samsung can ditch Qualcom once again!lilmoe - Thursday, November 12, 2015 - link
They're allegedly using both.Laststop311 - Thursday, November 12, 2015 - link
Well let's think back everytime qualcomm has used a custom core they have obliterated the competition not only due to their custom cores but to their superior modem integration offering the lowest power consumption of anything. Yes QC messed up when they scrapped their next custom core as it was still 32 bit and released a standard arm design to get them into 64 bit. They would have been better off releasing the planned 32 bit custom core instead of the abomination that was the 810 but people are idiots and would've said it sucked because 32 bit. Now they are back on track with their first real planned 64 bit chip.I am calling it now thier domination shall continue. This adreno graphics they are using now is going to kill the standard mali designs used by samsung. Why do you think samsung had to but the huge mp12 version on theri soc? They knew the new adreno was going to be massively powerful. Maybe the mp12 mali helps them somewhat keep up but at what power cost? And nobody can compete with the on die integrated modems from qualcomm they are the fastest and lowest power consuming modems available. The 600/150 speed it can push maximum will ensure that the modem is never the weak link in the connection as well.
People only think the exynos is so great because qualcomm faltered on 1 chip making a bad decision to scrap the next 32 bit design and release a standard 64 bit arm design just to jump on the 64 bit bandwagon. I've been saying this from day 1 when qualcomm released the standard 64 bit cores, that they will be right back on top when they release their true krait successors and not that simple stop gap solution. Exynos has sucked and always will suck.
I won;t be upgrading anyways, I'd still be on the galaxy note 2 if it wasnt for t-mobile putting band 12 700mhz LTE in my area and my phone not supporting it. So i was forced to get a galaxy note 4 which i love but really everything i ran worked perfectly fine on the note 2 but the ability to use band 12 LTE was far worth it and the note 4's extra battery life was a nice bonus too, I can use the phone as heavily as I want and I know it will make it 16 hours a day with sporadic charging on my car when i drive, the note 2 I had to baby it a little or I'd find myself without a charge.
Even if the 820 is totally amazing the speed in my note 4 is more than enough for everything i need as is the battery life. The galaxy note 6 will have to bring something truly revolutionary for me to upgrade and nothing has done that for me so far. Most likely will stick to this phone until 600mhz starts being deployed and I need a new phone to take advantage of the new frequencies as t-mobile is sure to win a lot of 600mhz though i fear I won't be seeing it as they said it will mainly be for areas they dont have 700mhz coverage and i have 700mhz coverage already. Will still be useful for travel though. I'm still locked in on t-mobiles 70 dollars a month unlimited everything with 7GB of 4g LTE tethering and with the phone rooted you can actually cheat the tethering limit. And you don't get throttled at 21GB you only get throttled after 21GB if a cell site is congested. Basically once a cell site is congested anyone over 21GB on unlimited gets lower priority as well as people on mvno's.
TechGod123 - Friday, November 13, 2015 - link
But A9. Still beats this SOC and Qualcomm's at single thread. The A9 has a beastly GPU and I'm not sure if Sammy or Qualcomm can compete.geekbot - Saturday, November 14, 2015 - link
Mali T880MP12 will be more powerful than PowerVR GPU in A9 . T880MP12 will be little more than twice more powerful than the T760MP8 currently being used in Exynos 7420. As per the CPU part single thread performance of the new exynos will be little less than A9 but as it is quadcore, it is absolutely going to destroy A9 in multicore performance. It will be really amazing that the custome core in exynos would reach(be slightly behind) the single thread performance of A9 even though having approximattely half the die sizeciderrules - Tuesday, November 17, 2015 - link
So Samsung is claiming 30% improvement over the 7420 yet the "leaked" score posted previously for the 8890 shows single core at 70% faster than the 7420.I doubt Samsung would claim "only" 30% if their custom cores were actually a full 70% faster than A57 cores used in the 7420. Which means that "leaked" score is probably wrong and the real single core score for the 8890 would be more like 1750. A far cry from the 2500 the A9 gets, and the A9 is clocked at 500MHz slower to boot.
Apple is still way ahead of Samsung here.
Nenad - Friday, November 20, 2015 - link
It is possible that 30% refers to improvement at same frequency ( clock for clock), while 70% refers to total improvement that also assume higher frequency. Or not ;p
alex3run - Friday, November 13, 2015 - link
Adreno was never a GPU to beat. And I doubt that the T880MP12 will struggle to compete with Adreno 530. The Apple's GPU, PowerVR GT7600, seems to be a stronger player at least in real games as synthetics lies very often.hung2900 - Friday, November 13, 2015 - link
"Adreno graphics they are using now is going to kill the standard mali designs used by Samsung"It's still unknown that the improvement of the new Adreno is purely architechture gain or they put more "cores" also.
"Maybe the mp12 mali helps them somewhat keep up but at what power cost"
More "shader cores" mean MORE power efficiency at the same workload. The real cost here is the bigger die size.
"People only think the exynos is so great because qualcomm faltered on 1 chip making a bad decision to scrap the next 32 bit design and release a standard 64 bit arm design just to jump on the 64 bit bandwagon" "Exynos has sucked and always will suck"
You're wrong. Many people including me still remember how crap the Crapdragon S3 was. So it's not only once for Qualcomm. The Exynos 4210 in S2 and Note 1 was simply lightyear ahead of Cradragon S3 that time.
jakeuten - Thursday, December 31, 2015 - link
Ahhh yes, the Exynos S2 on AT&T destroyed the S2 Skyrocket. I remember those days.poohbear - Thursday, November 12, 2015 - link
wow, so qualcomm is left out in the cold. Not boding well for qualcomm....geekbot - Saturday, November 14, 2015 - link
I am more curious to see what apple would be doing with the A10 chip next year. The increase in clock spead + improvment in process(20nm to 14 nm) helped them with their performance this year. As the 10nm tech will only be ready in 2017, not sure how much of improvement they can make. They still have a room to increase the clock speed from 1.8 ghz to something like 2.2 or add an additional core.jospoortvliet - Monday, November 16, 2015 - link
I am curious too as, by now, they have used all 'obvious' technologies to make their CPU cores perform great in single threaded workloads. Now they will have to innovate and break new ground...