Comments Locked

119 Comments

Back to Article

  • twizzlebizzle22 - Thursday, February 12, 2015 - link

    The speed on modern/flagship SoCs are phenomenal. The right implementation and power savings are what I'm focussed on this year.
  • ddriver - Thursday, February 12, 2015 - link

    Either there is a typo in the "PNG Comp ST" test, or Exynos 5433 is ~1000 times faster than the competition...
  • MrCommunistGen - Thursday, February 12, 2015 - link

    Probably a comma instead of a decimal point. You'll see that the Multithreaded PNG score for the Exynos 5433 is roughly in line with the other SoCs and much "lower" than the Single Threaded score.
  • Mondozai - Thursday, February 12, 2015 - link

    "The speed on modern/flagship SoCs are phenomenal."

    Yes, but not this chip. It's going to be Qualcomm's main chip in 2015, it's still getting beaten by year old tech. Then again, the OEMs want a "total solution" and while Nvidia is crushing them in the GPU benchmarks, Nvidia still doesn't have a good integrated LTE solution, for example.

    Nevertheless, GPU power matters. This SoC will struggle with 4K and its supposed to be the high-end. Disappointing.
  • Makaveli - Thursday, February 12, 2015 - link

    Does 4k really matter that much on a 5' display?
  • fokka - Thursday, February 12, 2015 - link

    i say no, but sadly that is where the market will go, especially onphablets and tablets. there already are rumours about an lg g4 with a 1800p screen and as we see on qualcomm's reference platform, i'm pretty sure we'll see some 4k tablets enter the market pretty soon.
  • Frenetic Pony - Friday, February 13, 2015 - link

    Then don't buy their bullshit, that's easy enough. Anything beyond 1080 for subs 6" is ridiculous and wasteful.
  • Uplink10 - Friday, February 13, 2015 - link

    I think anything beyond HD for a smartphone is worthless, difference is not worth the price and energy. Do people need 4K, FullHD, QHD screens because they edit photos and videos on their smartphone which we then see in the cinemas?
  • xnay - Saturday, February 14, 2015 - link

    I totally agree with you. And I'm waiting impatiently for the new HTC M9 because it's said to be using 1080p display.
  • Laststop311 - Friday, February 20, 2015 - link

    Im with you. I wish they woul stick to standard full HD and focus on improving reflectance of outside light to a lower percentage (better performance in this area is critical it allows easier viewing in sunlight without having to crank the brightness up and use more power), Luminance per watt for either brighter screen or same brightness but less power (which is easily possible if they quit using smaller pixels that block more of the backlight), better color accuracy and gamma with even a higher bit screen to display more color while keeping accuracy high. Pre calibrated with professional tools at the factory the way dell does with their high end u3014.

    Almost 100% of people I know would trade a couple extra hours of battery life to have less pixels. Less pixels = less power used by gpu, lower power backlight needed, less heat from backlight generated, smaller backlight needed (can make phone a bit thinner), more responsive phone when scrolling less pixels have to be renedered for the scroll animation so it's smoother and faster and uses less energy. And there isn't really a downside. You would have to have super human eagle eyes to see this difference between 1080 RGB strip and 1440 RGB stripe. Many more benefits sticking with 1080. Anything higher is utterly ridiculous for a 5-6 inch phone.

    I could honestly get by with 1280 x 720 or 1366 x x756 or whatever it is. I loved the screen on my 5.5" galaxy note 2 with RGB stripe 1280x720 AMOLED. Everything looked plenty crisp and switching to the note 4 sure things do look a bit more crisp but just imagine the battery life saved if it was 1280x720. Bet hours would be added to it.
  • Laststop311 - Friday, February 20, 2015 - link

    Also forgot to mention with the less heat generated from backlights not needing to be as strong this also gives more thermal room to the SoC so doesnt have tp throttle at all or in less situations.
  • Antronman - Saturday, February 14, 2015 - link

    It's all about the buzzwords.

    The last year or so, "4k" and "UHD" have been the buzzwords. If it's sub 3k, it isn't acceptable. 4k is decent. In two years people will be complaining about a lack of 8k panels.
  • Wwhat - Sunday, March 15, 2015 - link

    People put them in VR headsets, and then resolution is never enough.
  • open4g - Wednesday, March 18, 2015 - link

    One factor driving 4K onto SmartPhones, Phablets and tablet small-screen devices will be miracast and entertainment stations. I don't think 4K will be what sets the majority of consumer expectations for mobile devices for another 3-4 years. By that time SoCs should be manufactured using smaller geometries that reduces power consumption for H.265 by 25%-50% compared to the first generation HEVC capable mobile chips. And there should be more available content.

    There are limits to human visual acuity on small screen devices that 4K buts up against. What will eventually be needed is partly anticipated by dual-camera GPU, particularly the encoding. This can enhance perceived resolution on small screens. Its a topic for another discussion.
  • Nandy - Saturday, February 14, 2015 - link

    You'll need the highest resolution you can get if you use VR.
  • leliel - Thursday, February 12, 2015 - link

    On a five foot display? Yeah, it helps.

    Now if we're talking 5"... I've used 480x800 and 1080x1920 phones for an extended period and I suspect even current screens like the latter are borderline overkill. 720x1280 might have been the sweet spot for performance/battery. 4K is definitely a negative feature in my books.
  • SilthDraeth - Thursday, February 12, 2015 - link

    If I could upvote you, instead of replying I would have. LOL the 5' vs 5" comment is priceless.
  • Notmyusualid - Thursday, February 12, 2015 - link

    Yes, for the love of the Lord - please switch to Disqus already!

    Allows us to edit / delete comments too...

    And I believe, (mistakenly?), it is free too.
  • YUGogo - Friday, February 13, 2015 - link

    I'd say 1080p will do just fine on a performance-battery ratio. I'm ok with 720p atm (I use a Blackberry Z30), but I can see its limitations compared to 1080p. While not everyone can, (the whole "retina display" thing), 20/20 vision isn't actually "perfect" and there are other measures to vision beyond that. The day AA is no longer needed is the ideal resolution. The one arc thing at 12" being maximum is false. It's not uncommon for humans to go as low a 0.3 degrees at 12" for that form of acuity. 2160p is "sufficient" for almost any human. But I'd be very happy with "just" 1080p for phones. I don't want to need a 4000mAh battery just to last the day.
  • Uplink10 - Friday, February 13, 2015 - link

    That is what they ususally do, phones (the ones from LG, Samsung, Motorola...) get better battery but they also get higher resolution and in the end you are a little better off instead of a lot. Droid Turbo has 3900 mAh battery but comes with 1440p display.
  • jerrylzy - Friday, February 13, 2015 - link

    "Yes, but not this chip. It's going to be Qualcomm's main chip in 2015, it's still getting beaten by year old tech."

    Please specify what "year old tech" is. The first mobile A57 implementation is Exynos 5433, which is by no means a "year old tech." Also, Snapdragon 810 uses a newer revision r1p1. I cannot see your point here.

    "Nevertheless, GPU power matters. This SoC will struggle with 4K and its supposed to be the high-end. Disappointing."

    If excluding Apple and NVIDIA, Adreno 430 has the highest GPU power in mobile space. It also has much higher power efficiency compared to ARM Mali-T760 implementations. Adreno 430 may not be competent enough in tablet space, but it is one of the best smartphone GPU available right now.
  • jerrylzy - Friday, February 13, 2015 - link

    I also doubt whether 4K display will appear any time soon on 5.5" or smaller phones.
  • mkozakewich - Friday, February 13, 2015 - link

    Also, it'll do just fine on 4K. The benchmarks were running complex graphical scenes, like games, but most games run at lower resolutions anyway.
  • Ethos Evoss - Sunday, February 22, 2015 - link

    Yeah they just tricking putting best HW into smartphones WHICH ppl will NEVER use like LTE Cat 6 yeah OK .. please qualcom show me who supports it RIGHT now that I can fully use it ..
    And they implementing cat 9 jesus christ wich will be standard like within 5-10 years ?
    Come ooon qualcom wha for ?
    Packing so many things in SoC we wont even use it within 5 years..
    and exactly battery stay last half day right ?
  • douglord - Thursday, February 12, 2015 - link

    VERY disappointing performance. This will be crushed by the IPhone 7 in both CPU and GPU. And this can't even compete with K1 in the tablet market. The X1 should own the Android tablet space.
  • kron123456789 - Thursday, February 12, 2015 - link

    Yeah, it should, but unfortunately it won't. Just like previous Tegra chips. Tegra X1 still has impressive GPU performance though.
  • JarredWalton - Thursday, February 12, 2015 - link

    Tegra X1 needs to get power far lower to be in most tablets and smartphones, and given the lack of success with getting Tegra K1 into lots of devices I wouldn't expect X1 to fare any better.
  • blanarahul - Thursday, February 12, 2015 - link

    AnandTech's opinion on big.LITTLE vs. Qualcomm's, Intel's and Apple's approach.
  • blaktron - Friday, February 13, 2015 - link

    Won't high end Android tablets start shipping with Core-M chips this year? If not, why not? There isn't any technical reason. There really isn't any way for other SoC makers to catch up with manufacturers moving to big chips...
  • kron123456789 - Friday, February 13, 2015 - link

    The Core-M chip costs $281 per unit.
    http://ark.intel.com/products/family/83613/Intel-C...
    That's why nobody will put it in a tablet with price less than $1000. And this is too much for Android tablet, even high-end.
  • Uplink10 - Friday, February 13, 2015 - link

    That price is too high and is not intended for manufacturers, Bay Trail chips also have a high price but for the price of Bay Trail chip you get motherboard with Bay Trail chip. Also that chip is an overkill for a tablet, with that much features (vPro-remote bios...) it should be used as a server.
  • metayoshi - Friday, February 13, 2015 - link

    The price on that website IS for manufacturers:

    Recommended Customer Price (RCP) is pricing guidance only for Intel products. Prices are for direct Intel customers, typically represent 1,000-unit purchase quantities, and are subject to change without notice. Prices may vary for other package types and shipment quantities. Listing of RCP does not constitute a formal pricing offer from Intel.
  • Taneli - Thursday, February 12, 2015 - link

    K1 is tablets only with no integrated modem and X1 is closer to 10w TDP so it needs active cooling. The Snapdragon here is a totally different chip. And Apple doesn't sell SOCs
  • dragonsqrrl - Thursday, February 12, 2015 - link

    Errr... what? 10W TDP? Link please?
  • kron123456789 - Friday, February 13, 2015 - link

    Well, Nvidia claimed that Tegra X1 consumes 10W while running The Elemental demo.
  • jjj - Thursday, February 12, 2015 - link

    Funny how you insist in comparing it to the 805 not the 5433 when commenting on the Geekbench results. That kind of behavior shows the desire to please the maker of the product not to inform the users, yet it is a persistent Anandtech problem.
    Might have missed it but you don't seem to even mention clocks either , the SD 810 is at 2GHz i assume and the Exynos is at 1.9GHz, 5% is plenty. Nothing on throttling ,nothing on power, you basically help them better their damaged image without having the complete data. That's unethical and you just accept being manipulated into it.
    As for overheating ,we don't know what clocks they targeted and i haven't noticed any mention of what revision you are testing. For delays,there are already delays (that's not debatable) compared to their most optimistic previously disclosed timing.
    Just showing the perf numbers we already knew,without looking beyond that doesn't really help.
  • A5 - Thursday, February 12, 2015 - link

    I really doubt Qualcomm was going to let them take apart their reference tablet to get power numbers.

    That said, 2015 seems like a good "skip year" on the Android flagship front unless the Exynos 7420 is a real blockbuster. I don't really see anything in this article that supports the conclusion at the end.
  • TylerGrunter - Thursday, February 12, 2015 - link

    I fully support you there. Snapdragon 810 seems to be competitive with Exynos 5433 and A8 chips, the sad news is that those are 6 month old and there are not the SoCs it will have to compete with.
    Exynos 7420 has been a beast in preliminary benchmarks, so I hope for that (or Intel coming with something) not to have what you are calling a "skip year".
    And for tablets we'll always have Intel or NVidia.
  • JimRamK - Thursday, February 12, 2015 - link

    What about Intel chips?
  • A5 - Thursday, February 12, 2015 - link

    AFAIK, they don't have any design wins in phones. I don't think 14nm is going to change that, but we'll see.
  • phoenix_rizzen - Thursday, February 12, 2015 - link

    Asus ZenPhone 2 uses an Intel chipset.

    There's a couple of other ones as well.
  • blanarahul - Thursday, February 12, 2015 - link

    It's probably at Snapdragon 800 level. Intel won't compete with S810 and best Exynos' of the world until Airmont.... Well, atleast I hope so. *sigh*
  • serendip - Friday, February 13, 2015 - link

    Intel on Android also has problems with app compatibility and speed, despite Intel's assurances to the contrary. Apps with ARM-compiled native code
  • serendip - Friday, February 13, 2015 - link

    Apps with ARM-compiled native code either don't run or run slowly under the code translator. It's almost like Intel is giving away these phone and tablet Atom SOCs to get a foot in the mobile market. I'm quite happy with my cheap Windows 8.1 tablet but Android on Intel has a way a to go yet.

    (the lack of comment editing isn't fun, especially when the Submit Comment button is so easy to click)
  • phoenix_rizzen - Friday, February 13, 2015 - link

    That's for (mainly) games developed using the Android NDK, correct? Doesn't the switch to the Android RunTime (ART) and pre-compiling the apps at install time mitigate this? Or does ART not apply to NDK apps?
  • jjj - Thursday, February 12, 2015 - link

    Look at the phrasing from 2 articles at just a few days distance
    in this one - Thanks in large part to the new cryptographical capabilities of the ARMv8 cores, Snapdragon 810 gets off to a very good start in Geekbench 3's integer benchmarks ....Snapdragon 810's overall performance improvement here is a rather large 45%, though if we throw out the especially large gains that come from Lua MT, the overall performance advantage is closer to 30%.
    and from the Note 4 - GeekBench's integer benchmarks paint a similar picture - if we disregard the huge boost to the cryptography scores we see an average advantage of 31% for the Exynos 5433's A57 cores, or 29% when we normalize for clock speeds.

    So for the Note they clearly point out and discard the encryption gains and they normalize for clocks. That's good and fair and the proper way to look at it. (although quantifying the importnace of the encryption gains would be a plus).
    Here not only encryption is left alone but clocks are not even mention, some readers might not be even aware that there is a clock difference.
    The tone and objectivity are fundamentally different, a nice review for the Note while here it's all about easing concerns and making SD810 look good.
  • Sushisamurai - Thursday, February 12, 2015 - link

    +1. It's not unethical, because it's on a reference platform that Qualcomm is sourcing out. It'd be "unethical" (your terms, not mine - I would use the word disappointing) if they didn't give investigate throttling and power for actual, retail/shipping devices. But I haven't been disappointed yet so....
  • gonchuki - Friday, February 13, 2015 - link

    2015 was already going to be a good skip year with flagships already going full retard at 6'' in late 2014 and already gimping the specs of 5'' phones just to increase the sales of their higher margin phablets.
  • Andrei Frumusanu - Thursday, February 12, 2015 - link

    I added the clocks for the CPU, apologies the article was still being edited when published.

    As for power and thermals, we have no way to test these until we get a shipping review device with the SoC. Josh had only a couple of hours on hand with the MDP, making more extensive testing not possible. Calling that unethical is pretty harsh.
  • warreo - Thursday, February 12, 2015 - link

    Andrei, power and thermals aside, I notice you didn't address why there was not a greater focus on Exynos vs. S810? At bare minimum, you should include the % difference in the table in addition to the difference vs. S805. The absence of any kind of discussion there makes it easy for people to cry unethical, NOT the absence of power and thermals, which as you said, was due to lack of testing time.
  • JoshHo - Thursday, February 12, 2015 - link

    Comparing the Snapdragon 810 to the Exynos 5433 wouldn't be of much value as the S810 won't be competing with the Exynos 5433 in flagship 2015 devices. We hope to make a valid comparison to an Exynos SoC in the near future.
  • warreo - Thursday, February 12, 2015 - link

    I disagree. This article is already primarily a comparison of S810 and S805, which like the Exynos will obviously not be competing for flagship 2015 devices. Does that make the comparison invalid? No, it's just a matter of context. People know that Exynos 5433 is an older SoC, but it's still interesting to see how S810 compares to it, just like it is interesting to see how it compares to S805.

    In reading this article the most interesting takeaways that I got are that on the CPU side, S810 is in a dead heat with Exynos (or barely outperforms it), and on the GPU side, there was a more substantial outperformance (call it 20-25%) vs. Exynos. The sad thing is that I had to draw that conclusion myself, because it was barely addressed in the article.

    As an aside, can someone please learn me on how this performance is considered good when Exynos 7420 is right around the corner? Am I missing something?
  • Andrei Frumusanu - Thursday, February 12, 2015 - link

    The vast majority of users will want to compare performance to the S805, seeing as the 5433 is only found in one variant of the Note 4 and probably won't bee seeing any other implementation.

    As for your last point, we just can't comment on performance of unreleased products.
  • warreo - Thursday, February 12, 2015 - link

    Unfortunately, I still disagree. While most people will never use the 5433 because it is limited to the Note 4, it is still a relevant comparison because the S6 will use the 7420, which is the next iteration of the 5433. Lest we forget, there are (likely) millions of people who will buy or consider buying the S6, making the comparison against 5433 an early preview of 7420 vs. S810 which I'm willing to bet is HIGHLY interesting to readers of this site.

    No direct comments on the unreleased 7420 would be necessary, just a more indepth discussion on how the S810 fares against 5433 would be helpful and let readers extrapolate to 7420 themselves. The reality is, the data and benchmarks are all there, I'm just a bit mystified why it is apparently not worth the effort to add the % difference into the table and discuss those results more in the text of the article.

    I'll make this my last comment on the matter as you've at least shown you've thought about the matter and had a reason as to why you didn't really discuss Exynos. I remain cynical as to whether this is a good reason (or even the true reason), but I do at least appreciate the responses. I hope you'll take my comments in the spirit in which they were given: constructive criticism to improve the quality of the article.
  • lopri - Thursday, February 12, 2015 - link

    Millions of people considering the S6 will want to know how S810 (or rather Exynos 7420) performs compared to S600/S800/S801, because those are the platforms they are currently using. Millions of people also do not have access to the Exynos 5433 Note 4, and will not be upgrading from or to it. It would be akin to comparing some obscure Xeon CPU to widely popular Core i5 CPU.

    I fully expect there will be a comparison between S810 and whatever else it competes against in due time.
  • Jumangi - Friday, February 13, 2015 - link

    Millions? Let's be real here. 99%+ of the people who go out to buy the next Galaxy phone or any smartphone for that matter won't have the slightest clue of the SoC in the thing.
  • tdslam720 - Thursday, February 12, 2015 - link

    Way to miss out on all the hype. Take some hints from Pro Wrestling or UFC. Samsung vs Qualcomm is the hype right now. Exynos vs 810 . You claim people want to see 810 vs 805, no one cares about that. Give us 810 vs Exynos and get tons more ad money while maintaining your credibility. Right now it just looks like Qualcomm is influencing you to play nice.
  • melgross - Thursday, February 12, 2015 - link

    You say no one cares about that, but that's just you saying that. Samsung doesn't sell a whole lot of Notes, particularly to the number of devices Qualcomm sells into.
  • tdslam720 - Thursday, February 12, 2015 - link

    No but they'll sell millions of S6s which is basically the same chip
  • blzd - Thursday, February 12, 2015 - link

    We should care about a CPU in a phone that we will never use? Because the next iteration perhaps we will be able to use? Um no.

    S800 vs S810 is what I want to know personally.
  • TerdFerguson - Thursday, February 12, 2015 - link

    I'm inclined to agree with you, especially after seeing the dual-channel 32-bit bus being described as having a total of 64 bits. Wow, that's as bad as marketing for 1990s consoles.
  • extide - Thursday, February 12, 2015 - link

    How is that bad marketing? A dual channel 32-bit bus IS effectively 64-bits wide ...
  • dawheat - Thursday, February 12, 2015 - link

    The reference platform has a 6.2” display - making it quite the gigantic phone. I'm guessing it avoids thermal issues which may impact other, more normal size phones.
  • Andrei Frumusanu - Thursday, February 12, 2015 - link

    The preview tests were actually done on the MDP tablet, not the MDP phone.
  • dawheat - Thursday, February 12, 2015 - link

    Ouch - then this is really best-case performance of the S810 as I'd imagine the tablet MDP has way higher thermal headroom than the phones it's being compared to.
  • lopri - Thursday, February 12, 2015 - link

    I am usually very harsh on reviewers, but I do not think your argument holds up. Exynos 5433 is a vanity product. While technically interesting, it is used for one product (afaik) and even that product is not widely available. I would definitely prefer to learn the improvement of the S810 over S800/S805, being Qualcomm's generational product.

    And it is not like the review tried to hide the Exynos 5433 or anything. The numbers are right there for you to see. Furthermore, AT covered the Exynos 5433 very extensively only a few days ago.

    Likewise, throttling or power is meaningless at this stage without knowing what the shipping product is going to be like. And I expect to see those information in due time. Unless you can point to false benchmark data, I do not see the merit in picking on every single aspect of an SOC that was not covered (yet).

    Not everyone wants to read corporate conspiracy theories on a tech article, either. I, for one, do not like to read unverified rumors in a technical article.

    Only thing that I agree with you about is the missing clock frequency information on the charts. But again, the focus is rightfully on S805 v. S810. I will give the authors a benefit of doubt.

    Only thing that I do not like about the article is its timing. I mean, I haven't even finished the Exynos 5433 article yet and there are already 2 laptop articles and now an introduction to the S810.. It's too much information for me to digest. Obviously this is a subject point, and I do not expect everyone to agree with me.
  • warreo - Thursday, February 12, 2015 - link

    Exynos 5433 is indeed a vanity product, nobody is arguing otherwise. You are missing the point that I made earlier. Exynos 5433 is the immediate predecessor to the 7420. Hence, comparing the S810 to the 5433 is a good starting point for how the S810 will compare to the 7420, and it would be one completely based on real, hard data, not just speculation. Are you honestly saying that you would not be interested in a better preview of how the S810 vs. Exynos 7420 will shake out?
  • lopri - Thursday, February 12, 2015 - link

    I mean, if AT really wanted to boost S810's image the authors could have omitted the number from 5433 in Geekbench sub-score comparison. The rest of the charts also look free of manipulation, so I do not see how you get the impression.

    As you can see from this very comment section, not everyone is impressed by the S810. Apparently the authors did not do a good enough job - you know, the job you are insinuating here.
  • warreo - Thursday, February 12, 2015 - link

    True enough, but the fact that AT provides the 5433 data makes it more mystifying why they almost completely gloss over it in the text (not to mention not provide the % differences in the tables). Josh and Andrei have already stated they intentionally kept the focus on S805 vs. S810, but my point is this would be a much stronger article if there was more depth given to the Exynos 5433 comparison. Clearly, I'm not the only one who thinks so.
  • Tchamber - Friday, February 13, 2015 - link

    @warreo
    Get over yourself. So far the 5433 has made it into one product. We all know that it's a good performer, but the Snapdragon line makes it into many more devices...always has. What your asking would be like me writing to Car and Driver and saying "hey, you are guys are doing your comparisons wrong. I want you test your Corvettes, Camarros, and GT500s against a Lamborghini Aventador." Well, they have all the test results from the Lamobo...but that's just not what the others were made to compete with, so it would be meaningless to run them against each other at the road course. Do your own math if you don't like not seeing a percent sign with easy to digest material. All the time you've been talking down to AT you could have posted your own conclusions and helped out all those people who agree with you.
  • warreo - Wednesday, February 18, 2015 - link

    HAHAHA Tchamber you are a jewel. Thanks for making my morning. Here I was wondering if a week later anybody else had anything intelligent to say....

    Your analogy of the 5433 as a Lamborghini and the Snapdragons as Corvette/Camaro/GT500 is horrible. Period. Anybody who reads this site should know that. If you really want to get into an argument with someone, you should actually know what you're talking about before insulting them.

    As for me, I wasn't talking to down to anyone. I gave AT my observations and also did in fact summarize my own conclusions if you'd bothered to read my comments in totality. Just because I disagree with them doesn't mean I'm talking down to them. You, however, should run along back to pre-school and learn how not to be a prick to others.
  • djvita - Thursday, February 12, 2015 - link

    found some typos

    last paragraph GPU performance "Qualcomm has narrowedmuch"

    CPU performance
    PNG Comp ST 0.82 MP/s 1110 MP/s 1.11 MP/s 35%

    is 1110 correct? found the difference to be very high....

    All in all, preliminar benchamrks looks good. Seems anandtech will need a flex2/mi note pro or the upcoming htc m9 in MWC (for sony no rumors, until july i think, lg g4 maybe in may. S6 wont be qualcomm)
  • Ratman6161 - Thursday, February 12, 2015 - link

    Another typo:

    There are three tables at the top of the CPU Performance page. The last column in the first table says: Snapdragon % Advantage which clearly isn't correct because just in the first line the Samsung has about a 2 to one advantage it says the snapdragon advantage is 608%. I assume you actually meant this column to say but S810 > S805 % Advantage like in the second two tables.
  • djvita - Thursday, February 12, 2015 - link

    they fixed them all now, it was 1.11
  • SydneyBlue120d - Thursday, February 12, 2015 - link

    Very interesting article. Do you think it is possibile the Galaxy S6 devices will use the MDM9x45 modem?
  • deathBOB - Thursday, February 12, 2015 - link

    Subjective impressions? Andrei pointed out that the Exynos was subjectively faster than the 805. How does the 810 fare?
  • MrCommunistGen - Thursday, February 12, 2015 - link

    Thanks for the informative article! The scope of the article as a whole goes far beyond a Preview of Snapdragon 810, specifically the sections on RF and Qualcomm's scheduler.

    That in mind, I'll hold off on passing judgement on S810's performance until we see shipping silicon. Between pre-release drivers and differences in chassis/thermals "Performance Preview" *is* spot on for the whole benchmarks section.

    Even though S810 is Qualcomm's stopgap and there's only so much you can do (for better or worse) to the performance of off the shelf A57/A53 cores, I'm glad they're still in the game - or at least not out of it. Even as a preview, it is clear that Adreno 430's performance is more than just an iterative increase over Adreno 420.

    Regardless of how S810 shakes out, I'm sure Qualcomm is baking all of their learnings from working on this SoC into their in-house ARMv8-A design
  • Mr.r9 - Thursday, February 12, 2015 - link

    Even though this is a preview and drivers/Kernel will definitely improve....I still feel that the 810 will underwhelm.
  • djvita - Thursday, February 12, 2015 - link

    considering I still have an msm8960 device, this will be a huge jump for me.
  • tviceman - Thursday, February 12, 2015 - link

    This performance preview just reaffirms two of my beliefs.

    1) It's a shame that Nvidia couldn't get more products with Tegra K1 in it, seeing how K1 has been on the market for many months and generally outperforms the 805 (sometimes by a wide margin)

    2) It's a shame that Tegra X1 will likely suffer the same limited release fate that Tegra K1 suffered, even if manufacturers were to downclock Tegra X1 to meet smaller TDP demands. X1, even if downclocked, will run circles around 805.
  • tipoo - Thursday, February 12, 2015 - link

    Unless you have information we don't, we still have no sweet clue about the TDP of the X1. So I'll give that a [citation needed].
  • kron123456789 - Friday, February 13, 2015 - link

    Well, there is one clue about that from Nvidia — they claimed that Tegra X1 consumes 10W while running The Elemental demo(which is, considering frame drops, full load of the GPU)
  • tipoo - Friday, February 13, 2015 - link

    Exactly. Way too high for a phone. They'd have to drop wattage by nearly *triple*, so I'm not sure I believe that simply clocking it lower would have them lead on performance per watt.

    And I hope the 1tflop bogus number wasn't part of ops calculus.
  • kron123456789 - Saturday, February 14, 2015 - link

    You say "drop wattage by nearly *triple*" like other SoCs consumes no more than 3-3.5W.
    And i think this 1TFLOP isn't bogus, it's just in FP16 mode.
  • serendip - Friday, February 13, 2015 - link

    I assume Intel and Nvidia are still behind Qualcomm and Samsung when it comes to integrating LTE capability into their SOCs. Then again, the article mentioned that the power saving from having integrated LTE isn't much compared to other components.

    Any idea why Samsung went with Intel modems on some Exynos variants? The proliferation of so many LTE bands creates a mess of SKUs. It's interesting that some Galaxy S5 Snapdragon variants have access to TD-LTE, FDD-LTE, CDMA2000, WCDMA and GSM in one device.
  • hlovatt - Thursday, February 12, 2015 - link

    Really liked all the RF info and as you said this RF performance is just as important in overall phone performance as CPU and GPU. Now all we need to know is how it performs in an actual phone.
  • Gunbuster - Thursday, February 12, 2015 - link

    Maybe now that we can see this is not blowing other SOC's out of the water the big players can get some good pricing from Qualcomm. Perhaps Microsoft could make a real affordable flagship this time around... (or make the weak ass S4XX affordable flagship actually affordable at $200)
  • tipoo - Thursday, February 12, 2015 - link

    Any plans on throttling tests? That was the big controversy, with Samsung rumoured to not use it in the upcoming GS6 because of overheating.
  • JoshHo - Thursday, February 12, 2015 - link

    We intend on doing deep testing of the first S810 phone we get to the bottom of the story.
  • tipoo - Friday, February 13, 2015 - link

    Good to know, looking forward to you guys getting to the bottom of it. I've been wondering if Samsung was just saying that to hype up their own Exynos, or if the other phone manufacturers are going to have problems with S810.
  • PC Perv - Thursday, February 12, 2015 - link

    Did review state on what OS the benchmarks were run? KitKat, Lollipop, 64-bit/32-bit, etc.? Sorry if I missed it.
  • Gigaplex - Thursday, February 12, 2015 - link

    Why go through all that detail on how their software stack for big.LITTLE improves over stock ARM, without testing to see if it works? The Exynos article the other day showed that big.LITTLE flat out didn't work, performing worse than parking on little cores but consuming more energy. Does Qualcomms one actually improve things here?
  • bigstrudel - Thursday, February 12, 2015 - link

    I'm beyond skeptical of 810's performance under actual thermal constraints like inside a flagship smartphone.
  • PC Perv - Thursday, February 12, 2015 - link

    I am not sure how useful those system-level benches (Basemark, 3DMark) are to compare different platforms. On same platform (OS), I can see the value.
  • HisDivineOrder - Thursday, February 12, 2015 - link

    Given all the press runs (here, PCper) Qualcomm are doing, the loss of that Samsung contract must have REALLY got someone's knickers in a twist.
  • blzd - Thursday, February 12, 2015 - link

    Good article. Thanks for including some S800 devices in some of the device comparisons, more of that (older SoC for comparison) please if you can.
  • tuxRoller - Thursday, February 12, 2015 - link

    I'll say this once again: email Rob Clark of red hat. He's been working on a clean-room implementation of adreno (https://github.com/freedreno/freedreno) for a few years and has gotten quite far (gl 2.1/gl|es 2.0, iirc).
    He's a super nice guy, and given that Qualcomm has been contributing, a bit, to his project he may be loathe to harm the relationship, but, if nothing else, you can read through his repo to understand the arch.
  • aryonoco - Thursday, February 12, 2015 - link

    Fantastic preview, thanks guys, it's been great at AT over the last couple of weeks!

    Just a note, in future and especially when reviewing shipping devices, could you pay some attention to the 2-year upgrade performance improvement as well? Most people (in the developed world at least) seem to be on 2-year upgrade cycles, and so it makes sense to compare the current generation to the phone that's in their hand. AT does this for desktop/laptop CPUs and GPUs (for example informing people that if you already have Ivy Bridge, there's not much performance to be gained by Haswell etc) so it would be great it if that coverage extends to mobile platforms as well (for example comparing SD810 with SD600 and the level of improvement one might expect between them).
  • wyewye - Friday, February 13, 2015 - link

    Why no Wifi tests?

    You say it supports MU-MIMO and 801.22ad, but anywhere else I read only "ac" and MU-MIMO is supported.
  • PC Perv - Friday, February 13, 2015 - link

    Page 6, after the Geekbench floating-point chart, you said:

    "In this case Snapdragon 810 performance is relatively close to Exynos 5433 performance even though it has the advantage of running in AArch64 mode, which should give the FP numbers a boost over the Exynos. This is likely an isolated case where the Krait architecture and Snapdragon 805's high clock speed play to its favor."

    And I have no idea what you are saying. I do not want to sound rude, but this kind of writing is what I saw from previous articles written by Mr. Ho (and Mr. Chester).
  • aryonoco - Friday, February 13, 2015 - link

    Yes, that sentence is totally meaningless.
  • JoshHo - Sunday, February 15, 2015 - link

    I didn't write the Geekbench analysis, I'm currently looking into the issue.
  • randymorrisplano - Friday, February 13, 2015 - link

    You guys who say 2 or 4k is too much for anything under 6" displays, are seriously are not taking the near future of ubiquitous VR on our handsets. Putting the display that close to the eyes, with magnifying optics, makes it a whole new ballgame.
  • Gigaplex - Friday, February 13, 2015 - link

    If I was going to do that, I'd use a VR headset, rather than holding my phone over my eyes.
  • 68k - Friday, February 13, 2015 - link

    Interesting that Aarch32 get slightly better integer score in Geekbench compared to Aarch64. The lack of HW support for AES and SHA-1 in earlier Aarch32 capable CPUs and the fact that earlier 32 vs 64-bit comparisons has not been done on the same CPU-uarch made it tricky to directly compare results between Aarch32 and Aarch64.

    Adjusting for the difference in clock frequency between Exynos 5433 and Snapdragon 810, Aarch32 is about 12% fast. Removing the AES result which is an outlier in favor for Exynos, the performance lead for Aarch32 is still about 5%.

    Aarch64 seem to do better in the MT cases compared to ST cases, the average lead for Aarch in all ST cases with AES removed is 16%.
  • dragonsqrrl - Friday, February 13, 2015 - link

    So many awesome reviews and previews from Anandtech in the past week. Keep up the good work!
  • lilmoe - Friday, February 13, 2015 - link

    At this point of software optimization, I still believe big.LITTLE Core Migration is the way to go. Software isn't yet up to task for GTS, most of that complexity should be handled by hardware.
  • fivefeet8 - Friday, February 13, 2015 - link

    Your GFXbench3.0 driver overhead benchmarks seems to be off for the Shield Tablet. Unless maybe Android 5.x is causing a degradation there.
  • sonicmerlin - Friday, February 13, 2015 - link

    So how many bands will this new modem be able to support? And why do Apple phones always seem to support more LTE bands than the competition?
  • aryonoco - Friday, February 13, 2015 - link

    1) Depends on how many filters and transceivers the OEM fits the phone with. The baseband makes it easier to support more but the actual band support would still be OEM dependant.

    2) Because Apple does not shy away from high BoM and can cram as much filters and transceivers as they want in order to reduce the number of SKUs. Android manufacturers (unfortunately) don't think like that.
  • PC Perv - Saturday, February 14, 2015 - link

    Not an accurate description of the state of affairs. It is because Apple has the power over the carriers that other OEMs lack. I wish Congress can intervene in the situation and rein in on the carriers. That will not only benefit the U.S. consumers but also potentially influence the world market.

    Absolutely not "because Apple spend more money and Android OEMs do not want to spend money"
  • name99 - Friday, February 13, 2015 - link

    "While there are multiple solutions to solving the power problem that comes with OoOE, ARM currently sees big.LITTLE as the best solution. "

    I can't help but think (based on all the evidence we've seen so far) that big.LITTLE is the VLIW of low energy CPUs. Just like VLIW would be totally awesome if we could only solve those pesky compiler issues (which are just out of reach, but maybe next year...), so big.LITTLE would be awesome if we could only solve those pesky scheduler issues (which will, likewise, maybe be solved next year...)

    It's nice that QC claim they have a better scheduler; it would be even nicer if they were confident enough about it to provide actual power/energy NUMBERS...
  • TT Masterzz - Sunday, February 15, 2015 - link

    Amazing article. Although to be frank I hardly understood the antenna part. It would be amazing if the authors at Anand Tech make an article explaining the RF system/modems/naming scheme and baseband processors in depth. Also an article explaining some terms like CPU pipeline length/branch mispredict would be amazing.
  • Laststop311 - Friday, February 20, 2015 - link

    All this is telling me is that it can barely beat last generation exynos. The exynos 7 most likely stomps this in performance which is why samsung had to go qwith it for all countries. People would be too mad if only S Korea got the super fast exynos 7 and every 1 else got the slower 810. Before snapdragon had the slioght performance edge but looks like exynos may finally be the better chip.

    That is untile qualcomm busts out their custom made 64 bit krait that just wasnt ready in time so they had to use standard arm cores to get 64 bit to market faster. Custom Kraint 64 whatever they call it it krait 500 or something will most likely beat exynos again.
  • Zingam - Wednesday, February 25, 2015 - link

    Will these be DX12, OpenGL Next compatible, or will we have to wait for another 5 years for sufficient market penetration.
  • Keermalec - Saturday, March 21, 2015 - link

    So The 1-year old Nvidia K1 trounces the yet to arrive snapdragon 810...
    And yes, LTE is not integrated into the K1 in order for OEMS to have a choice between wifi or wifi+LTE tablet versions. Nvidia CAN integrate LTE in the SoC as they have done with the Tegra 4i. It was just sound business practice not to do so with the more powerful chip.
  • radeonex - Saturday, April 11, 2015 - link

    I want to point out that for linear amplifier circuits, most of the transistors operation in the saturated region (they do not act as switches but rather voltage controlled current sources). The high electron mobility helps with trans-conductance and other characteristics especially in the context of combating short-channel effects (helps smaller devices). It also helps to reduce the minimum voltage drop required to keep the saturated transistors in the correct region of operation.
  • Ning3n - Monday, July 27, 2015 - link

    To give a "joe sixpack" review of the 810. I recently replaced my HTC M7 with an M9....

    As far as I've seen/noticed, the 810 (combined with the 430 GPU), is *ROUGHLY* 15-20% faster than the 600 series I've upgraded from.

    Gaming performance (for a cellular device) is great! But, it took over an hour to encrypt just under 5Gb of mp3s, and 1.5Gb of pictures.

    Hardly a "phenomenal" improvement.
  • b.akhil96 - Tuesday, June 21, 2016 - link

    How do you categorize the loads ? max(avg,recent) policy when loads are categorized as peak or non peak . what would be an ideal policy to be applied on Moderate loads. (similar to max(avg,recent) )

Log in

Don't have an account? Sign up now