Comments Locked

97 Comments

Back to Article

  • lilmoe - Wednesday, September 7, 2016 - link

    I'm really jealous. I can't believe Apple can still get away with ~720p, taking advantage of the optimal efficiency of their processors. I f'ing hate this stupid resolution race. 720p is good enough for screens 4.7" and lower, and 1080p is more than enough for phablets. Common........

    Oh, and just to throw this here again. YEA, more cores are the future, folks ;)
  • lowlymarine - Wednesday, September 7, 2016 - link

    "720p is good enough for screens 4.7" and lower."
    Perhaps if you have imperfect vision and don't care about other use cases like VR, sure. I owned a Nexus 4 (which had a 4.7" 768p IPS display in 2012 for $299) and could easily discern individual pixels in high-contrast scenarios. When I replaced it with a Galaxy S5 (5.1" 1080p OLED) the difference in pixel density was immediately noticeable. I won't pretend the difference in density between the S5 and my newer S7 (5.1" 1440p) is noticeable in general usage, however when paired with the Gear VR or a Cardboard headset you can still easily see visible screen door effect even on this 'stupid' resolution.

    This is all ignoring the fact that a modern ~1080p AMOLED panel wouldn't use more power on average than their 750p IPS panel anyway. This is plain and simple cost cutting, which is unacceptable on a $649 flagship. (Well, there's also the lazy way iOS does resolution scaling which makes using commodity panels at specific sizes difficult, hence why we have the wacky 1334x750 resolution to begin with.)
  • Kytael - Wednesday, September 7, 2016 - link

    I'd expect to be able to see differences up to 1080p in pentile sub-pixel arrangements, but not sure if I'd be able to tell 720p from 1080p with just plain RGB
  • ZeDestructor - Thursday, September 8, 2016 - link

    I can. That said, for phone use, 1080p is enough for me up to 5.5" (400ppi seems to be my cutoff), and in practive, having 1440p on my HTC 10 has made 0 improvement for anything I do on my phone, even for reading very small text sizes.
  • Meteor2 - Thursday, September 8, 2016 - link

    As lowlymarine says, the increased resolution makes a big difference for phone-based VR. 1080p isn't enough for that.
  • ZeDestructor - Thursday, September 8, 2016 - link

    Definitely, but I did say it made for zero improvement for anything I do on my phone. Emphasis on the phone part.

    Still salty that the new Sony compact STILL isn't 1080p.
  • crispbp04 - Thursday, September 8, 2016 - link

    Apple won't care about VR until they can figure out how to best monetize it into their strategy. They do lower res screens because it's all about scale and margin. Lower res produces higher yields, lower costs to produce = higher margins. They have unlimited market power in the sense that they will sell units even with marginal spec increases. Why open the flood gates when you can milk the margins for years with incremental updates. They are increasing the power where it gives them the ability to force obsolescence through software (more complex OS that reduces the experience on older devices, while minimizing fragmentation of their ecosystem by not changing things like resolution.. just bump up the horsepower and improve the color of the display). Also, there is not a correlation between unit sales and screen resolution, which "retina" has proven is good enough to have the largest unit sales in the world.
  • HAL_hello - Tuesday, September 13, 2016 - link

    It is categorically bad design work to produce a device with a higher resolution when the human eye cannot distinguish the difference. It's laughable to claim otherwise.
  • Spectrophobic - Wednesday, September 7, 2016 - link

    You tout VR and high PPI as if they're necessities like calling and texting. So what if you can discern pixels? Will a slightly pixelated letter "A" at 3-inches render the whole word unreadable? Only people who use complex CJK characters regularly has the right to complain.

    I'll gladly take the extra graphical performance that comes with a lower resolution.
  • lowlymarine - Wednesday, September 7, 2016 - link

    If calling and texting are your only concerns, then you should probably not be looking at $650+ flagships because a $60 Blu R1 HD will do those things just as well (and still has a 720p IPS panel, natch).

    "Only people who use complex CJK characters regularly has the right to complain." Oh well good thing Apple doesn't sell iPhones in those regions then, huh?

    Your comment about graphical performance is nonsense in this context. The iPhone is a completely closed ecosystem, much like game consoles. Developers will just target whatever resolution, image quality, and framerate they feel best suits their game. Hooking your PS4 up to a 720p TV won't magically get you better performance or PC-quality graphics, it just throws a lot of frames away to Vsync.
  • tabascosauz - Wednesday, September 7, 2016 - link

    Say whatever you want about Android's "open" superiority. Graphics and NAND performance, both of which are *the* fundamentals of a good user experience (as long as CPU performance is adequate, which it has always been in the ARM space for the past 2 years) are sorely lacking outside of Apple ecosystem, and they do not lie. Time and again, the greatest of Snapdragon and Exynos-powered Android phones have touted ground-breaking features, yet they cannot solve the simple puzzle of providing adequate GPU power and responsive NAND.

    Apple SoCs power an entire lineup of devices, and their GPUs have proved time and time again that they can scale beyond 1080p whilst retaining good performance. Adreno and Mali can't hold a candle to their contemporaries at anything beyond 720p. GPU performance does not lie.
  • lilmoe - Wednesday, September 7, 2016 - link

    Read my reply blow. The GPU in the GS7 isn't only adequate, it's plain overkill. The GPU is mostly underutilized. Android is the problem, not the processors running Android.

    That being said, NO ONE is saying that system performance isn't acceptable. We've reached a point where we want the most "efficiency" that hardware can achieve. Other than that, Android has a lot less trade-offs compared to iOS for most people, but if iOS is your thing (just like for lot of others), then by all means.
  • HAL_hello - Tuesday, September 13, 2016 - link

    thank you. spot on
  • Spectrophobic - Wednesday, September 7, 2016 - link

    Never said those two are the only thing that concerns me.
  • Meteor2 - Thursday, September 8, 2016 - link

    "You tout VR and high PPI as if they're necessities like calling and texting." Hmmm, think you did.
  • lilmoe - Wednesday, September 7, 2016 - link

    I'm not worried about the power draw of the panel itself, what I'm worried about is the amount of power it takes to process the data before sending it to that panel.

    Android is a LOT less efficient in rendering text and graphics compared to iOS and Windows Mobile. They all have some sort of hardware acceleration, but for the same workload, Android utilizes a much greater portion of the CPU in comparison. CPU utilization grows more dramatically with increased resolution, leaving the GPU underutilized (again, in comparison). You're not seeing the "huge" benefits of big.LITTLE on most Android devices because of this very reason. System responsiveness, scrolling smoothness and system power vary dramatically according to how OEMs configure their devices and how they set their scheduling priorities (responsiveness vs battery life).

    Samsung, for example, prioritizes battery life; when you touch the screen, the little cores don't ramp up to their max clock speed (they go to 1.2GHz, while the max is ~1.7) unlike Google's Nexus and other OEMs with 1440p screens. That's why, for the same battery size and screen resolution (ie: normalized), Galaxies usually have better battery life (leaving other factors out, like background services and other bloat). I mention this because it's mind boggling that Android's rendering engine is actually saturating the little cores for the most mundane UI tasks. Google has a LOT of work to do; ramping up clocks upon user touch is NOT a solution.

    I believe Apple's implementation will see MUCH better results NOT because their hardware implementation is better, but because most UI tasks in iOS probably wouldn't even saturate 50% of the little cores' power (as it should be). Apple are being conservative with their power saving claims; they might actually be a lot better.

    Back to screen sharpness. I can understand that 1080p is generally more visually appealing than 720p on a 5" screen, but beyond that it's just plain spec-sheet marketing. VR is a novelty most people don't care about. It's nice to try, but most people won't bother using on a frequent basis. At least not for now.

    Most of the difference you're seeing is sub-pixel arrangement. When I downscale some apps on my 5.5" GS7e to 720p, the text is almost razor sharp; that's because each pixel is represented by 4-6 sub-pixels, instead of 3 on your Nexus.

    Down-sampling a pentile display yields amazing results. I'd love to see a 4K diamond pentile AMOLED screen down-sampled natively to 1080p. It'll be the best 1080p panel you can get, without the added overhead of processing 4k on a damn phone. I'd challenge you to tell any visual difference. Screen resolution isn't the only way you can improve a display.
  • zeeBomb - Wednesday, September 7, 2016 - link

    Explain more on down sampling pentile AMOLED...so the pixels are already non 4k or what?
  • lilmoe - Thursday, September 8, 2016 - link

    It was a suggestion, not a description of any sort of current practice (not that I know of). I'm suggesting that Samsung and others start down sampling pentile AMOLED panels to get the best picture quality possible at a a resolution the panel itself can out-resolve.

    The panel can be 1440p or UHD (4K) native, but it would announce itself as 1080p. Since there would be more than 3 sub-pixels per pixel, it would be sharper than any other native 1080p panel and would potentially support larger and more accurate color gamut, yet without the overhead of rendering a higher resolution on the platform as a whole. It's very similar to down sampling a 4K video to 1080p, where the picture quality would be higher and sharper than any video taken at 1080p natively.

    The visual difference between the current native 1440p implementations and the down sampled one would be virtually imperceptible. I know, because I tried.
  • tuxRoller - Monday, September 12, 2016 - link

    Do you have a reference for the assertion "Android is a LOT less efficient in rendering text and graphics compared to iOS and Windows Mobile"?
  • StevoLincolnite - Thursday, September 8, 2016 - link

    I had a Lumia 920 with a 4.5" 1280x720 screen.
    Now I have a Samsung Galaxy note 5 with a 5.7" 2560x1440 screen.

    There is a stupidly massive difference.

    The downside is though... Repairing my Galaxy note 5's screen cost me almost $400 AUD. Where-as an iPhone screen would top out at $200 AUD.
    But, like all things, you get what you pay for. The Samsung panel is superior.

    I personally won't accept a panel less than 1440P on any of my devices now, PC included.
  • lilmoe - Friday, September 9, 2016 - link

    The visual difference of 1080p vs 720p panels is significant, but not so much so higher than that.

    The new Samsung AMOLED screens are incredible, but that's not because of resolution alone. They're in pentile arrangement anyway, you'd be hard pressed to find any visual difference if that particular panel was downscaled to 1080p.

    Download the Game Tuner App from the Galaxy store, downscale an app of your choice to 75% (1080p) and tell me if you see a difference. Try downscaling again to 50% (720p), tell me again how much of a difference that makes. Compare with your previous Lumia.
  • HAL_hello - Tuesday, September 13, 2016 - link

    There may be a stupidly massive difference, it's just not because of the number of pixels on the panel. But carry on my 1440p loving friend.
  • HAL_hello - Tuesday, September 13, 2016 - link

    Umm... imperfect vision? It's very basic, beyond a certain resolution per unit area the human eye cannot distinguish a difference. It's a calculable equation, not some claim. Your experiences are because of other factors in the technology. Carry on believing otherwise if it makes you happy.
  • Kytael - Wednesday, September 7, 2016 - link

    I felt this way too until they started using phone screens for VR.
  • lilmoe - Wednesday, September 7, 2016 - link

    VR is niche.

    My free Gear VR headset is sitting here collecting dust. It was cool and interesting when I first got it, but now it's just meh. I should have opted for a Gear S2 instead. Stupid me.

    I don't like how they're trying to shove the "next big money maker" up our throats to make it more mainstream when barely anyone cares. If someone wants VR, they should get dedicated hardware for that and that alone. Consumers shouldn't pay the price of advancing a product they mostly don't care about. Companies should stop treating people like beta testers.
  • CSMR - Wednesday, September 7, 2016 - link

    I agree. 1440p on phones is crazy for normal use and just causes poor performance. But for VR it's ideal.
  • darkich - Thursday, September 8, 2016 - link

    Haven't you heard about the wonderful solution in the Note 7?
    It has a user-changeable screen resolution setting - 720p, 1080p, 1440p
    Done and done!
  • yhselp - Friday, September 9, 2016 - link

    More cores are the future? Depends on what you mean by that. Let's not get ahead of ourselves. A10 is still a dual-core design; apps can only use two cores at a time. If the A10 Fusion indicates anything about the future it must be that asymmetrical, big.LITTLE type configurations are the best way to conserve battery life in mobile. Would performance be the same though?

    Apple said the little cores would be used for things like Mail. Is Mail going to open as fast and run as smoothly as it would on the big cores? If not then the extra 1-2 hours of battery life on average wouldn't be worth it for me. The potential to comfortably last two days on a charge is appealing, but not at the cost of performance - I'd rather charge every day and/or carry a portable battery pack. Many modern battery packs are so compact, light, trendy, and affordable that it's almost crazy to not carry one just in case.

    As for the future in terms of performance, I can't see how a many-cores design would be better than strong per-core (single thread) performance. Video games might benefit from simultaneous access to all cores - big and little. Apple might ditch cluster migration in the future, although it does seem to be the better approach in general.

    Speaking of cluster migration, Apple is the second company to favor the approach in an asymmetrical design. NVIDIA has been doing it with Tegra since 2015. Curiously, the upcoming Nintendo NX console is based on Tegra. It is a quad-core design though - four big and four little cores.
  • lilmoe - Friday, September 9, 2016 - link

    Tegra had a single high efficiency core since Tegra3 back in 2010-2011, and TI were doing additional high efficiency graphics cores since OMAP4 back in 2012. But those didn't work out very well because Android was a mess back then. Samsung has been messing with big.LITTLE since 2013 with the Exynos 5410 (it was sort-of broken, but that was ARM's fault. Samsung fixed that later).

    Regardless of the implementation, dynamic scaling is the ends mean of big.LITTLE and similar architectures. No single core can scale efficiently from low clocks, low power to higher clocks and higher performance. It all depends on the types of dedicated or common workloads you want to offload. There's a certain threshold to "acceptable" performance/experience, beyond that is the domain of diminishing returns, unless you're going beyond 60fps and millisecond differences in response times.

    The ones getting ahead of themselves were those who thought that wide cores are the answer to everything. Apple's previous implementations (at least the core cluster part), unlike what everyone wants to believe, are not as sophisticated as modern competing implementations. They're relatively basic and cheap to implement. Their cores draw more power at higher clocks.

    Apple has gotten away with this because of their direct control over their platform; their cores are IDLE most of the time. iOS is more efficient relative to Android at common UI/consumer workloads, their screen resolutions are much lower compared to the competition AND, unlike Android, lots of iOS first and 3rd party apps take (better) advantage of dedicated hardware acceleration (another form of offloading). Android apps still utilize the CPU mainly for most of their workloads, it makes sense to have even more cores to deal with those needs because 1) Android is getting better at parallelizing worloads, and 2) most of the heavy mobile tasks are content consumption tasks where they're easily parallelized.

    That being said (despite what Apple said about "running Mail on a smaller core", instead of separate thread scheduling; Apple tends to oversimplify), modern apps and workloads are proving harder and harder to offload to dedicated hardware accelerators, thus the increased need for more efficient general computing (ie: the CPU). If Apple wants to expand the capabilities of their platform and mobile OS, they need to pay special attention to that. A good example is how much less efficient iOS has gotten on older hardware. One major reason for that, I believe, is that iOS's text and graphics scaling was relatively dumbed down in the past; it was an effective doubling of the resolution where the CPU didn't have to do much and most of the burden was thrown at the GPU. NOT the case anymore.

    A good usecase for the smaller cores in Apple implementation is the main UI thread after most of the heavy lifting is done. We'll see. I suspect even more cores in the future (regardless of arrangement).

    On a side note (and I suspect), Apple has hinted on how much less efficient the model with the higher resolution is; "2 vs 1 hour of extra battery life". The Plus model had significantly better battery life, and it's only natural to expect more, or at least equal, battery savings. Other factors might be at play here, not entirely sure.
  • yhselp - Saturday, September 10, 2016 - link

    Thank you for the insightful comment.

    What I meant to say is that NVIDIA and Apple are the first to favor cluster migration after it fell out of use because of the implementations you mentioned.

    What you're saying about core efficiency, platform control, and task management is well-known. It doesn't change the end result though. Apple's SoCs might not be more sophisticated than their counterparts, and of course wider cores draw more power in isolation, and more cores might make more sense for the way Android is built today, but all those are just circumstantial facts. At the end of the day, it's the real world result that matters. You can't get another SoC as powerful as the A9 in a phone as small as the iPhone SE. You can't get better or even the same performance on Android. Despite what makes more sense in theory; in a lab.

    I'm not as well-versed as you seem to be, and I can't provide an accurate explanation, but you're spot-on about iOS performance on older devices - iOS 9 on iPhone 5s is very clearly slower than iOS 8 - both shell and in-app performance - not to mention the UI is not well-designed for 4-inch devices. I suspect that even if iOS 10 boosts shell performance it would be at a further cost of in-app performance. Haven't seen the preview, but I'm sceptical about UI improvements for the 4-inch devices, which is a real shame.

    Again, I'm not an expert, but I can't see why a wide dual-core CPU can't be the better approach for fanless designs, despite your comments about modern apps and workloads. Intel seem to be getting along just fine - you can get great performance out of two cores even on a desktop OS; on both Windows and OS X. Not to mention Intel hold the performance crown for a fanless CPU with a dual-core design, and Windows is an open platform.

    The difference in power savings between the iPhone 7 and 7 Plus might be proof that a larger screen has a bigger impact on battery life than the SoC.

    As far as performance of the small cores go, I'd be very interested to see in-depth analysis of tasks known to run on the small cores compared to how the same tasks run on iPhone 6s with iOS 9.
  • lilmoe - Saturday, September 10, 2016 - link

    ---"but all those are just circumstantial facts"

    Agreed. There isn't a single best solution for "all". Each platform has its needs and priorities. What's best for iOS is absolutely not the same for Android; part of the argument we had back in Andrei's article about task/thread scheduling.

    ---"The difference in power savings between the iPhone 7 and 7 Plus might be proof that a larger screen has a bigger impact on battery life than the SoC"

    But the Plus model also has a significantly larger battery, and it usually gets better overall in-use battery life (screen-on time) than the smaller model. Since the larger battery makes up for the extra screen real-estate, and then some, I had initially thought the Plus would gain even more battery savings. It only makes sense since the newer processor should be more power efficient, and that's before adding the little cores to the equation. It's unlikely that the new screen isn't as efficient as the old one at least (should be more so).

    This might be a direct result of the extra screen resolution, where the little cores (or the improved efficiency of the processor, not sure about Apple's implementation) are enought to offset 720p (or the larger cores are more efficient at lower clocks), but not very much so for 1080p.

    Another possibility is the fact that iOS 10 has increased framerates and overall responsiveness. Not so sure, but this might *not* be a direct result of more efficient software, but a more aggressive governor/shell where the SoC is being pushed harder for common tasks. As a result, iPhone 6/6S users might actually get lower in-use battery life compared to previous iOS versions on average. Again, who knows.

    ---"I can't see why a wide dual-core CPU can't be the better approach for fanless designs, despite your comments about modern apps and workloads. Intel seem to be getting along just fine"

    I'm no "expert" either. But Laptops (regardless of CPU cooling), or ultrabooks, are a completely different ballgame, let alone PCs with constant power supply and a larger fan. Absolutely not comparable. Apple and Intel's dual core solutions are adequate for their intended workloads, but that's not the issue. It's an issue of *how long* can you afford pushing these CPU at an average/high power draw before racing to sleep. Efficiency can vary dramatically on the type of workload and how the platform is implemented. Intel's first 5 watt Core-M iterations proved LESS efficient than their 15 watt counterparts because of this very reason.

    I'm sure the difference is apparent to you now. What's acceptable for a laptop's relatively large battery, and thermal headroom, is absolutely NOT the case for a relatively tiny smartphone. Don't underestimate the power draw of these processors (especially Apple's), they can completely shadow the max power draw of the screen if pushed hard enough. It's shame Anandtech refuses to make direct power draw comparisons between Ax SoCs and the competition.

    You probably know this, and I don't want to be redundant, but we've reach a point of perfectly acceptable single core performance (Twister, Kryo, M1, A73). The spec race needs to slow down here and move to the next step; effeciency. More processing cores aren't just about performance, they can play a major efficiency role, even for wide cores, when racing to sleep isn't fast enough. Even for the same architecture, 4 cores at 800MHz would yield more sustainable performance and much more efficiency than 2 running at 1.6GHz. This goes a long way when you have 2 clusters, each with multiple cores, and each for specific dedicated tasks OR for global task scheduling. It would be interesting to see how well Intel's upcoming quad-core Core-M processors fair against their dual-core counterparts in future products in terms of battery life.
  • yhselp - Saturday, September 10, 2016 - link

    The way I'm thinking is the Plus has a bigger screen, but it also has a bigger chassis, which can house a larger battery, which in turn is enough to offset the impact of the big screen and even add additional battery life compared to the standard model. However, all that doesn't mean that the screen still can't have a bigger impact on battery life than the SoC. Therefore, an SoC-related power improvement could be more beneficial for the standard model, despite the smaller battery. I could be totally wrong, of course. As you've said - there's no way to know for sure now. The power draw of Apple's SoCs might very well be higher. It'd be nice to get more statistics from Anandtech; but, again, what we can see is that the performance and battery life of an iPhone are both there, despite what the power draw might be.

    I do understand that different platforms and form factors have different needs, but judging by your arguments I might be underestimating just how different those might actually be. I simply don't know enough. Still, how can a tablet (not a laptop or desktop) like the Surface Pro 4 powered by a wide dual-core Core M running a full-fledged desktop OS have great performance and battery life if the ballgame is completely different? It'd be very interesting indeed to see a quad-core ~4.5 W Core CPU from Intel.

    You're not being redundant, on the contrary. The way I understand what you're saying is, and correct me if I'm wrong, that we've reached a point where a single core of a mobile CPU is fast enough for its intended purpose, and so from now on we should favor more cores per cluster as a means to improve efficiency, because that has become the more important thing to try and improve. Which essentially means that the iPhone 6s and other similar products represent the best performance we can expect of smartphones for a long while. If that is the case, it'd be very unfortunate from my point of view. I'm the kind of user that feels you can never have too much performance, that no solution if ever "good enough". More battery life is nice, but I would always choose more performance with the battery life we have today if given a choice. I feel that only a historic scientific breakthrough that produces a new power source can dramatically improve battery life of electronics without increasing their weight and size.

    I'd like to think that I'm aware that this greed for performance might not be the best thing overall, but I still want it, and if it weren't for it I wouldn't invest in a flagship smartphone - that's not to say performance is the only thing I favor in a device, far from it, and I definitely haven't and won't upgrade every year. At the end of the day, I think that a 6700K at 4.6 GHz with 3200 MHz memory and a 480 GB SanDisk Extreme Pro is definitely not fast enough. It is not disappointing, it does meet expectation, but it can definitely be faster, and I'm very much looking forward to what Intel (and AMD, hopefully) comes out after 2 and after 5 years respectively, judging by their current roadmap.

    Wonderful discussion we're having.
  • lilmoe - Saturday, September 10, 2016 - link

    ---"Therefore, an SoC-related power improvement could be more beneficial for the standard model, despite the smaller battery. I could be totally wrong"
    I read your take on it and was like, DUH. I was just explaining how CPU efficiency isn't as crucial in laptops with larger batteries.... You're totally right, completely my oversight.

    ---"Still, how can a tablet (not a laptop or desktop) like the Surface Pro 4 powered by a wide dual-core Core M running a full-fledged desktop OS have great performance and battery life if the ballgame is completely different?"
    The Surface despite being in tablet form, has laptop internals. Internals and OS are totally identical. Just like how you offset the larger screen with a larger battery, you do the same for the higher power draw of the processor during average usage. Again, for the same processor/SoC, a desktop OS would generally consume more power because the processor is working harder relatively. That being said, Windows 10 is pretty damn efficient for a desktop OS. Despite that, you can clearly see how the iPad Pro has significantly better battery life despite having a slightly smaller battery; because the iPad runs iOS, not OSX.

    ---"I'm the kind of user that feels you can never have too much performance"
    Look at Intel. How much did their IPC improve over the past 3-4 years? Nothing close to what Apple is reporting. AMD's Zen isn't breaking any grounds by any means. I'm not saying that Apple is lying, but the alleged improved performance comes at a cost. A huge one in fact. Despite that, it's only in very short burst in practice, and the difference is virtually imperceptible provided they don't cripple their older hardware too much (sorry, but their history proves otherwise. Part of my argument: if older hardware had more cores, the burden would have been much easier to remedy).

    I'm not trying to be hateful or offensive, but this is absolutely NOT in the best interest of the Apple consumer, and consequently the consumers of OEMs that follow suit. Finding more and more ways to increase perceived performance is rendering older hardware obsolete at shorter periods of time. Yes they're improving their OS, but they're not working hard enough to make those changes meaningful in a way that older hardware doesn't suffer.

    I totally respect your desire to have the latest and greatest, but not everyone can afford (or even wants) that. They sell 10s of millions of those, and I believe consumers are better catered to with sustained performance of their older devices, NOT the "latest version of iOS", while still getting security updates of course. BUT, consumers are either forced to update to a "slower" OS, or face potential security issues............ That's one of my gripes with Apple.

    I'm glad you agree though. A flagship isn't only about performance. But a flagship shouldn't only be better in performance compared to lower range devices, it should also outlast their life spans by a VERY significant margin. That's why you buy better appliances, electronics, cars, etc...
  • lilmoe - Saturday, September 10, 2016 - link

    "I was just explaining how CPU efficiency isn't as crucial in laptops"

    Quoting myself, and just to make it clear that I'm not misunderstood. Efficiency is JUST as important in any form factor. It's which processor that's *more efficient* for a given workload, at an "acceptable performance level", that matters.
  • yhselp - Monday, September 12, 2016 - link

    I completely agree with most of what you're saying, and despite my belief that performance is very important, I don't upgrade every year, not even every two years. A flagship should absolutely last very long - at least four years, and as you've said Apple have been guilty of slow performance on relatively new devices. If there's one thing I truly hate about them it's that. I still remember how apps used to run fine on my 3GS until I upgraded iOS. I still think it's ludicrous that the 5s - the most evolutionary iPhone in terms of performance - drops frames, many apps work slower, is slower overall, and the UI isn't well-optimized for its 4" screen on iOS 9 which was released just two years after its launch. I really hope iOS 10 fixes those issues.

    And if the solution to those problems is the addition of more cores per cluster at the expense of the most up-to-date flagship not being the absolutely fastest it could be for the short period of one to two years, then so be it. I haven't really thought about it this way before.

    Of course, the sad thing is that Apple might not be interested in providing stellar performance for their iPhones for too long.

    On the Surface Pro subject - I'm not so sure about what you're saying. The Surface Pro 4 has the same battery capacity as the iPad Pro, weighs just a few per cent more, and its battery life is not dramatically lower - less than 30% difference, I think. In this case, I consider Microsoft's tablet the superior product in almost every way - it's faster, more practical, and still sexy. Doesn't matter how it's built; the end-result matters. I'd be more than happy to sacrifice about 30% of battery life for all the rest when I still get many hours of use.

    For for taking so long to reply - I've been busier than usual.
  • yhselp - Monday, September 12, 2016 - link

    *Sorry for taking so long to reply - I've been busier than usual.
  • lilmoe - Tuesday, September 13, 2016 - link

    It's OK. I have a busy life as well, and mostly participate from my phone when I'm out (unless it's a long reply).

    I have no doubt that modern Windows 10 tablets are the superior form of a tablet computer, and the Surface being the best one of those. Even for mobile operating systems, I believe Windows 10 Mobile is the absolute best of both worlds; it's like (or better than) iOS in terms of sandboxing and efficiency, AND it's pretty close to the flexibility and power of Android. It's just too bad, and heart breaking, that Microsoft failed miserably in delivering, despite the amazing job their engineers are doing with their Universal Windows Platform. Windows 10 Mobile also benefits from indefinite updates delivered directly from Microsoft (unless the update was in firmware). I always dreamed of an Exynos powered Galaxy S running Windows 10........ Oh well.

    Back to the topic of battery, I was specifically debating battery life in comparison with a mobile OS. Had the same Surface Pro been equipped Windows 10 Mobile, for example, it would have definitely yielded better battery life, albeit being less practical. Rundown tests rarely paint a true picture of the actual battery life in practice, if ever. You see phones lasting just as much as others, give or take, but in practice one could have double the battery life of the other...
  • yhselp - Wednesday, September 14, 2016 - link

    It's a shame about Microsoft, yes. I remember being hyped about the original Window Phone 7 being the best of both worlds back in 2010. I also remember one guy I know actually being worried about potentially having to swap his shiny iPhone 4 for a superior device so soon after release. I've lost count the people I've tried to convince Windows Phone is a great OS. A real shame Microsoft hasn't developed it properly.
  • NetMage - Wednesday, October 5, 2016 - link

    Don't forget the Plus also has 50% more RAM, which is expensive to power.
  • michael2k - Wednesday, September 7, 2016 - link

    From their materials, I expect their A10 to be a pair of Twisters clocked to 2.6GHz (literally 40% faster) and paired with two A7's Cyclone cores. Shrinking the Cyclone from 28nm to 16nm gives us two die shrinks, and running at half the clock (1.3GHz was the Cyclone's native clock) means they can literally run at 1/5th the power usage while still be at full throttle.
  • CloudWiz - Wednesday, September 7, 2016 - link

    I highly doubt that they'd reuse Twister for A10 and clock it so high. Rumors point to a 2.4 GHz max clock which I think is reasonable. Add in some architectural improvements (some say this core is called Hurricane) and you can hit that same 40% number without clocking it too high (similar to how Typhoon was alike, but still faster, than Cyclone. In fact Typhoon behaved like a Cyclone clocked 200 MHz higher, which is what I believe we'll see here with Hurricane/Twister).

    Using a 6-wide OoO design for low power cores seems unlikely, the core is still very large even with two node shrinks and probably wouldn't be ideal for a little core. Something similar to A53 with its 2-wide in order design would be more likely, as stated.
  • Geranium - Thursday, September 8, 2016 - link

    Are you sure apple's custom ARM cores are 6-wide OoO design. Did Apple conformed that. I think apple custom cores are 3-wide OoO ARM design with some changes.
  • LiverpoolFC5903 - Thursday, September 8, 2016 - link

    Really doubt it considering the A9 is 6 wide. Apple CPUs are far beyond any current ARM based solutions. Much more complex, much more powerful and efficient. I absolutely hate iOS with a passion, but you have to give them credit where it is due.Their CPU designs are cutting edge and way more powerful than your average Snapdragon.

    They can come up with such brilliant designs because they can afford to spend millions in R&D with the margins they charge.

    Whatever the reason, I would LOVE to see this soc being with Android. I believe the performance would be stellar.
  • michael2k - Thursday, September 8, 2016 - link

    Pretty sure:
    http://www.anandtech.com/show/7910/apples-cyclone-...
  • michael2k - Thursday, September 8, 2016 - link

    Tweaking the Twister to run at 2.6GHz is the trivial solution to hit 40%. It could be only a much smaller speed boost, I don't disagree.

    However, I don't see why a 6 wide Cyclone 'low power' core is unlikely. That is also a trivial solution that would hit all their claims without excess engineering work. Just running at half clock would give us half the power consumption, on average. Die shrinks would get us to 1/2 that, or 1/4th the power consumption. Tweaks, extra power gating, changes due to reduction in voltage, and possibly even more underclocking, gets us the rest of the way to 1/5th
  • yhselp - Saturday, September 10, 2016 - link

    Don't forget the jump from 28nm to 16nm is the equivalent of one traditional die shrink, not two.
  • zeeBomb - Wednesday, September 7, 2016 - link

    Dannnng. I missed the live keynote, but I'm really surprised Apple had the audacity to make a "quad core" chip thats like the Kyrie for the first time ever. And those airpods!
  • df99 - Wednesday, September 7, 2016 - link

    Thank you for the article.

    I'm someone who always needs better telephone quality.

    1. Do you know which Qualcomm modem the Verizon model is using. Is it the same as the Galaxy S7 (Apple is suggesting download speeds up to 450 Mbits/sec which suggests this modem).
    https://www.qualcomm.com/products/snapdragon/modem...

    2. Last year's model added another microphone for improved noise cancellation. Anything about microphone quality improvement or noise cancellation improvement.

    3. Last year's model added H.265 compression for FaceTime Video over cell network. Are there any other improvements this year?

    I feel frustrated that the telephone function is not emphasized on the phones. Even with VoLTE on Verizon, in Manhattan, on an iPhone 6s+ I'm still not getting the voice quality of a landline.

    I just ran speedtest from the Upper West Side on Verizon on my iPhone 6s+ and got 58.5 Mbits/sec down and 30.0 Mbits/sec up. I don't think we need more speed, we need better voice quality.
  • Ryan Smith - Wednesday, September 7, 2016 - link

    1) No idea if it's even Qualcomm. We'll find out once someone takes it apart.

    2) Nothing disclosed.

    3) Nothing disclosed.
  • jjj - Wednesday, September 7, 2016 - link

    Apple has always been terrible at signal strength, the Apple press just doesn't like to report on negatives.
    http://mspoweruser.com/lumia-640-lumia-650-thrashe...
  • lilmoe - Wednesday, September 7, 2016 - link

    Yea, no one's being very honest about this... Not to mention the higher than average SAR levels iPhones produce.
  • lilmoe - Wednesday, September 7, 2016 - link

    emit**
  • df99 - Wednesday, September 7, 2016 - link

    Thank you. The long report with the details referred to in the article is very interesting. There is quite a difference. I wonder what the reasons are for the bad Apple performance.
  • LukeTim - Friday, September 9, 2016 - link

    "mspoweruser.com"

    Seems impartial.
  • Samus - Wednesday, September 7, 2016 - link

    A lot less exciting than I expected. Still have a 6 probably going to wait for the 7S.

    The Apple Watch is also similarly disappointing, it's basically the same exact watch, casing and all, with a waterproof speaker canal, an unnecessary GPS since the thing is practically useless without an iPhone nearby (which serves as the GPS for the original Watch) and a dual core CPU, which is puzzling when Watch OS3 beta testers have reported staggering performance gains on the original Watch through what is suspected to be overclocking, at a negligible cost to battery life. Since the dual core CPU is trickling down to the updated Series 1, I wonder if this is actually going to be a power saving feature and nothing else.

    Now don't get me wrong my Apple Watch has broken 3 times, twice from what I suspect was water damage (Haptic engine failed after bathing my kids one day, and another I was caught in a rain storm while biking) so the IP67 improvement is the biggest change here, but the fact the watch isn't any thinner, and they didn't talk about improved battery life, actually renders speculation unnesseccary about the CPU upgrade. It's probably going to get the same battery life the current Watch gets (I get exactly a day, getting a power reserve notification every night around 10pm - but I use my watch more than my phone.) Which means WatchOS 3 could put me on the edge of battery life if the current beta stats on existing Watch carry over (2-5x performance boost at a 10-20% less battery life trade off)
  • p1esk - Wednesday, September 7, 2016 - link

    What were you expecting exactly? They delivered a lot more than all the rumors indicated.
  • Samus - Thursday, September 8, 2016 - link

    Wireless charging (the Apple Watch has had it almost 2 years and Palm, WinMo and Android have had it damn near a decade)

    True Tone screen for iPhone (all the iPad pro's have it)

    Thinner case for Apple Watch, as it is among the thickest smart watches

    Some form of smart bands to take advantage of those "pins" hiding underneath the band mounting bracket, akin the what Pebble is doing. Everything from a battery band to a camera band to a GPS band are possible.

    The prices are ridiculous. $169 for wireless ear buds. $20 increase for Apple Watch.

    And not to call our the elephant in the room, but why the hell did they spend so little time focusing on IOS 10 when its claimed, even by Apple, to be the biggest overhaul ever. They spent 4:18 seconds talking about it and reviewed 4 features. In comparison the Apple Watch got 5 talking points and the iPhone got double that.

    If that weren't all insulting enough, and I realize this wasn't a Macintosh focused event, but why hasn't the MacBook Pro or Mac mini been updated in 2 years? They owe their customers a roadmap for what the hell is going on with their PC development because I have heard numerous times people are either holding out or bailing on Mac because they are literally outdated.
  • michael2k - Friday, September 9, 2016 - link

    Why would the 7S be less disappointing? History tells us that the 7S will have even fewer changes than the 7; more performance, enhancements to existing features, and maybe a couple internal changes not visible to the world (like adding NFC to developers via API or adding another GB of RAM).

    I imagine the GPS on the Watch reduces the battery drain since there is no need to use wifi or BT to communicate with the iPhone. That tradeoff allows them to increase the screen brightness.

    If you can turn the screen brightness down on the new Watch, you'll gain both waterproof abilities and likely increased battery life.
  • adityarjun - Wednesday, September 7, 2016 - link

    I am surprised that there is no true tone display!
  • jjj - Wednesday, September 7, 2016 - link

    "Apple will not have received the advantages of a die shrink with this generation"

    They got more than that. They get the equivalent of that from using InFO and on top of that, by using just TSMC they can push a bit further. On the CPU side the entire gain should be pretty much clocks and 40% might be a bit inflated.
    On the GPU side, hard to say how they do the math, at same TDP for the GPU or this gen gets higher TDP. It's much easier to gain perf if you just make more room for the GPU.
    Do wonder about memory BW, they might not have enough with the cores at about 2.4GHz.

    Not exposing all 4 cores is a mistake but they were likely afraid to do it all at once. The software ecosystem needs to catch up and thread better, staying 10 years behind is not a good idea and with the big ipad trying to be a PC ,dual core is a severe limitation. We'll see what they do on 10nm.
  • zeeBomb - Thursday, September 8, 2016 - link

    Was waiting for information about inFo...eft dissatisfied. Care to explain?
  • jjj - Thursday, September 8, 2016 - link

    InFO is just a branding TSMC is using for FO-WLP.
    Google for FO-WLP and ,if you want, FO-PLP and learn about it as much as you need.

    Here's a pdf that quickly looks at many different packaging technologies with easy to understand pictures http://s3.amazonaws.com/sdieee/1817-SanDiegoCPMTDL...
  • zeeBomb - Thursday, September 8, 2016 - link

    Ohhh I knew it. Seemed funny but once I saw those diagrams... Thank you kind sir!
  • name99 - Thursday, September 8, 2016 - link

    Really?
    We've already gone from "We have no idea whether Apple exposes all the cores" to "Not exposing all 4 cores is a mistake" in just a few hours?
  • jjj - Thursday, September 8, 2016 - link

    If a tree falls in the woods....
    I've been saying for years that Apple needs more cores as computing hasn't been about few threads for a long while now. That fact that you are less than informed doesn't alter the facts.
    If i tell you the news that Earth is not flat and the sun doesn't revolve around it that doesn't mean that everything changed over night,it has always been that way, you just weren't looking.
  • lilmoe - Saturday, September 10, 2016 - link

    "The software ecosystem needs to catch up and thread better"

    Nothing needs to "catch up". A CPU with more cores can run the current ecosystem just fine. Apps just need to be update to actually take better advantage of that, that's all.
  • CSMR - Wednesday, September 7, 2016 - link

    The lack of a normal headphone port is a big mistake. Its the first apple device not to be a good audio player. Unless you use an external dac/amp in which case its the first apple audio player to be huge and clunky.
  • Samus - Thursday, September 8, 2016 - link

    I don't think it matters. The adapter is there, and there will likely be 3rd party pass through adapters that allow charging and listening simultaneously.

    It's also debatable whether the DAC in the adapter is superior to the cirrus logic crap that has been in the iPhone since they bailed on Wolfson years ago.
  • tabascosauz - Wednesday, September 7, 2016 - link

    "Either way, this is the biggest change to the structure of Apple’s CPU subsystem since A6 and the Cyclone CPU core in 2012, and given Apple's habit of throwing us curveballs on the SoC side, I suspect the answer is not as simple as what we currently believe."

    You mean Swift? Swift was the first generation of Apple custom cores. The A6 featured Swift in 2012. Cyclone was the revision, in A7, 2013.
  • Ryan Smith - Wednesday, September 7, 2016 - link

    No, I meant Cyclone. I just had the wrong generation in my head. Thanks for the heads up!
  • TheAnnouncer - Wednesday, September 7, 2016 - link

    Many errors of fact here. Most notably is the bizarre insistence that using lightening requires a dac in the headphones. No it doesn't, which is why the adapter works.
  • Ryan Smith - Wednesday, September 7, 2016 - link

    I always welcome corrections. But everything I have on the subject is that Apple is pushing digital audio out, which means there needs to be a DAC after the lightning port.
  • tipoo - Wednesday, September 7, 2016 - link

    "given Apple’s fondness for developing their own ARM CPU cores and various technical considerations (such as the core interfaces), it may very well be that these are also Apple-designed cores, as opposed to an off-the-shelf solution like ARM’s Cortex-A53."

    Could it even be the Watch's CPU cores? Though maybe those are 32 bit
  • Pissedoffyouth - Wednesday, September 7, 2016 - link

    What's the best we'll get the A10 deep dive soon, and get the 820 deep dive never. Oh wait, it's "being worked on"
  • Geranium - Thursday, September 8, 2016 - link

    AND now big.LITTLE is better because Apple is using it.
  • Ryan Smith - Thursday, September 8, 2016 - link

    In all honesty, it actually has been canceled, since Andrei is no longer with us.
  • ImSpartacus - Thursday, September 8, 2016 - link

    Whoa, that sucks.
  • beginner99 - Thursday, September 8, 2016 - link

    $650 for a 4.7" phone with 32 GB, probably 2 gb RAM and a 720p screen in 2016? No thanks. And why no 64 GB option? To force most users to 128 GB?

    And Apple goes big.little. That's why now the rumors of high clocks make sense. Because they will hardly ever be used.
  • Glaurung - Thursday, September 8, 2016 - link

    "why no 64 GB option? To force most users to 128 GB?"

    All capacities doubled with no change to price points. You now get 128gb for the same cost as last year's 64gb option.
  • phoenix_rizzen - Thursday, September 8, 2016 - link

    Finally, the $100 premium for more storage kind of makes sense. 4x the storage for $100 (32 --> 128 GB) is a better deal than $100 for 2x the storage like the old days (16 --> 32 GB).
  • ABR - Thursday, September 8, 2016 - link

    256GB, huh? How long will it take to fill, backup, or restore THAT via USB? Maybe ac wifi would be a little faster, but still.. Yikes!
  • Glaurung - Thursday, September 8, 2016 - link

    I have no idea why Apple hasn't shifted to shipping a USB 3 capable charge/sync cable with their devices by now. Maybe the patented logic for running USB 3 is not allowed to be integrated onto the lightning controller chip?
  • Danvelopment - Thursday, September 8, 2016 - link

    On a positive note, people with new iPhones won't get a say when it comes to music selection at parties and in my car.

    Unless they carry around their own adapter everywhere...but then why not just put it in the phone?
  • Meteor2 - Thursday, September 8, 2016 - link

    Don't you use Bluetooth?
  • Danvelopment - Thursday, September 8, 2016 - link

    I don't have it in my car and not getting a new headunit, and I've been to one gathering where the stereo had bluetooth, but no one used it because it was too much hassle pairing devices every time.

    We just pulled the cable out and stuck it in the next phone.

    Also I have the biggest, comfiest car in my group and regularly drive long road trips with a full vehicle and an auxiliary cable.
  • name99 - Thursday, September 8, 2016 - link

    So you could acquire the relevant cable dongle for your "comfiest car in the group"... or you could be a dick... Decisions, decisions

    Just FYI, I have a cluster of chargers in my car for people to charge their devices. And I include MORE than just Lightning in the cluster --- because I am NOT a dick about what my friends like to use as their phones...
  • Danvelopment - Thursday, September 8, 2016 - link

    Why can't they carry their own adapter around?
  • Danvelopment - Thursday, September 8, 2016 - link

    Also isn't that adapter >$100? It has the DAC in it.
  • Danvelopment - Thursday, September 8, 2016 - link

    "Let me play something"
    *music stops
    "Oh, give me a second, just pairing. Oh, it rejected. Are you still connected? No? Lets restart the stereo. Still not working, I'll restart my phone.... ... ... Gah, I hate how it's automatically off when the phone first starts. Alright, on now. Found it. Pairing, why is it asking for a code? 00000000, alright paired. And putting the music on now."
  • name99 - Thursday, September 8, 2016 - link

    You have been able to buy a large BLUETOOTH party speaker from Costco, called a Block Rocker for at least a year now. Whatever you are claiming are the supposed difficulties of using these devices, other people have been managing to use them (even in the context of a party) for a long time now.

    Honestly, you people sound like the lame-ass idiots on an infomercial. "Do you find it difficult to put your cup on a table? There's all that sliding and where does it go? Sometimes you even miss the edge of the table! Obviously what you need are cupholders for your chair."

    https://www.buzzfeed.com/julianbrand/40-gifs-of-st...
  • Danvelopment - Thursday, September 8, 2016 - link

    Wait, people are supposed to buy new things to facilitate decisions made by other people? Why can't they buy their own things? For example I have a Panasonic SA-Max700 (look it up), I'm just going to chuck it in the bin for a crappier system so that iPhone users get a say? Nope.
  • name99 - Thursday, September 8, 2016 - link

    Did anyone tell you to "chuck your Panasonic SA-Max700 in the bin"? NO.
    It was pointed out that you have the OPTION of not being a dick to your friends by carrying a small $9 adaptor in your car. That's what SOME of us would do.

    But they're your friends; I guess they know what they got themselves into.
  • Danvelopment - Friday, September 9, 2016 - link

    Why can't they have their own adapter?
  • LukeTim - Friday, September 9, 2016 - link

    I wonder whether it has the GT7600 Plus GPU as the OpenVX support would probably be useful for some of the processing involved in the new camera functionality.
  • hassan1231 - Tuesday, September 20, 2016 - link

    Brand new Original Apple iphone 7 and 7 plus cost 650usd with 1year warranty.

    Serious buyer should contant us.
  • hassan1231 - Tuesday, September 20, 2016 - link

    Brand new Original Apple iphone 7 and 7 plus cost 650usd with 1year warranty.

    Serious buyer should contant us.

    Skype: faisa.hassan102
  • Bolang - Thursday, December 1, 2016 - link

    Yeah i love this phone.
    and now get for free http://freeiphone7plusgiveaway.win/

Log in

Don't have an account? Sign up now