I wonder what the scrolling performance will be like on the G3 given its resolution and "just" having an 801 chip. I feel like scrolling performance is finally good with the 1080p and the latest generation of processors, but are we going backwards in performance with the leap to 4k?
High DPI phones will hopefully die a silent and quick death. Even 1080p is overkill for me on any screen smaller than 5 inch, and staying at 720p at that size is great for battery life...
I don't know man. The difference in pushing and power consumption and price between 720p and 1080p is not that much to deter from the noticeably better quality.
Say what you will but I (with my less than perfect vision) can still see pixelation at battery indicators or other small elements. Not to mention native full HD content should you wish it. and obviously SOCs today able to cope with 1080p.
I'm not sure why some people spend as much effort deriding the push for higher pixel densities as they do. Most of it sounds more like sneering for the sake of sneering than having an actual point to make, to me (I'm not saying I know that to be the case here). One should at least make the effort to differentiate himself from the person telling us we should not try to improve the quality of what we see by reaching for more than 60 DPI not all that long ago, because that was "good enough".
Human vision isn't the simple mechanism some people think it is. Research has shown a density of over 600 DPI can be useful. I tend to agree that there are other ways to make smart phones better at this point, and it's hard for me to imagine that I would be able to see any benefit beyond 300 DPI personally, but I'm not against increasing the ability of our technology to do more than we can use. Perhaps the technology will lead to improvements in other areas, like responsiveness, or fluidity. Seems to me that giving designing engineers all the power they could possibly use can't be a bad thing. :)
+1 -- I amazed how quickly people have tired of smartphone innovation. Bring on the pixels! I'm still young(ish) and can resolve details quite well. I'm hoping for magazine level smoothness with no blockiness.
Because if they have to be able to drive BIG displays with the phone they may as well be able to brag about being able to do it ON the phone. And everyone will eventually have to be able to drive BIG screens OFF the phone.
Scrolling performance on my old Galaxy S3 became perfectly snappy -- as soon as I ditched Touchwiz. All these stupid skins are the limiting factor, and there's really no purpose for them any more other than product differentiation.
Yup. That Snapdragon S4 was more than good enough for basic AOSP Android. TouchWiz really killed the performance on that thing. I mean, I know more people with a Nexus 4 than a Nexus 5, simply because the S4 Pro is good enough to run Google's Android. I personally have a Nexus 5 because it's compatible with Sprint, and should have Triband with the whenever-it-comes 4.4.3 update. I used to upgrade my phone every time I had the chance, but I actually don't think I'll be upgrading as long as my Nexus 5 is in good working condition. SoC improvements aren't really that exciting for me anymore.
I am in the same boat. I upgraded to the Nexus 5 from the Galaxy Nexus since it was getting old. I don't see myself upgrading anytime soon. Not only is the SoC more than enough to handle what I want, it has plenty of RAM and the screen is great too. All of this with no bull-sh!t skins like touch wiz and you have what I feel to be a great phone.
So I really hope google does not shutdown the Nexus division.
I don't think they will. People i know are divided into 3 groups: iPhone Users, Samsung users, and Nexus users. Others do exist, but for the most part, Google's phones seem to be pretty popular.
I'm in the same (happy) boat. Downgraded from the Galaxy S III (rooted, unlocked, Cyanogenmod) to the Moto-X (republic wireless). My phone lasts 20-24 hours with pretty decent usage (screen is on 3-4 hours a day, always browsing the internet on this thing...) My Moto-X isn't rooted or unlocked, but it's the most enjoyable phone i've ever owned...and i'm a spec freak.
btw the reason the phone lasts so long on battery is due to republic wireless operating over wifi. when wifi is on, cellular modem is off...apparently the cell modem is a power hog on any phone.
Too bad indeed. It seems they couldn't (or didn't have time to) re-engineer it in time to add an LTE modem (since the 805 doesn't include one). I believe it still should run smooth though...the 801 is still a beast. My Nexus 7 2013 for example has a resolution which is slightly lower than the G3's resolution with a lesser SoC (Snapdragon S4 Pro) and it flies with no lag/stutter at all, so my fingers are crossed. Yes it's somewhat disappointing but I'm still looking forward to having the G3 in my hands for the next couple of years.
Which you can get where exactly? I will believe NVIDIA the day they actually DELIVER, as opposed to make empty promises like for, oh , every previous tegra soc?
I don't understand why anand keeps using Chrome browser for android devices, and Safari browser for IOS. Safari is very well optimized for IOS (naturally), and chrome is NOT. For example the samsung's stock browser on GS5 is quite optimized and has benchmark results beating iPhone 5s.
I admit all other sites has degenerated enough to publish some random 'impression' as review, but I still feel strong bias here. We have benchmark results, but they are many times cherry picked to make apple products looks better. And it is very bad because now everyone takes review here "objective".
I like that Chrome is used on the Android devices. It is a browser available for all products. It is the default browser on the pure devices (such as the nexus devices as even none nexus devices running standard Android). If Google doesn't optimize it for Android than it is their loss, but I personally use it on my Android devices as well.
I don't know if each Android smartphone has the same stock browser but I do know Nexus devices no longer have the stock browser and instead use Chrome. Wouldn't using one browser for all Android devices help with benchmark consistency?
I think you'd be pretty bummed to find out the 'true' speed of old reliable, your TW browser. Have you tried a third party browser? Chrome? Dolphin? Mercury or Photon? Opera or Skyfire? Next or Firefox? I'm a browser junky. If you're not upset with me calling the TDub browser crap, drop me a line and we can discuss. I've literally downloaded all of the worth a bean over the past six, seven years;)
the stock browser is normally not Chrome (unless its Pure Android device or Sony phone) its normally the webkit browser? (not checked what its called)
i find chrome a power hog (i use Opera mini most of the time any way that infact is for most part worst then the stock android browser, but strips all scripts from the website so you get a static page no CPU use or lag at all)
I'm not sure where you got your S5...I've got an S5 and Note 3. BOTH are MUCH quicker with Chrome, Dolphin, Mercury...or any numerous third party browsers than the crappy Samsung 'stock' "Internet" browser. It's CRAP! If there's a link you could share with the stock Sammy browser 'beating' the latest iOS device benchmarks on the A7/IT platform. Sorry, but as a LONG time TouchWiz owner and user it's definitely the FIRST omitted 'default' option. Then. Keyboard;) Don't worry. Sounds like you seem to think someone is going to buy their phone based off a Sunspider test only? That doesn't make any sense. While the iOS platforms with the A7 are certainly holding their own eight months after release, the tests I eras were definitely not dominated by the iOS silicon right now. I own the iPhone 5s and Air and can honestly say regardless of benchmark results or numbers, tests and objective reviews, the iOS devices generation compared to the same generation of TouchWiz FLATTEN the latter when it comes to browsing speed, UI fluidity and overall general 'speed'. My wife has the iPhone 5 and the Samsung S5. Me the 5s and Note3. It's irrelevant which browser you use, iOS is definitely the quicker populating browser (I actually prefer iCab IR Mercury to Safari...regardless of Apples' decision to not allow access to Nitro, they're JUST as quick). While I LOVE my Note for the actual browsing 'experience' in comparison to the iPhone, the intrigue for me is the larger display and the ability to use the stylus for sketching rigging points for clients, credit cards and my old eyes. I love em both but there's NO WAY without a link you're gonna convince me that POS stock Sammy browser beats ANYTHING! Lol. Android. iOS. I'm a fan of both. BUT some things are better on one or the other's OS, UI and overall compatibility with whatever tools you're using. Sorry. Others have beat me to it. Took to long. Android is MASSIVE ...And while Samsung makes up a large percentage of 'sales' they're not all flagships. Tons of 2.xx devices still on the market and pay by month kiosks, other countries, etc. not all Sammy's have the same TouchWiz browser nor do ANY of the other OEMs. Chrome, by default...it's Google (Android's) true 'stock' browser and it's a helluva lot faster than the crap that comes with TW. I've found Dolphin and Merc to be my faves, simply because Chrome still doesn't allow text resizing J
My Samsung Galaxy S5 has a score of 391ms at Sunspider using the stock browser from Samsung and ART Runtime. This stock browser used to be crap, but the new version of it coming with the Galaxy S5 has been really improved. I used to use Chrome instead of it, but now Samsung's stock browser is far faster.
Browser benchmarks are in no way a good benchmark of CPU performance, even with the same browser on different devices. Browser benchmarks test many things. Raw CPU performance is not one of them.
I think you might not know what your'e talking about. The Samsung Android browser just uses the old Android Opensource project stack. It's not very standards compliant and pretty slow. A lot of web applications don't ever support it anymore.
I am guessing not... These foundries all use same stuff. If 20nm isnt ready it isnt ready. There is pretty much TSMC, UMC (formerly AMD) and Samsung, and Intel that are capable of producing at Volume. Intel is obviously not part of this equation. I am not aware of any of the other 3 that are making 20nm chips at volume in 2014. They usually all 3 go in step at the same time. I am 99% sure the Apple A8 or whatever is in the next iPhone iPad will stil be 28nm.
But how they will make that compulsory 2 x jump then ? Not say its impossible on 28 nm but 20nm would help a lot. Also if 28 nm it will be samsung or TSMC ? Because TSMC has more matured 28 nm process than samsung even that could be possibly enough to make that 2x jump. And hell Apple got billions to spend... Why is 20 nm so much behind the shelude ? Isn't that just Apple with an secret exclusive multibillion deal sucking off the entire TSMC ?
ok so this is the year when Qualcomm lost their leadership as 2014 fastest Android SoC appears to be Tegra K1. Much faster GPU (26fps in Manhattan offscreen, 60fps in T-rex offscreen) and faster CPU too (see Xiaomi MiPad benchs). 20nm Erista with Maxwell coming same time as S810, Nvidia will even make the gap wider on next generation...
Does it matter if Nvidia cannot do it without a fan and higher power usage?
The SHIELD is a perfect example of why Nvidia fails to win against Qualcomm in meaningful terms: In the Phone/Tablet market, performance does not matter. Pref/power, and, absolute power do matter.
Until Nvidia learns to make things in lower power envelopes (the T4i is a decent example) they will lose to Qualcomm in meaningful ways.
On that note, The "amazing" K1 chip will be clocked in the 600Mhz range in the first Tablets it comes in... How downclocked will Qualcomm's part be?
Anyhow, if you need absolute performance in the SoC space (aka you are using it as a desktop, or, perhaps a laptop) NVidia is the player to go to. Otherwise, Qualcomm is plain better for phones/tablets.
thats not true, take the tegra note for example. the K1 uses an updated A15 that is more power efficient. then when the custom denver chip hits, it will start taking names and cashing checks.
You do realize that people have already done power tests with the TK1 boards right? Or are you pretty much ignoring anything until you get your "product with lines in them" argument. If that's the case, then I'm pretty sure the Snapdragon 805 fits in the same boat.
Qualcomm has a history of products being used by OEMs... NVidia has Tegra 2... tons of large OEM design wins... Tegra 3, many large OEM design wins, but, far less... Tegra 4... no large OEM design wins, until they finally got a win in China.
Nvidia did not lose design wins magically. Tegra consumed more power than they claimed, and, may have been slower than their claimed.
Why are you arguing about design wins when I'm talking about your power test argument? Have you not seen the power information from users with a TK1 board or not? Or are simply ignoring their findings?
Nvidia lost design wins because they lied about their power usage and performance to OEMs repeatably.
Nvidia had plenty of support with Tegra 2, the OEMs though the clockspeeds and power usage looked great... the final product, well, noticeably slower at around the original power promised. So, OEMs grow wary of the next Tegra, but, NVidia might have made an honest mistake... They get number for Tegra 3... Looks great... Get real Tegra 3... substantially slower, more power (at that slower speed) than told.
Nvidia tries to Tell OEMs about their great new Tegra product... and OEMs do not use it, well, major mobile OEMs do not. MS, HP, a few other companies, and, later a large Chinese company.
I have no faith in Nvidia as far as Tegra is concerned. Companies that got Tegra 2, 3 (and perhaps Tegra 4 chips) got real chips. They ran at the clockspeeds NVidia promised, at the power Nvidia promised.
Cherrypicking chips is not hard. It was done to large OEMs, in large enough numbers to let them design many design wins.
I'm not sure why you think it cannot be done for developer boards.
Tegra K1 does look like a good Tegra product finally coming around though. I do not think it will be as good as Nvidia says, based off of the past, but, I don't think it will be as run-of-the-mill as the rest of the Tegra chips were.
Tegra 4/4i was used by various large OEM's including: Asus, HP, Toshiba, Xiaomi, LG, Huawei, ZTE. Of course, that is just smoke, because it has nothing to do with Tegra K1 performance nor power efficiency.
How many TEGRA 4 (Tegra 4i is a different product, it is a fine product) phones are there? One. By an OEM that also used Qualcomm for the same Phone... I am sure the battery life tests are are the Qualcomm device (happy to be proven wrong)
Otherwise, you have a bunch of PC manufacturers (not large mobile OEMs...) with Tablets/laptops/AIOs. Oh, and, 2 of the 11 products are made by NVidia themselves.
Tegra 4i on the other hand, well, it is what Tegra 3 should have been than some. It is a fine product.
You just don't get it. Tegra is focused on automotive, embedded, consumer, and gaming products. Mainstream smartphones is NOT a focus now. Tegra will make it's way into some high end differentiated smartphone products in the future, but the lion's share outside of Apple and Samsung will go to Qualcomm and Mediatek. Qualcomm is able to attractively bundle their modem with their SoC, and certain large carriers have legacy WCDMA networks that require Qualcomm's modem tech. Mediatek is the lowest cost provider. That's life, and it takes nothing away from Tegra K1 which is still a revolutionary product for the ultra mobile space.
QC's lead in mobile chips and their pricing probably account for the leading position until MediaTek and others starts chipping away on prices and performance. The failure of Tegra3 shows where the price/performance point was and Nvidia knows that and it is the reason why they venture to automotive and other products becuase these needed powerful and higher power gpu chips as opposed to mobile. Except for rendering video in 10bit, and possibly 120fps video encode, there is no real need for the 805 in a phone. The S5 shows that the 801 is more than capable of all things mobile yet have an acceptable battery life. The K1 is a beast in itself being able to do vision graphics and VR stuff. Not that the 805 cannot do but probably better at it in a competitive price package. Nvidia Icera 500 modem is not as popular either having gone through the certification of carriers yet is hardly in any handsets commercially. Also Nvidia knows this up front.
what's the focus then? Testbed devices? It can be as "revolutionary" as you claim (or more likely its just a downclocked desktop part)
And what sort of a revolution will a device with no OEM wins will cause? I mean we know there are faster parts in the hardware market as a whole. We also know that some of them used 250watts of power. So why does a part with high power usage and higher performance surprise anyone? :)
It was NV's 1st LTE integration attempt. Carrier Qualification takes long, and since it's the 1st NV silicon with integrated LTE, it probably took longer. If NV, can continue to develop it's LTE, and not have any IP issues with QC, i'm sure NV would give QC a run for their $$. think of it this way, QC been in the game for awhile...NV showed up about 5yrs ago, was able to give enough competition for TI to leave the phone market. (NOT saying NV should take credit for this). and now with K1 GPU
Tegra 2 & 3 were both subsidised. Tegra 4 isn't and it was delayed as well thats why there were fewer design wins. Also the fact that it didn't had integrated. Not because it lowers power required. Integration of LTE modem doesn't lowers power consumption. (apple's iphone 5s doesn't have integrated modem) Integration of modem reduces oem costs instead.
More than likely, QC has a strangle hold on LTE, as they're not likely to license out their tech to a competitor. they've been in the Phone game longer, so OEMs probably have it easier on the Sftwre side. QC SD SoCs run hot too, just as hot as any other SoC. I've had Tegra devices and SD devices, both run at similar temp to the touch. except the T4 devices don't lag as much as SD devices (This could be due to stupid TouchWiz)
For you: http://en.miui.com/thread-22041-1-1.html 5hours heavy 3D game on 6700mAH battery means that TK1 runs with ~3W and 11 hours on video so excellent numbers when taking into account performance
The GPU is running in the mid-600 Mhz range (from the 950Mhz or so Nvidia touted) and the CPU is certainly also downclocked.
Do you have performance numbers for that game? How about how fast/power usage on competitor chips? Not enough knowledge to draw large conclusions... Still, really odd how NVidia is not talking about the clockspeeds in the tablet... You think they would talk up how highly clocked and efficient their chip is...
What do we care about clock speeds ? is it now a new metric of performance ? Is A7 running at only 1.3Ghz a slow SoC ? Architecture efficiency and final performance results are what we care about. What is important is that TK1 in MiPad destroys all other Android SoC by good margin (60fps T-rex vs 40 on S805) and with good power efficiency. is it so difficult to admit it for nv haters ?
If I run my (nexus 7 2013) playing asphalt 8. My battery runs out in 2 hours only on 50% brightness. I can tell you Tegra K1+RAM on TK1 Jetson consumes 6980mW running full tilt at 950mhz for an actively cooled device. Now remember this is a non mobile device for developers.
well your post shows big ignorance of the products. 1/ Tegra 4 was on 28HPL process when S800/801/805 use 28HPM that provides nearly 30% better transistors. oranges vs apples and big advantage to QC 2/ T4 uses A15r1 that is not very well optimized for power efficiency. TK1 is now with A15r3 that provides better efficiency. 3/ Tegra K1 and S800/801/805 are made on the the same 28HPM process, so it's a fair comparison.
That means that T4 vs S800/S8001 was an easy win for QC due to many disadvantage over T4 design. But TK1 vs S80x shows completely different story with both using same node.
Finally, TK1 benchs are in the Xiaomi MiPad, a 7.9" tablet, no fan here, and it still smokes S805...
I will believe it when I See them... Also, the amazing "950Mhz" clockspeed of the K1? Guess where it is in the MiPad?
In the 600Mhz range. NVidia has to downclock its parts to fit into tablets. Much less phones.
2. Process choice is a manufacturing choice. Nvidia could not get a HPM design? They suffered. Anyhow, Qualcomm will still probably smoke on pref/watt... which, once again, is what is really important in the phone and in most tablets.
3. Krait "450" cores are the same Krait cores from the 800 (a 2013 product) with more clockspeed. A15r3 is a 2014 product. I can throw meaningless garbage into "fair comparisons" also. You compare the SoC as a whole... K1 will end up faster than the 805. I am convinced of it. Will it matter? Not unless you are looking at putting a chip into a miniPC or a laptop... Or, perhaps a mobile-gamecontroller-with-a-screen. :)
Depends on how/when Samsung, Qualcomm, Mediatek, Rockchip, etc. introduce their next generation chips and how fast they are.
Apple's will likely beat this on GPU and CPU while using less power... Because, well, Apple has and continues to spend tons of money on optimization. The biggest part is die size... which is a huge advantage Apple has.
Apple is using PowerVR GPU. And only GX6650 is comparable with Tegra K1(but it's have about 600-650MHz max frequency, and, because of that, less GFLOPS)
and if the K1 cannot run at full clockspeed in phones/tablets due to setting reasonable clockspeeds or throttling? Or if PowerVR GPU adds more units, or raises Mhz (no reason to do these, as, raising power consumption (Without partial redesign) and these parts are typically "use X power, get as much speed as possible" Or if Apple happens to license a GPU architecture from a company they are close with, say, PowerVR... ;)
apple faster ? proof ? All these SoC manufacturers rely on other companies tech to build their GPU. Unlike Nvidia, they can't come up with something new if their IP supplier doesn't have it. And for now, like it or not, no GPU available this year will be more powerful than mobile Kepler. Shallow it, you can't do anything against that.
lol. All right, I'll just ignore the fact that Apple made a custom ARMv8 CPU, the fact it has a GPU team, to match its Uncore and CPU teams, and the fact that there are mobile GPU manufacturers who would reasonably license their overall design to Apple. Also, Imagine Tech (and ARM, and Qualcomm, and Samsung, and others) could call make GPUs that were faster than Tegra K1. But, there is no reason to. Waste of money, waste of power (in the design), etc.
Oh, and ignore the fact that Apple is willing to spend more money on die. You have to remember that Nvidia is planning to sell these on a profit, that means they need to minimize die space. Typical semiconductor design comes down to Power, Performance and Area. NVidia Tegra seems to follow Performance, Area, Power. Tegra K1 might be the first to go Power, Area, Performance (impossible to tell without real retail products...)
Apple, on the other hand, targets Power, Performance Area. That means that as long as the chip will fit inside the phone, they would be fine making a 200M^2 die. Making a larger die means you can reduce power due to various reasons. Of course, making a die smaller also allows you to reduce power by shortening distances (this (and lack of interdie connect, larger cache and faster caches) is a reason why Maxwell managed to reduce power so much).
I am also using historical precidence: -NVidia claimed Tegra 2 brought mobile up to parity with Xbo360/PS3 (And Tegra K1, not sure about 3 and 4) which, well, Tegra 2 was not, Tegra K1 will be not (due to bandwidth for the most part, imho. Given it had more bandwidth, it certainly could beat the Xbox 360/PS3) -Nvidia showed Tegra 4 beating iPad (did it for Tegra 2 and 3? I don't remember) and it lost upon next iPad. -Nvidia claimed Tegra 2 was great pref/watt. And Tegra 3, and Tegra 4. They all were bad compared to Qualcomm (and Apple)
I don't take Nvidia's claims for much, because, they stink. Hopefully Tegra K1 fixes it. I would rather we did not have a player in the market "die" (read: move to focusing almost wholly on automotive) especially not it being after that company finally got its act together.
Without going into Apple being faster, it's clearly silly to claim that "all the SOC manufacturers rely on other companies tech to build their GPU". Who is "all the SOC manufacturers"? Qualcomm use their own GPU, as does nV. Soon enough so will AMD.
Apple is the only interesting SoC manufacturer that uses someone else's tech for their GPU, and their are plenty of indications (eg large buildup in GPU HW hires) that this is going to change soon --- if not with the A8, then with the A9.
It's hard to have coherent dialog with you going on tangents. For this year it seems the competition from Qualcomm will be the 805 which we now know will not be as performant as the Tegra K1.
How do we KNOW this? I've struggled to find third part, comprehensive benchmarks for either the mipad or the tk1, especially ones that include power draw (those numbers some random person threw up in a forum aren't tremendously useful, with regards the mipad). Also, the adreno420's drivers are, apparently, not well optimized. Basically, until AT, toms, or someone similar get their hands on it I won't feel like I know how things stackup.
There are slides from the IHV presentation for the MiPad showing power usage patterns. Phoronix also did some testing of the TK1 board which shows power usage well below what some seem to be thinking. As for Adreno drivers, they've always been bad and not well optimized.
When did phoronix release power numbers? The mipad presentation looked like copy paste from the Nvidia material. The adreno drivers aren't great. Which should tell you how good that hardware really is. Rob Clark, the lead developer of the open source freedreno driver, is already at least matching their performance up to opengl 2.1 gl|es 2. He's mentioned that he's found evidence of the hardware supporting advanced gl extensions not advertised by the driver. This may change as Qualcomm has recently joined linaro so they will probably be seeking to work more with upstream. The end result of that process is always better quality. Lastly, don't forget that adreno is a legacy of bitboys, and that Qualcomm is no Intel, even though they are the same size. Qualcomm seems to actually be interested in making top performing GPUs.
isn't Apple's die known to be the biggest of all the SOCs? they can afford to have a big power sucking SOC cuz they can optimize for it. only reason Iphones last long on battery...besides being lower clocked, cuz look how small and dinky that screen is...
Proof is loose... based off of talking with someone who has worked closely with Nvidia and TSMC in the past (Tesla, Fermi, a few Tegra chips).
They have been quite accurate before... When the tablet comes we will see it.
On the other hand, silence tells us a lot also... Where is Nvidia's talking about their "950Mhz GPU" in a tablet? I think the 600Mhz (by that, I should clarify I mean between 600 and 699) clockspeed band is still quite impressive... Just, well, it shows why the chip won't go into Phones...
Well, i don't belive it because of benchmarking results. Nvidia promised 60fps in T-Rex HD, and here it is. And Nvidia also promised 2.5x performance on Manhattan Offscreen test compared Apple's A7. And here it is(30fps vs 13.3fps). Well, almost 2.5x.
When the iPad Mini Retina (and other A7 devices) had been benchmarked towards the end of January in Manhattan Offscreen 1080p, they had average scores closer to 667 (10.8 Fps). Based off of this score, the Tegra K1 was offering a score ~2.65x that of the A7.
Sometime around March 10th, the average score for Apple's A7 had changed to 803 (13.0 Fps). I assume there was some sort of software update that boosted their Manhattan score.
So read this: http://en.miui.com/thread-22041-1-1.html 5hours heavy 3D game on 6700mAH battery means that TK1 runs with ~3W and 11 hours on video so excellent numbers when taking into account the leading performance
about 5 hours. Hm, I wish they had ran the tests longer... Ideally (for testing purposes, not time purposes) they would have ran everything for 20-25%(+) of the tablets battery life, and used that. OR Did the tests from 75% or 50% battery.
Using 15 minutes (read: 5%) is a bit low amount of battery wear... It should be pretty accurate, but, I would skew that test a little lower (my experience is phones (over 5 phones) and tablets (over 2 tablets) tend to lose the first few percent of battery the slowest, and, it gives an inflated battery life.
According to the first 5% of battery (from 100% to 95%) on an iPad mini, well, It gave me something like 8 hours of usage playing hearthstone... Same with video watching on Nook HD+ Same with Video (and reading webpages) watching on 920/1020 Same on reading webpages/playing weak applications on my Moto G Same with web pages on some cheapass Android LG Same with video/web pages on my old iPhone 4.
I would guess the real battery is closer to 4-4.5 hours (happy to be proven wrong ^__^) and the SoC is not running max clock (for good reason, no need to run max clock when you already hit a game fps gap, or, the refresh rate of the tablet)
Doesn't Asus have a TF701T? which uses a Tegra 4 @ 1.9Mhz, WITHOUT A FAN? One bad review, because Toshiba don't know how to make a tablet, and everyone thinks T4 runs hot. The fan in the Shield is very quite and doesn't 'spool' up as if T4 was overheating, it runs very smoothly and the the interface is very smooth. Also, streaming AWAY from your house on good WiFi for PC games is working quite well now.
And at least NV is updating its software....consistently...unlike most OEMs.
Agreed. Also Denver is in house where 810 isn't. I would expect CPU stuff to show Denver is better on power when taxed as Qcom in house won't be around until late 2015 or Q1 2016 since 810 is 1h2015. The GPU will heavily favor NV (even AMD at some point if they get in with a good SOC) as nobody else but AMD has 20yrs of gaming experience. We already have seen K1 is pretty good against the 805. Odd they ran from battery here...
"Here even NVIDIA's Shield with Tegra 4 cooled by a fan can't outperform the Adreno 420 GPU" Anandtech needs to stop makings dumb NV statements. It's a YEAR old device and won't even be in this race, not to mention it's not getting a ton from the fan anyway which is really there for longevity and temps in the hands for hours (we game hard and for hours so keeps it cool). The fan isn't in there to hit 2.5ghz or something, it's 1.9. This device will be facing K1. Stop taking AMD checks to be their portal site, and you can get back to making unbiased reporting without the little BS NV digs.
So true for the last paragraph ! Let's see if Anand will say "TK1 Kepler GPU smokes Adreno 420, providing twice the performance on GFX3.0 bench with half memory bandwidth" I'm waiting for it, right Anand ?
"{"Here even NVIDIA's Shield with Tegra 4 cooled by a fan can't outperform the Adreno 420 GPU"} Anandtech needs to stop makings dumb NV statements. It's a YEAR old device and won't even be in this race, not to mention it's not getting a ton from the fan anyway which is really there for longevity and temps in the hands for hours.... Stop taking AMD checks to be their portal site, and you can get back to making unbiased reporting without the little BS NV digs."
Reading. Comprehension. Even YOU, taking the time to quote and post the comment, specifically relating to an OBJECTIVE benchmark. nVidia isn't ON the market! Unless you buy THEIR pad....and that's extremely niche right Now..."gaming for hours?" Who does THAT on their phone? Even their tablet?? An hour, ok...I see that. But there's a FAN for a reason. Not to just keep your hands cool. You said it yourself. Longevity. Doesn't that DIRECTLY relate to 'cooling' the SoC? The guts? So it can 'live longer?' It wasn't a dig. It certainly wasn't an AMD stamp, WhateverTH that is. Certainly possible it was I that missed the innuendo there. It's a fact though, bro! nVidia, Intel, AMD...ALL late to the 'mobile game'. Intel has the resources to jump into the fray...head first. nVidia doesn't. They're being very careful while maintaining their, again...slowly but certainly 'niche' dedicated GPU activity. With Intel's iGPU performance envelope and TDP, along with its kinda close association with the CPU ;)...increasing demand for smaller, faster and more portable computing is going to destroy nVidia if the K1 projects isn't accepted by more mobile vendors and OEMs. There's a LOT of money in the R&D of these SOCs and to date, the nVidia 'Tegra' solution scared a LOT of OEMs using or considering using their silicon graphically. I think you owe Anand and his crew an apology. I'd be interested as to what your contribution to the world of technology is...it's got to be something incredible! I'm all ears!!! Seriously, for you to disrespect the author of the article as you did...you owe at least an apology. Then, feed the spider. Leave mom's basement. Get a job. Stop playing games all day. And don't 'pick a winner!' You'll NEVER win. It's called gambling. That's why the lights are on in Vegas. It's cool to be a fan of theirs but to post such a silky comment disrespecting one or the MOST respected and intelligent employee of or Anand himself is bad juju. Take it back. Get off the 'net for a couple of days. Get some sunshine. Good for ya!
I've played games on my phone (LG G2) for over 4 hours at a time (Puzzle Quest 2 is damned addictive). Once for over 6 hours, although I had to plug it in near the end. :) Anytime I get a new, interesting RPG onto my phone, I'll go through bouts of playing it for 4-6 hours at a time.
And my daughter has played games on our tablet (2012 Nexus 7) for multiple hours at a time, including some Netflix and local video watching. The battery on that thing tends to only last about 4 hours, though. :(
Just because YOU can't see a reason to play games on your phone for over an hour doesn't mean nobody does that.
Hear hear - i play my psp games emulated on the phone these days. the psp is too much of a pain to carry around (and too old tbh) but some of the old rpgs on it are ossum and yes i can play them for hours on end.
Market comparisons aside as a taken, the K1 is quite different than the Tegra 4 as far as GPU hardware goes. It should have the chance to be used in Super Phones and mobile devices if only to put pressure on Qualcomm to fix their OpenGL drivers.
Yeah, it supports DX11.2.... no other mobile GPU supports... Wait. Qualcomm does... And I wonder how long until ARM "stock" GPU does... And Imagine Tech's...
Nvidia is not displacing Qualcomm most likely. It could happen, but, given OEM relationships, it is unlikely. Even if NVidia has a better product (note: I think they have one that is on best even with Qualcomm (for Phones)) the OEM relations mean a lot. ATI managed to make quite a bit of money despite NVidia having far better products due to OEM relationships.
"Yeah, it supports DX11.2.... no other mobile GPU supports... Wait. Qualcomm does... And I wonder how long until ARM "stock" GPU does... And Imagine Tech's..." — Only Tegra K1 supports OpenGL 4.4. DirectX support matters ONLY on Windows. And Tegra K1 supports DirectX 12(because of that: http://blogs.nvidia.com/blog/2014/03/20/directx-12... ).
Unfortunately, OpenGL 4.x will mean little to Android devices until Qualcomm starts actually caring about the API in their drivers. If the K1 actually competes in tablets and superphones, it may actually have the effect of forcing them to through competition.
Look, you don't need to keep linking to that old post. I'm don't need to argue about adreno blob quality. Nvidia, however, lives and breathes on it software, which seems to be not quite as flawless as some would believe. Again, why does gl matter to android? It doesn't even support it. The point of having the embedded profile is to expose enough of gl to create a good balance between high performance/graphics quality and efficiency. Going full blown gl destroys that last advantage (tesselation alone is terrible for battery life).
Great, but you aren't going to see Google including gl in their builds. The reason? Gles is the best of what can be supported while maintaining decent battery.
Hardware level support for API's mean diddly squat if their drivers are crap. My Galaxy S4 with the Snapdragon 600 supports OpenGLes3, but it sucks at it. Nvidia actually has a history of nominal OpenGL support in drivers at least.
Who said anything about NVIDIA "displacing" Qualcomm? That is pure silliness. NVIDIA offers a "differentiated" product. Any OEM who wants to offer differentiated high end products may consider using Tegra K1. K1 is also a very versatile product, so it will be used in a wide variety of different products.
Companies like Xiaomi that are looking for differentiated products would absolutely consider using Tegra K1 in a high end smartphone. Considering the performance and perf. per watt advantages for Tegra K1, there are clearly compelling reasons to use it. But again, like I said above, the lions share of phones outside of Apple and Samsung will go to Qualcomm (bundled modem) and Mediatek (lowest cost).
At least it's not just a rebrand like the 801, we want 20nm already! On the flip side, really looking forward to the A53 SoCs about to arrive from Qualcomm and Mediatek, the Allwinner A80 quad A15, the Rockchip RK3288 with it's quad A17 (not sure it's not A12 ,we'll see soon i guess) and the quad A17 MediaTek MT6595. The more budget side should be getting some nice perf boost. And while at it, heave you guys heard anything about Intel investing/collaborating with Rockchip?
I actually found very detailed power consumption figures for Jetson Tegra K1.... I am quite surprised how low power it is... I am quite surprised anand missed that... Here is the link....
Hope this chipset doesn't just come on 2560x1440 screens because the bump in performance isn't going to match the increased demands of that resolution.
The only test you did that reflects real world gaming(in my experience) is 3d mark so it's pretty shocking how poor it does.
40% improved performance might as well mean nothing because the current chips rarely even run full power. So basically a 20% power efficiency bump is the only thing that will be actualized.
X-Com will drain my note3's entire battery in about 2 1/2 hours where the 800 runs full power.
The better shader support is by far the biggest news, direct x 11 effects are solely needed, shaders in games currently look like direct x 7. So that's a massive leap.
All the visual fidelity currently comes from running games at very high resolution with high poly counts. I wonder if in the next generation we are going to see developers run games under native resolution, think of what you could do with shaders running at 720p instead of 1440p. Now that you have all the advanced lighting and shadowing effects why would you want to burn up all your performance on super high resolutions.
The xbox 1 and ps4 don't even run most games at 1080p it's sort of crazy that mobile games do. You aren't going to get ps3 level graphics at 1440p but you could at 720p.
Would you consider including at least 1 Intel/AMD 'cost-effective' cpu, integrated graphics, and 'cost-effective' discrete gpu in these charts? A price per chip comparison would be very cool as well.
I am really interested in watching the performance gap close over the next year.
Nice GPU features & moderate performance bump. Not impressive, though. This release shows how bad we need to migrate to the next 20nm process node. Industry has done all they can to squeeze the heck out of current 28nm node. It's really frustrating to observe this stagnation of process technology.
IMHO, I do not think 805 will get much in the area of phones. Manufacturing wise it is more expensive. It needs to have a separate Gobi MDM to provide LTE connectivity. Instead the cheaper integrated 800/801 which are MSMs will continue to be used.
IT could very well be a repeat of Q3-12-Q2-13 where in when the first QC Kraits appeared sans the integrated baseband.
That was a brief window of time that Note 2 as well as HTC One X+ sported Exynos and Tegra 3+ variants as the SOC and plus a Gobi MDM for LTE connectivity. So maybe Note 4 would have the next Exynos version with the Gobi MDM for the US version of the handset.
810 is 2x32b LPDDR4 1600. 25.6GB/s, same as 805 808 is 2x32b LPDDR3 933. 14.9GB/s, less than the others but still quite a lot of BW considering the GPU is smaller too.
The combination of a new 64-bit core and higher clocked GPU (note that the 810 shows Adreno 430) will put additional strain on the memory bandwidth. I think I'd rather see 4x32b LPDDR3 933.
Considering the hugely more powerful APUs from AMD and Intel are happy with 25.6GB/s using architectures which are less bandwidth conscientious; I'm sure these Snapdragons will be limited by thermals far before memory.
I wonder what the heuristics are like for switching graphics from TLDR to direct rendering are and whether they are tweaked depending on tablet or phone usage.
Abandoning TLDR to use that giant amount of bandwidth is going to use oodles of power. This is why I doubt you will see NV Kepler on a phone and restrict it to tablets.
True. Remember NV do have Maxwell shipping on desktop form so its mobile variant is probably in testing/optimization phase right now with possibly reserving it for near future opposition.
No VP9 decoding? And when is Qualcomm going to open source its baseband firmware? Otherwise we can just assume it's backdoored these days, therefore making the whole phone vulnerable to attacks and espionage.
OK so your excuse, Anand, for not including Tegra K1 or Exynos 5433 in your benchmarks was that you didn't have them available on May 21. Now you do as K1 and Exynos 5433 are shipping while Snapdragon 805 still is not.
So can you correct this article to show 805's performance vs the real competition? Pitting it against last year's Tegra 4 and Exynos 5 Octa is just wrong.
Wow, didn't expect that my Tegra K1 is so much better than these SoCs. 50% more graphics performance and even 10% to 20% more CPU Performance (the week part of the Tegra K1).
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
149 Comments
Back to Article
B.Jay - Wednesday, May 21, 2014 - link
Too bad it seems like the LG G3 might not have this beast... So close LG, so close!erikiksaz - Wednesday, May 21, 2014 - link
I wonder what the scrolling performance will be like on the G3 given its resolution and "just" having an 801 chip. I feel like scrolling performance is finally good with the 1080p and the latest generation of processors, but are we going backwards in performance with the leap to 4k?Impulses - Thursday, May 22, 2014 - link
Still dunno why they're pushing for even higher DPI on phones, seems almost absurd at this point.piroroadkill - Thursday, May 22, 2014 - link
Because people are very, very simple in the head and think bigger numbers are better.Frenetic Pony - Thursday, May 22, 2014 - link
"Now assembled by 100% more unpaid Chinese interns!"Vayra - Monday, June 2, 2014 - link
High DPI phones will hopefully die a silent and quick death. Even 1080p is overkill for me on any screen smaller than 5 inch, and staying at 720p at that size is great for battery life...Teo222 - Thursday, June 5, 2014 - link
I don't know man. The difference in pushing and power consumption and price between 720p and 1080p is not that much to deter from the noticeably better quality.Say what you will but I (with my less than perfect vision) can still see pixelation at battery indicators or other small elements. Not to mention native full HD content should you wish it. and obviously SOCs today able to cope with 1080p.
Sabresiberian - Thursday, May 22, 2014 - link
I'm not sure why some people spend as much effort deriding the push for higher pixel densities as they do. Most of it sounds more like sneering for the sake of sneering than having an actual point to make, to me (I'm not saying I know that to be the case here). One should at least make the effort to differentiate himself from the person telling us we should not try to improve the quality of what we see by reaching for more than 60 DPI not all that long ago, because that was "good enough".Human vision isn't the simple mechanism some people think it is. Research has shown a density of over 600 DPI can be useful. I tend to agree that there are other ways to make smart phones better at this point, and it's hard for me to imagine that I would be able to see any benefit beyond 300 DPI personally, but I'm not against increasing the ability of our technology to do more than we can use. Perhaps the technology will lead to improvements in other areas, like responsiveness, or fluidity. Seems to me that giving designing engineers all the power they could possibly use can't be a bad thing. :)
Egg - Friday, May 23, 2014 - link
Increased power draw, cost, performance issues.Frenetic Pony - Saturday, May 24, 2014 - link
Hey looks reasons!evonitzer - Friday, May 23, 2014 - link
+1 -- I amazed how quickly people have tired of smartphone innovation. Bring on the pixels! I'm still young(ish) and can resolve details quite well. I'm hoping for magazine level smoothness with no blockiness.lmcd - Friday, June 20, 2014 - link
...which is why we need color accuracy and gamut, a greater range of brightness settings, better refresh rates, and less ghosting and less burn-in.But not higher pixel density.
YouInspireMe - Saturday, May 24, 2014 - link
Because if they have to be able to drive BIG displays with the phone they may as well be able to brag about being able to do it ON the phone. And everyone will eventually have to be able to drive BIG screens OFF the phone.pjcamp - Thursday, May 22, 2014 - link
Scrolling performance on my old Galaxy S3 became perfectly snappy -- as soon as I ditched Touchwiz. All these stupid skins are the limiting factor, and there's really no purpose for them any more other than product differentiation.metayoshi - Thursday, May 22, 2014 - link
Yup. That Snapdragon S4 was more than good enough for basic AOSP Android. TouchWiz really killed the performance on that thing. I mean, I know more people with a Nexus 4 than a Nexus 5, simply because the S4 Pro is good enough to run Google's Android. I personally have a Nexus 5 because it's compatible with Sprint, and should have Triband with the whenever-it-comes 4.4.3 update. I used to upgrade my phone every time I had the chance, but I actually don't think I'll be upgrading as long as my Nexus 5 is in good working condition. SoC improvements aren't really that exciting for me anymore.Babar Javied - Thursday, May 22, 2014 - link
I am in the same boat. I upgraded to the Nexus 5 from the Galaxy Nexus since it was getting old. I don't see myself upgrading anytime soon. Not only is the SoC more than enough to handle what I want, it has plenty of RAM and the screen is great too. All of this with no bull-sh!t skins like touch wiz and you have what I feel to be a great phone.So I really hope google does not shutdown the Nexus division.
betam4x - Thursday, May 22, 2014 - link
I don't think they will. People i know are divided into 3 groups: iPhone Users, Samsung users, and Nexus users. Others do exist, but for the most part, Google's phones seem to be pretty popular.betam4x - Thursday, May 22, 2014 - link
I'm in the same (happy) boat. Downgraded from the Galaxy S III (rooted, unlocked, Cyanogenmod) to the Moto-X (republic wireless). My phone lasts 20-24 hours with pretty decent usage (screen is on 3-4 hours a day, always browsing the internet on this thing...) My Moto-X isn't rooted or unlocked, but it's the most enjoyable phone i've ever owned...and i'm a spec freak.betam4x - Thursday, May 22, 2014 - link
btw the reason the phone lasts so long on battery is due to republic wireless operating over wifi. when wifi is on, cellular modem is off...apparently the cell modem is a power hog on any phone.Eddie A - Saturday, May 24, 2014 - link
Too bad indeed. It seems they couldn't (or didn't have time to) re-engineer it in time to add an LTE modem (since the 805 doesn't include one). I believe it still should run smooth though...the 801 is still a beast. My Nexus 7 2013 for example has a resolution which is slightly lower than the G3's resolution with a lesser SoC (Snapdragon S4 Pro) and it flies with no lag/stutter at all, so my fingers are crossed. Yes it's somewhat disappointing but I'm still looking forward to having the G3 in my hands for the next couple of years.Krysto - Sunday, May 25, 2014 - link
Beast? Where? These numbers look absolutely pathetic. 40 percent faster than Adreno 330 will put it right around...HALF of Tegra K1's GPU.jospoortvliet - Monday, May 26, 2014 - link
Which you can get where exactly? I will believe NVIDIA the day they actually DELIVER, as opposed to make empty promises like for, oh , every previous tegra soc?lmcd - Friday, June 20, 2014 - link
Probably half of Tegra K1's power consumption, too. Particularly the Project Denver version.2far - Wednesday, June 25, 2014 - link
well it turns that it will!nerd1 - Wednesday, May 21, 2014 - link
I don't understand why anand keeps using Chrome browser for android devices, and Safari browser for IOS. Safari is very well optimized for IOS (naturally), and chrome is NOT. For example the samsung's stock browser on GS5 is quite optimized and has benchmark results beating iPhone 5s.I admit all other sites has degenerated enough to publish some random 'impression' as review, but I still feel strong bias here. We have benchmark results, but they are many times cherry picked to make apple products looks better. And it is very bad because now everyone takes review here "objective".
Myrandex - Wednesday, May 21, 2014 - link
I like that Chrome is used on the Android devices. It is a browser available for all products. It is the default browser on the pure devices (such as the nexus devices as even none nexus devices running standard Android). If Google doesn't optimize it for Android than it is their loss, but I personally use it on my Android devices as well.anonymous_user - Wednesday, May 21, 2014 - link
I don't know if each Android smartphone has the same stock browser but I do know Nexus devices no longer have the stock browser and instead use Chrome. Wouldn't using one browser for all Android devices help with benchmark consistency?nerd1 - Wednesday, May 21, 2014 - link
We should at least have benchmark results using the stock browser that ships with the device.akdj - Wednesday, May 21, 2014 - link
I think you'd be pretty bummed to find out the 'true' speed of old reliable, your TW browser. Have you tried a third party browser? Chrome? Dolphin? Mercury or Photon? Opera or Skyfire? Next or Firefox? I'm a browser junky. If you're not upset with me calling the TDub browser crap, drop me a line and we can discuss. I've literally downloaded all of the worth a bean over the past six, seven years;)Flunk - Thursday, May 22, 2014 - link
The stock browser for Android is Chrome, Samsung just adds more junk to their images.leexgx - Monday, May 26, 2014 - link
the stock browser is normally not Chrome (unless its Pure Android device or Sony phone) its normally the webkit browser? (not checked what its called)i find chrome a power hog (i use Opera mini most of the time any way that infact is for most part worst then the stock android browser, but strips all scripts from the website so you get a static page no CPU use or lag at all)
akdj - Wednesday, May 21, 2014 - link
I'm not sure where you got your S5...I've got an S5 and Note 3. BOTH are MUCH quicker with Chrome, Dolphin, Mercury...or any numerous third party browsers than the crappy Samsung 'stock' "Internet" browser. It's CRAP! If there's a link you could share with the stock Sammy browser 'beating' the latest iOS device benchmarks on the A7/IT platform. Sorry, but as a LONG time TouchWiz owner and user it's definitely the FIRST omitted 'default' option. Then. Keyboard;)Don't worry. Sounds like you seem to think someone is going to buy their phone based off a Sunspider test only? That doesn't make any sense. While the iOS platforms with the A7 are certainly holding their own eight months after release, the tests I eras were definitely not dominated by the iOS silicon right now. I own the iPhone 5s and Air and can honestly say regardless of benchmark results or numbers, tests and objective reviews, the iOS devices generation compared to the same generation of TouchWiz FLATTEN the latter when it comes to browsing speed, UI fluidity and overall general 'speed'. My wife has the iPhone 5 and the Samsung S5. Me the 5s and Note3. It's irrelevant which browser you use, iOS is definitely the quicker populating browser (I actually prefer iCab IR Mercury to Safari...regardless of Apples' decision to not allow access to Nitro, they're JUST as quick). While I LOVE my Note for the actual browsing 'experience' in comparison to the iPhone, the intrigue for me is the larger display and the ability to use the stylus for sketching rigging points for clients, credit cards and my old eyes. I love em both but there's NO WAY without a link you're gonna convince me that POS stock Sammy browser beats ANYTHING! Lol. Android. iOS. I'm a fan of both. BUT some things are better on one or the other's OS, UI and overall compatibility with whatever tools you're using.
Sorry. Others have beat me to it. Took to long. Android is MASSIVE ...And while Samsung makes up a large percentage of 'sales' they're not all flagships. Tons of 2.xx devices still on the market and pay by month kiosks, other countries, etc. not all Sammy's have the same TouchWiz browser nor do ANY of the other OEMs. Chrome, by default...it's Google (Android's) true 'stock' browser and it's a helluva lot faster than the crap that comes with TW. I've found Dolphin and Merc to be my faves, simply because Chrome still doesn't allow text resizing
J
sachouba - Thursday, May 22, 2014 - link
My Samsung Galaxy S5 has a score of 391ms at Sunspider using the stock browser from Samsung and ART Runtime. This stock browser used to be crap, but the new version of it coming with the Galaxy S5 has been really improved. I used to use Chrome instead of it, but now Samsung's stock browser is far faster.Kidster3001 - Thursday, May 22, 2014 - link
Browser benchmarks are in no way a good benchmark of CPU performance, even with the same browser on different devices. Browser benchmarks test many things. Raw CPU performance is not one of them.Flunk - Thursday, May 22, 2014 - link
I think you might not know what your'e talking about. The Samsung Android browser just uses the old Android Opensource project stack. It's not very standards compliant and pretty slow. A lot of web applications don't ever support it anymore.darkich - Friday, May 23, 2014 - link
Exactly.Galaxy S5 with STOCK (Samsung) browser scores way better than on Chrome.
A record-setting 370ms on Sunspider!
Note 3 gets around 500ms.
Uselless and very biased job, Anand
rubene66 - Wednesday, May 21, 2014 - link
Snapdragon always slow web browsing tegra k1 will be bettertipoo - Wednesday, May 21, 2014 - link
Is the RAM package on package, or is it right in the die as the first image implies?tipoo - Wednesday, May 21, 2014 - link
I'm guessing if that is an actual die shot, it's just the interface to the RAM, as the RAM would have to be bigger.mavere - Wednesday, May 21, 2014 - link
So no 20nm Snapdragons until 2015?Is anyone besides Apple expected to have a 20nm mobile SoC on the market this year?
grahaman27 - Wednesday, May 21, 2014 - link
will apple?Thermogenic - Thursday, May 22, 2014 - link
It remains to be seen how good yields are for 20mm at TMSC. My guess id "not very" since Maxwell launched at 28mm.Kevin G - Thursday, May 22, 2014 - link
The low end Maxwell parts also launched a bit earlier than expected. TSMC 20 nm wouldn't have been ready regardless of yields.The real question is if the bigger Maxwell chips will use 20 nm and when they're ready to ship.
kwrzesien - Thursday, May 22, 2014 - link
I'd really like to seem them go ahead and release the medium Maxwell on 28mm for a GTX760ti.testbug00 - Thursday, May 22, 2014 - link
3 28nm dies (GM107, GM204 and GM206). One 20nm die GM200 (210?)GM204/206 will be october at earliest, more likely later... Hopefully not 2015, but, it could end up 2015 :(
testbug00 - Thursday, May 22, 2014 - link
Maxwell was planned to (mostly) be on 28nm years back.Only Maxwell silicon not on 28nm is the one that would be to large (likely, 10-12 billion transistor chip)
name99 - Thursday, May 22, 2014 - link
OR Apple was willing to pay more than nV to secure access to the limited 20nm capacity...retrospooty - Thursday, May 22, 2014 - link
I am guessing not... These foundries all use same stuff. If 20nm isnt ready it isnt ready. There is pretty much TSMC, UMC (formerly AMD) and Samsung, and Intel that are capable of producing at Volume. Intel is obviously not part of this equation. I am not aware of any of the other 3 that are making 20nm chips at volume in 2014. They usually all 3 go in step at the same time. I am 99% sure the Apple A8 or whatever is in the next iPhone iPad will stil be 28nm.GC2:CS - Thursday, May 22, 2014 - link
But how they will make that compulsory 2 x jump then ? Not say its impossible on 28 nm but 20nm would help a lot. Also if 28 nm it will be samsung or TSMC ? Because TSMC has more matured 28 nm process than samsung even that could be possibly enough to make that 2x jump. And hell Apple got billions to spend... Why is 20 nm so much behind the shelude ? Isn't that just Apple with an secret exclusive multibillion deal sucking off the entire TSMC ?tuxfool - Friday, May 23, 2014 - link
Small Correction. UMC isn't former AMD. GF (Global Foundries) is former AMDArthurG - Wednesday, May 21, 2014 - link
ok so this is the year when Qualcomm lost their leadership as 2014 fastest Android SoC appears to be Tegra K1. Much faster GPU (26fps in Manhattan offscreen, 60fps in T-rex offscreen) and faster CPU too (see Xiaomi MiPad benchs).20nm Erista with Maxwell coming same time as S810, Nvidia will even make the gap wider on next generation...
testbug00 - Wednesday, May 21, 2014 - link
Does it matter if Nvidia cannot do it without a fan and higher power usage?The SHIELD is a perfect example of why Nvidia fails to win against Qualcomm in meaningful terms: In the Phone/Tablet market, performance does not matter. Pref/power, and, absolute power do matter.
Until Nvidia learns to make things in lower power envelopes (the T4i is a decent example) they will lose to Qualcomm in meaningful ways.
On that note, The "amazing" K1 chip will be clocked in the 600Mhz range in the first Tablets it comes in... How downclocked will Qualcomm's part be?
Anyhow, if you need absolute performance in the SoC space (aka you are using it as a desktop, or, perhaps a laptop) NVidia is the player to go to. Otherwise, Qualcomm is plain better for phones/tablets.
grahaman27 - Wednesday, May 21, 2014 - link
thats not true, take the tegra note for example. the K1 uses an updated A15 that is more power efficient. then when the custom denver chip hits, it will start taking names and cashing checks.testbug00 - Thursday, May 22, 2014 - link
When Nvidia gets a product with the lines in them, and power tests are done, I will believe it.Until than, Tegra 2, 3, and 4 all cast doubt on that.
__could it happen__ YES!
__Will it happen__ I DOUBT IT :/ :(
fivefeet8 - Thursday, May 22, 2014 - link
You do realize that people have already done power tests with the TK1 boards right? Or are you pretty much ignoring anything until you get your "product with lines in them" argument. If that's the case, then I'm pretty sure the Snapdragon 805 fits in the same boat.testbug00 - Thursday, May 22, 2014 - link
Qualcomm has a history of products being used by OEMs... NVidia has Tegra 2... tons of large OEM design wins... Tegra 3, many large OEM design wins, but, far less... Tegra 4... no large OEM design wins, until they finally got a win in China.Nvidia did not lose design wins magically. Tegra consumed more power than they claimed, and, may have been slower than their claimed.
fivefeet8 - Thursday, May 22, 2014 - link
Why are you arguing about design wins when I'm talking about your power test argument? Have you not seen the power information from users with a TK1 board or not? Or are simply ignoring their findings?testbug00 - Thursday, May 22, 2014 - link
Nvidia lost design wins because they lied about their power usage and performance to OEMs repeatably.Nvidia had plenty of support with Tegra 2, the OEMs though the clockspeeds and power usage looked great... the final product, well, noticeably slower at around the original power promised. So, OEMs grow wary of the next Tegra, but, NVidia might have made an honest mistake... They get number for Tegra 3... Looks great... Get real Tegra 3... substantially slower, more power (at that slower speed) than told.
Nvidia tries to Tell OEMs about their great new Tegra product... and OEMs do not use it, well, major mobile OEMs do not. MS, HP, a few other companies, and, later a large Chinese company.
I have no faith in Nvidia as far as Tegra is concerned. Companies that got Tegra 2, 3 (and perhaps Tegra 4 chips) got real chips. They ran at the clockspeeds NVidia promised, at the power Nvidia promised.
Cherrypicking chips is not hard. It was done to large OEMs, in large enough numbers to let them design many design wins.
I'm not sure why you think it cannot be done for developer boards.
Tegra K1 does look like a good Tegra product finally coming around though. I do not think it will be as good as Nvidia says, based off of the past, but, I don't think it will be as run-of-the-mill as the rest of the Tegra chips were.
ams23 - Thursday, May 22, 2014 - link
Tegra 4/4i was used by various large OEM's including: Asus, HP, Toshiba, Xiaomi, LG, Huawei, ZTE. Of course, that is just smoke, because it has nothing to do with Tegra K1 performance nor power efficiency.testbug00 - Thursday, May 22, 2014 - link
Here is Nvidia's website showing Tegra devices: http://www.nvidia.com/object/tegra-phones-tablets....How many TEGRA 4 (Tegra 4i is a different product, it is a fine product) phones are there? One. By an OEM that also used Qualcomm for the same Phone... I am sure the battery life tests are are the Qualcomm device (happy to be proven wrong)
Otherwise, you have a bunch of PC manufacturers (not large mobile OEMs...) with Tablets/laptops/AIOs. Oh, and, 2 of the 11 products are made by NVidia themselves.
Tegra 4i on the other hand, well, it is what Tegra 3 should have been than some. It is a fine product.
ams23 - Thursday, May 22, 2014 - link
You just don't get it. Tegra is focused on automotive, embedded, consumer, and gaming products. Mainstream smartphones is NOT a focus now. Tegra will make it's way into some high end differentiated smartphone products in the future, but the lion's share outside of Apple and Samsung will go to Qualcomm and Mediatek. Qualcomm is able to attractively bundle their modem with their SoC, and certain large carriers have legacy WCDMA networks that require Qualcomm's modem tech. Mediatek is the lowest cost provider. That's life, and it takes nothing away from Tegra K1 which is still a revolutionary product for the ultra mobile space.fteoath64 - Saturday, May 24, 2014 - link
QC's lead in mobile chips and their pricing probably account for the leading position until MediaTek and others starts chipping away on prices and performance. The failure of Tegra3 shows where the price/performance point was and Nvidia knows that and it is the reason why they venture to automotive and other products becuase these needed powerful and higher power gpu chips as opposed to mobile. Except for rendering video in 10bit, and possibly 120fps video encode, there is no real need for the 805 in a phone. The S5 shows that the 801 is more than capable of all things mobile yet have an acceptable battery life. The K1 is a beast in itself being able to do vision graphics and VR stuff. Not that the 805 cannot do but probably better at it in a competitive price package. Nvidia Icera 500 modem is not as popular either having gone through the certification of carriers yet is hardly in any handsets commercially. Also Nvidia knows this up front.Alexey291 - Tuesday, May 27, 2014 - link
what's the focus then? Testbed devices? It can be as "revolutionary" as you claim (or more likely its just a downclocked desktop part)And what sort of a revolution will a device with no OEM wins will cause? I mean we know there are faster parts in the hardware market as a whole. We also know that some of them used 250watts of power. So why does a part with high power usage and higher performance surprise anyone? :)
Ghost0420 - Wednesday, May 28, 2014 - link
It was NV's 1st LTE integration attempt. Carrier Qualification takes long, and since it's the 1st NV silicon with integrated LTE, it probably took longer. If NV, can continue to develop it's LTE, and not have any IP issues with QC, i'm sure NV would give QC a run for their $$. think of it this way, QC been in the game for awhile...NV showed up about 5yrs ago, was able to give enough competition for TI to leave the phone market. (NOT saying NV should take credit for this). and now with K1 GPUhahmed330 - Friday, May 23, 2014 - link
Tegra 2 & 3 were both subsidised. Tegra 4 isn't and it was delayed as well thats why there were fewer design wins. Also the fact that it didn't had integrated. Not because it lowers power required. Integration of LTE modem doesn't lowers power consumption. (apple's iphone 5s doesn't have integrated modem) Integration of modem reduces oem costs instead.Ghost0420 - Wednesday, May 28, 2014 - link
More than likely, QC has a strangle hold on LTE, as they're not likely to license out their tech to a competitor. they've been in the Phone game longer, so OEMs probably have it easier on the Sftwre side. QC SD SoCs run hot too, just as hot as any other SoC. I've had Tegra devices and SD devices, both run at similar temp to the touch. except the T4 devices don't lag as much as SD devices (This could be due to stupid TouchWiz)Flunk - Thursday, May 22, 2014 - link
If they don't get the actual production hardware out there, it doesn't mean much.ArthurG - Thursday, May 22, 2014 - link
For you:http://en.miui.com/thread-22041-1-1.html
5hours heavy 3D game on 6700mAH battery means that TK1 runs with ~3W
and 11 hours on video
so excellent numbers when taking into account performance
testbug00 - Thursday, May 22, 2014 - link
The GPU is running in the mid-600 Mhz range (from the 950Mhz or so Nvidia touted) and the CPU is certainly also downclocked.Do you have performance numbers for that game? How about how fast/power usage on competitor chips? Not enough knowledge to draw large conclusions... Still, really odd how NVidia is not talking about the clockspeeds in the tablet... You think they would talk up how highly clocked and efficient their chip is...
kron123456789 - Thursday, May 22, 2014 - link
"The GPU is running in the mid-600 Mhz range" — How do you know that? Where is proof?ArthurG - Thursday, May 22, 2014 - link
What do we care about clock speeds ? is it now a new metric of performance ? Is A7 running at only 1.3Ghz a slow SoC ? Architecture efficiency and final performance results are what we care about.What is important is that TK1 in MiPad destroys all other Android SoC by good margin (60fps T-rex vs 40 on S805) and with good power efficiency.
is it so difficult to admit it for nv haters ?
hahmed330 - Friday, May 23, 2014 - link
If I run my (nexus 7 2013) playing asphalt 8. My battery runs out in 2 hours only on 50% brightness.I can tell you Tegra K1+RAM on TK1 Jetson consumes 6980mW running full tilt at 950mhz for an actively cooled device. Now remember this is a non mobile device for developers.
ArthurG - Wednesday, May 21, 2014 - link
well your post shows big ignorance of the products.1/ Tegra 4 was on 28HPL process when S800/801/805 use 28HPM that provides nearly 30% better transistors. oranges vs apples and big advantage to QC
2/ T4 uses A15r1 that is not very well optimized for power efficiency. TK1 is now with A15r3 that provides better efficiency.
3/ Tegra K1 and S800/801/805 are made on the the same 28HPM process, so it's a fair comparison.
That means that T4 vs S800/S8001 was an easy win for QC due to many disadvantage over T4 design. But TK1 vs S80x shows completely different story with both using same node.
Finally, TK1 benchs are in the Xiaomi MiPad, a 7.9" tablet, no fan here, and it still smokes S805...
testbug00 - Thursday, May 22, 2014 - link
I will believe it when I See them... Also, the amazing "950Mhz" clockspeed of the K1? Guess where it is in the MiPad?In the 600Mhz range. NVidia has to downclock its parts to fit into tablets. Much less phones.
2. Process choice is a manufacturing choice. Nvidia could not get a HPM design? They suffered. Anyhow, Qualcomm will still probably smoke on pref/watt... which, once again, is what is really important in the phone and in most tablets.
3. Krait "450" cores are the same Krait cores from the 800 (a 2013 product) with more clockspeed. A15r3 is a 2014 product. I can throw meaningless garbage into "fair comparisons" also. You compare the SoC as a whole... K1 will end up faster than the 805. I am convinced of it. Will it matter? Not unless you are looking at putting a chip into a miniPC or a laptop... Or, perhaps a mobile-gamecontroller-with-a-screen. :)
Cannot wait for SHIELD 2.
kron123456789 - Thursday, May 22, 2014 - link
Read this:developer.download.nvidia.com/embedded/jetson/TK1/docs/Jetson_platform_brief_May2014.pdf
fivefeet8 - Thursday, May 22, 2014 - link
The MiPad can get the performance numbers they've shown with a GPU clocked at 600 mhz? And that's a bad thing?testbug00 - Thursday, May 22, 2014 - link
Depends on how/when Samsung, Qualcomm, Mediatek, Rockchip, etc. introduce their next generation chips and how fast they are.Apple's will likely beat this on GPU and CPU while using less power... Because, well, Apple has and continues to spend tons of money on optimization. The biggest part is die size... which is a huge advantage Apple has.
kron123456789 - Thursday, May 22, 2014 - link
Apple is using PowerVR GPU. And only GX6650 is comparable with Tegra K1(but it's have about 600-650MHz max frequency, and, because of that, less GFLOPS)testbug00 - Thursday, May 22, 2014 - link
and if the K1 cannot run at full clockspeed in phones/tablets due to setting reasonable clockspeeds or throttling?Or if PowerVR GPU adds more units, or raises Mhz (no reason to do these, as, raising power consumption (Without partial redesign) and these parts are typically "use X power, get as much speed as possible"
Or if Apple happens to license a GPU architecture from a company they are close with, say, PowerVR... ;)
ArthurG - Thursday, May 22, 2014 - link
apple faster ? proof ?All these SoC manufacturers rely on other companies tech to build their GPU. Unlike Nvidia, they can't come up with something new if their IP supplier doesn't have it. And for now, like it or not, no GPU available this year will be more powerful than mobile Kepler. Shallow it, you can't do anything against that.
testbug00 - Thursday, May 22, 2014 - link
lol. All right, I'll just ignore the fact that Apple made a custom ARMv8 CPU, the fact it has a GPU team, to match its Uncore and CPU teams, and the fact that there are mobile GPU manufacturers who would reasonably license their overall design to Apple. Also, Imagine Tech (and ARM, and Qualcomm, and Samsung, and others) could call make GPUs that were faster than Tegra K1. But, there is no reason to. Waste of money, waste of power (in the design), etc.Oh, and ignore the fact that Apple is willing to spend more money on die. You have to remember that Nvidia is planning to sell these on a profit, that means they need to minimize die space. Typical semiconductor design comes down to Power, Performance and Area. NVidia Tegra seems to follow Performance, Area, Power. Tegra K1 might be the first to go Power, Area, Performance (impossible to tell without real retail products...)
Apple, on the other hand, targets Power, Performance Area. That means that as long as the chip will fit inside the phone, they would be fine making a 200M^2 die. Making a larger die means you can reduce power due to various reasons. Of course, making a die smaller also allows you to reduce power by shortening distances (this (and lack of interdie connect, larger cache and faster caches) is a reason why Maxwell managed to reduce power so much).
I am also using historical precidence:
-NVidia claimed Tegra 2 brought mobile up to parity with Xbo360/PS3 (And Tegra K1, not sure about 3 and 4) which, well, Tegra 2 was not, Tegra K1 will be not (due to bandwidth for the most part, imho. Given it had more bandwidth, it certainly could beat the Xbox 360/PS3)
-Nvidia showed Tegra 4 beating iPad (did it for Tegra 2 and 3? I don't remember) and it lost upon next iPad.
-Nvidia claimed Tegra 2 was great pref/watt. And Tegra 3, and Tegra 4. They all were bad compared to Qualcomm (and Apple)
I don't take Nvidia's claims for much, because, they stink. Hopefully Tegra K1 fixes it. I would rather we did not have a player in the market "die" (read: move to focusing almost wholly on automotive) especially not it being after that company finally got its act together.
name99 - Thursday, May 22, 2014 - link
Without going into Apple being faster, it's clearly silly to claim that "all the SOC manufacturers rely on other companies tech to build their GPU". Who is "all the SOC manufacturers"?Qualcomm use their own GPU, as does nV. Soon enough so will AMD.
Apple is the only interesting SoC manufacturer that uses someone else's tech for their GPU, and their are plenty of indications (eg large buildup in GPU HW hires) that this is going to change soon --- if not with the A8, then with the A9.
fivefeet8 - Thursday, May 22, 2014 - link
It's hard to have coherent dialog with you going on tangents. For this year it seems the competition from Qualcomm will be the 805 which we now know will not be as performant as the Tegra K1.tuxRoller - Thursday, May 22, 2014 - link
How do we KNOW this?I've struggled to find third part, comprehensive benchmarks for either the mipad or the tk1, especially ones that include power draw (those numbers some random person threw up in a forum aren't tremendously useful, with regards the mipad).
Also, the adreno420's drivers are, apparently, not well optimized.
Basically, until AT, toms, or someone similar get their hands on it I won't feel like I know how things stackup.
fivefeet8 - Friday, May 23, 2014 - link
There are slides from the IHV presentation for the MiPad showing power usage patterns. Phoronix also did some testing of the TK1 board which shows power usage well below what some seem to be thinking. As for Adreno drivers, they've always been bad and not well optimized.https://dolphin-emu.org/blog/2013/09/26/dolphin-em...
tuxRoller - Friday, May 23, 2014 - link
When did phoronix release power numbers?The mipad presentation looked like copy paste from the Nvidia material.
The adreno drivers aren't great. Which should tell you how good that hardware really is. Rob Clark, the lead developer of the open source freedreno driver, is already at least matching their performance up to opengl 2.1 gl|es 2. He's mentioned that he's found evidence of the hardware supporting advanced gl extensions not advertised by the driver. This may change as Qualcomm has recently joined linaro so they will probably be seeking to work more with upstream. The end result of that process is always better quality.
Lastly, don't forget that adreno is a legacy of bitboys, and that Qualcomm is no Intel, even though they are the same size. Qualcomm seems to actually be interested in making top performing GPUs.
Ghost420 - Friday, May 23, 2014 - link
isn't Apple's die known to be the biggest of all the SOCs? they can afford to have a big power sucking SOC cuz they can optimize for it. only reason Iphones last long on battery...besides being lower clocked, cuz look how small and dinky that screen is...Ghost0420 - Wednesday, May 28, 2014 - link
Exactly, it's DOWNCLKED to 600Mhz and still spanking the competitionkron123456789 - Thursday, May 22, 2014 - link
"Guess where it is in the MiPad?In the 600Mhz range. " — Proof? BTW, even if it is, MiPad is still more powerful than S805 MDP.
testbug00 - Thursday, May 22, 2014 - link
Proof is loose... based off of talking with someone who has worked closely with Nvidia and TSMC in the past (Tesla, Fermi, a few Tegra chips).They have been quite accurate before... When the tablet comes we will see it.
On the other hand, silence tells us a lot also... Where is Nvidia's talking about their "950Mhz GPU" in a tablet? I think the 600Mhz (by that, I should clarify I mean between 600 and 699) clockspeed band is still quite impressive... Just, well, it shows why the chip won't go into Phones...
kron123456789 - Thursday, May 22, 2014 - link
Well, i don't belive it because of benchmarking results. Nvidia promised 60fps in T-Rex HD, and here it is. And Nvidia also promised 2.5x performance on Manhattan Offscreen test compared Apple's A7. And here it is(30fps vs 13.3fps). Well, almost 2.5x.AnandTechUser99 - Thursday, May 22, 2014 - link
NVIDIA had shown us the benchmarks in January.When the iPad Mini Retina (and other A7 devices) had been benchmarked towards the end of January in Manhattan Offscreen 1080p, they had average scores closer to 667 (10.8 Fps). Based off of this score, the Tegra K1 was offering a score ~2.65x that of the A7.
Sometime around March 10th, the average score for Apple's A7 had changed to 803 (13.0 Fps). I assume there was some sort of software update that boosted their Manhattan score.
Ghost420 - Friday, May 23, 2014 - link
i'm glad that if it is down clocked to 600mhz in the MiPad, benchmarks shows it's still kicking QC buttGhost420 - Friday, May 23, 2014 - link
on a side note, this is very good for SOC competition...hopefullyArthurG - Thursday, May 22, 2014 - link
So read this:http://en.miui.com/thread-22041-1-1.html
5hours heavy 3D game on 6700mAH battery means that TK1 runs with ~3W
and 11 hours on video
so excellent numbers when taking into account the leading performance
testbug00 - Thursday, May 22, 2014 - link
about 5 hours. Hm, I wish they had ran the tests longer... Ideally (for testing purposes, not time purposes) they would have ran everything for 20-25%(+) of the tablets battery life, and used that.OR
Did the tests from 75% or 50% battery.
Using 15 minutes (read: 5%) is a bit low amount of battery wear... It should be pretty accurate, but, I would skew that test a little lower (my experience is phones (over 5 phones) and tablets (over 2 tablets) tend to lose the first few percent of battery the slowest, and, it gives an inflated battery life.
According to the first 5% of battery (from 100% to 95%) on an iPad mini, well, It gave me something like 8 hours of usage playing hearthstone...
Same with video watching on Nook HD+
Same with Video (and reading webpages) watching on 920/1020
Same on reading webpages/playing weak applications on my Moto G
Same with web pages on some cheapass Android LG
Same with video/web pages on my old iPhone 4.
I would guess the real battery is closer to 4-4.5 hours (happy to be proven wrong ^__^) and the SoC is not running max clock (for good reason, no need to run max clock when you already hit a game fps gap, or, the refresh rate of the tablet)
Ghost0420 - Wednesday, May 28, 2014 - link
Doesn't Asus have a TF701T? which uses a Tegra 4 @ 1.9Mhz, WITHOUT A FAN? One bad review, because Toshiba don't know how to make a tablet, and everyone thinks T4 runs hot. The fan in the Shield is very quite and doesn't 'spool' up as if T4 was overheating, it runs very smoothly and the the interface is very smooth. Also, streaming AWAY from your house on good WiFi for PC games is working quite well now.And at least NV is updating its software....consistently...unlike most OEMs.
TheJian - Wednesday, May 21, 2014 - link
Agreed. Also Denver is in house where 810 isn't. I would expect CPU stuff to show Denver is better on power when taxed as Qcom in house won't be around until late 2015 or Q1 2016 since 810 is 1h2015. The GPU will heavily favor NV (even AMD at some point if they get in with a good SOC) as nobody else but AMD has 20yrs of gaming experience. We already have seen K1 is pretty good against the 805. Odd they ran from battery here..."Here even NVIDIA's Shield with Tegra 4 cooled by a fan can't outperform the Adreno 420 GPU"
Anandtech needs to stop makings dumb NV statements. It's a YEAR old device and won't even be in this race, not to mention it's not getting a ton from the fan anyway which is really there for longevity and temps in the hands for hours (we game hard and for hours so keeps it cool). The fan isn't in there to hit 2.5ghz or something, it's 1.9. This device will be facing K1. Stop taking AMD checks to be their portal site, and you can get back to making unbiased reporting without the little BS NV digs.
ArthurG - Thursday, May 22, 2014 - link
So true for the last paragraph !Let's see if Anand will say "TK1 Kepler GPU smokes Adreno 420, providing twice the performance on GFX3.0 bench with half memory bandwidth"
I'm waiting for it, right Anand ?
ams23 - Thursday, May 22, 2014 - link
Technically the difference in memory bandwidth is closer to 50% due to differences in mem. operating speeds for these two SoC's.akdj - Thursday, May 22, 2014 - link
"{"Here even NVIDIA's Shield with Tegra 4 cooled by a fan can't outperform the Adreno 420 GPU"}Anandtech needs to stop makings dumb NV statements. It's a YEAR old device and won't even be in this race, not to mention it's not getting a ton from the fan anyway which is really there for longevity and temps in the hands for hours.... Stop taking AMD checks to be their portal site, and you can get back to making unbiased reporting without the little BS NV digs."
Reading. Comprehension. Even YOU, taking the time to quote and post the comment, specifically relating to an OBJECTIVE benchmark. nVidia isn't ON the market! Unless you buy THEIR pad....and that's extremely niche right Now..."gaming for hours?" Who does THAT on their phone? Even their tablet?? An hour, ok...I see that. But there's a FAN for a reason. Not to just keep your hands cool. You said it yourself. Longevity. Doesn't that DIRECTLY relate to 'cooling' the SoC? The guts? So it can 'live longer?' It wasn't a dig. It certainly wasn't an AMD stamp, WhateverTH that is. Certainly possible it was I that missed the innuendo there. It's a fact though, bro! nVidia, Intel, AMD...ALL late to the 'mobile game'. Intel has the resources to jump into the fray...head first. nVidia doesn't. They're being very careful while maintaining their, again...slowly but certainly 'niche' dedicated GPU activity. With Intel's iGPU performance envelope and TDP, along with its kinda close association with the CPU ;)...increasing demand for smaller, faster and more portable computing is going to destroy nVidia if the K1 projects isn't accepted by more mobile vendors and OEMs. There's a LOT of money in the R&D of these SOCs and to date, the nVidia 'Tegra' solution scared a LOT of OEMs using or considering using their silicon graphically. I think you owe Anand and his crew an apology. I'd be interested as to what your contribution to the world of technology is...it's got to be something incredible! I'm all ears!!! Seriously, for you to disrespect the author of the article as you did...you owe at least an apology. Then, feed the spider. Leave mom's basement. Get a job. Stop playing games all day. And don't 'pick a winner!' You'll NEVER win. It's called gambling. That's why the lights are on in Vegas. It's cool to be a fan of theirs but to post such a silky comment disrespecting one or the MOST respected and intelligent employee of or Anand himself is bad juju. Take it back. Get off the 'net for a couple of days. Get some sunshine. Good for ya!
phoenix_rizzen - Friday, May 23, 2014 - link
I've played games on my phone (LG G2) for over 4 hours at a time (Puzzle Quest 2 is damned addictive). Once for over 6 hours, although I had to plug it in near the end. :) Anytime I get a new, interesting RPG onto my phone, I'll go through bouts of playing it for 4-6 hours at a time.And my daughter has played games on our tablet (2012 Nexus 7) for multiple hours at a time, including some Netflix and local video watching. The battery on that thing tends to only last about 4 hours, though. :(
Just because YOU can't see a reason to play games on your phone for over an hour doesn't mean nobody does that.
Alexey291 - Tuesday, May 27, 2014 - link
Hear hear - i play my psp games emulated on the phone these days. the psp is too much of a pain to carry around (and too old tbh) but some of the old rpgs on it are ossum and yes i can play them for hours on end.kron123456789 - Thursday, May 22, 2014 - link
actually, it's 30fps in Manhattan Offscreen))http://gfxbench.com/device.jsp?benchmark=gfx30&...
sachouba - Thursday, May 22, 2014 - link
Having amazing scores at benchmarks is good, but Nvidia's Soc still aren't compatible with a lot of apps...kron123456789 - Thursday, May 22, 2014 - link
What apps, for example?tviceman - Thursday, May 22, 2014 - link
So Qualcomm will continue to have the better phone SoC in 805, while Nvidia will have the better tablet, set top, and chromebook SoC in TK1.ArthurG - Thursday, May 22, 2014 - link
without integrated modem in S805, I'm not sure it's better than TK1 for super phones. Let's wait for power consumption figures...testbug00 - Thursday, May 22, 2014 - link
How many phones used T4 (not T4i, which, is a good product!) again? One.Nvidia either cannot, or does not offer a compelling solution in phones.
I would say why, but, you would scream "that is not true" as the only evidence is in how OEMs have acted and design wins count.
fivefeet8 - Thursday, May 22, 2014 - link
Market comparisons aside as a taken, the K1 is quite different than the Tegra 4 as far as GPU hardware goes. It should have the chance to be used in Super Phones and mobile devices if only to put pressure on Qualcomm to fix their OpenGL drivers.testbug00 - Thursday, May 22, 2014 - link
Yeah, it supports DX11.2.... no other mobile GPU supports... Wait. Qualcomm does... And I wonder how long until ARM "stock" GPU does... And Imagine Tech's...Nvidia is not displacing Qualcomm most likely. It could happen, but, given OEM relationships, it is unlikely. Even if NVidia has a better product (note: I think they have one that is on best even with Qualcomm (for Phones)) the OEM relations mean a lot. ATI managed to make quite a bit of money despite NVidia having far better products due to OEM relationships.
kron123456789 - Thursday, May 22, 2014 - link
"Yeah, it supports DX11.2.... no other mobile GPU supports... Wait. Qualcomm does... And I wonder how long until ARM "stock" GPU does... And Imagine Tech's..." — Only Tegra K1 supports OpenGL 4.4. DirectX support matters ONLY on Windows. And Tegra K1 supports DirectX 12(because of that: http://blogs.nvidia.com/blog/2014/03/20/directx-12... ).tuxRoller - Friday, May 23, 2014 - link
Why does opengl 4.4 matter? Who would consume it?http://richg42.blogspot.com/2014/05/the-truth-on-o...
fivefeet8 - Friday, May 23, 2014 - link
Unfortunately, OpenGL 4.x will mean little to Android devices until Qualcomm starts actually caring about the API in their drivers. If the K1 actually competes in tablets and superphones, it may actually have the effect of forcing them to through competition.https://dolphin-emu.org/blog/2013/09/26/dolphin-em...
tuxRoller - Friday, May 23, 2014 - link
Look, you don't need to keep linking to that old post. I'm don't need to argue about adreno blob quality. Nvidia, however, lives and breathes on it software, which seems to be not quite as flawless as some would believe.Again, why does gl matter to android? It doesn't even support it. The point of having the embedded profile is to expose enough of gl to create a good balance between high performance/graphics quality and efficiency. Going full blown gl destroys that last advantage (tesselation alone is terrible for battery life).
kron123456789 - Friday, May 23, 2014 - link
Serious Sam 3 for Tegra K1 is using OpenGL 4(and Croteam have Android build that supports OpenGL 4).tuxRoller - Friday, May 23, 2014 - link
Great, but you aren't going to see Google including gl in their builds. The reason? Gles is the best of what can be supported while maintaining decent battery.fivefeet8 - Thursday, May 22, 2014 - link
Hardware level support for API's mean diddly squat if their drivers are crap. My Galaxy S4 with the Snapdragon 600 supports OpenGLes3, but it sucks at it. Nvidia actually has a history of nominal OpenGL support in drivers at least.ams23 - Thursday, May 22, 2014 - link
Who said anything about NVIDIA "displacing" Qualcomm? That is pure silliness. NVIDIA offers a "differentiated" product. Any OEM who wants to offer differentiated high end products may consider using Tegra K1. K1 is also a very versatile product, so it will be used in a wide variety of different products.ams23 - Thursday, May 22, 2014 - link
Companies like Xiaomi that are looking for differentiated products would absolutely consider using Tegra K1 in a high end smartphone. Considering the performance and perf. per watt advantages for Tegra K1, there are clearly compelling reasons to use it. But again, like I said above, the lions share of phones outside of Apple and Samsung will go to Qualcomm (bundled modem) and Mediatek (lowest cost).phoenix_rizzen - Friday, May 23, 2014 - link
Snapdragon S4 Pro didn't have an integrated modem (APQ80xx) and it sold quite well into phones like the LG Optimus G / Nexus 4.jerrylzy - Thursday, May 22, 2014 - link
The Memory Interface of Snapdragon 805 should be 2 x 64bit instead of 4 x 32bit...Zaydax - Thursday, May 22, 2014 - link
Just noticed: First page of the review says 32 nm. Aren't these all 28nm?As always, great review!
Zaydax - Thursday, May 22, 2014 - link
Never mind. totally read that wrong. It said 32 bit...jjj - Thursday, May 22, 2014 - link
At least it's not just a rebrand like the 801, we want 20nm already!On the flip side, really looking forward to the A53 SoCs about to arrive from Qualcomm and Mediatek, the Allwinner A80 quad A15, the Rockchip RK3288 with it's quad A17 (not sure it's not A12 ,we'll see soon i guess) and the quad A17 MediaTek MT6595. The more budget side should be getting some nice perf boost.
And while at it, heave you guys heard anything about Intel investing/collaborating with Rockchip?
hahmed330 - Thursday, May 22, 2014 - link
I actually found very detailed power consumption figures for Jetson Tegra K1.... I am quite surprised how low power it is... I am quite surprised anand missed that... Here is the link....http://developer.download.nvidia.com/embedded/jets...
Also for even more geeky details including the schematics...
https://developer.nvidia.com/jetson-tk1-support
hahmed330 - Thursday, May 22, 2014 - link
660 mW at idle... 3660 mW running at iPhone 5S speed... At full load at 950mhz 6980mW...These numbers are SOC+DRAM...
henriquen - Thursday, May 22, 2014 - link
"Tegra K1 performance measured on Jetson TK1 platform running LINUX"Ryan Smith - Thursday, May 22, 2014 - link
Memory bandwidth! Sweet, sweet memory bandwidth!bradleyg5 - Thursday, May 22, 2014 - link
Hope this chipset doesn't just come on 2560x1440 screens because the bump in performance isn't going to match the increased demands of that resolution.The only test you did that reflects real world gaming(in my experience) is 3d mark so it's pretty shocking how poor it does.
40% improved performance might as well mean nothing because the current chips rarely even run full power. So basically a 20% power efficiency bump is the only thing that will be actualized.
X-Com will drain my note3's entire battery in about 2 1/2 hours where the 800 runs full power.
The better shader support is by far the biggest news, direct x 11 effects are solely needed, shaders in games currently look like direct x 7. So that's a massive leap.
All the visual fidelity currently comes from running games at very high resolution with high poly counts. I wonder if in the next generation we are going to see developers run games under native resolution, think of what you could do with shaders running at 720p instead of 1440p. Now that you have all the advanced lighting and shadowing effects why would you want to burn up all your performance on super high resolutions.
The xbox 1 and ps4 don't even run most games at 1080p it's sort of crazy that mobile games do. You aren't going to get ps3 level graphics at 1440p but you could at 720p.
bradleyg5 - Thursday, May 22, 2014 - link
http://imgur.com/a/fkUhfCharts!
phoenix_rizzen - Friday, May 30, 2014 - link
Broken links!hardeswm - Thursday, May 22, 2014 - link
Anand,Would you consider including at least 1 Intel/AMD 'cost-effective' cpu, integrated graphics, and 'cost-effective' discrete gpu in these charts? A price per chip comparison would be very cool as well.
I am really interested in watching the performance gap close over the next year.
Thanks!
texasti89 - Thursday, May 22, 2014 - link
Nice GPU features & moderate performance bump. Not impressive, though. This release shows how bad we need to migrate to the next 20nm process node. Industry has done all they can to squeeze the heck out of current 28nm node. It's really frustrating to observe this stagnation of process technology.
rocketbuddha - Thursday, May 22, 2014 - link
IMHO, I do not think 805 will get much in the area of phones. Manufacturing wise it is more expensive. It needs to have a separate Gobi MDM to provide LTE connectivity. Instead the cheaper integrated 800/801 which are MSMs will continue to be used.IT could very well be a repeat of Q3-12-Q2-13 where in when the first QC Kraits appeared sans the integrated baseband.
That was a brief window of time that Note 2 as well as HTC One X+ sported Exynos and Tegra 3+ variants as the SOC and plus a Gobi MDM for LTE connectivity. So maybe Note 4 would have the next Exynos version with the Gobi MDM for the US version of the handset.
Tanclearas - Thursday, May 22, 2014 - link
Is that really correct that Snapdragon 808/810 will revert back to 2 x 32b memory interface? That doesn't seem right.frostyfiredude - Thursday, May 22, 2014 - link
810 is 2x32b LPDDR4 1600. 25.6GB/s, same as 805808 is 2x32b LPDDR3 933. 14.9GB/s, less than the others but still quite a lot of BW considering the GPU is smaller too.
Tanclearas - Friday, May 23, 2014 - link
The combination of a new 64-bit core and higher clocked GPU (note that the 810 shows Adreno 430) will put additional strain on the memory bandwidth. I think I'd rather see 4x32b LPDDR3 933.frostyfiredude - Friday, May 23, 2014 - link
Considering the hugely more powerful APUs from AMD and Intel are happy with 25.6GB/s using architectures which are less bandwidth conscientious; I'm sure these Snapdragons will be limited by thermals far before memory.vortmax2 - Thursday, May 22, 2014 - link
Bring on the Note 4 with this!Arbie - Friday, May 23, 2014 - link
Yes, but can it run Crysis?Krysto - Sunday, May 25, 2014 - link
No. But Tegra K1 probably can, not just because it's twice as fast, but because it has a much more PC-parity API than Adreno 420.errorr - Friday, May 23, 2014 - link
I wonder what the heuristics are like for switching graphics from TLDR to direct rendering are and whether they are tweaked depending on tablet or phone usage.Abandoning TLDR to use that giant amount of bandwidth is going to use oodles of power. This is why I doubt you will see NV Kepler on a phone and restrict it to tablets.
fteoath64 - Saturday, May 24, 2014 - link
True. Remember NV do have Maxwell shipping on desktop form so its mobile variant is probably in testing/optimization phase right now with possibly reserving it for near future opposition.Krysto - Sunday, May 25, 2014 - link
Denver + Maxwell at FinFET 16nm, now THAT'S a beauty to behold. Hopefully Nvidia doesn't have anymore delays, though.Krysto - Sunday, May 25, 2014 - link
No VP9 decoding? And when is Qualcomm going to open source its baseband firmware? Otherwise we can just assume it's backdoored these days, therefore making the whole phone vulnerable to attacks and espionage.Keermalec - Saturday, August 2, 2014 - link
OK so your excuse, Anand, for not including Tegra K1 or Exynos 5433 in your benchmarks was that you didn't have them available on May 21. Now you do as K1 and Exynos 5433 are shipping while Snapdragon 805 still is not.So can you correct this article to show 805's performance vs the real competition? Pitting it against last year's Tegra 4 and Exynos 5 Octa is just wrong.
HibikiTaisuna - Tuesday, August 19, 2014 - link
Wow, didn't expect that my Tegra K1 is so much better than these SoCs. 50% more graphics performance and even 10% to 20% more CPU Performance (the week part of the Tegra K1).Edwardnew - Tuesday, April 14, 2020 - link
Hi, here on the forum guys advised a cool Dating site, be sure to register - you will not REGRET it https://bit.ly/34EHMey