That SoC impresses me like no other. I was initially concerned about the previous battery life measurements, but seems to be controllable, and amazingly so. NVidia really screwed up their reputation with Tegra 3, but it seems it's getting back on track. Now for it to actually hit more devices. I'm putting my money on Surface 3.
I would agree. I once owned a HTC One X international and the graphics performance was just not ideal and even lagged behind the A5. It was a disappointment as graphics should be nVidia's strength but I guess they are doing much better with the Tegra 4 and 5 now.
Delicious graphy goodness this early in the morning? Must be my birthday.
Given that scant few games are really going to push the K1 to its limits at this point I don't think there's much to be concerned with right now in terms of battery life. At the Peasant-Standard 30fps it holds up quite well.
1. I'm not seeing K1 SHIELD in the AT Tablet Bench. Will it be added soon?
2. Is there any possibility that GPU part of K1 could be used in an Atom tablet running full Windows? Will Erista support X86/64? Basically I want a kickass tablet like SHIELD, but full Windows.
Whoops, I've released those results, they should be visible now.
The GPU IP block could be used in another SoC, but it's a bit unlikely that NVIDIA would license their GPU to Intel or that Intel would allow NVIDIA to manufacture x86 reference designs or acquire an x86 ISA license.
K1 gives competitive CPU/battery life to Intel, but I don't want to use Android to get the GPU performance I want. I've currently got several Android devices (phones and tablets), but I'm just getting tired of Android's limitations. The problem is that Windows tablets pretty much all suck for GPU performance. Sure, I can use it as a Steam Streamer (which is great), but that isn't always ideal. Do I just need to wait for Broadwell? Will AMD, who already has x86 license, ever build what I want?
Wish list: 10-12" IPS 1080-1600p display (4K?) Atom Z3770 CPU performance (minimum) K1 GPU performance (minimum) Windows 8/9 (non-RT) 5 (heavy) to 10 (light) hours battery life
AMD already has what you want, the Mullins APU is better than an Atom at the CPU side, and similar to the K1 at the GPU side (when compared at a 4.5w TDP, as the Shield has a higher TDP adn gets as hot as pancakes).
They just didn't get any design wins yet, that's the sad part of the story.
If you look at the full Shield Tablet review, you'll see that the results from the "AMD Mullins Discovery Tablet" are the only ones from a mobile SoC to approach the K1's GPU results. But K1 has a TDP of 5-8W (11W at full load), as opposed to Mullin's 4.5W TDP. So given that, I'd say Mullins is roughly equal with the K1 on the GPU front when you look at power/performance.
Of course, Mullins has the advantage that it's an x86 design, and so the GPU performance won't go largely to waste like it does with the K1 (though at least you can use it for emulators on Android). The disadvantage is, of course, that AMD doesn't seem to have any design wins, so you can't actually buy a Mullins-equipped tablet.
That last point makes me rather irritated, since I'd love nothing more than to have a tablet of Shield's caliber/price (though preferably ~10" and with a better screen) powered by Mullins and running Win8.1.
If it's in nVidia's interests to sell GPU IP right now (as they stated last year), why wouldn't they sell it to Intel?
Sure, there was some bad blood during the QPI class action suit, but that was over 3 years ago and hard feelings don't tend to take place over good business decisions in such large companies.
Truth be told, nVidia is in the best position to offer the best possible GPU IP for Intel's SoCs, since PowerVR seems so reluctant to offer GPUs with full DX11 compliance.
Intel will NOT license x86. This is their "crown jewels" and they battled hard enough to limit who could use it after an initial easy share (to start the architecture). If intel could put their Pride aside and use K1 in their Atoms or even laptop processors, this would be a killer. Don't get me wrong, Intel HD cores have grown by leaps and bonds, but they both lag in hardware perfs (they have a much lower silicon area too) and particularly driver development. If Intel wanted to kill AMD (hint: they don't) at their own game (APU), they would license Kepler and integrate it. Imagine an Atom or even Haswell/Broadweel (or beyond) with an NV integrated?
dream laptop right there. dual core i5 with k1's gpu, maybe running at higher clockspeed. integrated, simple, but decently powerful without breaking the bank and good battery life and drivers to boot.
Josh, isn't there a feature on Shield tablet where CPU/GPU clock operating frequencies get reduced when the battery life indicator is < ~ 20%? After all, it takes more than 110 looped runs (!) of the GFXBench 3.0 T-Rex Onscreen test to see any significant reduction in performance, and during that time, the peak frequencies and temps are pretty consistent and well maintained.
Note that if you look at the actual GFXBench 3.0 T-Rex Onscreen "long term performance" scores (which is likely based on something closer to ~ 30 looped runs of this benchmark), the long term performance is consistently very high at ~ 56fps, which indicates very little throttling during the test: http://gfxbench.com/subtest_results_of_device.jsp?...
To my knowledge there isn't such a mechanism active. It may be that there are lower power draw limits as the battery approaches ~3.5V in order to prevent tripping the failsafe mechanisms in the battery.
If I recall correctly, it was mentioned somewhere on the GeForce forums (in the Shield section) by an NV rep that CPU/GPU frequencies get reduced or limited once the battery life indicator starts to get below ~ 20%.
Interesting, I see in settings menu that there is an option to enable CPU/GPU throttling and FPS caps after the battery drops to a certain level, but I've made sure to keep that off for all of these tests.
It would be easy to test if either the OS or some setting is throttling the CPU/GPU when the battery is getting low (< 20%). When you start to see the drop off around loop 110 just plug in the charger and see if the throttling goes away as the battery is recharged.
Looking forward to an update in the article is this proves to be true.
I'm not sure if that's automatic. I think there's a manual frame rate cap you can use to preserve battery life, though I might be confusing that with GeForce Experience.
How does performance mode impact GPU clock speeds? My note 3 is driving me nuts lately, so many games are being released that won't run the GPU over 320mhz. I would say the VAST majority of 3d mobile games being released throttle the GPU between 210mhz and 320mhz.
Here is evidence the annoying effect, this game DOES NOT run at a consistent 30fps yet look how aggressive power management is http://imgur.com/VJEJcVm
"We see max temperatures of around 85C, which is edging quite close to the maximum safe temperature for most CMOS logic. The RAM is also quite close to maximum safe temperatures. It definitely seems that NVIDIA is pushing their SoC to the limit here"
What does the product's datasheet have to say about how close to the limit this really is?
I also would like to know his source. Typical operating temperatures for microcontrollers (MIPS, ARM, PIC) are Junction temperature: -40 to +125 degree Celcius.
Maybe he confuses it with ambient temperature, which typically maxes out at 85 degrees celcius.
TjMax is not quite the same as the maximum sustainable safe temperature. Operating in that region will make leakage worse and reduce the effective lifetime of a part. In addition, the operating temperature for lithium ion batteries is no higher than 50C, and operating at such high temperatures can seriously affect the usable lifetime of a battery: http://www.portal.state.pa.us/portal/server.pt/doc...
i don't understand why you talk about the battery temperature if we talk about the temperature of the die in the SoC? Depending on the thermal design both can be nearly independent of each other. Temperature in every region has an influence of the leakage current, so I don't understand why you bring up this topic. Especially because you probably have no idea in which amount it has an influence, so we can just guess. The same with the lifetime. Just for clarification: We talk about 'the maximum safe temperature for most CMOS logic.' not more not less. And according to lots of datasheets of microcontrollers the safe temperature is up to +125C. Of course, I don't know it for HKMG.
It's possible to thermally isolate battery and the board, but in most devices this is not done as both parts tend to share a metal midframe to aid in heat dissipation. As a result it's not possible to simply ignore battery temperature and focus on SoC temperature. It's likely that both battery and SoC at the maximum temperatures observed in this test are at the highest safe level.
Maximum safe temperature in most datasheets for something like a CPU or GPU would be the point where the device is shut off and/or reset, not a point to throttle to. While it's fully acceptable to run something like a CPU up to 100C continuously with a TjMax of 105C, the MTBF will be noticeably shorter than if the same CPU was run at 70C or less.
Exceeding TjMax is far from the only way to damage an IC with heat. Thermal cycling from high to low temperatures is also a concern, and other components on the board will have reduced lifetime from high temperatures.
I have no doubt NVIDIA has carefully throttled this SoC and ensured that the MTBF of this device is within acceptable range, but it is still quite a high temperature.
"The 95C maximum operating temperature that most 28nm devices operate under is well understood by engineering teams, along with the impact to longevity, power consumption, and clockspeeds when operating both far from it and near it. In other words, there’s nothing inherently wrong with letting an ASIC go up to 95C so long as it’s appropriately planned for."
"AMD no longer needs to keep temperatures below 95C in order to avoid losing significant amounts of performance to leakage. From a performance perspective it has become “safe” to operate at 95C."
When talking about 85C you stated "such temperatures would be seriously concerning in a desktop PC". Are you saying Ryan is likely too optimistic about that desktop device's 95C lifespan?
For a desktop it's usually fully possible to keep temperatures well below 80C by throwing more surface area and CFM at the problem. The same page also cites a cost to longevity, and when upgrade cycles for desktop parts can greatly exceed the warranty period, allowing ~95C core temperature can be much more expensive than louder fan noise or a custom cooling solution.
Looking at all of this. We got a chip that runs fast, but also consumes insane amounts of power for a mobile device. It runs so hot that it has to throttle (for whatever reason), even though it happily runs on potentially damaging temperatures, even with an integrated magnesium heat-spreader, even when running an "uncapped" test that is capped by the display refresh rate at a performance noticeably bellow the off-screen tests. So hot it can't be put into a phone (1440p phones would be happy). Only when running at 30FPS and losing any significant advantage over the competition, we could say that the battery life falls into tablet class. So whats the difference between this tegra and an adreno 330 that gets an 7W power budget, and a heatspreader ? Where are the comparisons ? How does the iPad mini with Retina display compares for example ?
Everything I see is a chip with far higher maximum power draw than the competition and thats all.
If you look at the actual testing, Shield tablet is able to maintain steady temps and steady performance for > 110 (!) continuous GFXBench benchmark loops even in the max performance mode, which is pretty amazing for an 8" thin and fanless tablet. So the end of test throttling does not appear to be related to heat, but is most likely due to the very low battery % capacity that is left at the end of the test which triggers lower CPU/GPU clock operating frequencies.
At 30fps framerate cap, the performance of Shield tablet in the T-Rex Onscreen test is roughly 1.5 higher than iPad Air. With an uncapped framerate, the performance of Shield tablet in the T-Rex Onscreen test is > 2.5x higher than iPad Air.
You need to look further down on the front page. Sometimes two or more articles get posted on the same day, in which case the more recent article gets the large image while the second article gets the small image below it making it seem like that article is very old when in fact it could have been posted just a second before the top article.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
39 Comments
Back to Article
hahmed330 - Friday, August 8, 2014 - link
Blimey! Pretty impressive....lilmoe - Friday, August 8, 2014 - link
That SoC impresses me like no other. I was initially concerned about the previous battery life measurements, but seems to be controllable, and amazingly so.NVidia really screwed up their reputation with Tegra 3, but it seems it's getting back on track. Now for it to actually hit more devices. I'm putting my money on Surface 3.
kruzovsky - Friday, August 8, 2014 - link
The Tegra 3 was actually pretty fast for the time, it just gained a bad rep due to the horrible eMMC ASUS had put on their tablets.HarryATX - Friday, August 8, 2014 - link
I would agree. I once owned a HTC One X international and the graphics performance was just not ideal and even lagged behind the A5. It was a disappointment as graphics should be nVidia's strength but I guess they are doing much better with the Tegra 4 and 5 now.hung2900 - Friday, August 8, 2014 - link
Tegra 2 3 4 all were over advertised and crappy.However, the new Tegra sounds very promising
Muyoso - Friday, August 8, 2014 - link
This tablet and SoC is getting me really excited for the next Nexus tablet which is rumored to be using the 64 bit version.Anonymous Blowhard - Friday, August 8, 2014 - link
Delicious graphy goodness this early in the morning? Must be my birthday.Given that scant few games are really going to push the K1 to its limits at this point I don't think there's much to be concerned with right now in terms of battery life. At the Peasant-Standard 30fps it holds up quite well.
nathanddrews - Friday, August 8, 2014 - link
1. I'm not seeing K1 SHIELD in the AT Tablet Bench. Will it be added soon?2. Is there any possibility that GPU part of K1 could be used in an Atom tablet running full Windows? Will Erista support X86/64? Basically I want a kickass tablet like SHIELD, but full Windows.
JoshHo - Friday, August 8, 2014 - link
Whoops, I've released those results, they should be visible now.The GPU IP block could be used in another SoC, but it's a bit unlikely that NVIDIA would license their GPU to Intel or that Intel would allow NVIDIA to manufacture x86 reference designs or acquire an x86 ISA license.
nathanddrews - Friday, August 8, 2014 - link
Ah, it's under "Mobile 14", not "Tablet".This is what concerns me:
http://www.anandtech.com/bench/product/1212?vs=129...
K1 gives competitive CPU/battery life to Intel, but I don't want to use Android to get the GPU performance I want. I've currently got several Android devices (phones and tablets), but I'm just getting tired of Android's limitations. The problem is that Windows tablets pretty much all suck for GPU performance. Sure, I can use it as a Steam Streamer (which is great), but that isn't always ideal. Do I just need to wait for Broadwell? Will AMD, who already has x86 license, ever build what I want?
Wish list:
10-12" IPS
1080-1600p display (4K?)
Atom Z3770 CPU performance (minimum)
K1 GPU performance (minimum)
Windows 8/9 (non-RT)
5 (heavy) to 10 (light) hours battery life
gonchuki - Friday, August 8, 2014 - link
AMD already has what you want, the Mullins APU is better than an Atom at the CPU side, and similar to the K1 at the GPU side (when compared at a 4.5w TDP, as the Shield has a higher TDP adn gets as hot as pancakes).They just didn't get any design wins yet, that's the sad part of the story.
nathanddrews - Friday, August 8, 2014 - link
Please link me ASAP!kyuu - Sunday, August 10, 2014 - link
If you look at the full Shield Tablet review, you'll see that the results from the "AMD Mullins Discovery Tablet" are the only ones from a mobile SoC to approach the K1's GPU results. But K1 has a TDP of 5-8W (11W at full load), as opposed to Mullin's 4.5W TDP. So given that, I'd say Mullins is roughly equal with the K1 on the GPU front when you look at power/performance.Of course, Mullins has the advantage that it's an x86 design, and so the GPU performance won't go largely to waste like it does with the K1 (though at least you can use it for emulators on Android). The disadvantage is, of course, that AMD doesn't seem to have any design wins, so you can't actually buy a Mullins-equipped tablet.
That last point makes me rather irritated, since I'd love nothing more than to have a tablet of Shield's caliber/price (though preferably ~10" and with a better screen) powered by Mullins and running Win8.1.
ToTTenTranz - Friday, August 8, 2014 - link
If it's in nVidia's interests to sell GPU IP right now (as they stated last year), why wouldn't they sell it to Intel?Sure, there was some bad blood during the QPI class action suit, but that was over 3 years ago and hard feelings don't tend to take place over good business decisions in such large companies.
Truth be told, nVidia is in the best position to offer the best possible GPU IP for Intel's SoCs, since PowerVR seems so reluctant to offer GPUs with full DX11 compliance.
nathanddrews - Friday, August 8, 2014 - link
Let alone DX12...It could probably go both ways, couldn't it? NVIDIA could license X86 from Intel or Intel could use NVIDIA's GPU, if it makes business sense, that is.
frenchy_2001 - Friday, August 8, 2014 - link
Intel will NOT license x86. This is their "crown jewels" and they battled hard enough to limit who could use it after an initial easy share (to start the architecture).If intel could put their Pride aside and use K1 in their Atoms or even laptop processors, this would be a killer.
Don't get me wrong, Intel HD cores have grown by leaps and bonds, but they both lag in hardware perfs (they have a much lower silicon area too) and particularly driver development.
If Intel wanted to kill AMD (hint: they don't) at their own game (APU), they would license Kepler and integrate it.
Imagine an Atom or even Haswell/Broadweel (or beyond) with an NV integrated?
TheinsanegamerN - Friday, August 8, 2014 - link
dream laptop right there. dual core i5 with k1's gpu, maybe running at higher clockspeed. integrated, simple, but decently powerful without breaking the bank and good battery life and drivers to boot.ams23 - Friday, August 8, 2014 - link
Josh, isn't there a feature on Shield tablet where CPU/GPU clock operating frequencies get reduced when the battery life indicator is < ~ 20%? After all, it takes more than 110 looped runs (!) of the GFXBench 3.0 T-Rex Onscreen test to see any significant reduction in performance, and during that time, the peak frequencies and temps are pretty consistent and well maintained.Note that if you look at the actual GFXBench 3.0 T-Rex Onscreen "long term performance" scores (which is likely based on something closer to ~ 30 looped runs of this benchmark), the long term performance is consistently very high at ~ 56fps, which indicates very little throttling during the test: http://gfxbench.com/subtest_results_of_device.jsp?...
JoshHo - Friday, August 8, 2014 - link
To my knowledge there isn't such a mechanism active. It may be that there are lower power draw limits as the battery approaches ~3.5V in order to prevent tripping the failsafe mechanisms in the battery.ams23 - Friday, August 8, 2014 - link
If I recall correctly, it was mentioned somewhere on the GeForce forums (in the Shield section) by an NV rep that CPU/GPU frequencies get reduced or limited once the battery life indicator starts to get below ~ 20%.Thank you for the extended testing!
JoshHo - Friday, August 8, 2014 - link
Interesting, I see in settings menu that there is an option to enable CPU/GPU throttling and FPS caps after the battery drops to a certain level, but I've made sure to keep that off for all of these tests.HighTech4US - Saturday, August 9, 2014 - link
It would be easy to test if either the OS or some setting is throttling the CPU/GPU when the battery is getting low (< 20%). When you start to see the drop off around loop 110 just plug in the charger and see if the throttling goes away as the battery is recharged.Looking forward to an update in the article is this proves to be true.
nathanddrews - Friday, August 8, 2014 - link
I'm not sure if that's automatic. I think there's a manual frame rate cap you can use to preserve battery life, though I might be confusing that with GeForce Experience.bradleyg5 - Friday, August 8, 2014 - link
How does performance mode impact GPU clock speeds? My note 3 is driving me nuts lately, so many games are being released that won't run the GPU over 320mhz. I would say the VAST majority of 3d mobile games being released throttle the GPU between 210mhz and 320mhz.bradleyg5 - Friday, August 8, 2014 - link
Here is evidence the annoying effect, this game DOES NOT run at a consistent 30fps yet look how aggressive power management is http://imgur.com/VJEJcVmJohnny_k - Friday, August 8, 2014 - link
You should also include the fps of the other devices under test so we can get some comparison of perf/wattJoshHo - Friday, August 8, 2014 - link
I've added the graph, but this is unchanged from the original graph in the SHIELD Tablet review.Beerfloat - Friday, August 8, 2014 - link
"We see max temperatures of around 85C, which is edging quite close to the maximum safe temperature for most CMOS logic. The RAM is also quite close to maximum safe temperatures. It definitely seems that NVIDIA is pushing their SoC to the limit here"What does the product's datasheet have to say about how close to the limit this really is?
UpSpin - Friday, August 8, 2014 - link
I also would like to know his source.Typical operating temperatures for microcontrollers (MIPS, ARM, PIC) are
Junction temperature: -40 to +125 degree Celcius.
Maybe he confuses it with ambient temperature, which typically maxes out at 85 degrees celcius.
JoshHo - Friday, August 8, 2014 - link
TjMax is not quite the same as the maximum sustainable safe temperature. Operating in that region will make leakage worse and reduce the effective lifetime of a part. In addition, the operating temperature for lithium ion batteries is no higher than 50C, and operating at such high temperatures can seriously affect the usable lifetime of a battery: http://www.portal.state.pa.us/portal/server.pt/doc...UpSpin - Friday, August 8, 2014 - link
i don't understand why you talk about the battery temperature if we talk about the temperature of the die in the SoC? Depending on the thermal design both can be nearly independent of each other.Temperature in every region has an influence of the leakage current, so I don't understand why you bring up this topic. Especially because you probably have no idea in which amount it has an influence, so we can just guess. The same with the lifetime.
Just for clarification:
We talk about 'the maximum safe temperature for most CMOS logic.' not more not less.
And according to lots of datasheets of microcontrollers the safe temperature is up to +125C.
Of course, I don't know it for HKMG.
JoshHo - Friday, August 8, 2014 - link
It's possible to thermally isolate battery and the board, but in most devices this is not done as both parts tend to share a metal midframe to aid in heat dissipation. As a result it's not possible to simply ignore battery temperature and focus on SoC temperature. It's likely that both battery and SoC at the maximum temperatures observed in this test are at the highest safe level.Maximum safe temperature in most datasheets for something like a CPU or GPU would be the point where the device is shut off and/or reset, not a point to throttle to. While it's fully acceptable to run something like a CPU up to 100C continuously with a TjMax of 105C, the MTBF will be noticeably shorter than if the same CPU was run at 70C or less.
Exceeding TjMax is far from the only way to damage an IC with heat. Thermal cycling from high to low temperatures is also a concern, and other components on the board will have reduced lifetime from high temperatures.
I have no doubt NVIDIA has carefully throttled this SoC and ensured that the MTBF of this device is within acceptable range, but it is still quite a high temperature.
Beerfloat - Friday, August 8, 2014 - link
Josh,From this article:
http://www.anandtech.com/show/7457/the-radeon-r9-2...
"The 95C maximum operating temperature that most 28nm devices operate under is well understood by engineering teams, along with the impact to longevity, power consumption, and clockspeeds when operating both far from it and near it. In other words, there’s nothing inherently wrong with letting an ASIC go up to 95C so long as it’s appropriately planned for."
"AMD no longer needs to keep temperatures below 95C in order to avoid losing significant amounts of performance to leakage. From a performance perspective it has become “safe” to operate at 95C."
When talking about 85C you stated "such temperatures would be seriously concerning in a desktop PC". Are you saying Ryan is likely too optimistic about that desktop device's 95C lifespan?
JoshHo - Friday, August 8, 2014 - link
For a desktop it's usually fully possible to keep temperatures well below 80C by throwing more surface area and CFM at the problem. The same page also cites a cost to longevity, and when upgrade cycles for desktop parts can greatly exceed the warranty period, allowing ~95C core temperature can be much more expensive than louder fan noise or a custom cooling solution.GC2:CS - Friday, August 8, 2014 - link
Looking at all of this.We got a chip that runs fast, but also consumes insane amounts of power for a mobile device. It runs so hot that it has to throttle (for whatever reason), even though it happily runs on potentially damaging temperatures, even with an integrated magnesium heat-spreader, even when running an "uncapped" test that is capped by the display refresh rate at a performance noticeably bellow the off-screen tests.
So hot it can't be put into a phone (1440p phones would be happy).
Only when running at 30FPS and losing any significant advantage over the competition, we could say that the battery life falls into tablet class. So whats the difference between this tegra and an adreno 330 that gets an 7W power budget, and a heatspreader ? Where are the comparisons ? How does the iPad mini with Retina display compares for example ?
Everything I see is a chip with far higher maximum power draw than the competition and thats all.
ams23 - Friday, August 8, 2014 - link
If you look at the actual testing, Shield tablet is able to maintain steady temps and steady performance for > 110 (!) continuous GFXBench benchmark loops even in the max performance mode, which is pretty amazing for an 8" thin and fanless tablet. So the end of test throttling does not appear to be related to heat, but is most likely due to the very low battery % capacity that is left at the end of the test which triggers lower CPU/GPU clock operating frequencies.At 30fps framerate cap, the performance of Shield tablet in the T-Rex Onscreen test is roughly 1.5 higher than iPad Air. With an uncapped framerate, the performance of Shield tablet in the T-Rex Onscreen test is > 2.5x higher than iPad Air.
sonicmerlin - Friday, August 8, 2014 - link
Wth? I check this site almost every day and I never saw the shield test review. Did it show up on the front page?The Von Matrices - Friday, August 8, 2014 - link
You need to look further down on the front page. Sometimes two or more articles get posted on the same day, in which case the more recent article gets the large image while the second article gets the small image below it making it seem like that article is very old when in fact it could have been posted just a second before the top article.przemo_li - Saturday, August 9, 2014 - link
How do You test FPS at the end of test time?How can You make sure that frequencies of CPU/GPU aren't skewed by the OS (for preserving battery life)?
In other words how can You separate GPU performance from OS performance?