Both nvidia and Qualcomm say their mobile chips will support DX 12. The way I see it, qualcomm's only advance is that it will be included in nearly ever phow. Nvidia's k1 supports full opengl 4.4, tesselation, along with nvidia specific features that are used in desktop games.
Devices with lower feature levels will be able to support D3D12, just like lower feature-level devices can support D3D11.
K1 is Kepler-based, and therefore feature-level 11_0, which is still great for a mobile GPU.
nVidia Maxwell, as appears in the Geforce 750 and 750ti, and is the basis for next-gen nVidia GPUs, is feature-level 11_2; otherwise, nVidia's recent GPUs have all been 11_0. AMD's GCN 1.1, as in the 290/290x and 260/260x, are feature-level 11_2, and GCN 1.0 (7970, 280x) are 11_1.
The article is a bit mistaken that they've beaten nVidia to feature-level 11_2 support, because of the 750 and 750ti.
How important is a directX discussion in an OpenGL android/iOS world anyway? I don't quite get what they're bragging for about dx. They think winrt will become a hot commodity or something? Isn't 11.2 support win8.1+ only? I can understand why and gpu designer would ignore that market. http://www.guru3d.com/news_story/directx_11_2_anno... If guru3d's statements are wrong now someone give a link please (not directed at you grahaman27). They also note it's running on a 770, so if it can't do 11.2 why use it (it at least does tiled resources part)? http://msdn.microsoft.com/en-us/library/windows/ap... 8.1 required it seems. So again, I'd ask Qcom, "who cares?"
http://www.pcmag.com/article2/0,2817,2453551,00.as... PCmag, maxwell DOES support 11.2...LOL. "The GTX 750 Ti supports DirectX 11.2 and OpenGL 4.4, and 16 lanes of PCI Express 3.0." RE: the regular 750 "It also supports DirectX 11.2 and OpenGL 4.4, and 16 lanes of PCI Express 3.0." well, hmmm... Don't really care who's right, 8.1 is a not important to me (ever). Devs won't use it's features unless they backport them to win7 or win9 is a HUGE hit, which I have my doubts as they seem hellbent on just doing more of the same again.
OpenGL ES3.1 (most important on mobile) and OpenGL4.4 I'd think are far more important than bragging about dx11_2 which if used on mobile would bring you down to under 30fps for sure.
I still haven't seen games that REQUIRE Qcom hardware even from their snapdragongaming page. Every one I checked doesn't mention them at all or any optimizations specifically for them unlike Tegra THD games which will show unique effects etc on tegras.
https://www.youtube.com/watch?v=afFb2JU45gA&ht... outerra engine demo running OpenGL3.3 appearing to do the same crap as dx11.2? http://www.outerra.com/wfeatures.html I really don't get why anyone is talking dx anything for mobile. Android doesn't use is last I checked, and nobody cares about MS on mobile (ok, a small few do). Considering NV owns 65% of discrete (and these are gamers) I don't see 11.2 as an issue anyway. MS used a 770 in their dx11.2 demo so whatever is missing can't be that bad (I've read just 2d crap isn't 11.2 but the 3d stuff is, but who knows). If I read correctly the tiled resources can be done on 11_0 and 11_1 hardware (maybe explains 770 doing it in the demo). Again though, who cares about DX when using android or iOS for what 95% of the world of mobile?
The question is, does which has bettery power consumption and does the K1 include baseband. Because the point is mute if the power consumption is terrible and it will never make it into phones without built-in baseband.
well TK1 is already in Xiaomi MiPad (big big design win), is supposed to be in Nexus 9 with the 64bit Denver version and will come soon in Acer CB5 chromebook, just to name a few.
According to Xiaomi, the MiPad will offer 11 hours of video playback on a 6700mAh battery. However, I would wait for an actual test to see the real world power consumption.
Like Tegra K1, Snapdragon 805 does not have an integrated baseband.
Video playback does not engage the GPU. All that proves is that the SoC is capable of doing the most basic "power down unused pieces when it makes sense". The more relevant sort of benchmark would be something like "how long does the system last playing game X as compared to another device playing game X".
A different (but relevant) sort of benchmark would test battery life under "normal" usage conditions, the idea being to test how efficient the GPU is at handling not demanding tasks but the sort of basic compositing and animation that makes up the UI.
Video playback does use some of the GPU for rendering and decoding of HD streams. BSplayer on Android also has a hardware render backend for supported GPU's which suprisingly increases battery life.
Accelerated video decoding is done by fixed function hardware. It's embedded on the GPU but it's totally independent of the programmable parts of the GPU, which is what actually matters when you really want to compare power and performance.
a 'mute' point would be to not mention it, it will be unusable... (i am not sure you can use it in this way but, I think you mean... a 'moot' is a good idea even if disadvantages outweight the advantages of using it ...
An official Nvidia whitepaper indicates that the K1 SoC with RAM in GFXBench 3.0 uses 7W of power (not including other hardaware -- eg. screen, radio, etc). This is ridiculously high power draw, certainly not fit for something like a smartphone and even pushing it for a tablet.
I had erred, in that the power draw was taken for the dev-board, but not using low power, mobile-optimized DRAM. The power consumption at peak in GFXBench 3.0 is still a bit high for a smartphone (though maybe not a phablet) at around 3.85W, it is very well suited for a tablet.
The problem is in that the power compustion increases exponentionally as you increase the clock speed. So if tegra can get 7,29 FPS at 1W power, it has to go a lot higher than 3,85 W to get 28,1, it's far from a linear relation.
Title: Qualcomm Launches GPU for Android Gaming In this article: Direct3D this, Direct3D that.
Not to pick on the author because Qualcomm loves to talk about D3D in all their announcements... but why? D3D has practically no adoption in mobile at this time -- none; and even MS seems to be de-emphasizing Win RT for their tablets.
From a mobile perspective, what really matters is how well their Open GL/ES implementations are.
Very true, this article talks mainly about D3D... in Android Gaming ! And regarding Qualcomm OpenGL, it's subpart, broken and papersheet features without any dev support, even in their own forum... compared to Nvidia, it's a joke
The Direct3D talk is due to the fact that this is how we classify GPU feature sets at the moment. Direct3D offers a larger feature set than OpenGL ES 3.1, so once you exceed 3.1's feature set you need another way to classify them.
Wouldn't it be better to specify the features by name like multidraw indirect or bindless textures or whatever? How many people know what feature sets constitute D3D 11_2?
At the end of the day more people will understand and appreciate something like D3D FL 11_2 than they would draw indirect, scatter/gather, or bindless resources.
This problem will eventually go away when OpenGL ES catches up and these features become part of the standard.
K1 supports OpenGL 4.4 also. Classified now as above Qcom for Android purposes correct? Direct3D talk is pointless in a MOBILE world ruled by android/iOS. Once again taking a shot at NV for, well no reason but that AMD portal ;) Let me know when DX works on android or iOS :) Until then only OpenGL ES3.1/OpenGL 4.4 matter right? Or are you saying Winrt has a huge market now? I must have missed that if it happened overnight.
Is it a publi-reportage for QC or what ? Only one mention of Tegra K1 to say that the Andreno feature set is higher (which is not true). Why don't you say that 420 doesn't support full OpenGL 4.4 like K1 ? Why don't you say that 805 is limited to Android and maybe WinRT when K1 support all of that plus Linux (ubuntu) ? Why don't you say that Adreno drivers s*cks big time compared to Nvidia golden standard ? But more important, what don't say that Adreno 420 is completely destroyed in performance by K1 ? very very oriented article :(
@RyanSmith: You should double emphasize that last sentence. While functionality is the same, it is nowhere near as fast as a hd4400 in a core i3 or the PS4 gpu, let alone a GTX780/R9290X. So while an ARM compiled Battlefield4 might run i doubt you'd get playable framerates - even on ultra low settings.
Not to mention hilariously they're dealing with higher res most of the time than the PC crowd - with all the 1440p+ high end devices coming out far outnumbering the % of PC gamers rocking high DPI monitors
What is hilarious is your guys thinking about the resolution in mobile from a desktop perspective.
According to you, none of the Android and IOS gaming should be possible, yet we see GTA San Andreas and the likes running flawlessly at 1440p. Miraculous!
They aren't running at 1440 natively but some fraction there of, say 720 and scaled up. The nice thing about high DPI screens is that you can have your cake and eat it to depending on what's important to you.
nvidia doesn't much care about MP16 since Tegra 4 and onwards. That is why they perform poor as compared to Imagination Tech's/Adreno counterpart on some android benchmarks. They mostly want their GPU cores to be more flexible (in other words they want it to work on all sort of devices including AIOs and Cars; and desktop GPUs of course). I am pretty sure that 32 bit operations are much faster on Tegra K1 than any other existing/next one year mobile SOC.
D3D11? Ok, that matters to the small install base of people with Windows Phones. I guess it allows them to get design wins for Windows Phones.
OpenGL seems to be the way the game industry needs to move. It allows them to run on Windows, OSX, Android, iOS and on any consule. While D3D is limited to Windows and Xbox. What is the point?
Btw, I don't believe feature level 11.2 is a thing, it's 11.1 with different tiers of support for tiled resources. I'll give you some slack Ryan since it was Qualcomm pr that messed up. ;)
If the CPU is a 4 cores could some one tell how many cores used for the Adreno GPU and are they dedicated for the GPU use only or they are shared ?? If the total cores for both CPU and GPU in the Snapdragon chipset are 4 cores that means Apple A8 chipset has the advantage since it has 6 cores in total 4 cores are dedicated for the GPU use only!!
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
65 Comments
Back to Article
grahaman27 - Monday, June 23, 2014 - link
Both nvidia and Qualcomm say their mobile chips will support DX 12. The way I see it, qualcomm's only advance is that it will be included in nearly ever phow. Nvidia's k1 supports full opengl 4.4, tesselation, along with nvidia specific features that are used in desktop games.ravyne - Monday, June 23, 2014 - link
Devices with lower feature levels will be able to support D3D12, just like lower feature-level devices can support D3D11.K1 is Kepler-based, and therefore feature-level 11_0, which is still great for a mobile GPU.
nVidia Maxwell, as appears in the Geforce 750 and 750ti, and is the basis for next-gen nVidia GPUs, is feature-level 11_2; otherwise, nVidia's recent GPUs have all been 11_0. AMD's GCN 1.1, as in the 290/290x and 260/260x, are feature-level 11_2, and GCN 1.0 (7970, 280x) are 11_1.
The article is a bit mistaken that they've beaten nVidia to feature-level 11_2 support, because of the 750 and 750ti.
mczak - Monday, June 23, 2014 - link
Maxwell (at least gm107) still appears to be feature level 11_0 only.TheJian - Friday, June 27, 2014 - link
How important is a directX discussion in an OpenGL android/iOS world anyway? I don't quite get what they're bragging for about dx. They think winrt will become a hot commodity or something? Isn't 11.2 support win8.1+ only? I can understand why and gpu designer would ignore that market.http://www.guru3d.com/news_story/directx_11_2_anno...
If guru3d's statements are wrong now someone give a link please (not directed at you grahaman27). They also note it's running on a 770, so if it can't do 11.2 why use it (it at least does tiled resources part)?
http://msdn.microsoft.com/en-us/library/windows/ap...
8.1 required it seems. So again, I'd ask Qcom, "who cares?"
http://www.pcper.com/reviews/Graphics-Cards/NVIDIA...
PCper maxwell doesn't support 11.2
http://www.pcmag.com/article2/0,2817,2453551,00.as...
PCmag, maxwell DOES support 11.2...LOL.
"The GTX 750 Ti supports DirectX 11.2 and OpenGL 4.4, and 16 lanes of PCI Express 3.0."
RE: the regular 750 "It also supports DirectX 11.2 and OpenGL 4.4, and 16 lanes of PCI Express 3.0." well, hmmm...
Don't really care who's right, 8.1 is a not important to me (ever). Devs won't use it's features unless they backport them to win7 or win9 is a HUGE hit, which I have my doubts as they seem hellbent on just doing more of the same again.
OpenGL ES3.1 (most important on mobile) and OpenGL4.4 I'd think are far more important than bragging about dx11_2 which if used on mobile would bring you down to under 30fps for sure.
I still haven't seen games that REQUIRE Qcom hardware even from their snapdragongaming page. Every one I checked doesn't mention them at all or any optimizations specifically for them unlike Tegra THD games which will show unique effects etc on tegras.
https://www.youtube.com/watch?v=afFb2JU45gA&ht...
outerra engine demo running OpenGL3.3 appearing to do the same crap as dx11.2?
http://www.outerra.com/wfeatures.html
I really don't get why anyone is talking dx anything for mobile. Android doesn't use is last I checked, and nobody cares about MS on mobile (ok, a small few do). Considering NV owns 65% of discrete (and these are gamers) I don't see 11.2 as an issue anyway. MS used a 770 in their dx11.2 demo so whatever is missing can't be that bad (I've read just 2d crap isn't 11.2 but the 3d stuff is, but who knows). If I read correctly the tiled resources can be done on 11_0 and 11_1 hardware (maybe explains 770 doing it in the demo). Again though, who cares about DX when using android or iOS for what 95% of the world of mobile?
Voltism - Monday, June 23, 2014 - link
It's only fair the 420 gpu has blazing speeds and high benchmarksextide - Monday, June 23, 2014 - link
bhahaha lol ^^lkb - Tuesday, June 24, 2014 - link
Given its sheer amount of Rasta operation units, the blazin speed should be justified.AnandTechUser99 - Monday, June 23, 2014 - link
This is definitely a nice leap for Qualcomm, but Tegra K1 is still quite a bit more powerful.GFXBench 3.0 Manhatten (Offscreen):
Xiaomi MiPad (Tegra K1 32-bit version) - 28.1 Fps
Qualcomm MDP/T (Snapdragon 805) - 17.7 Fps
Apple iPad Air (A7) - 13.0 Fps
Flunk - Monday, June 23, 2014 - link
The question is, does which has bettery power consumption and does the K1 include baseband. Because the point is mute if the power consumption is terrible and it will never make it into phones without built-in baseband.jwcalla - Monday, June 23, 2014 - link
I doubt we'll see TK1 in many phones. Tablets and Chromebooks and purpose-built Android gaming devices are a different story though.I think nvidia has pretty much given up on the phone market.
ArthurG - Monday, June 23, 2014 - link
well TK1 is already in Xiaomi MiPad (big big design win), is supposed to be in Nexus 9 with the 64bit Denver version and will come soon in Acer CB5 chromebook, just to name a few.p3ngwin1 - Monday, June 23, 2014 - link
so no PHONES like he said then.Laststop311 - Tuesday, June 24, 2014 - link
fewer better performing cores is better. Bump that single threaded performance upAnandTechUser99 - Monday, June 23, 2014 - link
According to Xiaomi, the MiPad will offer 11 hours of video playback on a 6700mAh battery. However, I would wait for an actual test to see the real world power consumption.Like Tegra K1, Snapdragon 805 does not have an integrated baseband.
name99 - Monday, June 23, 2014 - link
Video playback does not engage the GPU. All that proves is that the SoC is capable of doing the most basic "power down unused pieces when it makes sense".The more relevant sort of benchmark would be something like "how long does the system last playing game X as compared to another device playing game X".
A different (but relevant) sort of benchmark would test battery life under "normal" usage conditions, the idea being to test how efficient the GPU is at handling not demanding tasks but the sort of basic compositing and animation that makes up the UI.
fivefeet8 - Tuesday, June 24, 2014 - link
Video playback does use some of the GPU for rendering and decoding of HD streams. BSplayer on Android also has a hardware render backend for supported GPU's which suprisingly increases battery life.gonchuki - Tuesday, June 24, 2014 - link
Accelerated video decoding is done by fixed function hardware. It's embedded on the GPU but it's totally independent of the programmable parts of the GPU, which is what actually matters when you really want to compare power and performance.haardrr - Tuesday, June 24, 2014 - link
a 'mute' point would be to not mention it, it will be unusable... (i am not sure you can use it in this way but, I think you mean...
a 'moot' is a good idea even if disadvantages outweight the advantages of using it ...
haardrr - Tuesday, June 24, 2014 - link
a 'moot' point is a... (bad proof-reading)tuxRoller - Monday, June 23, 2014 - link
At what power usage?bengildenstein - Monday, June 23, 2014 - link
An official Nvidia whitepaper indicates that the K1 SoC with RAM in GFXBench 3.0 uses 7W of power (not including other hardaware -- eg. screen, radio, etc). This is ridiculously high power draw, certainly not fit for something like a smartphone and even pushing it for a tablet.http://developer.download.nvidia.com/embedded/jets... (page 13)
tuxRoller - Monday, June 23, 2014 - link
That's pretty high.I'll wait for AT's results.
bengildenstein - Monday, June 23, 2014 - link
I had erred, in that the power draw was taken for the dev-board, but not using low power, mobile-optimized DRAM. The power consumption at peak in GFXBench 3.0 is still a bit high for a smartphone (though maybe not a phablet) at around 3.85W, it is very well suited for a tablet.GC2:CS - Tuesday, June 24, 2014 - link
Do you really think that 2 GB of DDR3L has an power impact of 3,13 Wats ?tuxRoller - Tuesday, June 24, 2014 - link
But, as you say, that doesn't include radios, screen, ports, etc.AnandTechUser99 - Monday, June 23, 2014 - link
That's for the Tegra K1 Jetson development board.From that document:
"Tegra K1 Performance Per Watt on GFXBench 3.0 Manhattan Offscreen"
It says Tegra K1 achieves: "7.29 FPS/Watt"
(28.1 FPS) / (7.29 FPS/Watt) = ~3.85 W
bengildenstein - Monday, June 23, 2014 - link
Indeed. Thanks for the correction. I didn't read the footnotes that claimed that the DRAM used was not low power!GC2:CS - Tuesday, June 24, 2014 - link
The problem is in that the power compustion increases exponentionally as you increase the clock speed. So if tegra can get 7,29 FPS at 1W power, it has to go a lot higher than 3,85 W to get 28,1, it's far from a linear relation.jwcalla - Monday, June 23, 2014 - link
It's like you didn't read any of the footnotes.bengildenstein - Monday, June 23, 2014 - link
Fair point. Thanks for the correction!eddman - Tuesday, June 24, 2014 - link
I think you guys are misinterpreting the figures.In figure 5, it says "Tegra K1 delivers higher performance while consuming same power as A7".
What I gather from that is that K1 will consume less power than an A7 if performance was to be the same, IINM.
If that turns out to be the case and since A7 is efficient enough to be in a phone, then it should be possible to put a K1 in a phone too.
wicketr - Monday, June 23, 2014 - link
Damn. 4x faster than my M7 and my device is barely 12 months old. That's insane.jwcalla - Monday, June 23, 2014 - link
Title: Qualcomm Launches GPU for Android GamingIn this article: Direct3D this, Direct3D that.
Not to pick on the author because Qualcomm loves to talk about D3D in all their announcements... but why? D3D has practically no adoption in mobile at this time -- none; and even MS seems to be de-emphasizing Win RT for their tablets.
From a mobile perspective, what really matters is how well their Open GL/ES implementations are.
ArthurG - Monday, June 23, 2014 - link
Very true, this article talks mainly about D3D... in Android Gaming !And regarding Qualcomm OpenGL, it's subpart, broken and papersheet features without any dev support, even in their own forum... compared to Nvidia, it's a joke
Ryan Smith - Monday, June 23, 2014 - link
The Direct3D talk is due to the fact that this is how we classify GPU feature sets at the moment. Direct3D offers a larger feature set than OpenGL ES 3.1, so once you exceed 3.1's feature set you need another way to classify them.jwcalla - Monday, June 23, 2014 - link
Wouldn't it be better to specify the features by name like multidraw indirect or bindless textures or whatever? How many people know what feature sets constitute D3D 11_2?Ryan Smith - Tuesday, June 24, 2014 - link
At the end of the day more people will understand and appreciate something like D3D FL 11_2 than they would draw indirect, scatter/gather, or bindless resources.This problem will eventually go away when OpenGL ES catches up and these features become part of the standard.
TheJian - Friday, June 27, 2014 - link
K1 supports OpenGL 4.4 also. Classified now as above Qcom for Android purposes correct? Direct3D talk is pointless in a MOBILE world ruled by android/iOS. Once again taking a shot at NV for, well no reason but that AMD portal ;) Let me know when DX works on android or iOS :) Until then only OpenGL ES3.1/OpenGL 4.4 matter right? Or are you saying Winrt has a huge market now? I must have missed that if it happened overnight.Madpacket - Tuesday, June 24, 2014 - link
+1Like who really cares about D3X features for an ASIC that will power Android devices? The 5 people who bought RT devices?
ArthurG - Monday, June 23, 2014 - link
Is it a publi-reportage for QC or what ?Only one mention of Tegra K1 to say that the Andreno feature set is higher (which is not true). Why don't you say that 420 doesn't support full OpenGL 4.4 like K1 ? Why don't you say that 805 is limited to Android and maybe WinRT when K1 support all of that plus Linux (ubuntu) ? Why don't you say that Adreno drivers s*cks big time compared to Nvidia golden standard ?
But more important, what don't say that Adreno 420 is completely destroyed in performance by K1 ?
very very oriented article :(
T1beriu - Monday, June 23, 2014 - link
Dude, it's an announcement, not a review. Chill.tuxRoller - Monday, June 23, 2014 - link
Um, given freedreno drivers you could certainly run fedora on a 805 Dev board.Also, let's wait and see the power consumption numbers when loaded.
tuxRoller - Monday, June 23, 2014 - link
One more thing. Those were prerelease drivers used in that article. I'm not sure how much difference that'll make, thoughArthurG - Tuesday, June 24, 2014 - link
Thanks for mentioning Freedreno existence, it proves again how much official Adreno drivers are bad !tuxRoller - Tuesday, June 24, 2014 - link
You don't know the half of it.http://bloggingthemonkey.blogspot.com/2014/06/fire...
Though I don't expect at to talk about this, it's still important info for consumers.
tuxRoller - Monday, June 23, 2014 - link
This article focused on the DX features but given the miniscule windows phone share I don't see how significant it is.bernstein - Monday, June 23, 2014 - link
@RyanSmith: You should double emphasize that last sentence. While functionality is the same, it is nowhere near as fast as a hd4400 in a core i3 or the PS4 gpu, let alone a GTX780/R9290X. So while an ARM compiled Battlefield4 might run i doubt you'd get playable framerates - even on ultra low settings.wintermute000 - Monday, June 23, 2014 - link
Not to mention hilariously they're dealing with higher res most of the time than the PC crowd - with all the 1440p+ high end devices coming out far outnumbering the % of PC gamers rocking high DPI monitorsjuhatus - Tuesday, June 24, 2014 - link
Yep, the age old "Does it run Crysis?", still stands. With 4k screens its hilariously absurd.It does, with just 0-3fps slides..
darkich - Tuesday, June 24, 2014 - link
What is hilarious is your guys thinking about the resolution in mobile from a desktop perspective.According to you, none of the Android and IOS gaming should be possible, yet we see GTA San Andreas and the likes running flawlessly at 1440p.
Miraculous!
tuxRoller - Wednesday, June 25, 2014 - link
They aren't running at 1440 natively but some fraction there of, say 720 and scaled up.The nice thing about high DPI screens is that you can have your cake and eat it to depending on what's important to you.
darkich - Wednesday, June 25, 2014 - link
^ that was my very point.tuxRoller - Friday, June 27, 2014 - link
Ergh, yeah. I mistook your point😊eddman - Tuesday, June 24, 2014 - link
A sort of off-topic question; did AT ever write about the Mali-T7xx series of GPUs?Wonder how adreno 4xx will fare against them.
Mali-T760 seems to be the GPU that will give K1 a hard time.
Any thoughts on that?
Ryan Smith - Tuesday, June 24, 2014 - link
Although we're not in a position to talk about performance, we'll be talking about Mali's architecture very soon.przemo_li - Tuesday, June 24, 2014 - link
+1And folks. Switch to OpenGL versions. (Aspecially since one can get very good per-extension expanations of what new stuff in given OGL do)
DX FL ... Even Nvidia do not cares about them that much ;)
darkich - Tuesday, June 24, 2014 - link
According to Antutu 3D test, Mali T760 clearly outperforms the Adreno 420http://www.gsmarena.com/exynos_5433_gets_benchmark...
Another advantage of Mali 760 is the scalability, so the MP16 variant could in theory outperform even the K1 GPU
tuxRoller - Tuesday, June 24, 2014 - link
According to antutu, the tegra 4 outperforms the snapdragon 801, iirc.darkich - Wednesday, June 25, 2014 - link
Antutu 3D test runs at native resolution so it is no wonder that Tegra 4 at 720p indeed performs better than Snapdragon 801 at 1080p.daku123 - Wednesday, June 25, 2014 - link
nvidia doesn't much care about MP16 since Tegra 4 and onwards. That is why they perform poor as compared to Imagination Tech's/Adreno counterpart on some android benchmarks. They mostly want their GPU cores to be more flexible (in other words they want it to work on all sort of devices including AIOs and Cars; and desktop GPUs of course). I am pretty sure that 32 bit operations are much faster on Tegra K1 than any other existing/next one year mobile SOC.tuxRoller - Friday, June 27, 2014 - link
Ah, that would explain it. Thanks for the info.Really makes antutu pretty useless if you want to compare soc, though.
danjw - Tuesday, June 24, 2014 - link
D3D11? Ok, that matters to the small install base of people with Windows Phones. I guess it allows them to get design wins for Windows Phones.OpenGL seems to be the way the game industry needs to move. It allows them to run on Windows, OSX, Android, iOS and on any consule. While D3D is limited to Windows and Xbox. What is the point?
przemo_li - Tuesday, June 24, 2014 - link
"any console"Not a single non-android/non-linux console support full modern OpenGL nor any modern OpenGL ES.
Willardjuice - Wednesday, June 25, 2014 - link
Btw, I don't believe feature level 11.2 is a thing, it's 11.1 with different tiers of support for tiled resources. I'll give you some slack Ryan since it was Qualcomm pr that messed up. ;)Adel - Wednesday, October 8, 2014 - link
If the CPU is a 4 cores could some one tell how many cores used for the Adreno GPU and are they dedicated for the GPU use only or they are shared ??If the total cores for both CPU and GPU in the Snapdragon chipset are 4 cores that means Apple A8 chipset has the advantage since it has 6 cores in total 4 cores are dedicated for the GPU use only!!