Actually it's believed to be DHT buildup, but there a lot of breakthroughs that are able to regrow hair straight from the skin/stem cells w/o needing the hair follicle. They did it on rats, but it grew back colorless, human testing is either underway or soon.
As for Nvidia, could they up the release date for phones? My contract's up in August ;)
Some guy in tights reversing time by rotating the planet backwards is as believable as perfect scaling from 2 to 4 cores, so I suppose "Kal-El" is an appropriate name.
I think the assumption is that a future iteration of the architecture introduced with Kal-El will be a die shrink. (probably the 2x one that will be implemented in 2012) At least that is what I am assuming until Nvidia proves me wrong.
Who cares about four cores? I want Android UI running on the GPU. Most people who see an Android phone will immediately do two things: swipe through home screens and scroll through the app list. If they see lag and choppiness they inevitably compare to the iPhone and assume that the phone as a whole is not as fast and responsive; end-of-story for them.
The restrictions of the iphone and Apples iTunes lock-in barely makes it worth it.
Also, part of the problem is the carriers software doesn't have great performance. My Droid 2 is running Liberty 1.5 and the home screens, app drawer, and contacts all scroll smoothly. Not so with the stock ROM, which stuttered a lot and got slower over time. I suggest anyone with a Droid 2 or Droid X check out Liberty.
Android 2.3 finally, FINALLY has GPU acceleration. So the Samsung Galaxy S2 will have it for instance.
I totally agree with you though, Google should have had GPU acceleration a long time ago, most people i know who have iPhones complain about the choppiness of Android when scrolling/animating. Windows Phone 7 actually has the smoothest OS of them all though, it's so well polished. MS done it literally perfectly on the first release. Android takes many versions to fix this, and it's still not as good, but atleast GPU acceleration is there now.
Someone listen to this guy! I love my Android phone but god damnit even a half decent one is a little laggy and sluggish at points. The iphone sucks in a lot of silly ways but it's always a smooth and consistent feel, even if it fakes it - they are clever about it.
Well according to the sales figures that doesn't prevent Android phones from selling well so I don't see a problem from Google and the manufacturers' point of view.
And from my point of view that's absolutely no problem at all : Since I like my phone to be responsive and I dont need 50 widgets I only have one home screen (So I can't swipe through anything) and I disabled every single animation using spareparts... So there's nothing left to accelerate really and now when I use a phone with the animations enabled I'm like "Why do I have to look at the page sliding or zooming in before I can actually begin to read ? It's just a waste of time !"
I guess you just eat protein and vitamin pills etc? What's the point in making tasty food and chewing it? It's just a waste of time.
I guess it's good that you are happy to live in a bland little world, but most people aren't and whether you realise that it's an issue or not, but one of the reasons why many people love LOVE (LOVE!!!!!) their Apple devices, and why they still have a strong presence in the market in spite of their price premium is because they have realised that the swoop, expanding, sliding animations make using their devices a far more fun experience, rather than just being a dull utilitarian device.
You might be happy with what you have, but I think you're in a teeny, tiny minority when the entire market (ie not just techies) is taken into account and that's where the money is to be made. Expect Android and WinMo to try to emulate Apple as much as possible, because if they don't, they will always struggle to lure people, their wallets and their personal data that they might want to mine, away from Apple.
If extra cores make browsing buttery-smooth, I'm all for it. They shouldn't be drawing extra power from your phone unless they are getting some tangible use anyway. (that is assuming some level of power-gating)
I like GPU acceleration too, but there is no reason to ditch a wider architecture because it's effects are not immediately discernible by people browsing phones in the verizon store.
The similarities with Nvidia's first products are strikingly similar to it's venture into the mobile space. Insofar as being a success,
Tegra1 = NV1 (not successful at all) Tegra2 looks like it has become the RIVA 128 - huge turn around and very competitive. Tegra3 (Kal-El) could end up being the RIVA TNT/ TNT2 and the transition to 28nm (probably Wayne) could propel Nvidia to being the #1 mobile CPU manufacturer ---- much like what GeForce did for Nvidia.
Well they haven't sold anything yet. It's business as usual at Nvidia, delivering lots of hype and marketing.
Considering they struggle to keep their desktop parts running cool and with low power, maybe people should focus on the key aspect of a mobile chip, battery life, rather than salivating over cherry picked CPU benchmarks and potential GPU performance which is already out there in pocket melting form.
Nvidia... I literally just committed myself to buying a new smartphone a year, despite only have a full upgrade every two years. Am I really going to have to buy a new smartphone every six months now?!? Really?!?! Not cool Nvidia... actually, pretty cool, but vicious as far as budgets go.
As long you don't need an ark reactor to power it, I'm game too.
If Nvidia can do SoC chips, can it do full-blown non-mobile CPUs too? Just asking... I guess it's just like nuclear arsenal: can do it in 2 years or less, have all the tools, but won't tell you if it did.
They're working on customized ARM chip for servers, called Project Denver, and will be released in 2013. It's mostly focused on performance so they will make it as powerful as an ARM chip can get around that time. It will also be targeted at PC's.
It's interesting that nVidia's Coremark slide uses a recent GCC 4.4.1 build on their Tegras but uses a much older GCC 3.4.4 build on the Core 2 Duo. I can't help but think nVidia is trying to find a bad Core 2 Duo score in order to make their own CPU look more impressive.
... in a mobile phone? Most people only have 2 in their many PC. Agreed, does 2 are much more powerful but still, it will end up the same as on pc side. tons of cores and only niche software using it.
I still have a very old "phone only" mobile. Yesterday I had some time to kill and looked at a few smart phones. And saw exactly what someone described here. They all seemed laggy and choppy, except the iPhone. I'm anything but an apple Fan boy (more like the opposite) but if I where a consumer with 0 knowledge just by playing with the phone I would chose to buy an iPhone.
Did anyone look at the fine print in the chart with the Coremark benchmark?
Not only do they use more aggressive compiler flags for their products than for the T7200, they also use a much more recent version of gcc. At the very least, they are comparing apples and oranges. Actually, I'm more inclined to call it cheating...
This looks like Moore's Law on steroids. I guess (hope?) it is technically possible, simply because for a while now we've had the reverse thing - way slower progress than Moore's Law predicts. So for a brief period we may be able to do some catch-up sprints like this. I don't believe it will last long though.
Another question is if it is economically feasible though. What impact will this have on the prices of the previous generation? If the competition can not catch up, wouldn't nVidia decide to simply hold on to the new one instead of releasing it, trying to milk the old one as much as they can, just like all other greedy corporations in similar industries?
And finally, will consumers find an application for that performance? It not being x86 compatible, apps will have to be made specifically for it and that will take time. I for one can not imagine using a non-x86 machine yet. I need it to be able to run all my favorite games, i.e. WoW, EVE Online, Civ 5 or whatever. I'd love a lightweight 10-12 inch tablet that can run those on max graphics, with a wireless keyboard and touch pad for the ones that aren't well suited for tablet input. But having the same raw power without x86 compatibility will be relatively useless, for now. I guess developers may start launching cool new games for that platform too, or even release the old ones on the new platform where it makes sense (Civ 5 would make a very nice match to tablets, for example), but I doubt it will happen quickly enough.
There is the Mobile World Congress happening right now in the nice city of Barcelona.... almost every company involved in mobile electronics sector is showing off new products, that's why you see only news about smartphones!
@40nm the power draw would be too high for a phone so I don't suppose there's much point having this processor in one until 28nm arrives.
However for the new tablet market you have larger batteries so you can target them with a higher power draw soc (it's still going to be much much smaller then any x86 chip and I expect the big screen will still be sucking most of the power).
Impressive they got it working first time, puts a lot of pressure on competitors who are still struggling to catch up with tegra 2 let alone compete with this.
Very cool chip, lots of great technology. But it will not be successful in the market. a 1080p high profile decode onto a tablet's SXGA display can easily jump into the 1.2GB/s range. if you drive it over HDMI to a TV and then run a small game or even a nice 3D game on the tablet's main screen, you can easily get into the 1.7 to 2GB/s range.
why is this important? a 533Mhz lpddr2 channel has a max theoretical bandwidth of 4.3GB/s. Sounds like enough right? well, as you increase frequency of ddr, your _actual_ bandwidth lowers due to latency issues. in addition, across workloads, the actual bandwidth you can get from any DDR interface is between 40 to 60% of the theoretical max.
So that means the single channel will get between 2.5GBs (60%) down to 1.72 (40%). Trust me, ask anyone who designs SOCs, they will confirm the 40 to 60% bandwidth #. So the part will be restricted to use cases that current single core/single channel chips can do.
So this huge chip with 4 cores, 1440p capable, probably 150MT/s 3D, has an Achilles heel the size of Manhattan. Don't believe what Nvidia is saying (that dual channel isn't required). They know its required but for some reason couldn't get it into this chip.
it makes a lot of sense to differentiate phones from tablets by giving them much faster cpus, higher resolutions and longer battery life. otherwise why get a tablet if you have a cell phone
Seems to me NVidia might be pulling a Qualcomm, meaning they are going with what they have and are trying to stretch it out longer and wider before giving us the complete redesign/refresh. You can see this quite clearly at the MWC right now.
Not a bad strategy as far as I can tell right now. Only threat that I see is that Qualcomm is actually scheduled to release their new core design around the time Nvidia will releasing the Kal-El.
So who's going to win that bet? ;) More IPC VS Raw Ghz/cores. Quite a reversed world too if you ask me, because Qualcomm was never big on IPC and went for the 1Ghz hype.
Hopefully NVidia doesn't make the same mistakes as with the GPU market, building such a revolutionary designs that they actually design "sideways" from the market. Making their GPU's fantastic in certain area's, which might not take off at all.
Mind you, I'm an NVidia fan... but it won't be the first time NVidia releases a revolutionary architecture, which isn't as efficient as they thought it would be. ;)
The only portion of the design that could be considered 'new' is the 1.5x GPU, but given NVIDIA's expertise in that area it's not too surprising that they'd have no issues executing that. The actual core changes consist of using another pre-made component, the MPE, per core and then doubling the number of cores... In other words, I'd be shocked if they got back first silicon and it -didn't- work flawlessly. That's kinda the point of licensing a design that's already fully tested and simply needs to be 'plugged in'.
As for the performance metrics demonstrated... The 'gaming' is most likely due to the improved graphics, which is unquestionable NVIDIA's strength. The "Coremark 1.0" results meanwhile are yet more amusing. If that Kal-el score is indicative of final frequency performance, then I'd expect it to still be running at 1GHz because Coremark is an unrealistic benchmark that scales linearly with number of cores. It's also basically just an integer benchmark (more information is available on their site.) aka, that benchmark implies zero per-core performance increase for Kal-el over Tegra 2.
Being a HUGE Superman fan I have newfound love for NVIDIA since all these processors are codenamed superheroes. Kal-El – Superman Wayne – Batman Logan – Wolverine Stark – Ironman I guess I am easily entertained :)
That is going to cause serious "Osborne effect" to all the Tegra2 tablets that are almost available now; if one can get something "5x better" in 6 months.
So nVidia is claiming better performance than a Core 2 Duo, with power consumption that is suitable for a smartphone? I find that a little hard to believe. Both Intel and AMD are still at least a couple of iterations away from that mark with their current low-power offerings, so nVidia's claims seem a bit suspect, if you ask me.
The Core 2 Duo nVidia is comparing it's chip to is about 5 years old and was originally built on the 65nm process. Tegra 3 will be a 40nm process. Going from 65nm->40nm gives you about 2.56x number of transistors to work with. Also, the T7200 wasn't exactly the fastest that Intel had to offer at the time, but the Tegra 3 is bleeding edge.
We need a Zotac motherboard/nettop with this and a custom Android interface, use office live or whatever and you have a more than enough system for daily usage, multimedia and even casual gaming (and not so casual with gameloft ripoffs)
I was planning on getting a tablet this year; but waiting to see the ipad 2 specs. Seeing how fast the soc chips are evolving is making me wonder whether I'd be better off waiting for next year. I guess that is always the case in gadget world :)
from the looks of it 2012 tablet features could include : * quad core arm processors * High resolution screens (ipad3 is rummored to be 4x this years model). * LTE ? Xoom is already commited to LTE upgrade.
Where is the power consumption numbers? All reviewers that I read still did not post those numbers and I was hoping to see power consumption numbers here, but did not find any. Saying that the Kal-El will have a same power consumption as Tegra 2 does not mean anything to me because I do not know how much power a Tegra 2 realistically consumes. I know how much the BeagleBoard consumes which is around 3 to 5 watts. I expect the PandaBoard to consume about double. If they are going to state the power consumption, I will just predict the power consumption of the Kal-El be around 15 watts. Is this wrong or am I right?
At 40nm, a dual core 2GHz Cortex-A9 uses 2W. So 4 1.5GHz Cortex-A9 cores will use ~2.5-3W. So with the GPU, a total of ~4-5W seems reasonable.
While that seems a lot, consider that the lowest power Atom needs 2W per core at 1.5GHz, and that you'll need 4 of those plus a fast GPU to get similar performance.
Engadget's video around 3:45 has the rep clearly stating that Kal-El will support high-profile H.264. It'll also handle a Blu-ray rip (and I don't really know of any commonly used video that higher quality than that right now).
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
76 Comments
Back to Article
wolfman3k5 - Tuesday, February 15, 2011 - link
Does this thing help hair growth?michael2k - Tuesday, February 15, 2011 - link
Testosterone is known to cause hair loss.vol7ron - Tuesday, February 15, 2011 - link
Actually it's believed to be DHT buildup, but there a lot of breakthroughs that are able to regrow hair straight from the skin/stem cells w/o needing the hair follicle. They did it on rats, but it grew back colorless, human testing is either underway or soon.As for Nvidia, could they up the release date for phones? My contract's up in August ;)
descendency - Wednesday, February 16, 2011 - link
Colorless isn't a problem. Hair dye... duh.Saidas - Wednesday, February 16, 2011 - link
Yes....but only on your back.Camikazi - Tuesday, February 15, 2011 - link
How is there not a picture of Superman there?quiksilvr - Wednesday, February 16, 2011 - link
Because chesty military girl is cheaper.MeanBruce - Wednesday, February 16, 2011 - link
Chesty? Nah she's barely a C-cup. ;)Ronakbhai - Tuesday, February 15, 2011 - link
She's not superman. She's not even Jor-El's wife!GeorgeH - Tuesday, February 15, 2011 - link
Some guy in tights reversing time by rotating the planet backwards is as believable as perfect scaling from 2 to 4 cores, so I suppose "Kal-El" is an appropriate name.jjj - Tuesday, February 15, 2011 - link
does this " The architecture will first ship in a quad-core, 40nm version" mean Kal-El will get a 28 nm version too?softdrinkviking - Wednesday, February 16, 2011 - link
I think the assumption is that a future iteration of the architecture introduced with Kal-El will be a die shrink. (probably the 2x one that will be implemented in 2012)At least that is what I am assuming until Nvidia proves me wrong.
michael2k - Tuesday, February 15, 2011 - link
Even if they can't sustain it, I love that they're trying. Is this the powerhouse behind the NGP?nafhan - Wednesday, February 16, 2011 - link
No, the NGP sounds like it'll be using an SGX GPU. So, that means it'll probably use Samsung or TI.Lucian Armasu - Wednesday, February 16, 2011 - link
The only other company that has announced a quad core chip for this year is Freescale. I believe they have a 1 Ghz quad core Cortex A9 chip.dleonseng - Tuesday, February 15, 2011 - link
first paragraph:"...NVIDIA may not have the entire market but it has enough of it to be take{N} seriously."
NCM - Wednesday, February 16, 2011 - link
Re "...are simply due to learnings it had in the design of Tegra 2s..." (second page)"Learning" means the process of acquiring knowledge, not the things one has learned; it doesn't have a plural form.
Instead try: "...are simply due to things it learned in the design of Tegra 2..."
DarkUltra - Tuesday, February 15, 2011 - link
Who cares about four cores? I want Android UI running on the GPU. Most people who see an Android phone will immediately do two things: swipe through home screens and scroll through the app list. If they see lag and choppiness they inevitably compare to the iPhone and assume that the phone as a whole is not as fast and responsive; end-of-story for them.The restrictions of the iphone and Apples iTunes lock-in barely makes it worth it.
A5 - Tuesday, February 15, 2011 - link
Honeycomb supposedly helps with that and presumably Tegra 3 will be running that or Ice Cream.dcollins - Wednesday, February 16, 2011 - link
Isn't Android 2.3 GPU accelerated?Also, part of the problem is the carriers software doesn't have great performance. My Droid 2 is running Liberty 1.5 and the home screens, app drawer, and contacts all scroll smoothly. Not so with the stock ROM, which stuttered a lot and got slower over time. I suggest anyone with a Droid 2 or Droid X check out Liberty.
B3an - Wednesday, February 16, 2011 - link
Android 2.3 finally, FINALLY has GPU acceleration. So the Samsung Galaxy S2 will have it for instance.I totally agree with you though, Google should have had GPU acceleration a long time ago, most people i know who have iPhones complain about the choppiness of Android when scrolling/animating.
Windows Phone 7 actually has the smoothest OS of them all though, it's so well polished. MS done it literally perfectly on the first release. Android takes many versions to fix this, and it's still not as good, but atleast GPU acceleration is there now.
Lucian Armasu - Wednesday, February 16, 2011 - link
Only Honeycomb has full hardware acceleration. Android 2.3 has better garbage collection that has improved the speed of the UI a bit.AbRASiON - Wednesday, February 16, 2011 - link
Someone listen to this guy!I love my Android phone but god damnit even a half decent one is a little laggy and sluggish at points.
The iphone sucks in a lot of silly ways but it's always a smooth and consistent feel, even if it fakes it - they are clever about it.
BlueScreenJunky - Wednesday, February 16, 2011 - link
Well according to the sales figures that doesn't prevent Android phones from selling well so I don't see a problem from Google and the manufacturers' point of view.And from my point of view that's absolutely no problem at all : Since I like my phone to be responsive and I dont need 50 widgets I only have one home screen (So I can't swipe through anything) and I disabled every single animation using spareparts...
So there's nothing left to accelerate really and now when I use a phone with the animations enabled I'm like "Why do I have to look at the page sliding or zooming in before I can actually begin to read ? It's just a waste of time !"
Aloonatic - Monday, February 21, 2011 - link
I guess you just eat protein and vitamin pills etc? What's the point in making tasty food and chewing it? It's just a waste of time.I guess it's good that you are happy to live in a bland little world, but most people aren't and whether you realise that it's an issue or not, but one of the reasons why many people love LOVE (LOVE!!!!!) their Apple devices, and why they still have a strong presence in the market in spite of their price premium is because they have realised that the swoop, expanding, sliding animations make using their devices a far more fun experience, rather than just being a dull utilitarian device.
You might be happy with what you have, but I think you're in a teeny, tiny minority when the entire market (ie not just techies) is taken into account and that's where the money is to be made. Expect Android and WinMo to try to emulate Apple as much as possible, because if they don't, they will always struggle to lure people, their wallets and their personal data that they might want to mine, away from Apple.
softdrinkviking - Wednesday, February 16, 2011 - link
If extra cores make browsing buttery-smooth, I'm all for it.They shouldn't be drawing extra power from your phone unless they are getting some tangible use anyway. (that is assuming some level of power-gating)
I like GPU acceleration too, but there is no reason to ditch a wider architecture because it's effects are not immediately discernible by people browsing phones in the verizon store.
tviceman - Tuesday, February 15, 2011 - link
The similarities with Nvidia's first products are strikingly similar to it's venture into the mobile space. Insofar as being a success,Tegra1 = NV1 (not successful at all)
Tegra2 looks like it has become the RIVA 128 - huge turn around and very competitive.
Tegra3 (Kal-El) could end up being the RIVA TNT/ TNT2 and the transition to 28nm (probably Wayne) could propel Nvidia to being the #1 mobile CPU manufacturer ---- much like what GeForce did for Nvidia.
djgandy - Wednesday, February 16, 2011 - link
Well they haven't sold anything yet. It's business as usual at Nvidia, delivering lots of hype and marketing.Considering they struggle to keep their desktop parts running cool and with low power, maybe people should focus on the key aspect of a mobile chip, battery life, rather than salivating over cherry picked CPU benchmarks and potential GPU performance which is already out there in pocket melting form.
bplewis24 - Wednesday, February 16, 2011 - link
They already have an article discussing Tegra2's battery life performance.Brandon
tim851 - Tuesday, February 15, 2011 - link
What does Nicolas Cage's son have to do with all of this?softdrinkviking - Wednesday, February 16, 2011 - link
lolsoftdrinkviking - Wednesday, February 16, 2011 - link
Nicholas Cage's son is actually named "Kal-El." Seriously.samirsshah - Tuesday, February 15, 2011 - link
how to compete.j.harper12 - Tuesday, February 15, 2011 - link
Nvidia... I literally just committed myself to buying a new smartphone a year, despite only have a full upgrade every two years. Am I really going to have to buy a new smartphone every six months now?!? Really?!?! Not cool Nvidia... actually, pretty cool, but vicious as far as budgets go.Here I was pining over the Optimus 3D...
MrSpadge - Wednesday, February 16, 2011 - link
You know, "new stuff exists" does not equal "have to buy it" ;pMrS
medi01 - Wednesday, February 16, 2011 - link
How many gigaherzopuxelshmixels does your current phone have?How often do you have to charge it?
mesiah - Wednesday, February 16, 2011 - link
Sounds like best buys buyback plan is in your future :DAuDioFreaK39 - Wednesday, February 16, 2011 - link
Can't wait to have a 24-CPU core Nvidia Tegra mobile processor based on Iron Man architecture in 2014!! \m/bplewis24 - Wednesday, February 16, 2011 - link
Pffft, while you're buying that I'll be waiting to purchase the 32-core SoC believed to be released in Q1 2015.Gonemad - Wednesday, February 16, 2011 - link
As long you don't need an ark reactor to power it, I'm game too.If Nvidia can do SoC chips, can it do full-blown non-mobile CPUs too? Just asking... I guess it's just like nuclear arsenal: can do it in 2 years or less, have all the tools, but won't tell you if it did.
Lucian Armasu - Wednesday, February 16, 2011 - link
They're working on customized ARM chip for servers, called Project Denver, and will be released in 2013. It's mostly focused on performance so they will make it as powerful as an ARM chip can get around that time. It will also be targeted at PC's.Enzo Matrix - Wednesday, February 16, 2011 - link
Superman < Batman < Wolverine < Iron Man?Kinda doing things in reverse order aren't you, Nvidia?
ltcommanderdata - Wednesday, February 16, 2011 - link
It's interesting that nVidia's Coremark slide uses a recent GCC 4.4.1 build on their Tegras but uses a much older GCC 3.4.4 build on the Core 2 Duo. I can't help but think nVidia is trying to find a bad Core 2 Duo score in order to make their own CPU look more impressive.Lucian Armasu - Wednesday, February 16, 2011 - link
I think you missed their point. They're trying to say that ARM chips are quickly catching up in performance with desktop/laptop chips.jordanp - Wednesday, February 16, 2011 - link
On Tegra Roadmap chart.. looks like that curve is leveling at 100 on STARK... reaching the limit boundary of 11nm node?!!Lucian Armasu - Wednesday, February 16, 2011 - link
I think they'll reach 14 nm in 2014. IBM made a partnership with ARM to make 14 nm chips.beginner99 - Wednesday, February 16, 2011 - link
... in a mobile phone? Most people only have 2 in their many PC. Agreed, does 2 are much more powerful but still, it will end up the same as on pc side. tons of cores and only niche software using it.I still have a very old "phone only" mobile. Yesterday I had some time to kill and looked at a few smart phones. And saw exactly what someone described here. They all seemed laggy and choppy, except the iPhone. I'm anything but an apple Fan boy (more like the opposite) but if I where a consumer with 0 knowledge just by playing with the phone I would chose to buy an iPhone.
jasperjones - Wednesday, February 16, 2011 - link
Did anyone look at the fine print in the chart with the Coremark benchmark?Not only do they use more aggressive compiler flags for their products than for the T7200, they also use a much more recent version of gcc. At the very least, they are comparing apples and oranges. Actually, I'm more inclined to call it cheating...
Visual - Wednesday, February 16, 2011 - link
This looks like Moore's Law on steroids.I guess (hope?) it is technically possible, simply because for a while now we've had the reverse thing - way slower progress than Moore's Law predicts. So for a brief period we may be able to do some catch-up sprints like this.
I don't believe it will last long though.
Another question is if it is economically feasible though. What impact will this have on the prices of the previous generation? If the competition can not catch up, wouldn't nVidia decide to simply hold on to the new one instead of releasing it, trying to milk the old one as much as they can, just like all other greedy corporations in similar industries?
And finally, will consumers find an application for that performance? It not being x86 compatible, apps will have to be made specifically for it and that will take time.
I for one can not imagine using a non-x86 machine yet. I need it to be able to run all my favorite games, i.e. WoW, EVE Online, Civ 5 or whatever. I'd love a lightweight 10-12 inch tablet that can run those on max graphics, with a wireless keyboard and touch pad for the ones that aren't well suited for tablet input. But having the same raw power without x86 compatibility will be relatively useless, for now. I guess developers may start launching cool new games for that platform too, or even release the old ones on the new platform where it makes sense (Civ 5 would make a very nice match to tablets, for example), but I doubt it will happen quickly enough.
Harry Lloyd - Wednesday, February 16, 2011 - link
I'm sick of smartphones. All I see here are news about smartphones, just like last year all I saw were news about SSDs.Doesn't anything else happen in this industry?
theagentsmith - Wednesday, February 16, 2011 - link
There is the Mobile World Congress happening right now in the nice city of Barcelona.... almost every company involved in mobile electronics sector is showing off new products, that's why you see only news about smartphones!R3MF - Wednesday, February 16, 2011 - link
nvidia, you have not lost the magic!Dribble - Wednesday, February 16, 2011 - link
@40nm the power draw would be too high for a phone so I don't suppose there's much point having this processor in one until 28nm arrives.However for the new tablet market you have larger batteries so you can target them with a higher power draw soc (it's still going to be much much smaller then any x86 chip and I expect the big screen will still be sucking most of the power).
Impressive they got it working first time, puts a lot of pressure on competitors who are still struggling to catch up with tegra 2 let alone compete with this.
SOC_speculation - Wednesday, February 16, 2011 - link
Very cool chip, lots of great technology. But it will not be successful in the market.a 1080p high profile decode onto a tablet's SXGA display can easily jump into the 1.2GB/s range. if you drive it over HDMI to a TV and then run a small game or even a nice 3D game on the tablet's main screen, you can easily get into the 1.7 to 2GB/s range.
why is this important? a 533Mhz lpddr2 channel has a max theoretical bandwidth of 4.3GB/s. Sounds like enough right? well, as you increase frequency of ddr, your _actual_ bandwidth lowers due to latency issues. in addition, across workloads, the actual bandwidth you can get from any DDR interface is between 40 to 60% of the theoretical max.
So that means the single channel will get between 2.5GBs (60%) down to 1.72 (40%). Trust me, ask anyone who designs SOCs, they will confirm the 40 to 60% bandwidth #.
So the part will be restricted to use cases that current single core/single channel chips can do.
So this huge chip with 4 cores, 1440p capable, probably 150MT/s 3D, has an Achilles heel the size of Manhattan. Don't believe what Nvidia is saying (that dual channel isn't required). They know its required but for some reason couldn't get it into this chip.
overzealot - Monday, February 21, 2011 - link
Actually, as memory frequency increases bandwidth and latency improve.araczynski - Wednesday, February 16, 2011 - link
so if i know that what i'm about to buy is outdated by a factor of two or five not even a year later, i'm not very likely to bother buying at all.kenyee - Wednesday, February 16, 2011 - link
Crazy how fast stuff is progressing. I want one...at least this might justify the crazy price of a Moto Xoom tablet.... :-)OBLAMA2009 - Wednesday, February 16, 2011 - link
it makes a lot of sense to differentiate phones from tablets by giving them much faster cpus, higher resolutions and longer battery life. otherwise why get a tablet if you have a cell phoneyvizel - Wednesday, February 16, 2011 - link
" NVIDIA also expects Kal-El to be somewhere in the realm of the performance of a Core 2 Duo processor (more on this later)."I don't think that you referred to this statement anywhere in the article.
Can you elaborate?
Quindor - Wednesday, February 16, 2011 - link
Seems to me NVidia might be pulling a Qualcomm, meaning they are going with what they have and are trying to stretch it out longer and wider before giving us the complete redesign/refresh. You can see this quite clearly at the MWC right now.Not a bad strategy as far as I can tell right now. Only threat that I see is that Qualcomm is actually scheduled to release their new core design around the time Nvidia will releasing the Kal-El.
So who's going to win that bet? ;) More IPC VS Raw Ghz/cores. Quite a reversed world too if you ask me, because Qualcomm was never big on IPC and went for the 1Ghz hype.
Hopefully NVidia doesn't make the same mistakes as with the GPU market, building such a revolutionary designs that they actually design "sideways" from the market. Making their GPU's fantastic in certain area's, which might not take off at all.
Mind you, I'm an NVidia fan... but it won't be the first time NVidia releases a revolutionary architecture, which isn't as efficient as they thought it would be. ;)
Khato - Wednesday, February 16, 2011 - link
The only portion of the design that could be considered 'new' is the 1.5x GPU, but given NVIDIA's expertise in that area it's not too surprising that they'd have no issues executing that. The actual core changes consist of using another pre-made component, the MPE, per core and then doubling the number of cores... In other words, I'd be shocked if they got back first silicon and it -didn't- work flawlessly. That's kinda the point of licensing a design that's already fully tested and simply needs to be 'plugged in'.As for the performance metrics demonstrated... The 'gaming' is most likely due to the improved graphics, which is unquestionable NVIDIA's strength. The "Coremark 1.0" results meanwhile are yet more amusing. If that Kal-el score is indicative of final frequency performance, then I'd expect it to still be running at 1GHz because Coremark is an unrealistic benchmark that scales linearly with number of cores. It's also basically just an integer benchmark (more information is available on their site.) aka, that benchmark implies zero per-core performance increase for Kal-el over Tegra 2.
supergoodness - Wednesday, February 16, 2011 - link
Being a HUGE Superman fan I have newfound love for NVIDIA since all these processors are codenamed superheroes.Kal-El – Superman
Wayne – Batman
Logan – Wolverine
Stark – Ironman
I guess I am easily entertained :)
ssiu - Wednesday, February 16, 2011 - link
That is going to cause serious "Osborne effect" to all the Tegra2 tablets that are almost available now; if one can get something "5x better" in 6 months.sarge78 - Wednesday, February 16, 2011 - link
Looking forward to your analysis!rs2 - Wednesday, February 16, 2011 - link
So nVidia is claiming better performance than a Core 2 Duo, with power consumption that is suitable for a smartphone? I find that a little hard to believe. Both Intel and AMD are still at least a couple of iterations away from that mark with their current low-power offerings, so nVidia's claims seem a bit suspect, if you ask me.dagamer34 - Saturday, February 19, 2011 - link
The Core 2 Duo nVidia is comparing it's chip to is about 5 years old and was originally built on the 65nm process. Tegra 3 will be a 40nm process. Going from 65nm->40nm gives you about 2.56x number of transistors to work with. Also, the T7200 wasn't exactly the fastest that Intel had to offer at the time, but the Tegra 3 is bleeding edge.Bonesdad - Wednesday, February 16, 2011 - link
can they put Tony Stark above Bruce Wayne??? And Logan....really...T2k - Wednesday, February 16, 2011 - link
Seriously: other than shills like Anand etc does anyone else do this kinda of slave pieces like this?I mean full of BS PR, nothing concrete, only marketing shit for NV?
fr500 - Thursday, February 17, 2011 - link
We need a Zotac motherboard/nettop with this and a custom Android interface, use office live or whatever and you have a more than enough system for daily usage, multimedia and even casual gaming (and not so casual with gameloft ripoffs)tkafafi - Thursday, February 17, 2011 - link
I was planning on getting a tablet this year; but waiting to see the ipad 2 specs.Seeing how fast the soc chips are evolving is making me wonder whether I'd be better off waiting for next year. I guess that is always the case in gadget world :)
from the looks of it 2012 tablet features could include :
* quad core arm processors
* High resolution screens (ipad3 is rummored to be 4x this years model).
* LTE ? Xoom is already commited to LTE upgrade.
Hmmmm .... decisions decisions ....
heinzr - Thursday, February 17, 2011 - link
How can Nvidia claim that Kal-El offers 5x Tegra performance without getting challenged? The Coremark result is better by less than 2x.tecknurd - Thursday, February 17, 2011 - link
Where is the power consumption numbers? All reviewers that I read still did not post those numbers and I was hoping to see power consumption numbers here, but did not find any. Saying that the Kal-El will have a same power consumption as Tegra 2 does not mean anything to me because I do not know how much power a Tegra 2 realistically consumes. I know how much the BeagleBoard consumes which is around 3 to 5 watts. I expect the PandaBoard to consume about double. If they are going to state the power consumption, I will just predict the power consumption of the Kal-El be around 15 watts. Is this wrong or am I right?Wilco1 - Friday, February 18, 2011 - link
At 40nm, a dual core 2GHz Cortex-A9 uses 2W. So 4 1.5GHz Cortex-A9 cores will use ~2.5-3W. So with the GPU, a total of ~4-5W seems reasonable.While that seems a lot, consider that the lowest power Atom needs 2W per core at 1.5GHz, and that you'll need 4 of those plus a fast GPU to get similar performance.
ioannis - Friday, February 18, 2011 - link
surely Parker should have a place in that chart, don't you think? Or is he more of an AMD/ATI colour-themed hero?IamEzio - Friday, February 18, 2011 - link
the NGP has quad core Cortex A9 ..dagamer34 - Saturday, February 19, 2011 - link
Engadget's video around 3:45 has the rep clearly stating that Kal-El will support high-profile H.264. It'll also handle a Blu-ray rip (and I don't really know of any commonly used video that higher quality than that right now).Link: http://www.engadget.com/2011/02/15/nvidia-announce...