There is a reason Intel is bringing 14nm to the atoms in 2014.
The product here doesnt make sense. Its expensive and not better than the one before it, except better gaming - that is, if the drivers work.
I dont know if the SB notebooks i have in the house is the same as the ones Jarred have. Mine didnt bring a revolution, but solid battery life, like the penryn notebook and core duo i also have. In my world more or less the same if you apply a ssd for normal office work.
Loads of utterly uninteresting benchmark doest mask the facts. This product excels where its not needed, and fails where it should excell most: battery life.
The trigate is mostyly a failure now. There is no need to call it otherwise, and the "preview" looks 10% like a press release i my world. At least trigate is not living up to expectations. Sometimes that happen with technology development, its a wonder its so smooth for Intel normally, and a testament to their huge expertise. When the technology matures and Intel makes better use of the technology in the arch, we will se huge improvements. Spare the praise until then, this is just wrong and bad.
Seriously!? You're going to mention Atom as the first comment on Ivy Bridge? Atom is such a dog as far as performance is concerned that I have to wonder what planet you're living on. 14nm Atom is going to still be a slow product, only it might double the performance of Cedar Trail. Heck, it could triple the performance of Cedar Trail, which would make it about as fast as Core 2 CULV from three years ago. Hmmm.....
If Sandy Bridge wasn't a revolution, offering twice the performance as Clarksfield at the high end and triple the battery life potential (though much of that is because Clarksfield was paired with power hungry GPUs), I'm not sure what would be a revolution. Dual-core SNB wasn't as big of a jump, but it was still a solid 15-25% faster than Arrandale and offered 5% to 50% better battery life--the 50% figure coming in H.264 playback; 10-15% better battery life was typical of office workloads.
Your statement with regards to battery life basically shows you either don't understand laptops, or you're being extremely narrow minded with Ivy Bridge. I was hoping for more, but we're looking at one set of hardware (i7-3720QM, 8GB RAM, 750GB 7200RPM HDD, switchable GT 630M GPU, and a 15.6" LCD that can hit 430 nits), and we're looking at it several weeks before it will go on sale. That battery life isn't a huge leap forward isn't a real surprise.
SNB laptops draw around 10W at idle, and 6-7W of that is going to the everything besides the CPU. That means SNB CPUs draw around 2-3W at idle. This particular IVB laptop draws around 10W at idle, and all of the other components (especially the LCD) will easily draw at least 6-7W, which means once again the CPU is using 2-3W at idle. IVB could draw 0W at idle and the best we could hope for would be a 50% improvement in battery life.
As for the final comment, 22nm and tri-gate transistors are hardly a failure. They're not the revolution many hoped for, at least not yet. Need I point out that Intel's first 32nm parts (Arrandale) also failed to eclipse their outgoing and mature 45nm parts? I'm not sure what the launch time frame is for ULV IVB, but I suspect by the time we see those chips 22nm will be performing a lot better than it is in the first quad-core chips.
From my perspective, to shrink a process node, improve performance of your CPU by 5-25%, and keep power use static is still a definite success and worthy of praise. When we get at least three or four other retail IVB laptops in for review, then we can actually start to say with conviction how IVB compares to SNB. I think it's better and a solid step forward for Intel, especially for lower cost laptops and ultrabooks.
If all you're doing is office work, which is what it sounds like, you're right: Core 2, Arrandale, Sandy Bridge, etc. aren't a major improvement. That's because if all you're doing is office work, 95% of the time the computer is waiting for user input. It's the times where you really tax your PC that you notice the difference between architectures, and the change from Penryn to Arrandale to Sandy Bridge to Ivy Bridge represents about a doubling in performance just for mundane tasks like office work...and a lot of people would still be perfectly content to run Word, Excel, etc. on a Core 2 Duo.
Trigate is not a failure, this move to Trigate wasn't expected to bring any crazy amounts of performance benefits. Trigate was necessary because of the limitations (leaks) from ever smaller transistors. Trigate has nothing to do with the architecture of the processor per se, it's more about how each individual transistor is created on such a small scale. Architectural improvements are key to significant improvements.
Sandy Bridge was great because it was a brand new architecture. If you have been even half-reading what they post on Anandtech, Intel's tick-tock strategy dictates that this move to Ivy Bridge would be small improvements BY DESIGN.
You will see improvements in battery life with the NEW architecture, AFTER Ivy Bridge (when Intel stays at 22nm), the so-called "tock," called "Haswell." And yes, tri-gate will still be in use at that time.
As I understand trigate, trigate provides the oportunity to even better granularity of power for the individual transistor, by using different numbers of gates. If you design your arch to the process (using that oportunity,- as IB is not, but the first 22nm Atom aparently is), there should be "huge" savings
I asume you BY DESIGN mean "by process" btw.
In my world process improvement is key to most industrial production, with tools often being the weak link. The process decides what is possible in your design. That why Intel have used billions "just" mounting the right equipment.
No, he means Ivy Bridge is not the huge leap forward by design -- Intel intentionally didn't make IVB a more complex, faster CPU. That will be Haswell, the 22nm tock to the Ivy Bridge tick. Making large architectural changes requires a lot of time and effort, and making the switch between process nodes also requires time and effort. If you try to do both at the same time, you often end up with large delays, and so Intel has settled on a "tick tock" cadence where they only do one at a time.
But this is all old news and you should be fully aware of what Intel is doing, as you've been around the comments for years. And why is it you keep bringing up Atom? It's a completely different design philosophy from Ivy Bridge, Sandy Bridge, Merom/Conroe, etc. Atom is more a competitor to ARM SoCs, which have roughly an order of magnitude less compute performance than Ivy Bridge.
- Intel speeds up Atom development, - not using depreciated equipment for the future. - Intel invest heavily to get into new business areas and have done for years - Haswell will probably be slimmer on the cpu part
The reason they do so is because the need of cpu power outside of the servermarket, is stagnating. And new third world markets is emergin. And all is turning mobile - its all over your front page now i can see.
The new Atom probably will provide adequate for most. (like say core 2 culv). Then they will have the perfect product. Its about mobility and price and price. Haswell will probably be the product for the rest of the mainstream market leaving even less for the dedicated gpu.
IB is an old style desktop cpu, maturing a not quite ready 22nm trigate process. Designed to fight a BD that did not arive. Thats why it does not impress. And you can tell Intel knows because the mobile lineup is so slim.
The market have changed. The shareprice have rocketed for AMD even though their high-end cpu failed, because the Atom sized bobcat and old technology llano could enter the new market. I could note have imagined the success of Llano. I didnt understand the purpose of it, because trinity was comming so close. But the numbers talk for themselves. People buy an user experience where it matter at lowest cost, not pcmark, encoding times, zip, unzip.
You have to use new benchmarks. And they have to be reinvented again. They have to make sense. Obviously cpu have to play a less role and the rest more. You have a very strong team, if not the strongest out there. Benchmark methology should be at the top of your list and use a lot of your development time.
The only benchmarks that would make sense under your new paradigm are graphics and video benchmarks, well, and battery life as well, because those are the only areas where a better GPU matters. Unless you have some other suggestions? Saying "CPU speed is reaching the point where it really doesn't matter much for a large number of people" is certainly true, and I've said as much on many occasions. Still, there's a huge gulf between Atom and Core 2 still, and there are many tasks where CULV would prove insufficient.
By the time the next Atom comes out, maybe it will be fixed in the important areas so that stuff like YouTube/Netflix/Hulu all work without issue. Hopefully it also supports at least 4GB RAM, because right now the 2GB limit along with bloated Windows 7 makes Atom a horrible choice IMO. Plus, margins are so low on Atom that Intel doesn't really want to go there; they'd rather figure out ways to get people to continue paying at least $150 per CPU, and I can't fault their logic. If CULV became "fast enough" for everyone Intel's whole business model goes down the drain.
Funny thing is that even though we're discussing Atom and by extension ARM SoCs, those chips are going through the exact same rapid increases in performance. And they need it. Tablets are fine for a lot of tasks, but opening up many web sites on a tablet is still a ton slower than opening the same sites on a Windows laptop. Krait and Tegra 3 are still about 1/3 the amount of performance I want from a CPU.
As for your talk about AMD share prices, I'd argue that AMD share prices have increased because they've rid themselves of the albatross that was their manufacturing division. And of course, GF isn't publicly traded and Abu Dhabi has plenty of money to invest in taking over CPU manufacturing. It's a win-win scenario for those directly involved (AMD, UAE), though I'm not sure it's necessarily a win for everyone.
I figure Intel wants everyone to want their CULV processors since they seem to charge the most for them to the OEMs, or are the profit margins not that great because they are a more difficult/expensive processor to make?
Yes - video and gaming is what matters for the consumer now, everything is okey as it will - hopefully - be 2014. What matters is ssd, screen quality, and everything else, - just not cpu power. It just needs to have far less space. Cpu having so much space is just old habits for us old geeks.
AMD getting rid of GF burden have been in the plan for years. Its known and can not influence share price. Basicly the, late, move to mobile focus, and the excellent execution of those consumer / not reviewer shaped apus is a part of the reason.
The reviewers need to move their mindset :) - btw its my impression Dustin is more in line with what the general consumer want. Ask him if he thinks the consumer want a new ssd benchmark with 100 hours of 4k reading and writing.
No, the finer granularity is just a nice side effect (which could probably be used more aggressively in the future). However, the main benefit of tri-gate is more control over the channel, which enables IB to reach high clock speeds at comparably very low voltages, and at very low leakage.
Lighter laptops are a design decision by the OEM, not the CPU. Putting in switchable graphics and all the other stuff adds weight, but ASUS chose to go for a more affordable product rather than spending a lot of time and money on industrial design and weight. I don't think you'll see IVB end up being heavier on average compared to SNB, but there's no inherent reason for it to be lighter either. Use more efficient and lighter cooling materials along with lighter materials for the chassis and you could certainly get a 15.6" IVB laptop down to 4.5 lbs., but you could do that with SNB as well (e.g. the Sony VAIO SE).
That's because Intel has only launched the desktop line and high end mobile chips. The CPUs destined for ultrabooks, the super efficient IVB chip (~17W) launch was delayed.
-------- The initial release includes 13 quad-core processors, most of which will be targeted at desktop computers.
Further dual core processors, suitable for ultrabooks - thin laptops - will be announced "later this spring".
And these companies continues to make crappy laptops. Seriously, with power efficient Ivy Bridge and no discrete GPU, they sure have terrible battery life. This is why Macbooks are one of the better laptops out there and deserves to be the model which others copies.
I have a HP DVT8 (weighs a ton with the regular battery). I bought it for the 18" screen and blu ray player - unfortunately the HP software (for blu ray/DVD playback) is full of bugs! I got the Dell XPS17 about a year ago. Dell knows how to make a laptop - it has the extended battery (lasted about 5.5 hrs new), it is light (I can easily carry it with one hand - could barely do that with the HP!) they use Cyberlink PowerDVD for viewing Blu Ray/DVDs (no problems!). I like the Dell but I don't like the lack of choices - by that I mean I can't opt out of their anti-virus choice, etc.
Temperature is related to the amount of cooling and the speed of the fans. For the N56NV, it runs very quiet -- I don't have numbers, but it never got really loud and I'd guess it maxes out at around 35dB. As for temperatures, I just did some load testing to see what sort of temperatures we get. The i7-3720QM hits 86-89C on the four cores with various stress tests.
Is that hot? Sure. But again, you can't compare temperatures in a vacuum; the Sony VAIO SE reaches similar temperatures on a dual-core SNB CPU, but the fan in the VAIO is much, much louder than the N56VM. ASUS should probably bump the fan speed up a notch, IMO, but it's one of the quietest laptops I've tested under load.
I'm certainly no expert, but if Intel made it so that the integrated GPU could ALSO supplement a discrete GPU, every gamer on the planet would want one.
Surely there are some functions that could be off-loaded to an integrated GPU and thereby free up discrete GPU resources?
Failing that, NVidia could at the very least toss a gazillion dollars Intel's way to let the integrated GPU handle Physx!
I think you mean that NVIDIA would want a bunch of money from Intel in order to let them license PhysX for their IGP (assuming it could handle the workload, which I'm not at all sure it could!) PhysX currently needs something around the level of GTX 460 before it's really useful and won't seriously drop performance. As much as HD 4000 is an improvement over HD 3000, GTX 460 is still about five times more compute and shader performance.
I just checked and this laptop does not support triple displays. You can connect two external displays and disable the internal display, but it appears ASUS did not include the necessary third TMDS transmitter or whatever.
Ivy Bridge is technically capable of supporting three displays, but it needs three TMDS transceivers in the laptop (or on the desktop motherboard) to drive the displays simultaneously. Some laptop makers will likely save $0.25 or whatever by only including two, but others will certainly include the full triple head support.
Just a quick correction, in case anyone is wondering:
For triple displays, Ivy Bridge needs to run TWO of the displays off of DisplayPort, and the other can be LVDS/VGA/HDMI/DVI. I can tell you exactly how many laptops I've seen with dual DP outputs: zero. Anyway, it's an OEM decision, and I'm skeptical we'll see 2xDP any time soon.
"I'm not sure what your point is, at all"? You cannot be serious. Either you have no understanding of thermodynamics, or you're just an anonymous Internet troll. I don't know what your problem is, rarson, but your comments on all the Ivy Bridge articles today are the same FUD with nothing to back it up.
Ivy Bridge specifications allow for internal temperatures of up to 100C, just like most other Intel chips. At maximum load the chip in the N56VM hits 89C, but it's doing that with the fan hardly running at all and generating almost no noise compared to other laptops. Is that so hard to understand? A dual-core Sandy Bridge i7-2640M in the VAIO SE hits higher temperatures while generating more noise. I guess that means Sandy Bridge is a hot chip in your distorted world view? But that would be wrong as well. The reality is that the VAIO SE runs hot and loud because of the way Sony designed the laptop, and the N56VM runs hot and quiet because of the way ASUS designed the laptop.
The simple fact is Ivy Bridge in this laptop runs faster than Sandy Bridge in other laptops, even at higher temperatures than some laptops that we've seen. There was a conscious decision to let internal CPU temperatures get higher instead of running the fans faster and creating more noise. If the fan were generating 40dB of noise, I can guarantee that the chip temperature wouldn't be 89C under load. Again, this is simple thermodynamics. Is that so difficult to understand?
How do we determine what Ivy Bridge temperatures are like "in general"? How do you know that it's a "hot chip"? You don't, so you're just pulling stuff out of the air and making blanket statements that have no substance. It seems you either work for AMD and think you're doing them a favor with these comments (you're not), or you have a vendetta against Intel and you're hoping to make people in general think Ivy Bridge is bad just because you say so (it's not).
I really don't want to play dumb - but if I get an honest answer I'll be pleased: Jarred said that the panel used in Asus N56VM is an LG LP156WF1. OK - how can I find the display type in a specific laptop? I have a Lenovo T61 and... I need help. I want to know the manufacturer, display type, viewing angles. Thanks!
I use Astra32 (www.astra32.com), a free utility that will usually report the monitor type. However, if the OEM chooses to overwrite the information in the LCD firmware, you'll get basically a meaningless code. You can also look at LaptopScreen.com and see if they have the information/screen you need (http://www.laptopscreen.com/English/model/IBM-Leno...
Calm down there. His comment is pointing out that measuring the temperatures of this laptop will tell you nothing about how hot mobile Ivy Bridge is as a platform. We need more information. It looks like it's not as cool as Intel marketing want everyone to believe, but we just don't know yet.
The real heart of the matter is that more performance (IVB) just got stuffed into less space. 22nm probably wasn't enough to dramatically reduce voltages and thus power, so the internal core temperatures are likely higher than SNB in many cases, even though maximum power draw may have gone down.
For the desktop, that's more of a concern, especially if you want to overclock. For a laptop, as long as the laptop doesn't get noisy and runs stable, I have no problem with the tradeoff being made, and I suspect it's only a temporary issue. By the time ULV and dual-core IVB ship, 22nm will be a bit more mature and have a few more kinks ironed out.
Even though you have mentioned that 45w Llano would have improved the gaming performance it would have been better to include such a configuration in your testing. Given that you were testing a 45w high end next gen core i7 product which itself skews the balance in Intel's favour given the vast difference in CPU processing capability the least you could have done was put a similar wattage AMD Llano SKU. The result would be that other than Batman and Skyrim the rest would all be better on HD 6620G. As they say "a picture is worth a thousand words ". All your charts cannot be undone by a small note at the end of the charts. The damage has been done. This is my opinion that objective comparisons can only be made under similar parameters. Its even more critical in the notebook market which have strict thermal restrictions. The desktop market is slightly less restrictive except for HTPCs which need 65w or lesser processors. When the comparisons for Trinity 35w are made it should be against 35w Ivybridge core i3 and core i5. By benching a ivybridge core i7 with a 45w rating and comparing with a Trinity 35w we aren't making a fair and objective comparison. Also the fact that the ivybridge core i7 and trinity are not in the same price segment makes things worse. I hope my comments are not taken negatively.
Welcome to the party; you're unfortunately a week or more late. In an ideal world, you're correct: we'd compare 35W to 35W and 45W to 45W, and we'd keep all other components (RAM, SSD, etc.) as close to the same as possible. With laptops, many of those items are completely out of our control. For instance, we have never had the chance to test a laptop with a 45W Llano APU. Yup: NEVER. I don't make enough money to go out and purchase hardware for testing, and AMD apparently doesn't deem the 45W TDP chips important enough to send out for reviews. If you want to complain, complain to AMD and their partners; I can only test what I'm given.
intel/asus comments that these laptops will probably be in the range of $1100 to $1300? What? Amazon offers an ASUS laptop with one of the low end processor/blu ray/12GB ram/750GB+256GBSSD/17" 1080p for $1700?! Well so much for the fairy tale!
It sounds like according to Intels site that newer drivers improve Portal 2, and deal with the memory leak in Starcraft 2. Worth an update note for this? The site also mentions that patching Battlefield 3 and other games helps also.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
49 Comments
Back to Article
krumme - Monday, April 23, 2012 - link
There is a reason Intel is bringing 14nm to the atoms in 2014.The product here doesnt make sense. Its expensive and not better than the one before it, except better gaming - that is, if the drivers work.
I dont know if the SB notebooks i have in the house is the same as the ones Jarred have. Mine didnt bring a revolution, but solid battery life, like the penryn notebook and core duo i also have. In my world more or less the same if you apply a ssd for normal office work.
Loads of utterly uninteresting benchmark doest mask the facts. This product excels where its not needed, and fails where it should excell most: battery life.
The trigate is mostyly a failure now. There is no need to call it otherwise, and the "preview" looks 10% like a press release i my world. At least trigate is not living up to expectations. Sometimes that happen with technology development, its a wonder its so smooth for Intel normally, and a testament to their huge expertise. When the technology matures and Intel makes better use of the technology in the arch, we will se huge improvements. Spare the praise until then, this is just wrong and bad.
JarredWalton - Monday, April 23, 2012 - link
Seriously!? You're going to mention Atom as the first comment on Ivy Bridge? Atom is such a dog as far as performance is concerned that I have to wonder what planet you're living on. 14nm Atom is going to still be a slow product, only it might double the performance of Cedar Trail. Heck, it could triple the performance of Cedar Trail, which would make it about as fast as Core 2 CULV from three years ago. Hmmm.....If Sandy Bridge wasn't a revolution, offering twice the performance as Clarksfield at the high end and triple the battery life potential (though much of that is because Clarksfield was paired with power hungry GPUs), I'm not sure what would be a revolution. Dual-core SNB wasn't as big of a jump, but it was still a solid 15-25% faster than Arrandale and offered 5% to 50% better battery life--the 50% figure coming in H.264 playback; 10-15% better battery life was typical of office workloads.
Your statement with regards to battery life basically shows you either don't understand laptops, or you're being extremely narrow minded with Ivy Bridge. I was hoping for more, but we're looking at one set of hardware (i7-3720QM, 8GB RAM, 750GB 7200RPM HDD, switchable GT 630M GPU, and a 15.6" LCD that can hit 430 nits), and we're looking at it several weeks before it will go on sale. That battery life isn't a huge leap forward isn't a real surprise.
SNB laptops draw around 10W at idle, and 6-7W of that is going to the everything besides the CPU. That means SNB CPUs draw around 2-3W at idle. This particular IVB laptop draws around 10W at idle, and all of the other components (especially the LCD) will easily draw at least 6-7W, which means once again the CPU is using 2-3W at idle. IVB could draw 0W at idle and the best we could hope for would be a 50% improvement in battery life.
As for the final comment, 22nm and tri-gate transistors are hardly a failure. They're not the revolution many hoped for, at least not yet. Need I point out that Intel's first 32nm parts (Arrandale) also failed to eclipse their outgoing and mature 45nm parts? I'm not sure what the launch time frame is for ULV IVB, but I suspect by the time we see those chips 22nm will be performing a lot better than it is in the first quad-core chips.
From my perspective, to shrink a process node, improve performance of your CPU by 5-25%, and keep power use static is still a definite success and worthy of praise. When we get at least three or four other retail IVB laptops in for review, then we can actually start to say with conviction how IVB compares to SNB. I think it's better and a solid step forward for Intel, especially for lower cost laptops and ultrabooks.
If all you're doing is office work, which is what it sounds like, you're right: Core 2, Arrandale, Sandy Bridge, etc. aren't a major improvement. That's because if all you're doing is office work, 95% of the time the computer is waiting for user input. It's the times where you really tax your PC that you notice the difference between architectures, and the change from Penryn to Arrandale to Sandy Bridge to Ivy Bridge represents about a doubling in performance just for mundane tasks like office work...and a lot of people would still be perfectly content to run Word, Excel, etc. on a Core 2 Duo.
usama_ah - Monday, April 23, 2012 - link
Trigate is not a failure, this move to Trigate wasn't expected to bring any crazy amounts of performance benefits. Trigate was necessary because of the limitations (leaks) from ever smaller transistors. Trigate has nothing to do with the architecture of the processor per se, it's more about how each individual transistor is created on such a small scale. Architectural improvements are key to significant improvements.Sandy Bridge was great because it was a brand new architecture. If you have been even half-reading what they post on Anandtech, Intel's tick-tock strategy dictates that this move to Ivy Bridge would be small improvements BY DESIGN.
You will see improvements in battery life with the NEW architecture, AFTER Ivy Bridge (when Intel stays at 22nm), the so-called "tock," called "Haswell." And yes, tri-gate will still be in use at that time.
krumme - Monday, April 23, 2012 - link
As I understand trigate, trigate provides the oportunity to even better granularity of power for the individual transistor, by using different numbers of gates. If you design your arch to the process (using that oportunity,- as IB is not, but the first 22nm Atom aparently is), there should be "huge" savingsI asume you BY DESIGN mean "by process" btw.
In my world process improvement is key to most industrial production, with tools often being the weak link. The process decides what is possible in your design. That why Intel have used billions "just" mounting the right equipment.
JarredWalton - Monday, April 23, 2012 - link
No, he means Ivy Bridge is not the huge leap forward by design -- Intel intentionally didn't make IVB a more complex, faster CPU. That will be Haswell, the 22nm tock to the Ivy Bridge tick. Making large architectural changes requires a lot of time and effort, and making the switch between process nodes also requires time and effort. If you try to do both at the same time, you often end up with large delays, and so Intel has settled on a "tick tock" cadence where they only do one at a time.But this is all old news and you should be fully aware of what Intel is doing, as you've been around the comments for years. And why is it you keep bringing up Atom? It's a completely different design philosophy from Ivy Bridge, Sandy Bridge, Merom/Conroe, etc. Atom is more a competitor to ARM SoCs, which have roughly an order of magnitude less compute performance than Ivy Bridge.
krumme - Monday, April 23, 2012 - link
- Intel speeds up Atom development, - not using depreciated equipment for the future.- Intel invest heavily to get into new business areas and have done for years
- Haswell will probably be slimmer on the cpu part
The reason they do so is because the need of cpu power outside of the servermarket, is stagnating. And new third world markets is emergin. And all is turning mobile - its all over your front page now i can see.
The new Atom probably will provide adequate for most. (like say core 2 culv). Then they will have the perfect product. Its about mobility and price and price. Haswell will probably be the product for the rest of the mainstream market leaving even less for the dedicated gpu.
IB is an old style desktop cpu, maturing a not quite ready 22nm trigate process. Designed to fight a BD that did not arive. Thats why it does not impress. And you can tell Intel knows because the mobile lineup is so slim.
The market have changed. The shareprice have rocketed for AMD even though their high-end cpu failed, because the Atom sized bobcat and old technology llano could enter the new market. I could note have imagined the success of Llano. I didnt understand the purpose of it, because trinity was comming so close. But the numbers talk for themselves. People buy an user experience where it matter at lowest cost, not pcmark, encoding times, zip, unzip.
You have to use new benchmarks. And they have to be reinvented again. They have to make sense. Obviously cpu have to play a less role and the rest more. You have a very strong team, if not the strongest out there. Benchmark methology should be at the top of your list and use a lot of your development time.
JarredWalton - Monday, April 23, 2012 - link
The only benchmarks that would make sense under your new paradigm are graphics and video benchmarks, well, and battery life as well, because those are the only areas where a better GPU matters. Unless you have some other suggestions? Saying "CPU speed is reaching the point where it really doesn't matter much for a large number of people" is certainly true, and I've said as much on many occasions. Still, there's a huge gulf between Atom and Core 2 still, and there are many tasks where CULV would prove insufficient.By the time the next Atom comes out, maybe it will be fixed in the important areas so that stuff like YouTube/Netflix/Hulu all work without issue. Hopefully it also supports at least 4GB RAM, because right now the 2GB limit along with bloated Windows 7 makes Atom a horrible choice IMO. Plus, margins are so low on Atom that Intel doesn't really want to go there; they'd rather figure out ways to get people to continue paying at least $150 per CPU, and I can't fault their logic. If CULV became "fast enough" for everyone Intel's whole business model goes down the drain.
Funny thing is that even though we're discussing Atom and by extension ARM SoCs, those chips are going through the exact same rapid increases in performance. And they need it. Tablets are fine for a lot of tasks, but opening up many web sites on a tablet is still a ton slower than opening the same sites on a Windows laptop. Krait and Tegra 3 are still about 1/3 the amount of performance I want from a CPU.
As for your talk about AMD share prices, I'd argue that AMD share prices have increased because they've rid themselves of the albatross that was their manufacturing division. And of course, GF isn't publicly traded and Abu Dhabi has plenty of money to invest in taking over CPU manufacturing. It's a win-win scenario for those directly involved (AMD, UAE), though I'm not sure it's necessarily a win for everyone.
bhima - Monday, April 23, 2012 - link
I figure Intel wants everyone to want their CULV processors since they seem to charge the most for them to the OEMs, or are the profit margins not that great because they are a more difficult/expensive processor to make?krumme - Tuesday, April 24, 2012 - link
Yes - video and gaming is what matters for the consumer now, everything is okey as it will - hopefully - be 2014. What matters is ssd, screen quality, and everything else, - just not cpu power. It just needs to have far less space. Cpu having so much space is just old habits for us old geeks.AMD getting rid of GF burden have been in the plan for years. Its known and can not influence share price. Basicly the, late, move to mobile focus, and the excellent execution of those consumer / not reviewer shaped apus is a part of the reason.
The reviewers need to move their mindset :) - btw its my impression Dustin is more in line with what the general consumer want. Ask him if he thinks the consumer want a new ssd benchmark with 100 hours of 4k reading and writing.
MrSpadge - Monday, April 23, 2012 - link
No, the finer granularity is just a nice side effect (which could probably be used more aggressively in the future). However, the main benefit of tri-gate is more control over the channel, which enables IB to reach high clock speeds at comparably very low voltages, and at very low leakage.krumme - Tuesday, April 24, 2012 - link
Whatever the benefit is, we dont see it now.Failure - hands down.
JarredWalton - Tuesday, April 24, 2012 - link
FUD, hands down.BSMonitor - Monday, April 23, 2012 - link
http://www.anandtech.com/show/4313/intel-announces...At these power levels, the benefit is not as noticeable.
The benefit comes in the extreme low power envelope. None of the mobile processors released today are of that variety.
mgoldshteyn - Monday, April 23, 2012 - link
So much for lighter laptops with Ivy Bridge.mgoldshteyn - Monday, April 23, 2012 - link
With a mere 6-cell battery, to boot!JarredWalton - Monday, April 23, 2012 - link
Lighter laptops are a design decision by the OEM, not the CPU. Putting in switchable graphics and all the other stuff adds weight, but ASUS chose to go for a more affordable product rather than spending a lot of time and money on industrial design and weight. I don't think you'll see IVB end up being heavier on average compared to SNB, but there's no inherent reason for it to be lighter either. Use more efficient and lighter cooling materials along with lighter materials for the chassis and you could certainly get a 15.6" IVB laptop down to 4.5 lbs., but you could do that with SNB as well (e.g. the Sony VAIO SE).mabellon - Monday, April 23, 2012 - link
That's because Intel has only launched the desktop line and high end mobile chips. The CPUs destined for ultrabooks, the super efficient IVB chip (~17W) launch was delayed.--------
The initial release includes 13 quad-core processors, most of which will be targeted at desktop computers.
Further dual core processors, suitable for ultrabooks - thin laptops - will be announced "later this spring".
[http://www.bbc.com/news/technology-17785464]
gorash - Monday, April 23, 2012 - link
Nice... if only MacBooks had those specs with that price. I don't really need the optical drive though.dwade123 - Monday, April 23, 2012 - link
And these companies continues to make crappy laptops. Seriously, with power efficient Ivy Bridge and no discrete GPU, they sure have terrible battery life. This is why Macbooks are one of the better laptops out there and deserves to be the model which others copies.xpsuser - Sunday, May 13, 2012 - link
I have a HP DVT8 (weighs a ton with the regular battery). I bought it for the 18" screen and blu ray player - unfortunately the HP software (for blu ray/DVD playback) is full of bugs! I got the Dell XPS17 about a year ago. Dell knows how to make a laptop - it has the extended battery (lasted about 5.5 hrs new), it is light (I can easily carry it with one hand - could barely do that with the HP!) they use Cyberlink PowerDVD for viewing Blu Ray/DVDs (no problems!). I like the Dell but I don't like the lack of choices - by that I mean I can't opt out of their anti-virus choice, etc.JarredWalton - Monday, April 23, 2012 - link
Temperature is related to the amount of cooling and the speed of the fans. For the N56NV, it runs very quiet -- I don't have numbers, but it never got really loud and I'd guess it maxes out at around 35dB. As for temperatures, I just did some load testing to see what sort of temperatures we get. The i7-3720QM hits 86-89C on the four cores with various stress tests.Is that hot? Sure. But again, you can't compare temperatures in a vacuum; the Sony VAIO SE reaches similar temperatures on a dual-core SNB CPU, but the fan in the VAIO is much, much louder than the N56VM. ASUS should probably bump the fan speed up a notch, IMO, but it's one of the quietest laptops I've tested under load.
GDSquared - Monday, April 23, 2012 - link
I'm certainly no expert, but if Intel made it so that the integrated GPU could ALSO supplement a discrete GPU, every gamer on the planet would want one.Surely there are some functions that could be off-loaded to an integrated GPU and thereby free up discrete GPU resources?
Failing that, NVidia could at the very least toss a gazillion dollars Intel's way to let the integrated GPU handle Physx!
Zink - Monday, April 23, 2012 - link
Even AMD hybrid crossfire doesn't work well. It would probably be a driver disaster.JarredWalton - Monday, April 23, 2012 - link
I think you mean that NVIDIA would want a bunch of money from Intel in order to let them license PhysX for their IGP (assuming it could handle the workload, which I'm not at all sure it could!) PhysX currently needs something around the level of GTX 460 before it's really useful and won't seriously drop performance. As much as HD 4000 is an improvement over HD 3000, GTX 460 is still about five times more compute and shader performance.Zink - Monday, April 23, 2012 - link
Even AMD hybrid crossfire doesn't really bring much benefit. It would probably be a huge driver fiasco.A5 - Monday, April 23, 2012 - link
It would also be slower and draw slightly more power.Angengkiat - Monday, April 23, 2012 - link
Hi Jarred,Can u pls help us to verify that the notebook is supporting triple display(1 internal, 2 external) output since it is using hm77 chipset thanks!
Regards
EK
JarredWalton - Tuesday, April 24, 2012 - link
Hi Angengkiat,I just checked and this laptop does not support triple displays. You can connect two external displays and disable the internal display, but it appears ASUS did not include the necessary third TMDS transmitter or whatever.
Angengkiat - Sunday, April 29, 2012 - link
Thanks for your reply! :)Angengkiat - Sunday, April 29, 2012 - link
I wonder if this inapplicable to all ivy bridge notebook (or hm77-powered ones) cos my Vaio Z with nvidia gt325 graphics can't support dual output..:(JarredWalton - Tuesday, May 1, 2012 - link
Ivy Bridge is technically capable of supporting three displays, but it needs three TMDS transceivers in the laptop (or on the desktop motherboard) to drive the displays simultaneously. Some laptop makers will likely save $0.25 or whatever by only including two, but others will certainly include the full triple head support.JarredWalton - Thursday, May 10, 2012 - link
Just a quick correction, in case anyone is wondering:For triple displays, Ivy Bridge needs to run TWO of the displays off of DisplayPort, and the other can be LVDS/VGA/HDMI/DVI. I can tell you exactly how many laptops I've seen with dual DP outputs: zero. Anyway, it's an OEM decision, and I'm skeptical we'll see 2xDP any time soon.
JarredWalton - Tuesday, April 24, 2012 - link
"I'm not sure what your point is, at all"? You cannot be serious. Either you have no understanding of thermodynamics, or you're just an anonymous Internet troll. I don't know what your problem is, rarson, but your comments on all the Ivy Bridge articles today are the same FUD with nothing to back it up.Ivy Bridge specifications allow for internal temperatures of up to 100C, just like most other Intel chips. At maximum load the chip in the N56VM hits 89C, but it's doing that with the fan hardly running at all and generating almost no noise compared to other laptops. Is that so hard to understand? A dual-core Sandy Bridge i7-2640M in the VAIO SE hits higher temperatures while generating more noise. I guess that means Sandy Bridge is a hot chip in your distorted world view? But that would be wrong as well. The reality is that the VAIO SE runs hot and loud because of the way Sony designed the laptop, and the N56VM runs hot and quiet because of the way ASUS designed the laptop.
The simple fact is Ivy Bridge in this laptop runs faster than Sandy Bridge in other laptops, even at higher temperatures than some laptops that we've seen. There was a conscious decision to let internal CPU temperatures get higher instead of running the fans faster and creating more noise. If the fan were generating 40dB of noise, I can guarantee that the chip temperature wouldn't be 89C under load. Again, this is simple thermodynamics. Is that so difficult to understand?
How do we determine what Ivy Bridge temperatures are like "in general"? How do you know that it's a "hot chip"? You don't, so you're just pulling stuff out of the air and making blanket statements that have no substance. It seems you either work for AMD and think you're doing them a favor with these comments (you're not), or you have a vendetta against Intel and you're hoping to make people in general think Ivy Bridge is bad just because you say so (it's not).
mtoma - Tuesday, April 24, 2012 - link
I really don't want to play dumb - but if I get an honest answer I'll be pleased: Jarred said that the panel used in Asus N56VM is an LG LP156WF1. OK - how can I find the display type in a specific laptop? I have a Lenovo T61 and... I need help. I want to know the manufacturer, display type, viewing angles. Thanks!JarredWalton - Tuesday, April 24, 2012 - link
I use Astra32 (www.astra32.com), a free utility that will usually report the monitor type. However, if the OEM chooses to overwrite the information in the LCD firmware, you'll get basically a meaningless code. You can also look at LaptopScreen.com and see if they have the information/screen you need (http://www.laptopscreen.com/English/model/IBM-Leno...leovande321 - Wednesday, May 15, 2013 - link
AUO 10.1 "SD + B101EVT03.2 1280X800 Matte Laptop Screen Grade A +I hope to help you!!!
AUO BOE CMO CPT IVO 10.1 14.0 15.6 LED CCFL whoalresell
Wholesale Laptop Screens www.globalresell.com
Spunjji - Thursday, April 26, 2012 - link
Calm down there. His comment is pointing out that measuring the temperatures of this laptop will tell you nothing about how hot mobile Ivy Bridge is as a platform. We need more information. It looks like it's not as cool as Intel marketing want everyone to believe, but we just don't know yet.JarredWalton - Thursday, April 26, 2012 - link
The real heart of the matter is that more performance (IVB) just got stuffed into less space. 22nm probably wasn't enough to dramatically reduce voltages and thus power, so the internal core temperatures are likely higher than SNB in many cases, even though maximum power draw may have gone down.For the desktop, that's more of a concern, especially if you want to overclock. For a laptop, as long as the laptop doesn't get noisy and runs stable, I have no problem with the tradeoff being made, and I suspect it's only a temporary issue. By the time ULV and dual-core IVB ship, 22nm will be a bit more mature and have a few more kinks ironed out.
leovande321 - Wednesday, May 15, 2013 - link
AUO 10.1 "SD + B101EVT03.2 1280X800 Matte Laptop Screen Grade A +I hope to help you!!!
AUO BOE CMO CPT IVO 10.1 14.0 15.6 LED CCFL whoalresell
Wholesale Laptop Screens www.globalresell.com
raghu78 - Wednesday, May 2, 2012 - link
Even though you have mentioned that 45w Llano would have improved the gaming performance it would have been better to include such a configuration in your testing. Given that you were testing a 45w high end next gen core i7 product which itself skews the balance in Intel's favour given the vast difference in CPU processing capability the least you could have done was put a similar wattage AMD Llano SKU. The result would be that other than Batman and Skyrim the rest would all be better on HD 6620G. As they say "a picture is worth a thousand words ". All your charts cannot be undone by a small note at the end of the charts. The damage has been done.This is my opinion that objective comparisons can only be made under similar parameters. Its even more critical in the notebook market which have strict thermal restrictions. The desktop market is slightly less restrictive except for HTPCs which need 65w or lesser processors. When the comparisons for Trinity 35w are made it should be against 35w Ivybridge core i3 and core i5. By benching a ivybridge core i7 with a 45w rating and comparing with a Trinity 35w we aren't making a fair and objective comparison. Also the fact that the ivybridge core i7 and trinity are not in the same price segment makes things worse. I hope my comments are not taken negatively.
JarredWalton - Thursday, May 10, 2012 - link
Welcome to the party; you're unfortunately a week or more late. In an ideal world, you're correct: we'd compare 35W to 35W and 45W to 45W, and we'd keep all other components (RAM, SSD, etc.) as close to the same as possible. With laptops, many of those items are completely out of our control. For instance, we have never had the chance to test a laptop with a 45W Llano APU. Yup: NEVER. I don't make enough money to go out and purchase hardware for testing, and AMD apparently doesn't deem the 45W TDP chips important enough to send out for reviews. If you want to complain, complain to AMD and their partners; I can only test what I'm given.leovande321 - Wednesday, May 15, 2013 - link
AUO 10.1 "SD + B101EVT03.2 1280X800 Matte Laptop Screen Grade A +I hope to help you!!!
AUO BOE CMO CPT IVO 10.1 14.0 15.6 LED CCFL whoalresell
Wholesale Laptop Screens www.globalresell.com
xpsuser - Sunday, May 13, 2012 - link
intel/asus comments that these laptops will probably be in the range of $1100 to $1300? What? Amazon offers an ASUS laptop with one of the low end processor/blu ray/12GB ram/750GB+256GBSSD/17" 1080p for $1700?! Well so much for the fairy tale!leovande321 - Wednesday, May 15, 2013 - link
AUO 10.1 "SD + B101EVT03.2 1280X800 Matte Laptop Screen Grade A +I hope to help you!!!
AUO BOE CMO CPT IVO 10.1 14.0 15.6 LED CCFL whoalresell
Wholesale Laptop Screens www.globalresell.com
TybeeJoe - Wednesday, May 16, 2012 - link
Whenever I see "prepayment via bank transfer, Western Union" and the like, I get REALLY nervous. Has anyone had any experiences with this guy?mira - Sunday, May 20, 2012 - link
hello guys...i want to know how to show up the lighting for N56VM asus series...
i try to press FN + F3/F4...but still did not work...
please somebody help me....tqvm...
jadedcorliss - Wednesday, June 20, 2012 - link
It sounds like according to Intels site that newer drivers improve Portal 2, and deal with the memory leak in Starcraft 2. Worth an update note for this? The site also mentions that patching Battlefield 3 and other games helps also.sonelone - Sunday, June 24, 2012 - link
The laptop is out now. Please review it.leovande321 - Wednesday, May 15, 2013 - link
AUO 10.1 "SD + B101EVT03.2 1280X800 Matte Laptop Screen Grade A +I hope to help you!!!
AUO BOE CMO CPT IVO 10.1 14.0 15.6 LED CCFL whoalresell
Wholesale Laptop Screens www.globalresell.com