I consider any dedicated card with at least 16SP's and at least 512MB of dedicated memory to be a gaming laptop; 16 SP's IS the ABSOLUTE minumum, but that should be enough to run everything ecxept maybe crysis (Which I really hate anyway) at 720p or higher with playable frame rates. Who cares about eye candy? As long as the game runs smoothly. Desktops are for eye candy, laptops and consoles are just for gaming.
Man, as someone who hasn't used a desktop as personal computer for the last 4 years, the move to laptop was a very difficult one. You have the convenience but lack the performance. Now couple this processor with two raid ssd's and 8 gigs of ram in a 64bit Windows 7 laptop and you finally have a beast of a machine in your probably burning lap.
I'd love to get that setup and finally not feel as though I'm loosing out to a desktop in anyway. The only true limitation is Crysis, but seriously that game sucked anyway!
I have to say the mobility right now is more of a draw than a higher level of eye candy. Now mind you, I have both laptop and desktop so if I really crave eye candy, I can go to the desktop room and game.
But with two little ones, I find that my 'gaming time' is often measured in 20 min spans here and there, and that being able to surf or get some work done wherever the kids happen to be is a benefit that I very much enjoy. So I can run Witcher on my old 7950M, windowed @ 1620 and have the settings lower and be "ok" with that.
Mind you I do crave a bit more oomph, a more modern machine, but I can bide my time. The mobility is very nice, and I don't LAN (no time!). Having SLI or a higher end mobile chip simply means the laptop is "acceptable" for a longer period of it's life.
I won't argue the bang for the buck. Mobile gaming is pricey and not cost effective. But the mobility is nice, the space taken up by a machine I can throw in the closet is also nice. And within some limits, lower res or lower eye candy is acceptable as payment for that mobility.
Now I just need USB 3 (USB changes only happen every 10 years or so) and then I might consider upgrading.
When checking my laptop for Ubuntu vs Xp battery life, I accidentally ran my first XP test with my standard undervolt on, didn't seem to impact battery life any.
Am i the only one who thinks these chips are ridiculously overpriced? I would never drop more than $350 on a CPU even in a desktop, it just doesn't seem economical for a $1K laptop processor. especially if it's only running at a 2.0GHz base.
The cheaper options seem really underwhelming and like others have said, the thermal output of these chips just doesn't make sense for a laptop.
The big problem with gaming laptop's is that they aren't balanced. The display is always at higher res than the cpu/gpu can drive.
I'd get a gaming laptop if it can drive all current gen games at max settings at the native res of the panel it comes with. Even if that res is <1080p.
If it can't do that, then I've spent a lot of money on something that's already behind the desktop I can get for cheaper.
Shouldn't the Quad core mobiles be 32nm and the Dual cores 45nm? I know that's not the case but what was Intel thinking? It doesn't even look like there's a refresh of the Quad's to 32nm in the Spring.
Crazy, cuz they look like good chips with a shrink.
I get the impression that Intel isn't really interested in making a low power quad core for laptops. My guess is that they see dual core + HT as the solution for users who need good battery life, and anyone needing a Quad core "portable" won't be running off battery (or will have a much larger battery). I don't agree with that, I think there are a small but growing percentage of users who could benefit from quad core performance on a notebook and still want/need good battery runtime. It's for not mainstream users yet, but it will be in a few years. For now, we have to choose between extra CPU power, extra battery life, or extra weight.
After my Alienware M17 which I'm VERY happy with I'd get this one, veeery sweet machine and cheap too, just $1k. Why bother with stationary desktop when this beautiful Asus lappy can get everything done including every game including Crysis too?
Jarred just can't understand the beauty of running your games literally anywhere. Sad, just sad. Grow up AT guys, desktop is the past and lappys are the future :P
I review laptops, I run tests on them, and gaming performance is still at least two generations behind desktops. That $1000 ASUS is a good gaming laptop, to be sure, and a much better value that a $3000+ Alienware. However, a single GTX 260M is a 9800 GTX desktop part equivalent (or GTS 250 if you prefer the latest name change).
There's a big reason I don't test with AA enabled on *any* of the laptops: they can't handle it. Heck, these $3000+ laptops struggle to run 1080p at times. Is it childish to think that my 30" display with 4870X2 provides a far more compelling experience than a laptop?
If you love gaming laptops, I've got full reviews of these three (and your M17x) in the works. And for every comment like yours stating that I need to grow up and get past the desktop, I'll get 10 comments on my high-end reviews saying, "Who in their right mind buys these things!?" The answer is "Pirks" apparently, along with people that want to go to LAN parties (a very small minority of gamers, considering the millions playing games) who also have the money to spend on gaming laptops (an even smaller minority).
Anyway, I loved the Gateway FX series kicking off the $1300 gaming laptop era, and the ASUS G50 series (including your linked laptop) is great as well... for what it is. A convenient laptop to me usually doesn't get an hour of battery life, have an extremely hot bottom (no, not like that...), or weigh 12 pounds. Okay, that ASUS probably weighs 7 pounds and gets 90 minutes of battery life, and it's not as toasty as a Clevo DTR setup. That's why it will sell a lot more units than the W87CU, though.
The problem is, Intel's "gaming laptops" slide includes http://www.bestbuy.com/site/olspage.jsp?skuId=9379...">stuff like this VAIO. GeForce 9600M is NOT a gaming solution by any stretch of the imagination. It wasn't even a gaming solution when it launched, with its paltry 32 SPs. It can handle WoW at moderate detail settings and resolution, sure, and it's faster than any IGP, but Crysis and many other games will still struggle.
My main laptop has a Quadro FX 770M (roughly equivalent to the 9600M GT you linked to) powering a 15.4" 1920x1200 screen and I have to disagree with you a little bit there.
It's very true that when playing games like Far Cry 2 zebras look more like dirty white horses and some of the jagged edges could put your eye out, but at the end of the day the gaming experience isn't really all that different from my desktop. Other titles are similar; yes, graphics sliders sometimes need to be set to the low end of the scale, but most games are still very playable and you often really don't miss the "shiny" graphics options.
At this point, it'd take a really solid implementation of something like AMD's Eyefinity to make me want to build another super high-end gaming desktop. Maybe I'm just getting old, but current advances in graphics technology just don't seem to wow as much as they did 10 years ago - subjectively they often feel more like baby steps than giant leaps.
The reason I bring all this up is to suggest that a short paragraph on your subjective gaming impressions going between the laptops and your desktop might be a good idea. Just noting that AA is off often isn't enough for us mortals that don't have a few $3000 laptops and $2000 towers just lying around. ;)
Is the 45W/55W TDPs for Clarksfield really that much of a concern over previous generation CPUs? Penryn processors may have had 35W/45W TDPs but that also had separate northbridges with the PM45 rated at 7W. Clarksfield with it's integrated memory and PCIe controllers basically has the northbridge integrated and absorbs it's TDP rating. Once the northbridge is taken into account, the 45W/55W TDP of Clarksfield is only 3W higher than previous generation Penryn+northbridge combinations.
I do agree though that the low clock speeds, relatively high clock speeds, and high prices of Clarksfield makes it unlikely that we will see quad cores break into the mainstream mobile market in this generation. While it may not be needed on desktop where the thermal constraints are relaxed, I think a 32nm quad core Westmere derivative is definitely needed in the mobile market. It's unfortunate that we'll likely have to wait at least another year before we see a 32nm mobile quad core with Sandy Bridge.
You're correct, a current C2Q at 2.0-2.53GHz with a 45W TDP plus PM45 + ICH9M (9.5W combined TDP) is 54.5W TDP, vs a Core i7 mobile (45W/55W TDP + PM55 (3.5W TDP) is 48.5W/58.5W TDP, each without IGP. Put the C2Q with an Nvidia 9400M (G) chipset (12W TDP) and it looks a bit better at 57W TDP including a good IGP, but it's still in the same power range as the i7 and as shown in the article, it's notably slower.
However, you're overlooking something. Intel currently offers C2D @ 1.6GHz with 20W TDP and 2.13GHz w 17W TDP, indicating that the should be able to make 20W and 34W TDP C2Q on their current 45nm process. In fact, they've got a Xeon L5408 (2.13GHz Quad Core) @ 40W TDP, so clearly they can get to the numbers I suggest. Couple those "possible" CPUs with an Nvidia 9400M (G) chipset @ 12 W TDP, and you're looking at a 1.6GHz C2Q w/ 32W TDP or 2.13GHz C2Q w/ 46W TDP for the complete CPU, chipset, and a good IGP. Or, go with the PM45+ICH9M (9.5W TDP combined) for 29.5W or 43.5W TDP for complete systems without GPU.
Compare that to a mobile i7 + PM55 at 48.5W - 58.5W TDP, without GPU, that's 34%-64% more peak load power than what Intel could theoretically do with C2Q today with similar base clock speeds. Yes, the i7 has power gating that will allow it to use lower power at idle, and it has Turbo mode which will allow it to perform better when only 1 or 2 cores are in use, but Turbo mode will still use more peak power than these hypothetical C2Q. Of course, you can turn off Turbo mode, but then you're back to performance much closer to the C2Q.
Compared to what Intel actually offers today, the new chips are an improvement for those who need quad core performance in a portable. However, compared to what they've shown that they could offer today if they chose to, the Clarksfield chips don't look like they're much of an improvement. If Intel applied the power gating and/or turbo features to the C2Q, Clarksfield might not look like an improvement at all. Of course, since Intel isn't doing that, it's all speculation.
Bottom line, Clarksfield gives more performance in a notebook, but at a notable cost in power usage (and a corresponding cost in battery life) vs what Intel could do today using C2 based systems. If battery life and weight are important to you, Clarksfield is no big deal, and it leaves you waiting for Arrandale or lower power Clarksfield CPUs. If top performance is your concern and you can live with shorter battery life and/or more weight, then Clarkfield gives you a new option.
These are not replacements of the ultra low voltage stuff or even replacements of the mobile Pxxxx core 2 duo's..
These processors are replacements of the high end Core Duos that do have a 35W or the Core Quads that have a 45W TDP's
Please do compare them with what they are aiming to replace..
yes the C2D 1.6Ghz 20W is much lower but now compare them with the power you get from a Core I7 720..
Its completely depending on what you do. If you do use 2-4 cores for your daily work then i think the Power/Watt is way better with the Core I7 for what you get.. The I7 is way faster done with it's work then the C2D so it can be in idle way quicker .. And looking at the review it seems that it saves more power in idle then the Core duo's!
"It's good to finally see an official Nehalem CPU for the mobile sector. Power gate transistors have the potential to seriously improve battery life, and we can't wait to see that sort of technology begin making its way into CPUs as well as processors. In terms of performance, things are a little bit of a mixed bag."
Don't you mean GPUs as well as processor? :p
I'm still waiting for arrandale, I'll be skipping this generation and upgrading in one or two years.
I have no idea what I was even trying to say on that one. Hmmm.... Dragon at 8AM after working for 20 hours straight is NOT my friend! :-) Oh, wait: "just getting" without the comma should work.
As for Arrandale, it's due out in Q1 2010, so not too far off. 32nm and dual-core + Hyper-Threading should be very compelling I think.
Dual core with hyperthreading should be better. I also don't think laptops will replace desktops for gaming. Is Intel crazy? That's just such a strange prediction, I don't know why they would think it's even remotely possible.
I think they should have waited for 32nm before releasing a Nehalem for mobile. Really, they have no competition from AMD worth speaking about, and 32nm could have been done right - with an integrated IGP if desired, and low enough power use that it's not one hour and out.
A micro-ATX setup with a handle, a Bloomfield and a real video card would probably be better for the vast majority of gamers than this one. You don't get true portability, but with only one hour of life, you don't really get it with this either.
The gloom and doom for desktops is always overstated. For one, laptops are really only comfortable for women and weak, pencil-necked men. If you have any size, the keyboard is a nightmare, and none come with the natural keyboard which men's shoulder width really begs for.
On top of this, mouse movement is a pain in the neck with that replacement. A mouse is just more comfortable. Then you have to worry about power, which kind of ruins a lot of the fun with computers - just kind of doing what you feel like and relaxing. Who wants to worry about power? Then, you're limited by screens. Of course, you can dock these things, and use them as desktops, in which case they only have disadvantages (although not as many) in this role, and no advantages.
So, I think desktops will always be around, and always be the preferred tool even if you have both.
I think one reason laptops sell pretty well is, ironically, because they are unreliable, and need replacement much more often. They are also more difficult to upgrade, also necessitating replacement rather than upgrade. So, the reasons aren't all good.
To be fair it's a desktop replacement cpu, like their many released before. Yes it will replace desktops for gaming but only for some. DTR cpus are also used for high-end mobile workstations. It's not for everybody get your grip together. They don't do everything because competition from AMD.
"A micro-ATX setup with a handle, a Bloomfield and a real video card would probably be better for the vast majority of gamers than this one. "
You have to be kidding, Lynnfield offers far superior power management and consumption plus better overall performance than Bloomfield. Something you would actually want in a micro-ATX setup, instead of a heat generator like Bloomfield. Even the Phenom/Athlon II would make for a better micro-ATX gaming platform than the energy sucking Bloomfield.
Before you get on your high horse about Bloomfield and overclocking, it's not going to make any difference in a micro-ATX system compared to Lynnfield, except to make the temperatures unbearable.
How is that next cut and paste article coming along for Toms? Are you going to do the history of lawnmowers on this one?
Dude, are you gay or something? What is your obsession with me?
Wouldn't your current boyfriend be upset if he saw your obsession.
Micro-Atx would, naturally, be plugged in. What kind of an idiot are you? Battery life wouldn't matter in this context. The micro-ATX would be for easy movement from one place to another. Some of them are really small, light (so even you could move it), and easily transported. I wasn't implying you'd want to use it from a battery.
Didn't you understand that within the context? Clearly, you're a moron. And a gay, obsessive one. Get a life. You need attention, clearly, but not from me. Find someone else.
What is your obsession with posting negative comments to every article at AT? They have proven you wrong every step of the way. Why they even wasted their time is beyond me but I have to hand it to them for even paying attention to you.
You obviously do not make these sames posts at Toms? I just read through the comments on the last six or seven articles. Especially the cpu related ones and you did not make a single comment even though their conclusion, test methods, and information is nearly the same at ATs when comparing Lynnfield vs Bloomfield. Why is that?
You brought up the Bloomfield micro-ATX setup and I was just replying to one not so bright idea of yours about Bloomfield once again being superior. The last thing you want in a micro-ATX system you will be lugging around to LAN parties is a Bloomfield cpu and X58 chipset. Heat is thy enemy and this platform has ridiculous power consumption compared to Lynnfield or Phenom II. Considering the alternatives available I would say Bloomfield should be one of the last ones to suggest.
My post wasn't negative. I agreed with the author. I don't agree with Intel, and neither did he. What is your problem, besides liking me?
They didn't prove anything, at all. You're just too stupid to see through the weird benchmarking. Maybe I stopped reading before they posted their 'proof'. But, really, Anand's apple to apple benches proved my point. In their first article, they were saying they were the same or better. Then, all of the sudden, the Bloomfield is 3.5% faster, normalized! Sometimes almost 10%, on real world benchmarks. Fancy that! Although, this time they screwed up by making the Bloomfield uncore faster, so it's not 100% accurate.
Tom's also showed a lot of advantages of the Bloomfield. This site just rubbed me the wrong way because they were doing whatever they could to make the Lynnfield look better than it really is. Is 3% a big deal? Who knows? Maybe it is, maybe it isn't. That's a matter of perspective. But, when I see 0%, or -1%, and I know it's just not so, that's where someone has to say something. If they were arguing 3% isn't so important, then, so be it. But, show the 3%, instead of hiding it behind bogus setups that hide it.
You wouldn't understand because you're simple. But, life isn't simple. Very few things are good or bad, completely.
By the way, why would power matter more for micro-ATX than for anything else? The Bloomfield is king of the hill. I'm not crazy about the power use, really, I'm not, but, for a gaming platform, I'd want the best. I'd put up with the additional power use.
For a computer I'd put in the kitchen and would only surf on, I'd probably be much more inclined to look at power. Of course, even then I wouldn't consider the Lynnfield. I'd get a Core 2 or Pentium, a G45, and get all the performance I needed.
Also, you probably didn't notice, because Gary hid it, that the voltage needed to overclock the Lynnfield was considerably higher than the Bloomfield. That makes me a little nervous. But, still, I agree completely the x58 needs to go on a diet. I wish Intel would move it down to 45nm. It's a high end platform, it deserves it.
So, it's not the size, it's the application. Hmmmm, that could have a different context, but, remember we're talking about computers here.
"They didn't prove anything, at all. You're just too stupid to see through the weird benchmarking. Maybe I stopped reading before they posted their 'proof'. But, really, Anand's apple to apple benches proved my point. In their first article, they were saying they were the same or better. Then, all of the sudden, the Bloomfield is 3.5% faster, normalized! Sometimes almost 10%, on real world benchmarks. Fancy that! Although, this time they screwed up by making the Bloomfield uncore faster, so it's not 100% accurate."
That's nice, but the processors as they ship are not 'clock normalised'. As foretold, the benefit of extra clockspeed usually outweighs the benefit of playing with the memory subsystem. We all knew this and benchmarks confirm it, yet again. Many people don't give a shit about overclocking, you know, even if their major-OEM BIOS would allow them to do it. If I didn't need ECC, which causes me to buy Xeons, I'd definately choose Lynnfield on P55 over Bloomfield on X58, for any kind of comparable price. Not only is performance better, but power consumption is also better. Now, if you don't think that websites should benchmark at stock speeds, then maybe you should just move your ass to a specialist overclocking forum?
TA152H,
You stated - "Tom's also showed a lot of advantages of the Bloomfield. This site just rubbed me the wrong way because they were doing whatever they could to make the Lynnfield look better than it really is."
Fact -
1. AT's numbers are in alignment with everyone else on the web, including Toms. They are not trying to make Lynnfield look better than it is, it is just better in most cases than Bloomfield, so get over it.
You have harped on and on about clock for clock numbers, overclocked results, and all sorts of stuff that the people here at AT have provided. Even after providing proof, you still launch personal attacks at the editors and other readers because the numbers do not agree with your warped view of the world.
Fact -
1. Toms has not provided clock for clock comparisons, overclocked comparisons, or i7/860 comparisons. Neither have they equalized memory settings or shown results in several applications that AT added after the first review to give an additional look at each of the processors.
2. You did not comment at Toms (where you freelance apparently) about any of these items that you complained about at AT. Why is that?
You mentioned that Toms showed a lot of advantages for Bloomfield. Actually reading the review it was very few and far between, just like at most sites.
Fact -
1. Straight Quote from the summary at Toms and I do not see where they are crazy about Bloomfield except for the workstation crowd or those that have to have six cores on the desktop -
"...Now that we’ve had a couple of weeks with final hardware the Core i5 and Core i7 processor families are even more fascinating.
To begin, they make it much harder to recommend LGA 1366-based Core i7s. We know the i7-900-series is supposed to be higher-end, and it’s hard to ignore the fact that next year we’ll see hexa-core Gulftowns that drop right into our X58 motherboards. But seriously...
Alright, so the Core i5-750, specifically, is priced well. What is there to like about it? Reasonable power consumption, a base clock rate comparable to Intel’s Core i7-920, a more-aggressive Turbo Boost able to take the chip to 3.2 GHz in single-threaded workloads, CrossFire and SLI compatibility—it’s a pretty compelling list, actually.
...More attractive for the folks who stand to benefit from Hyper-Threading is Core i7-860. Its price tag puts it in the realm of Core i7-920, its Turbo Boost helps make it faster, and a complementary motherboard is going to cost you between $75 and $50 less."
Explain to us why those comments did not warrant a response from you at Toms in the same manner that you have posted here. They provided the same type of conclusion, only half the benchmarks and none of the followup by the guys here at AT.
Yet, you never once complained at your site about what you thought was a serious enough problem to call the editors here idiots and to make other personal attacks on their work. Why they have not banned you by this point is beyond me.
I did send an email to Chris Angelini tonight requesting him to read your posts here at AT and asking him if this is the type of employee Toms is proud to have on their staff. You have gone way beyond the norm in your continued attacks on this site and hopefully Chris will be a stand up guy and address this issue with you immediately.
"Also, you probably didn't notice, because Gary hid it, that the voltage needed to overclock the Lynnfield was considerably higher than the Bloomfield. "
The voltages were never hidden, they were right there in the gallery along with the Uncore and memory subtimings as stated in the text. Regarding the uncore rates, they make very little difference, if any, except in the prized SuperPi benches.
Even though Lynnfield needed additional voltages to overclock, the total power consumption and thermals were still lower than Bloomfield. Say what you want, but Bloomfield is not significantly better in any regard for desktop users than Lynnfield. In fact, for most, the opposite is true.
uATX is smaller, and the cases are smaller, and so more heat is definitely a concern. You also are placing way more stock on places where Bloomfield wins and pooh-poohing any areas where Lynnfield wins or ties. There's no benchmarking silliness going on, except in your expectations.
Places where Bloomfield clearly wins: WinRAR (super memory intensive, since it has to search for matching patterns when compressing). WME... does anyone really care about WME when we have x264 being clearly superior? And that's it. Everything else is within 5%, which is one speed bin.
To say that we tried to hide the additional voltage required for overclocking Lynnfield is another fabrication. There's a http://www.anandtech.com/cpuchipsets/showdoc.aspx?...">whole section where that is specifically addressed, as well as being mentioned in the conclusion about the "stock voltage overclocking". If you overclock, I hardly think voltages and power are your primary concern. You want to stay stable, and people have been boosting Intel CPU voltages far more than what we've done on Lynnfield.
Personally, Lynnfield makes more sense, simply from the financial aspect. If you want top performance, go for Bloomfield. If you want better performance without dropping a load, Lynnfield wins, even when you factor in overclocking. Only the extreme fringe is really concerned with more than Lynnfield offers. You're having fits over differences of 5% or less in most cases (DDR3-1066 vs. DDR3-1333, non-Turbo overclocked performance, etc.)
The only truly bad thing about the Lynnfield platform right now is that we're missing a decent IGP solution, and Intel is going to be the sole chipset provider for a good long time. Those are problems with Bloomfield as well, and they're not even serious concerns since you can easily add an inexpensive GPU.
That's my take anyway, but then I'm still perfectly happy with my desktop (overclocked) QX6700/Q6600 systems that I use for work and gaming. The headaches of trying to get everything transferred to a new system aren't worth the minor performance increases I'd see. Most of the time, my PC is waiting for me to finish typing/dictating/mousing.
One other possible reason for X58 is extra PCI-E lanes could be used for USB3 or SATA3 controllers. If you are only using a single graphics card you still have another x16 for a controller, while with P55 its x8 and also drops the graphics card to x8.
I'd like to know how multithreaded apps perform (4-8 threads,) given that the CPU will definitely clock down under this kind of load. How does the mobile i7 perform compared to the mobile C2Q in this situation?
Can you run Valve's particle or map compilation benchmarks?
I'll see if I can dig out the old Valve SMP tests... or are there newer versions? The files I have seem rather old and outdated (pre-EP2) so I'd prefer a test that's current if you have anything.
I'd be perfectly happy with the 820QM along with lower-end graphics. Maybe a nice 4830 or GeForce 9800? Something lower-power-using, but still adequate for most non-ultra-high-end games.
My problem is that I like to have only a laptop as my primary computer; and I do things like edit HD video, so four cores are what I want; yet I want it to be portable and have decent battery life, too. (Yup, I have a MacBook Pro.) So the idea of the quad-core Clarksfield is perfect. It's a low-power-draw chip when not in full load, yet can ramp up very well for high-load situations; both low-threaded and multi-threaded. Now I want a GPU that is similarly dual-natured. Something with very low idle power, but which can ramp up to high-power when needed.
(I only do my serious high-load work when plugged in, so power draw at load doesn't matter so much, it's power draw at idle and low-load that matters to me.)
Power draw is too high for my liking, i was expecting better than this. More interested in Arrandale due to the above - quad-core is overkill for most on a laptop anyway.
And someone bring an S-IPS panel to a laptop already!
Problem with an ips panel in a laptop is that ips panels need brighter backlighting due to the technology letting less light through than an tn panel. Thus higher power consumption.
Looks like the next decent launch of laptop chips will be 32nm. This hot & overpriced chip reminds me of the 'ol crappy Pentium 4M's that were around prior to being destroyed by Banias.
No, they don't. The Clarksfield CPUs are 55W or 45W TDP. The current Core 2 Quad mobile CPUs are 45W TDP. You're getting more performance for similar maximum power usage to the C2Q, and lower idle power, so it's definitely an improvement overall.
The current line of Core 2 Duo mobile CPUs tops out at 35W TDP. Switch to 45nm and step down in speed and you can get down to 28W @ 2.8G, down to 25W @ 2.66GHz, down to 17W @ 2.13GHz, or down to 10W @ 1.6GHz, all as Core 2 Duos.
Lower voltage (and lower TDP) versions of the Core i7 mobile CPUs may show up in the future, but right now, they definitely use more power than Core 2 Duo mobile CPUs, and are similar to current Core 2 Quads.
The 920 is just a extreme proc so lets take the 820 thats a 45W proc. But do remember that it has many more stuff build in that the Core Duo!
It has the memory controller, it has the pci-e controller.
The big question is here what does the platform of a Core Duo/Quad so the cpu including the complete chipset use as power compared to the combo 820/PM55 ?? Thats the question people have to ask.
I find the current setup really good. Anand was a bit wrong about the first 2 pictures of the battery life. It corrected it with the relative battery life picture (the first 2 shouldnt even be shown they are completely irrelevant) and you see there that the out come is pretty good.
So no they are not similar to the current Core 2 Quads.. You cant compare them 1 on 1.
About desktops vs Laptops. Around me (friends, co workers, family) there are almost NO desktops anymore. I only have 1 desktop and that isnt used as a real desktop its used as a Media Center below the tv. Almost everybody is using laptops its easily 5 laptops for 1 desktop.
I wouldn't say the first two battery life pictures are "wrong" - they tell you what the current W87CU will get with the default battery. I don't even know if there's an extended capacity battery available. To ship this sort of system with a puny 42Whr battery is at best very weird. The battery casing is actually very large too, so I don't know why they didn't go with at least a 9-cell ~65Whr battery. That would boost battery life by 50%.
The review compares apple and oranges here.
We have here an article that compares new processor with other (older) processors. This is not a review of the W87CU laptop.
When by accident a completely other laptop would used that had a 90Whr battery then suddenly that picture looks totally different!
So its completely random.
So the only way to look at it is the real power usage of the complete laptop, not how fast that laptop would run out of its battery.
Did you even read the article, or did you just skim the article? Even though the chips have a higher TDP, the new Clarksfield chips use less power than the old quads. Look at the last four charts of the article and tell me that the 920XM was not the most efficient processor. Keep in mind that this chip is not only the most power-hungry chip out of all Clarksfields, but it also has a GTX 280M, a 17" screen and a 42 watt hour battery.
To compare, the new ASUS CLUV based notebook gets 4.93 Minutes/WHr, but is clocked 300 MHz lower than the quad will ever go, the CLUV based platform has a G210M GPU, and the whole dual vs. quad w/ turbo boost and hyperthreading thing. To put into more reasonable terms, let's compare the new Clarksfield with my laptop, the Studio XPS 16. Jarred tested the Studio XPS with a P8600 and a HD 3670, I have a P8700 and a HD 4670, which is much better than the HD 3670 by the way, which use a little less power, but are close. Anyway, the Studio XPS 16 uses 36 watts in idle, just like the 920XM machine. At full load, the XPS 16 uses 93 watts, or 110 watts with max brightness and let me tell you, it is bright. The 920XM machine manages 90 watts with a full CPU load and 143 watts with a max load on the CPU and GPU.
What does all this mean? It means that the 920XM will not just be a desktop replacement, but a desktop replacement that you can actually use under light load away from the outlet (things like internet, music, word processing), whereas before you needed an A/C adapter if you wanted to use your laptop longer than 30 min. That's my two cents.
I read the article, apparently more carefully than you did. The Core 2 Quad that it's compared to has 2x GTX 280M GPUs, a HD, and an 18" screen, compared to the 920XM with a single GTX 280M, an SSD, and a 17" screen, and the screens are different resolutions (which will impact CPU/GPU load). We don't know if the GTX 280M GPUs were running at the same clock rate or if they used the same memory type, or the same memory frequency, all of which affect both idle and load power usage. Also, as neither of these machines was optimized for battery life, we have no information about the efficiency of their power supply systems.
The bottom line is that a comparison of the power consumption on these machines is NOT a direct comparison of the CPU and chipset efficiency. Find a test with two laptops that differ only in the CPU and chipset, then we'll see which one actually uses less power.
"Max brightness" on two different displays with no measurement of the actual brightness, screen size, screen type, and lighting type is meaningless as a point of comparison. Two different laptop displays at max brightness can draw significantly different amounts of power based upon the factors I mentioned.
The idle power of the Clarksfield CPUs is very promising, and that might make it a better CPU for battery powered devices, but that's not a valid conclusion to draw from the tests because the machines had too many other differences.
Don't be so quick to jump to conclusions. While TDP ratings are NOT power utilization ratings, they are an indicator of the maximum power demands of the CPU. Since the 920XM has a 55W TDP, but it's slower versions have a 45W TDP, it's predictable that the 920XM exceeds 45W under load while the QX9300 has a 45W TDP and remains under 45W under load. Intel's TDP #'s are a good guideline for maximum load power.
Regarding the "max brightness", that was actually incorrect. I used new values for the laptops with the LCDs always at 100 nits. It's still not a perfect comparison, since one LCD at 100 nits might use 3W and another could use 8W depending on size and backlight technology, but it's closer than using the max brightness (as your comments indicate).
FWIW, Core 2 Quad QX9300 and the i7-920XM appear similar in max power draw but with the i7 part having far better idle power. A C2Q with two 10W CPUs would end up running at 1.6GHz tops and use ~20W TDP, but idle power still won't be as good as the 920XM because of Power Gate. What would be really interesting would be Power Gate tech moved into Core 2 CPUs, but that ain't gonna happen anytime soon I suspect. :)
Can the QX9300 on the Eurocom system be clocked down to 2.0GHz? Can you disable Turbo and HT on the Core i7-920XM on the Clevo W87CU system? If so, you could compare a C2Q @ 2.0GHz vs a 920XM @ 2.0GHz (with and without HT). Run CPU (not GPU) intensive tasks and see how they perform and how much load power they each use. That should give a good indication of the relative instructions per clock of the two architectures as well as the performance/watt.
There is no need to repeat any of the gaming tests, just a couple single threaded and multithreaded CPU intensive tests and idle and full CPU load power usage. It might let us put to rest the lingering questions of whether C2D or Core i7 is a better core architecture for mobile systems.
Granted, there are still other system differences that we can't eliminate, but as long as SLI is not enabled on the Eurocom system, we can get them to be fairly close.
Agreed. Aside from better power management (power gating and turbo mode), I'm not yet convinced that the Core i7 is more power efficient than C2. I don't expect to ever see it, but a C2Q with the power management of the i7 might make an excellent laptop CPU.
As for the "Max brightness" comment, I was addressing the other poster's reply about tests of the Dell Studio 16, don't know the 100nit level was used there, but as noted, power can still vary significantly.
Arrandale is what we want, really: dual-core with HyperThreading. That should cut maximum CPU power use down substantially, and there will be 25W and 35W parts (and likely 17W as well). Restricting Turbo modes to lower clocks will also help. Right now, Clarksfield is max performance within a much greater thermal envelope than most laptops allow.
*I* want a quad core laptop! No question dual core is kind of anemic anymore. I mean it's been silly to go dual core on the desktop for YEARS, yet we're still stuck mostly with dual core on notebooks :-/
I'm really more interested in how those 1.6 and 1.73Ghz parts do versus faster clocked Core 2 duos and quads. The clock speed obviously is kind of frighteningly low, so I sort of need to see benchmarks that 1.6 or 1.73 actually gives you a competent system (I'm sure it does, but...)
And yeah, I game on my 2.4Ghz Penryn dual core with mobile Geforce 9650GT. I'd like better, but a desktop isn't an option for me anymore, so I'll just upgrade my notebook as needed :)
While Arrandale is promising, I would be similarly interested in a 25W C2Q. Since they can make 10W and 17W C2D, they should certainly be able to make 25W and 35W C2Q. Arrandale should be faster when running 1 or 2 threads, but a 1.6GHz C2Q @ 20W TDP (2 x SU9600) should perform as well or better when running all 4 cores. As a bonus, the C2Q could work with the Nvidia 9400M chipset, for very good IGP performance, add an optional discreet GPU for those who want something faster. Until Intel demonstrates that they can actually deliver a good IGP, Arrandale doesn't sound all that wonderful. Just a thought.
Let me clarify a bit. I would be far more interested in the Core i# CPUs if it didn't have an Intel GPU built in and if I had an option of a good non Intel chipset. Since Intel and Nvidia seem to be in a pissing contest over the licensing that would allow Nvidia to build an i# compatible chipset, the future of a low power CPU, chipset, and GPU (that doesn't suck) looks questionable on Arrandale.
For those of us who don't need a discreet GPU, but want decent graphics performance AND excellent battery life, an all Intel solution does not look promising. At best, it looks like a 25W Arrandale with an Intel chipset and a discreet low power ATI or Nvidia GPU.
While HT on the Core i# CPUs is better than HT on P4, it's still nowhere near the benefit of doubling the real cores. I would rather have a non HT (e.g. Core i5) based quad core than Arrandale.
55% market share is laptops but they don't mention if those people also own a desktop - or more importantly, build their own desktop.
When you consider that more and more of the people who want a desktop are enthusiasts who build their own, and those numbers aren't going to be counted in desktop sales which only considers the pre-built big-box manufacturers like Dell, etc, you realize that chart means little.
So in reality the chart is a great marketing tool: It's "true" in one sense, but it doesn't tell the whole story.
Pre-built machines from Dell, HP, Apple, etc. account for the vast majority of systems sold. Custom built computers are a niche. I suspect custom build computers would be lost in the margin of error.
What means little is your phantom statistic "when you consider that more and more of the people who want a blah blah blah."
That' your opinion. You have no evidence to back that up. My opinion is that you are very wrong and that most people just buy the cheapest prebuilt rig they can find.
They are talking about cpu's sold. If 55% of the cpu's sold are mobile, it a good bet that about 55% of the systems those cpu's are being put into are laptops.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
63 Comments
Back to Article
Hrel - Thursday, October 22, 2009 - link
I consider any dedicated card with at least 16SP's and at least 512MB of dedicated memory to be a gaming laptop; 16 SP's IS the ABSOLUTE minumum, but that should be enough to run everything ecxept maybe crysis (Which I really hate anyway) at 720p or higher with playable frame rates. Who cares about eye candy? As long as the game runs smoothly. Desktops are for eye candy, laptops and consoles are just for gaming.maomao0000 - Sunday, October 11, 2009 - link
http://www.myyshop.com">http://www.myyshop.comQuality is our Dignity; Service is our Lift.
Myyshop.com commodity is credit guarantee, you can rest assured of purchase, myyshop will
provide service for you all, welcome to myyshop.com
Air Jordan 7 Retro Size 10 Blk/Red Raptor - $34
100% Authentic Brand New in Box DS Air Jordan 7 Retro Raptor colorway
Never Worn, only been tried on the day I bought them back in 2002
$35Firm; no trades
http://www.myyshop.com/productlist.asp?id=s14">http://www.myyshop.com/productlist.asp?id=s14 (Jordan)
http://www.myyshop.com/productlist.asp?id=s29">http://www.myyshop.com/productlist.asp?id=s29 (Nike shox)
MonicaS - Wednesday, September 30, 2009 - link
Man, as someone who hasn't used a desktop as personal computer for the last 4 years, the move to laptop was a very difficult one. You have the convenience but lack the performance. Now couple this processor with two raid ssd's and 8 gigs of ram in a 64bit Windows 7 laptop and you finally have a beast of a machine in your probably burning lap.I'd love to get that setup and finally not feel as though I'm loosing out to a desktop in anyway. The only true limitation is Crysis, but seriously that game sucked anyway!
Can't wait!
Monica S
Los Angles Computer Repair
http://www.sebecomputercare.com">http://www.sebecomputercare.com
AnnonymousCoward - Friday, September 25, 2009 - link
Heh, "Gamers Are Going Mobile". My video card is the size of some laptops. And I'm not playin on no 15" screen.FXi - Thursday, September 24, 2009 - link
I have to say the mobility right now is more of a draw than a higher level of eye candy. Now mind you, I have both laptop and desktop so if I really crave eye candy, I can go to the desktop room and game.But with two little ones, I find that my 'gaming time' is often measured in 20 min spans here and there, and that being able to surf or get some work done wherever the kids happen to be is a benefit that I very much enjoy. So I can run Witcher on my old 7950M, windowed @ 1620 and have the settings lower and be "ok" with that.
Mind you I do crave a bit more oomph, a more modern machine, but I can bide my time. The mobility is very nice, and I don't LAN (no time!). Having SLI or a higher end mobile chip simply means the laptop is "acceptable" for a longer period of it's life.
I won't argue the bang for the buck. Mobile gaming is pricey and not cost effective. But the mobility is nice, the space taken up by a machine I can throw in the closet is also nice. And within some limits, lower res or lower eye candy is acceptable as payment for that mobility.
Now I just need USB 3 (USB changes only happen every 10 years or so) and then I might consider upgrading.
strikeback03 - Thursday, September 24, 2009 - link
When checking my laptop for Ubuntu vs Xp battery life, I accidentally ran my first XP test with my standard undervolt on, didn't seem to impact battery life any.ambientmf - Thursday, September 24, 2009 - link
Am i the only one who thinks these chips are ridiculously overpriced? I would never drop more than $350 on a CPU even in a desktop, it just doesn't seem economical for a $1K laptop processor. especially if it's only running at a 2.0GHz base.The cheaper options seem really underwhelming and like others have said, the thermal output of these chips just doesn't make sense for a laptop.
cjb110 - Thursday, September 24, 2009 - link
The big problem with gaming laptop's is that they aren't balanced. The display is always at higher res than the cpu/gpu can drive.I'd get a gaming laptop if it can drive all current gen games at max settings at the native res of the panel it comes with. Even if that res is <1080p.
If it can't do that, then I've spent a lot of money on something that's already behind the desktop I can get for cheaper.
Mugur - Thursday, September 24, 2009 - link
... if there is one? I mean that that 1.6 Ghz part looks very nice: quad-core with HT and turbo.I think someone could make a decent notebook, not a desktop replacement out of a 720QM.
FXi - Thursday, September 24, 2009 - link
Shouldn't the Quad core mobiles be 32nm and the Dual cores 45nm? I know that's not the case but what was Intel thinking? It doesn't even look like there's a refresh of the Quad's to 32nm in the Spring.Crazy, cuz they look like good chips with a shrink.
gstrickler - Thursday, September 24, 2009 - link
I get the impression that Intel isn't really interested in making a low power quad core for laptops. My guess is that they see dual core + HT as the solution for users who need good battery life, and anyone needing a Quad core "portable" won't be running off battery (or will have a much larger battery). I don't agree with that, I think there are a small but growing percentage of users who could benefit from quad core performance on a notebook and still want/need good battery runtime. It's for not mainstream users yet, but it will be in a few years. For now, we have to choose between extra CPU power, extra battery life, or extra weight.Pirks - Thursday, September 24, 2009 - link
People will keep buying gaming laptops in droves 'cause they are very convenient - you can game wherever you want and you are not bound to your big ol' immovable desktop tower. And they are cheap too. Take a look at this one for example: http://www.bestbuy.com/site/olspage.jsp?skuId=9366...">http://www.bestbuy.com/site/olspage.jsp...&typ...After my Alienware M17 which I'm VERY happy with I'd get this one, veeery sweet machine and cheap too, just $1k. Why bother with stationary desktop when this beautiful Asus lappy can get everything done including every game including Crysis too?
Jarred just can't understand the beauty of running your games literally anywhere. Sad, just sad. Grow up AT guys, desktop is the past and lappys are the future :P
JarredWalton - Thursday, September 24, 2009 - link
I review laptops, I run tests on them, and gaming performance is still at least two generations behind desktops. That $1000 ASUS is a good gaming laptop, to be sure, and a much better value that a $3000+ Alienware. However, a single GTX 260M is a 9800 GTX desktop part equivalent (or GTS 250 if you prefer the latest name change).There's a big reason I don't test with AA enabled on *any* of the laptops: they can't handle it. Heck, these $3000+ laptops struggle to run 1080p at times. Is it childish to think that my 30" display with 4870X2 provides a far more compelling experience than a laptop?
If you love gaming laptops, I've got full reviews of these three (and your M17x) in the works. And for every comment like yours stating that I need to grow up and get past the desktop, I'll get 10 comments on my high-end reviews saying, "Who in their right mind buys these things!?" The answer is "Pirks" apparently, along with people that want to go to LAN parties (a very small minority of gamers, considering the millions playing games) who also have the money to spend on gaming laptops (an even smaller minority).
Anyway, I loved the Gateway FX series kicking off the $1300 gaming laptop era, and the ASUS G50 series (including your linked laptop) is great as well... for what it is. A convenient laptop to me usually doesn't get an hour of battery life, have an extremely hot bottom (no, not like that...), or weigh 12 pounds. Okay, that ASUS probably weighs 7 pounds and gets 90 minutes of battery life, and it's not as toasty as a Clevo DTR setup. That's why it will sell a lot more units than the W87CU, though.
The problem is, Intel's "gaming laptops" slide includes http://www.bestbuy.com/site/olspage.jsp?skuId=9379...">stuff like this VAIO. GeForce 9600M is NOT a gaming solution by any stretch of the imagination. It wasn't even a gaming solution when it launched, with its paltry 32 SPs. It can handle WoW at moderate detail settings and resolution, sure, and it's faster than any IGP, but Crysis and many other games will still struggle.
GeorgeH - Thursday, September 24, 2009 - link
My main laptop has a Quadro FX 770M (roughly equivalent to the 9600M GT you linked to) powering a 15.4" 1920x1200 screen and I have to disagree with you a little bit there.It's very true that when playing games like Far Cry 2 zebras look more like dirty white horses and some of the jagged edges could put your eye out, but at the end of the day the gaming experience isn't really all that different from my desktop. Other titles are similar; yes, graphics sliders sometimes need to be set to the low end of the scale, but most games are still very playable and you often really don't miss the "shiny" graphics options.
At this point, it'd take a really solid implementation of something like AMD's Eyefinity to make me want to build another super high-end gaming desktop. Maybe I'm just getting old, but current advances in graphics technology just don't seem to wow as much as they did 10 years ago - subjectively they often feel more like baby steps than giant leaps.
The reason I bring all this up is to suggest that a short paragraph on your subjective gaming impressions going between the laptops and your desktop might be a good idea. Just noting that AA is off often isn't enough for us mortals that don't have a few $3000 laptops and $2000 towers just lying around. ;)
ltcommanderdata - Wednesday, September 23, 2009 - link
Is the 45W/55W TDPs for Clarksfield really that much of a concern over previous generation CPUs? Penryn processors may have had 35W/45W TDPs but that also had separate northbridges with the PM45 rated at 7W. Clarksfield with it's integrated memory and PCIe controllers basically has the northbridge integrated and absorbs it's TDP rating. Once the northbridge is taken into account, the 45W/55W TDP of Clarksfield is only 3W higher than previous generation Penryn+northbridge combinations.I do agree though that the low clock speeds, relatively high clock speeds, and high prices of Clarksfield makes it unlikely that we will see quad cores break into the mainstream mobile market in this generation. While it may not be needed on desktop where the thermal constraints are relaxed, I think a 32nm quad core Westmere derivative is definitely needed in the mobile market. It's unfortunate that we'll likely have to wait at least another year before we see a 32nm mobile quad core with Sandy Bridge.
gstrickler - Thursday, September 24, 2009 - link
You're correct, a current C2Q at 2.0-2.53GHz with a 45W TDP plus PM45 + ICH9M (9.5W combined TDP) is 54.5W TDP, vs a Core i7 mobile (45W/55W TDP + PM55 (3.5W TDP) is 48.5W/58.5W TDP, each without IGP. Put the C2Q with an Nvidia 9400M (G) chipset (12W TDP) and it looks a bit better at 57W TDP including a good IGP, but it's still in the same power range as the i7 and as shown in the article, it's notably slower.However, you're overlooking something. Intel currently offers C2D @ 1.6GHz with 20W TDP and 2.13GHz w 17W TDP, indicating that the should be able to make 20W and 34W TDP C2Q on their current 45nm process. In fact, they've got a Xeon L5408 (2.13GHz Quad Core) @ 40W TDP, so clearly they can get to the numbers I suggest. Couple those "possible" CPUs with an Nvidia 9400M (G) chipset @ 12 W TDP, and you're looking at a 1.6GHz C2Q w/ 32W TDP or 2.13GHz C2Q w/ 46W TDP for the complete CPU, chipset, and a good IGP. Or, go with the PM45+ICH9M (9.5W TDP combined) for 29.5W or 43.5W TDP for complete systems without GPU.
Compare that to a mobile i7 + PM55 at 48.5W - 58.5W TDP, without GPU, that's 34%-64% more peak load power than what Intel could theoretically do with C2Q today with similar base clock speeds. Yes, the i7 has power gating that will allow it to use lower power at idle, and it has Turbo mode which will allow it to perform better when only 1 or 2 cores are in use, but Turbo mode will still use more peak power than these hypothetical C2Q. Of course, you can turn off Turbo mode, but then you're back to performance much closer to the C2Q.
Compared to what Intel actually offers today, the new chips are an improvement for those who need quad core performance in a portable. However, compared to what they've shown that they could offer today if they chose to, the Clarksfield chips don't look like they're much of an improvement. If Intel applied the power gating and/or turbo features to the C2Q, Clarksfield might not look like an improvement at all. Of course, since Intel isn't doing that, it's all speculation.
Bottom line, Clarksfield gives more performance in a notebook, but at a notable cost in power usage (and a corresponding cost in battery life) vs what Intel could do today using C2 based systems. If battery life and weight are important to you, Clarksfield is no big deal, and it leaves you waiting for Arrandale or lower power Clarksfield CPUs. If top performance is your concern and you can live with shorter battery life and/or more weight, then Clarkfield gives you a new option.
jcompagner - Friday, September 25, 2009 - link
but you are overlooking somethingThese are not replacements of the ultra low voltage stuff or even replacements of the mobile Pxxxx core 2 duo's..
These processors are replacements of the high end Core Duos that do have a 35W or the Core Quads that have a 45W TDP's
Please do compare them with what they are aiming to replace..
yes the C2D 1.6Ghz 20W is much lower but now compare them with the power you get from a Core I7 720..
Its completely depending on what you do. If you do use 2-4 cores for your daily work then i think the Power/Watt is way better with the Core I7 for what you get.. The I7 is way faster done with it's work then the C2D so it can be in idle way quicker .. And looking at the review it seems that it saves more power in idle then the Core duo's!
gstrickler - Thursday, September 24, 2009 - link
In the 2nd paragraph, it should be "C2D @ 1.6GHz with 10W TDP"justme2009 - Wednesday, September 23, 2009 - link
"It's good to finally see an official Nehalem CPU for the mobile sector. Power gate transistors have the potential to seriously improve battery life, and we can't wait to see that sort of technology begin making its way into CPUs as well as processors. In terms of performance, things are a little bit of a mixed bag."Don't you mean GPUs as well as processor? :p
I'm still waiting for arrandale, I'll be skipping this generation and upgrading in one or two years.
rbbot - Wednesday, September 23, 2009 - link
simply for the increased ram capacity in a normal mainstream chassis.strikeback03 - Wednesday, September 23, 2009 - link
So when are we going to see Arrandale? At these prices not so interested in Clarksfield.strikeback03 - Wednesday, September 23, 2009 - link
Also, probably Dragon error:If you're after optimal gaming performance, obviously just kidding, the fastest CPU or the fastest GPU alone won't cut it in every situation.
bottom of page 7.
JarredWalton - Wednesday, September 23, 2009 - link
I have no idea what I was even trying to say on that one. Hmmm.... Dragon at 8AM after working for 20 hours straight is NOT my friend! :-) Oh, wait: "just getting" without the comma should work.As for Arrandale, it's due out in Q1 2010, so not too far off. 32nm and dual-core + Hyper-Threading should be very compelling I think.
strikeback03 - Thursday, September 24, 2009 - link
yeah, just getting is what I assumed it was.So no Arrandale before Christmas? That's too bad.
TA152H - Wednesday, September 23, 2009 - link
Finally a review conclusion I can agree with.Dual core with hyperthreading should be better. I also don't think laptops will replace desktops for gaming. Is Intel crazy? That's just such a strange prediction, I don't know why they would think it's even remotely possible.
I think they should have waited for 32nm before releasing a Nehalem for mobile. Really, they have no competition from AMD worth speaking about, and 32nm could have been done right - with an integrated IGP if desired, and low enough power use that it's not one hour and out.
A micro-ATX setup with a handle, a Bloomfield and a real video card would probably be better for the vast majority of gamers than this one. You don't get true portability, but with only one hour of life, you don't really get it with this either.
The gloom and doom for desktops is always overstated. For one, laptops are really only comfortable for women and weak, pencil-necked men. If you have any size, the keyboard is a nightmare, and none come with the natural keyboard which men's shoulder width really begs for.
On top of this, mouse movement is a pain in the neck with that replacement. A mouse is just more comfortable. Then you have to worry about power, which kind of ruins a lot of the fun with computers - just kind of doing what you feel like and relaxing. Who wants to worry about power? Then, you're limited by screens. Of course, you can dock these things, and use them as desktops, in which case they only have disadvantages (although not as many) in this role, and no advantages.
So, I think desktops will always be around, and always be the preferred tool even if you have both.
I think one reason laptops sell pretty well is, ironically, because they are unreliable, and need replacement much more often. They are also more difficult to upgrade, also necessitating replacement rather than upgrade. So, the reasons aren't all good.
Penti - Saturday, September 26, 2009 - link
To be fair it's a desktop replacement cpu, like their many released before. Yes it will replace desktops for gaming but only for some. DTR cpus are also used for high-end mobile workstations. It's not for everybody get your grip together. They don't do everything because competition from AMD.goinginstyle - Wednesday, September 23, 2009 - link
The cut and paste king is at it again."A micro-ATX setup with a handle, a Bloomfield and a real video card would probably be better for the vast majority of gamers than this one. "
You have to be kidding, Lynnfield offers far superior power management and consumption plus better overall performance than Bloomfield. Something you would actually want in a micro-ATX setup, instead of a heat generator like Bloomfield. Even the Phenom/Athlon II would make for a better micro-ATX gaming platform than the energy sucking Bloomfield.
Before you get on your high horse about Bloomfield and overclocking, it's not going to make any difference in a micro-ATX system compared to Lynnfield, except to make the temperatures unbearable.
How is that next cut and paste article coming along for Toms? Are you going to do the history of lawnmowers on this one?
TA152H - Wednesday, September 23, 2009 - link
Dude, are you gay or something? What is your obsession with me?Wouldn't your current boyfriend be upset if he saw your obsession.
Micro-Atx would, naturally, be plugged in. What kind of an idiot are you? Battery life wouldn't matter in this context. The micro-ATX would be for easy movement from one place to another. Some of them are really small, light (so even you could move it), and easily transported. I wasn't implying you'd want to use it from a battery.
Didn't you understand that within the context? Clearly, you're a moron. And a gay, obsessive one. Get a life. You need attention, clearly, but not from me. Find someone else.
goinginstyle - Wednesday, September 23, 2009 - link
What is your obsession with posting negative comments to every article at AT? They have proven you wrong every step of the way. Why they even wasted their time is beyond me but I have to hand it to them for even paying attention to you.You obviously do not make these sames posts at Toms? I just read through the comments on the last six or seven articles. Especially the cpu related ones and you did not make a single comment even though their conclusion, test methods, and information is nearly the same at ATs when comparing Lynnfield vs Bloomfield. Why is that?
You brought up the Bloomfield micro-ATX setup and I was just replying to one not so bright idea of yours about Bloomfield once again being superior. The last thing you want in a micro-ATX system you will be lugging around to LAN parties is a Bloomfield cpu and X58 chipset. Heat is thy enemy and this platform has ridiculous power consumption compared to Lynnfield or Phenom II. Considering the alternatives available I would say Bloomfield should be one of the last ones to suggest.
TA152H - Wednesday, September 23, 2009 - link
You're really not that stupid, are you?My post wasn't negative. I agreed with the author. I don't agree with Intel, and neither did he. What is your problem, besides liking me?
They didn't prove anything, at all. You're just too stupid to see through the weird benchmarking. Maybe I stopped reading before they posted their 'proof'. But, really, Anand's apple to apple benches proved my point. In their first article, they were saying they were the same or better. Then, all of the sudden, the Bloomfield is 3.5% faster, normalized! Sometimes almost 10%, on real world benchmarks. Fancy that! Although, this time they screwed up by making the Bloomfield uncore faster, so it's not 100% accurate.
Tom's also showed a lot of advantages of the Bloomfield. This site just rubbed me the wrong way because they were doing whatever they could to make the Lynnfield look better than it really is. Is 3% a big deal? Who knows? Maybe it is, maybe it isn't. That's a matter of perspective. But, when I see 0%, or -1%, and I know it's just not so, that's where someone has to say something. If they were arguing 3% isn't so important, then, so be it. But, show the 3%, instead of hiding it behind bogus setups that hide it.
You wouldn't understand because you're simple. But, life isn't simple. Very few things are good or bad, completely.
By the way, why would power matter more for micro-ATX than for anything else? The Bloomfield is king of the hill. I'm not crazy about the power use, really, I'm not, but, for a gaming platform, I'd want the best. I'd put up with the additional power use.
For a computer I'd put in the kitchen and would only surf on, I'd probably be much more inclined to look at power. Of course, even then I wouldn't consider the Lynnfield. I'd get a Core 2 or Pentium, a G45, and get all the performance I needed.
Also, you probably didn't notice, because Gary hid it, that the voltage needed to overclock the Lynnfield was considerably higher than the Bloomfield. That makes me a little nervous. But, still, I agree completely the x58 needs to go on a diet. I wish Intel would move it down to 45nm. It's a high end platform, it deserves it.
So, it's not the size, it's the application. Hmmmm, that could have a different context, but, remember we're talking about computers here.
Inkie - Thursday, September 24, 2009 - link
"They didn't prove anything, at all. You're just too stupid to see through the weird benchmarking. Maybe I stopped reading before they posted their 'proof'. But, really, Anand's apple to apple benches proved my point. In their first article, they were saying they were the same or better. Then, all of the sudden, the Bloomfield is 3.5% faster, normalized! Sometimes almost 10%, on real world benchmarks. Fancy that! Although, this time they screwed up by making the Bloomfield uncore faster, so it's not 100% accurate."That's nice, but the processors as they ship are not 'clock normalised'. As foretold, the benefit of extra clockspeed usually outweighs the benefit of playing with the memory subsystem. We all knew this and benchmarks confirm it, yet again. Many people don't give a shit about overclocking, you know, even if their major-OEM BIOS would allow them to do it. If I didn't need ECC, which causes me to buy Xeons, I'd definately choose Lynnfield on P55 over Bloomfield on X58, for any kind of comparable price. Not only is performance better, but power consumption is also better. Now, if you don't think that websites should benchmark at stock speeds, then maybe you should just move your ass to a specialist overclocking forum?
goinginstyle - Thursday, September 24, 2009 - link
TA152H,You stated - "Tom's also showed a lot of advantages of the Bloomfield. This site just rubbed me the wrong way because they were doing whatever they could to make the Lynnfield look better than it really is."
Fact -
1. AT's numbers are in alignment with everyone else on the web, including Toms. They are not trying to make Lynnfield look better than it is, it is just better in most cases than Bloomfield, so get over it.
You have harped on and on about clock for clock numbers, overclocked results, and all sorts of stuff that the people here at AT have provided. Even after providing proof, you still launch personal attacks at the editors and other readers because the numbers do not agree with your warped view of the world.
Fact -
1. Toms has not provided clock for clock comparisons, overclocked comparisons, or i7/860 comparisons. Neither have they equalized memory settings or shown results in several applications that AT added after the first review to give an additional look at each of the processors.
2. You did not comment at Toms (where you freelance apparently) about any of these items that you complained about at AT. Why is that?
You mentioned that Toms showed a lot of advantages for Bloomfield. Actually reading the review it was very few and far between, just like at most sites.
Fact -
1. Straight Quote from the summary at Toms and I do not see where they are crazy about Bloomfield except for the workstation crowd or those that have to have six cores on the desktop -
"...Now that we’ve had a couple of weeks with final hardware the Core i5 and Core i7 processor families are even more fascinating.
To begin, they make it much harder to recommend LGA 1366-based Core i7s. We know the i7-900-series is supposed to be higher-end, and it’s hard to ignore the fact that next year we’ll see hexa-core Gulftowns that drop right into our X58 motherboards. But seriously...
Alright, so the Core i5-750, specifically, is priced well. What is there to like about it? Reasonable power consumption, a base clock rate comparable to Intel’s Core i7-920, a more-aggressive Turbo Boost able to take the chip to 3.2 GHz in single-threaded workloads, CrossFire and SLI compatibility—it’s a pretty compelling list, actually.
...More attractive for the folks who stand to benefit from Hyper-Threading is Core i7-860. Its price tag puts it in the realm of Core i7-920, its Turbo Boost helps make it faster, and a complementary motherboard is going to cost you between $75 and $50 less."
Explain to us why those comments did not warrant a response from you at Toms in the same manner that you have posted here. They provided the same type of conclusion, only half the benchmarks and none of the followup by the guys here at AT.
Yet, you never once complained at your site about what you thought was a serious enough problem to call the editors here idiots and to make other personal attacks on their work. Why they have not banned you by this point is beyond me.
I did send an email to Chris Angelini tonight requesting him to read your posts here at AT and asking him if this is the type of employee Toms is proud to have on their staff. You have gone way beyond the norm in your continued attacks on this site and hopefully Chris will be a stand up guy and address this issue with you immediately.
Gary Key - Thursday, September 24, 2009 - link
"Also, you probably didn't notice, because Gary hid it, that the voltage needed to overclock the Lynnfield was considerably higher than the Bloomfield. "The voltages were never hidden, they were right there in the gallery along with the Uncore and memory subtimings as stated in the text. Regarding the uncore rates, they make very little difference, if any, except in the prized SuperPi benches.
Even though Lynnfield needed additional voltages to overclock, the total power consumption and thermals were still lower than Bloomfield. Say what you want, but Bloomfield is not significantly better in any regard for desktop users than Lynnfield. In fact, for most, the opposite is true.
JarredWalton - Wednesday, September 23, 2009 - link
uATX is smaller, and the cases are smaller, and so more heat is definitely a concern. You also are placing way more stock on places where Bloomfield wins and pooh-poohing any areas where Lynnfield wins or ties. There's no benchmarking silliness going on, except in your expectations.Places where Bloomfield clearly wins: WinRAR (super memory intensive, since it has to search for matching patterns when compressing). WME... does anyone really care about WME when we have x264 being clearly superior? And that's it. Everything else is within 5%, which is one speed bin.
To say that we tried to hide the additional voltage required for overclocking Lynnfield is another fabrication. There's a http://www.anandtech.com/cpuchipsets/showdoc.aspx?...">whole section where that is specifically addressed, as well as being mentioned in the conclusion about the "stock voltage overclocking". If you overclock, I hardly think voltages and power are your primary concern. You want to stay stable, and people have been boosting Intel CPU voltages far more than what we've done on Lynnfield.
Personally, Lynnfield makes more sense, simply from the financial aspect. If you want top performance, go for Bloomfield. If you want better performance without dropping a load, Lynnfield wins, even when you factor in overclocking. Only the extreme fringe is really concerned with more than Lynnfield offers. You're having fits over differences of 5% or less in most cases (DDR3-1066 vs. DDR3-1333, non-Turbo overclocked performance, etc.)
The only truly bad thing about the Lynnfield platform right now is that we're missing a decent IGP solution, and Intel is going to be the sole chipset provider for a good long time. Those are problems with Bloomfield as well, and they're not even serious concerns since you can easily add an inexpensive GPU.
That's my take anyway, but then I'm still perfectly happy with my desktop (overclocked) QX6700/Q6600 systems that I use for work and gaming. The headaches of trying to get everything transferred to a new system aren't worth the minor performance increases I'd see. Most of the time, my PC is waiting for me to finish typing/dictating/mousing.
strikeback03 - Thursday, September 24, 2009 - link
One other possible reason for X58 is extra PCI-E lanes could be used for USB3 or SATA3 controllers. If you are only using a single graphics card you still have another x16 for a controller, while with P55 its x8 and also drops the graphics card to x8.Cat - Wednesday, September 23, 2009 - link
I'd like to know how multithreaded apps perform (4-8 threads,) given that the CPU will definitely clock down under this kind of load. How does the mobile i7 perform compared to the mobile C2Q in this situation?Can you run Valve's particle or map compilation benchmarks?
Cat - Wednesday, September 23, 2009 - link
I know the CINEBENCH scores do provide some insight, but it's not representative the of workload I'll be using.JarredWalton - Thursday, September 24, 2009 - link
I'll see if I can dig out the old Valve SMP tests... or are there newer versions? The files I have seem rather old and outdated (pre-EP2) so I'd prefer a test that's current if you have anything.Anonymous Freak - Wednesday, September 23, 2009 - link
I'd be perfectly happy with the 820QM along with lower-end graphics. Maybe a nice 4830 or GeForce 9800? Something lower-power-using, but still adequate for most non-ultra-high-end games.My problem is that I like to have only a laptop as my primary computer; and I do things like edit HD video, so four cores are what I want; yet I want it to be portable and have decent battery life, too. (Yup, I have a MacBook Pro.) So the idea of the quad-core Clarksfield is perfect. It's a low-power-draw chip when not in full load, yet can ramp up very well for high-load situations; both low-threaded and multi-threaded. Now I want a GPU that is similarly dual-natured. Something with very low idle power, but which can ramp up to high-power when needed.
(I only do my serious high-load work when plugged in, so power draw at load doesn't matter so much, it's power draw at idle and low-load that matters to me.)
james jwb - Wednesday, September 23, 2009 - link
Power draw is too high for my liking, i was expecting better than this. More interested in Arrandale due to the above - quad-core is overkill for most on a laptop anyway.And someone bring an S-IPS panel to a laptop already!
Roland00 - Thursday, September 24, 2009 - link
Problem with an ips panel in a laptop is that ips panels need brighter backlighting due to the technology letting less light through than an tn panel. Thus higher power consumption.strikeback03 - Thursday, September 24, 2009 - link
That's fine, but make it an option for those of us who would happily give up some battery life for a better panel.Pneumothorax - Wednesday, September 23, 2009 - link
Looks like the next decent launch of laptop chips will be 32nm. This hot & overpriced chip reminds me of the 'ol crappy Pentium 4M's that were around prior to being destroyed by Banias.Exar3342 - Wednesday, September 23, 2009 - link
You realize these use the same power as most existing dual-cores, right?gstrickler - Wednesday, September 23, 2009 - link
No, they don't. The Clarksfield CPUs are 55W or 45W TDP. The current Core 2 Quad mobile CPUs are 45W TDP. You're getting more performance for similar maximum power usage to the C2Q, and lower idle power, so it's definitely an improvement overall.The current line of Core 2 Duo mobile CPUs tops out at 35W TDP. Switch to 45nm and step down in speed and you can get down to 28W @ 2.8G, down to 25W @ 2.66GHz, down to 17W @ 2.13GHz, or down to 10W @ 1.6GHz, all as Core 2 Duos.
Lower voltage (and lower TDP) versions of the Core i7 mobile CPUs may show up in the future, but right now, they definitely use more power than Core 2 Duo mobile CPUs, and are similar to current Core 2 Quads.
jcompagner - Thursday, September 24, 2009 - link
You are calculating wrong.The 920 is just a extreme proc so lets take the 820 thats a 45W proc. But do remember that it has many more stuff build in that the Core Duo!
It has the memory controller, it has the pci-e controller.
The big question is here what does the platform of a Core Duo/Quad so the cpu including the complete chipset use as power compared to the combo 820/PM55 ?? Thats the question people have to ask.
I find the current setup really good. Anand was a bit wrong about the first 2 pictures of the battery life. It corrected it with the relative battery life picture (the first 2 shouldnt even be shown they are completely irrelevant) and you see there that the out come is pretty good.
So no they are not similar to the current Core 2 Quads.. You cant compare them 1 on 1.
About desktops vs Laptops. Around me (friends, co workers, family) there are almost NO desktops anymore. I only have 1 desktop and that isnt used as a real desktop its used as a Media Center below the tv. Almost everybody is using laptops its easily 5 laptops for 1 desktop.
So i dont get who are buying all those desktops?
JarredWalton - Thursday, September 24, 2009 - link
I wouldn't say the first two battery life pictures are "wrong" - they tell you what the current W87CU will get with the default battery. I don't even know if there's an extended capacity battery available. To ship this sort of system with a puny 42Whr battery is at best very weird. The battery casing is actually very large too, so I don't know why they didn't go with at least a 9-cell ~65Whr battery. That would boost battery life by 50%.jcompagner - Friday, September 25, 2009 - link
The review compares apple and oranges here.We have here an article that compares new processor with other (older) processors. This is not a review of the W87CU laptop.
When by accident a completely other laptop would used that had a 90Whr battery then suddenly that picture looks totally different!
So its completely random.
So the only way to look at it is the real power usage of the complete laptop, not how fast that laptop would run out of its battery.
Someguyperson - Wednesday, September 23, 2009 - link
Did you even read the article, or did you just skim the article? Even though the chips have a higher TDP, the new Clarksfield chips use less power than the old quads. Look at the last four charts of the article and tell me that the 920XM was not the most efficient processor. Keep in mind that this chip is not only the most power-hungry chip out of all Clarksfields, but it also has a GTX 280M, a 17" screen and a 42 watt hour battery.To compare, the new ASUS CLUV based notebook gets 4.93 Minutes/WHr, but is clocked 300 MHz lower than the quad will ever go, the CLUV based platform has a G210M GPU, and the whole dual vs. quad w/ turbo boost and hyperthreading thing. To put into more reasonable terms, let's compare the new Clarksfield with my laptop, the Studio XPS 16. Jarred tested the Studio XPS with a P8600 and a HD 3670, I have a P8700 and a HD 4670, which is much better than the HD 3670 by the way, which use a little less power, but are close. Anyway, the Studio XPS 16 uses 36 watts in idle, just like the 920XM machine. At full load, the XPS 16 uses 93 watts, or 110 watts with max brightness and let me tell you, it is bright. The 920XM machine manages 90 watts with a full CPU load and 143 watts with a max load on the CPU and GPU.
What does all this mean? It means that the 920XM will not just be a desktop replacement, but a desktop replacement that you can actually use under light load away from the outlet (things like internet, music, word processing), whereas before you needed an A/C adapter if you wanted to use your laptop longer than 30 min. That's my two cents.
gstrickler - Wednesday, September 23, 2009 - link
I read the article, apparently more carefully than you did. The Core 2 Quad that it's compared to has 2x GTX 280M GPUs, a HD, and an 18" screen, compared to the 920XM with a single GTX 280M, an SSD, and a 17" screen, and the screens are different resolutions (which will impact CPU/GPU load). We don't know if the GTX 280M GPUs were running at the same clock rate or if they used the same memory type, or the same memory frequency, all of which affect both idle and load power usage. Also, as neither of these machines was optimized for battery life, we have no information about the efficiency of their power supply systems.The bottom line is that a comparison of the power consumption on these machines is NOT a direct comparison of the CPU and chipset efficiency. Find a test with two laptops that differ only in the CPU and chipset, then we'll see which one actually uses less power.
"Max brightness" on two different displays with no measurement of the actual brightness, screen size, screen type, and lighting type is meaningless as a point of comparison. Two different laptop displays at max brightness can draw significantly different amounts of power based upon the factors I mentioned.
The idle power of the Clarksfield CPUs is very promising, and that might make it a better CPU for battery powered devices, but that's not a valid conclusion to draw from the tests because the machines had too many other differences.
Don't be so quick to jump to conclusions. While TDP ratings are NOT power utilization ratings, they are an indicator of the maximum power demands of the CPU. Since the 920XM has a 55W TDP, but it's slower versions have a 45W TDP, it's predictable that the 920XM exceeds 45W under load while the QX9300 has a 45W TDP and remains under 45W under load. Intel's TDP #'s are a good guideline for maximum load power.
JarredWalton - Wednesday, September 23, 2009 - link
Regarding the "max brightness", that was actually incorrect. I used new values for the laptops with the LCDs always at 100 nits. It's still not a perfect comparison, since one LCD at 100 nits might use 3W and another could use 8W depending on size and backlight technology, but it's closer than using the max brightness (as your comments indicate).FWIW, Core 2 Quad QX9300 and the i7-920XM appear similar in max power draw but with the i7 part having far better idle power. A C2Q with two 10W CPUs would end up running at 1.6GHz tops and use ~20W TDP, but idle power still won't be as good as the 920XM because of Power Gate. What would be really interesting would be Power Gate tech moved into Core 2 CPUs, but that ain't gonna happen anytime soon I suspect. :)
gstrickler - Thursday, September 24, 2009 - link
Can the QX9300 on the Eurocom system be clocked down to 2.0GHz? Can you disable Turbo and HT on the Core i7-920XM on the Clevo W87CU system? If so, you could compare a C2Q @ 2.0GHz vs a 920XM @ 2.0GHz (with and without HT). Run CPU (not GPU) intensive tasks and see how they perform and how much load power they each use. That should give a good indication of the relative instructions per clock of the two architectures as well as the performance/watt.There is no need to repeat any of the gaming tests, just a couple single threaded and multithreaded CPU intensive tests and idle and full CPU load power usage. It might let us put to rest the lingering questions of whether C2D or Core i7 is a better core architecture for mobile systems.
Granted, there are still other system differences that we can't eliminate, but as long as SLI is not enabled on the Eurocom system, we can get them to be fairly close.
gstrickler - Wednesday, September 23, 2009 - link
Agreed. Aside from better power management (power gating and turbo mode), I'm not yet convinced that the Core i7 is more power efficient than C2. I don't expect to ever see it, but a C2Q with the power management of the i7 might make an excellent laptop CPU.As for the "Max brightness" comment, I was addressing the other poster's reply about tests of the Dell Studio 16, don't know the 100nit level was used there, but as noted, power can still vary significantly.
JarredWalton - Wednesday, September 23, 2009 - link
Arrandale is what we want, really: dual-core with HyperThreading. That should cut maximum CPU power use down substantially, and there will be 25W and 35W parts (and likely 17W as well). Restricting Turbo modes to lower clocks will also help. Right now, Clarksfield is max performance within a much greater thermal envelope than most laptops allow.Wolfpup - Thursday, October 15, 2009 - link
*I* want a quad core laptop! No question dual core is kind of anemic anymore. I mean it's been silly to go dual core on the desktop for YEARS, yet we're still stuck mostly with dual core on notebooks :-/I'm really more interested in how those 1.6 and 1.73Ghz parts do versus faster clocked Core 2 duos and quads. The clock speed obviously is kind of frighteningly low, so I sort of need to see benchmarks that 1.6 or 1.73 actually gives you a competent system (I'm sure it does, but...)
And yeah, I game on my 2.4Ghz Penryn dual core with mobile Geforce 9650GT. I'd like better, but a desktop isn't an option for me anymore, so I'll just upgrade my notebook as needed :)
gstrickler - Wednesday, September 23, 2009 - link
While Arrandale is promising, I would be similarly interested in a 25W C2Q. Since they can make 10W and 17W C2D, they should certainly be able to make 25W and 35W C2Q. Arrandale should be faster when running 1 or 2 threads, but a 1.6GHz C2Q @ 20W TDP (2 x SU9600) should perform as well or better when running all 4 cores. As a bonus, the C2Q could work with the Nvidia 9400M chipset, for very good IGP performance, add an optional discreet GPU for those who want something faster. Until Intel demonstrates that they can actually deliver a good IGP, Arrandale doesn't sound all that wonderful. Just a thought.gstrickler - Wednesday, September 23, 2009 - link
Let me clarify a bit. I would be far more interested in the Core i# CPUs if it didn't have an Intel GPU built in and if I had an option of a good non Intel chipset. Since Intel and Nvidia seem to be in a pissing contest over the licensing that would allow Nvidia to build an i# compatible chipset, the future of a low power CPU, chipset, and GPU (that doesn't suck) looks questionable on Arrandale.For those of us who don't need a discreet GPU, but want decent graphics performance AND excellent battery life, an all Intel solution does not look promising. At best, it looks like a 25W Arrandale with an Intel chipset and a discreet low power ATI or Nvidia GPU.
While HT on the Core i# CPUs is better than HT on P4, it's still nowhere near the benefit of doubling the real cores. I would rather have a non HT (e.g. Core i5) based quad core than Arrandale.
yacoub - Wednesday, September 23, 2009 - link
55% market share is laptops but they don't mention if those people also own a desktop - or more importantly, build their own desktop.When you consider that more and more of the people who want a desktop are enthusiasts who build their own, and those numbers aren't going to be counted in desktop sales which only considers the pre-built big-box manufacturers like Dell, etc, you realize that chart means little.
So in reality the chart is a great marketing tool: It's "true" in one sense, but it doesn't tell the whole story.
jordanclock - Thursday, September 24, 2009 - link
Pre-built machines from Dell, HP, Apple, etc. account for the vast majority of systems sold. Custom built computers are a niche. I suspect custom build computers would be lost in the margin of error.pervisanathema - Wednesday, September 23, 2009 - link
What means little is your phantom statistic "when you consider that more and more of the people who want a blah blah blah."That' your opinion. You have no evidence to back that up. My opinion is that you are very wrong and that most people just buy the cheapest prebuilt rig they can find.
7Enigma - Thursday, September 24, 2009 - link
Agreed. We enthusiasts are in the vast minority.Phynaz - Wednesday, September 23, 2009 - link
They are talking about cpu's sold. If 55% of the cpu's sold are mobile, it a good bet that about 55% of the systems those cpu's are being put into are laptops.yacoub - Wednesday, September 23, 2009 - link
Where does it state "CPUs sold" on that chart? Also, then it would be only Intel data.More likely it is what says, which is a statement about total mobile clients (aka systems) sold as a percentage of total PC sales.