Comments Locked

43 Comments

Back to Article

  • mmatis - Wednesday, September 1, 2010 - link

    for your failure by offering an additional giveaway to your readers. I suspect most of them would be satisfied with a complete home theater system...
    }:-]
  • iwod - Wednesday, September 1, 2010 - link

    I dont know if i am disappointed, since i have always wanted a SUPER efficient Integrated Graphics inside CPU while a more powerful sits outside. Hence i really like the idea of Southbridge + GPU that i have been calling for nearly two years.

    The GPU inside CPU would be enough to do ( hopefully ) Hardware Video Decode, Graphics acceleration for web browsing, Desktop acceleration etc.. And in this scenario, not supporting OpenCL wont matter because the job is left for another GPU. The Intel GPU wont have to waste transistor on doing Compute things. And focus solely on graphics.

    But i am disappointed in ways because i saw some leaks that point to Intel spending nearly half of the chips on GPU ( including memory controller ) . If this is the case then i can see Intel's GPU is not really Power / Transistor efficient.

    A GT220 ( 340M on Macbook ) is rougly 2 - 4 times faster then a 5450. Which is roughly the same as Intel GPU.
  • IntelUser2000 - Wednesday, September 1, 2010 - link

    You can't count on the memory controller as part of the GPU because its shared. The GPU portion only takes ~40mm2 or so. That will turn out to be the most efficient GPU in terms of die size.
  • mino - Thursday, September 2, 2010 - link

    Umm, maybe.
    But counting microelectronics efficiency by mm2 is kind of pointless.

    Also when you remove MC&Video from 5450's 73 mm2 (at 40nm!) you would get comparable if not smaller GPU die.
  • Wayne321 - Thursday, September 2, 2010 - link

    For the case of desktop chips, how difficult would it be to code a driver that utilize only the integrated GPU during certain workloads (or by user choice), pass the video feed to the monitor through the dedicated video card while keeping it almost completely turned off? Considering the amount of power video cards draw even during idle, there is a good case to save some power if the integrated graphics can do the job.
  • iwod - Thursday, September 2, 2010 - link

    Nvidia Optimus with Drivers already allows you to that. ( Only on Laptops at the moment... )
  • Drag0nFire - Wednesday, September 1, 2010 - link

    Thank you for fessing up to the mistake. Your high journalistic standards are what makes this site so amazing! Many other sites wouldn't make such a big point of correcting a published article...

    Keep up the great work!
  • anandreader106 - Thursday, September 2, 2010 - link

    Now if only Anand could somehow convince Dailytech (which doesn't have any journalistic standards what-so-ever) to do the same for their articles. That would be amazing!!!
  • DigitalFreak - Thursday, September 2, 2010 - link

    Here here!
  • bah12 - Thursday, September 2, 2010 - link

    It would indeed, on DT if you even dare point out a typo you (usually) get down rated and flamed. AT welcomes the correction and thanks the user for finding it, the way a true journalism site should.

    Case in point.
    http://www.dailytech.com/Plugin+Electric+Vehicle+S...

    Picture caption says Nisan Volt instead of leaf, still not corrected 24hrs later even though it was pointed out in the comments.
  • Taft12 - Thursday, September 2, 2010 - link

    Hallelujah! Someone give this man a +6!
  • anactoraaron - Wednesday, September 1, 2010 - link

    I have been curious as to the integrated graphics power consumption with arrandale/SB in the notebook realm... I LOVE the notebooks from the Core 2 series with optimus and the ~12 hr potential battery life (a.k.a. the ASUS UL series of yesterday). But it seems like arrandale doesn't really get there on battery life and I am wondering if there's some way for the integrated graphics to have the same "power gating" that the cpu's have (is this already the case?).
    What I would like to see is an improvement for Optimus (nvidia) and "switchable" (amd) for the "on die graphics" as it relates to power gating. It seems to me to be the next logical step to be able to completely turn off the integrated portion especially now that SB shares resources with the cpu when using discrete graphics. Or be able to turn off one core when not doing anything 3D, etc.
    Is this already happening and I've missed it... or am I one of many to think this is needed?
  • Randomblame - Wednesday, September 1, 2010 - link

    I thought that performance was too good to be true for the low end intel graphics. I was hoping we could see more performance on the mobile front, the possibility of up to 50% more oomph over what we saw in your benchmarks was awesome but I guess we all just forgot these are intel integrated graphics. You can't expect too much. The results in your benchmarks are still respectable considering the source, if we want more performance we can still go the integrated route.

    I sure hope the next die shrink brings at least a doubling of execution units for the graphics, then I'll be happy.

    You know what would be very interesting? A method of accessing and using these extra gpu transistors for computing, cuda style. A lot of people are going to be running discreet graphics...
  • Randomblame - Wednesday, September 1, 2010 - link

    I wonder if all of these chips are made with the extra execution units and half are disabled to improve yields when they don't work. They are still in the sampling stage right now they still have to get it right before they can start trying to make native gt1 parts to save silicone. Once they start gt1 parts perhaps we will see a version with the graphics disabled altogether as another method of improving yields.
  • s44 - Wednesday, September 1, 2010 - link

    "the enthusiast K-SKUs will have GT2 graphics while the more mainstream parts will have GT1"

    This seems backwards, at least from the consumer perspective. Although there's a need for a minimally powerful HTPC part, won't any enthusiast desktop user who wants shader power just buy a GPU?
  • beginner99 - Thursday, September 2, 2010 - link

    Most enthusiast won't buy this lga1155 stuff anyway and wait till Q32011 for the real thing. But in general I agree with you.

    But why intel does this is simple. Would make live even more complicated for the normal consumers if a slower cpu have better gpu's.
  • IntelUser2000 - Thursday, September 2, 2010 - link

    Oh yea because 3/4-channel memory and 2/4 extra cores will will help in games, which is the biggest reason for enthusiasts to buy.
  • shiznit - Thursday, September 2, 2010 - link

    +1

    I consider myself an enthusiast but I also like to save money, and I doubt I'm alone.

    $150 core i5 750 @ 4.0 with $70 2x2GB ram and $100 motherboard feeds my 5870 just as well as an i7 920 x58 platform that costs a lot more.

    enthusiast != big spender
  • richardginn - Thursday, September 2, 2010 - link

    If that was GT2 power we looked at in the preview than you are looking for that in a 300 dollar plus CPU.

    I say at the price point you will also be buying a 200 buck video card anyway which defeats the purpose owning GT2 graphics, right????
  • IntelUser2000 - Thursday, September 2, 2010 - link

    "The desktop Sandy Bridge GPU rollout is less clear. I've heard that the enthusiast K-SKUs will have GT2 graphics while the more mainstream parts will have GT1. I'm not sure this makes sense, but we'll have to wait and see."

    Wait, does it mean there's still a possibility the tested part is 1 core? The tested part was apparently i5 2400, which doesn't have any K parts at all.

    @ those complaining about K parts with graphics

    It doesn't mean only the "K" variants have it, it probably means all the CPUs that have K options have it. The i5 2500 has K also.
  • ssj4Gogeta - Thursday, September 2, 2010 - link

    It was an engineering sample, not the retail 2400. The only thing similar it had to the 2400 was the clock speed.
  • ssj4Gogeta - Thursday, September 2, 2010 - link

    It would make sense if we had switchable graphics on the desktop.
  • teq_zombie - Wednesday, September 1, 2010 - link

    Interesting way to put it.

    Another way to think about it: you colluded with some one of the very limited Intel customers who have legitimate access to these chips to win unauthorized access.

    At the very least, you did NOT acquire the chip on your own.
  • ggathagan - Wednesday, September 1, 2010 - link

    Try not to be too much of an a$$hat.

    If you resist the urge to take that single sentence outside of the context of the whole paragraph, it's not too difficult to discern that Anand is referring to whether or not they received the chip directly from Intel as part of an Intel-sanctioned review.
  • MadMan007 - Wednesday, September 1, 2010 - link

    Kind of a shame if only the 'K' CPUs will have the good graphics, and kind of ironic too. Most enthusiasts who get the K CPUs will have discrete cards too and won't even care about a 6 vs 12 EU IGP. Personally, I am almost fine with the Clarkdale IGP since I don't care about newer games much at all (btw Anand, when are you going to test older games running on IGPs, or perhaps give us an idea of what older GPU is equivalent to a modern IGP? We all know IGPs are too weak to provide good experiences at decent resolution/settings in newer games but there are loads of older games that are just as good) but the IGP speed of the SB would be welcomed. I just don't want to shell out for a K series CPU, was hoping to get the cheapest 4c/8t option with 12EU and oc it. Being somewhat 'forced' to buy the K CPUs doesn't sit well with me :/
  • mino - Thursday, September 2, 2010 - link

    I too would love HW sites to pay more attention to older games. Especially compatibility with Intel IGP drivers.

    Except flawless Diablo II, I had seen BSOD's, hard freezes, screen corruption and all other sorts of crap from latest Intel video drivers.

    While it is be nice to have 10 FPS in current games and benchmarks, I do not care abou them on IGP. No one should.
  • ssj4Gogeta - Thursday, September 2, 2010 - link

    Anand said in his article that overclocking will be very limited on the non-K variants.

    I just wish we had switchable graphics on desktop too, then the IGP in K-series chips would make sense.
  • Regenweald - Thursday, September 2, 2010 - link

    I have no interest in intel integrated graphics on a desktop, but Sandy Bridge seems mightily impressive for a laptop, 12EU's that is. The bottom of the barrel has finally gotten to an acceptable height. I'm excited to see the SB and Llano form factors next year.
  • ET - Thursday, September 2, 2010 - link

    Thanks for the update. The IGP mostly matters on the laptop side, where a user can't add a discrete card and is stuck with what's available, so it's good that Intel is using the higher performing part there. I think that this performance level is a good baseline.
  • Khato - Thursday, September 2, 2010 - link

    Hopefully all the graphics performance questions revolving around Sandy Bridge will be answered then. Best of all would be if we also get to see what the turbo mode performance looks like on a GT2 part.
  • ClagMaster - Thursday, September 2, 2010 - link

    Apology is accepted.

    This is a Black Op test of a motherboard vendor sample. You have done this before with Lynnwood and Conroe sample CPUs and despite small disparities between then and now, were accurate in your preliminary performance assessments. I am surprised your contact that provided you with this sample did not know whether it was a 1 or 2 GPU processor.

    As you pointed out earlier in response to an earlier observation, this is a preliminary test to be supplimented by a more comprehensive (and final) test report when the hardware is released.
  • krumme - Thursday, September 2, 2010 - link

    Thank you Anand - respect
    Thank you for posting it on the frontpage.
    Doing this sort of previews is sure a delicate balance and dangerous in many ways. I dont think we have to know how you aquire the CPU. Nothing have to be said about "formal". Who know what formal is here?
    I never believed for a second the preview was santioned in any formal way by intel. But non the less, i was still left with a voice saying "favor" in my ear when i read the preview. Its okey, we get the preview, but this was over my personal limit. When i read the review, i got the clear impression that the message was that SB was ok for low- and midrange gaming. Its clearly not, and it doesnt have to. It looks to be a very fine CPU.
    Again - thank you for the update on the frontpage.
    Best regards
  • mcturkey - Thursday, September 2, 2010 - link

    Between the thoroughness of your reviews and your honesty when something you report winds up being incorrect, this is far and way the best site for hardware and technology reviews. Keep up the good work!
  • Stuka87 - Thursday, September 2, 2010 - link

    When I first read the preview I did find it interesting that Intel would provide chips so early on that were not yet complete. Now it all makes sense.

    I too was surprised about the graphics performance if those were indeed GT1 chips. But even if they are GT2's, that still is a HUGE jump over what Intel currently offers. And it means the GT1 should still easily out perform what is offered currently as well.

    And as others said, thanks for being so honest with us Anand. Its one of the main reasons I visit this site every day. I do not feel I have to worry about anything being twisted or biased one way or the other. If you make a mistake, you correct it. And that means a lot to us readers.
  • justaviking - Thursday, September 2, 2010 - link

    Anand,

    Adding my voice to the others, thank you for the front-page correction.

    It is good that the error stings. That means you care. But don't apologize too much. I also believe you were quite clear that many things were uncertain and there was much speculation in the original article. For that matter, even your update here includes speculation.

    It must be a difficult balance. Do you publish a speculative report or do you delay giving us the information we crave. I vote for early reporting as long as it is clear that is what we are getting.

    I have always appreciated these things about AnandTech ("You" being all your authors):
    - You present data and analytical results as much as possible
    - You add your own subjective analysis and opinion
    - You clearly distinguish between objective and subjective comments
    - You give the "why" behind opinions
    - You give us a look behind-the-scenes
    - And, as far as I can tell, you strive to be honest and forthright in your articles, as proven today

    Thanks again for the prompt update. This is why I often recommend AnandTech.com on nearly every online forum I participate in.
  • Vamp9190 - Thursday, September 2, 2010 - link

    So what is the best path to upgrade from an older Q6600 Kentsfield CPU and MB?

    Wait for Sandy Bridge and go 1155? Wait til then and buy a i7 930 1366 cheaper than today (but by how much?) ?

    Wait until Q3-4 of next year for 2011 ?

    Oh, and I want to be able to OC to 4.0GHz+ (on water if needed), but spend ~$300 for the CPU and ~200-250 for the MB.

    Thoughts?

    Thanks.
  • tatertot - Thursday, September 2, 2010 - link

    If you are fairly sure this was a 12 EU sample, and certain the turbo was disabled, that does leave one other relevant question:

    What speed was the iGPU running at?

    12 EU QC Mobile parts seem to mostly run at a 650MHz base / 1300 MHz turbo, while 6 EU Desktop high-end parts seem to be 850 MHz base / 1350 MHz turbo.

    So for your non-turboing part, was it 850? 650?

    thx
  • ibudic1 - Thursday, September 2, 2010 - link

    Dear Anand,

    I would respectfully disagree with your logic in saying that Performance / (watt, cost) can be attributed to the CPU.

    It is misleading to say that Performance / (Watt, Cost) of CPU's is important.

    You Say : "But clock for clock isn't what matters, it's performance per dollar and performance per watt that are most important" -> This simply is not true for ALMOST any user, save supercomputing centers.

    Cost of CPU's - especially in laptops - compared to the rest of the system is a fraction of the cost of the laptop. Purchasing a laptop that may need to be updated sooner because of worse fundamental performance will in the end cost more than one with a better CPU.

    To wit: Buying a laptop now with a low performing but inexpensive CPU might force me to buy a completly new machine in say 1 year. A laptop, as you know has many more components than a CPU alone.

    On the other hand buying a laptop with a more powerful, as well as unproportionally more expensive CPU is less expensive, since one would not need to replace the entire system for much longer, say one and a half years.

    This is, of course, much less important in case of servers, and supercomputing centers, where the cost of CPUs is the main cost of the center. Replacing an LCD screen on a supercomputing center or on a server is a trivila cost - not worth mentioning.

    Having to replace an entire system on the laptop, because of one component, is NOT trivial.

    It is a loosing proposition to look at price/ performance of the CPU in a laptop and make your purchase accordingly.

    However, if your goal is to help AMD out so that they can stay competative, by using your influence I agree with you. Personally, I will look for my own interest and buy whatever has high price/performance long term - at the moment, AMD is just not competative - at least in the Desktop/Laptop market. For servers they can be a better choice.

    So thank you anand by giving an unfair advantage to AMD, which will put pressure on Intel to drive their prices down. :)
  • ClagMaster - Thursday, September 2, 2010 - link

    I would respectfully disagree that performance/power ratio is an essential metric for a laptop. Double the performance for the same power is a real gain in performance.

    I certainly use this metric for deciding when its time to upgrade either a CPU or graphics card. I limit the CPU power to 95W and the discrete graphics card power to 65W. Unless I get double the performance for the same power level, the component is not worth upgrading.

    However, for a 2x increase in performance, it usually requires a complete overhaul of the system to realize the performance potential of all.
  • 7Enigma - Thursday, September 2, 2010 - link

    This is a pretty big disappointment since we have no clue the difference between the 6 and 12. The original review now doesn't have much validity.

    With the incorrect assumption that this was a 6 gpu and the fact that turbo was disabled we could be relatively sure that this was the BASE performance with the combination of the extra 6 PU's and turbo giving an increase. Now, since we don't know the impact of having 6 vs. 12 (20% increase, 50% increase?), nor the actual effect of turbo on the GPU, there really is no way to guess at the performace level of the 6PU part.

    I thank you Anand for the update, but kind of wish I hadn't read the original preview.....it was like being teased with a vette and being given a beater. :(
  • 7Enigma - Monday, September 6, 2010 - link

    On second thought the original article will give a baseline for the laptop chips (without turbo so hopefully slightly better performance). Not too shabby, but would have been even a greater difference if the original data was for the 6 unit part.
  • Donnie257 - Friday, September 3, 2010 - link

    No way I'm disappointed since I'll never use IGP unless in a Laptop anyway. I'm more interested in the Higher-End of the Mid-Range model, such as my current i930. There is no way for me to be disappointed when I'm not going to buy this for Games. Who in the #^!! buys IGP for Games? I'm more impressed with the performance gains on the already best processors on the planet LOL!. The only thing I see important for IGP is Desktop 2D/3D GUI Video Acceleration. With very little competition, Intel could be ripping off even more like the bad old days when Athlon64/X2 ruled the roost! Now even Apple-Mac can reast a little easier.

    Thanks Anand and keep up the good work!

    Site mods, PM me my old password? It's been a long time since I posted here as Donnie27, don't want anyone to think I'm hiding LOL!.

    Posted by "saaya, Xtremesystems.org-->wtg anand! hope you get an extra bonus from intel for this pr stunt... ""

    Same site and thread""
    spursindonesia--> ""gotta scrub & massage the hand that feed you, cause that's where company money comes all along, right ?""

    Posted MACMAC Post #22 ""Please...this is Anandtech.com, Intel absolutely gave the thumbs up for this article."""

    G.Foyle--"Nice preview, but I'm tired of this Anand/Intel politic BS.""""

    god_43---->""lol ignore lists are more useful everyday! anand isn't always straight when it comes to Intel. ill wait till the cpu comes out...not that i would buy it...just curious. """

    400+ posts and at least a 3rd of them seems to be negative posts posted by AMD fans. These posts range from thinly veiled slants, spins to down right Attacks and FUD! Thank Goodness for folks like Hornet and Jumping Jack keeping it real:) In fact, they're the only reason I kept reading the thread with 19,000 others. If AMD and their fans don't like the results, they need to put their products where their spin or mouth is. Instead of posting faked photo-shopped Bulldozer Die Shots LOL!
  • katleo123 - Tuesday, February 1, 2011 - link

    good post about sandy bridge
    for more info visit http://www.techreign.com/2010/12/intels-sandy-brid...

Log in

Don't have an account? Sign up now