Comments Locked

252 Comments

Back to Article

  • TwiSparkle - Wednesday, January 9, 2013 - link

    Asuming that there will be an i5 that ships with the same GPU, and that there is a Surface Pro 2 (or some other piece of hardware that is as compelling) running said i5, I am seriously excited. I want a portable gaming machine that can do more than just 2D/indie games and it looks like I could play Skyrim comfortably on that.
  • althaz - Wednesday, January 9, 2013 - link

    I'm already getting a Surface Pro (once they become available, it fulfills my tablet needs and, more importantly, can run Starcraft 2 and Football Manager), but a ULV i5 with comparable video performance to what is described above would probably get me to upgrade withing a year, if battery life and/or size/weight improves as well.
  • Krysto - Wednesday, January 9, 2013 - link

    You actually think you'll be getting the GT3 GPU in a low-power Surface Pro? GT3 is meant for higher-end, more power-consuming laptops, like those 35W Core i7 laptops.

    A future Surface Pro would use GT1, which is I think 3x weaker, and as TheVerge has noticed, their "low-power CPU" will be a lot weaker, too (sub-1Ghz).

    You're not going to magically get a dual core 1.8 Ghz Haswell with a GT3 GPU in a 7W machine (which would still be too high for mobile consumption anyway).
  • IntelUser2000 - Wednesday, January 9, 2013 - link

    "You actually think you'll be getting the GT3 GPU in a low-power Surface Pro? GT3 is meant for higher-end, more power-consuming laptops, like those 35W Core i7 laptops."

    Yea, you do. Only the quad core and Ultrabook bound ULT parts have GT3. The regular dual cores have GT2, and so does desktop chips.

    But of course, the GT3 in the quad core will be faster than the ULT one.
  • wsw1982 - Thursday, January 10, 2013 - link

    I remember I have read a article from SemiAccurate, which claimed the GT3 running at a slower clock but doubled stream processors of GT2 part, and as a results it consume less power than the GT2 . I am not sure if that was the case. But anyway, the GT650M was kind of high end graphic in 2012. If the the IGP of haswell has a similar performance, I gonna update my laptop. I don't care about touch or ultra light, because I run most of my work in VM and the touch anyway doesn't work, as long as it get a good battery life. I currently could squeeze 6.5 hours out of my lenovo thinkpad w520.
  • MrSpadge - Thursday, January 10, 2013 - link

    I'd rather have it in a Lenovo Helix II.
  • hescominsoon - Wednesday, January 9, 2013 - link

    Until a third party can report the numbers and do their own tests as far as i'm concerned this was two videos and not the actual game engines.
  • Hulk - Wednesday, January 9, 2013 - link

    Do you really think Intel would pull a fast one like that? Why? It's not like they are on the verge of going out of business and they need this or it's all over. I just think the risk/reward on that one doesn't make sense. Especially when they know the results will be verified in a few months.
  • Gc - Wednesday, January 9, 2013 - link

    Risk? Did it hurt them last year?
    http://www.anandtech.com/show/5333/intel-sort-of-d...
    http://www.anandtech.com/show/5391/intel-demos-dx1...
    If anything, it gave them more press attention, at least on enthusiast sites that like to point out such things.
  • CeriseCogburn - Thursday, January 10, 2013 - link

    Reviewers can smell these things with how the comany's people act and react, and there are thousands of those interactions and plenty of excited leakers who "drop hints".
    So the answer is, this time it's obviously real.
    I can even smell it reading the review.
    I don't understand how you people are so emotional and negative, and yet cannot pick up on the emo groove of articles from reviewers.
  • Krysto - Wednesday, January 9, 2013 - link

    Intel has been doing a lot of misleading lately. Here:

    http://www.theverge.com/2013/1/9/3856050/intel-can...

    But it's not the first time. They did it with the "3D chips" for IVB, which most people understood from their marketing that it will be BOTH 40% more efficient and 37% more powerful, when in fact it was an "either/or" thing, and in the end it compromised between the two, and got a lot less for each - 20% energy efficiency and I think 15% higher performance.

    Then they did it with Medfield, too, announcing it as a "2 Ghz processor" (it was 1.3 Ghz, really). And I think with the recent ones, too, like Lexington. They announced it as 1.2 Ghz processor, but it's probably more like 500 Mhz, with turbo-boost, that isn't used most of the time, because it eats a lot of power. This is probably why Verge also found it very slow:

    http://www.theverge.com/2013/1/8/3850734/intels-at...

    Lexington looks like no competitor to Cortex A7. I would distrust Intel's marketing by default, until proven otherwise, rather than trust what they say now.
  • IntelUser2000 - Wednesday, January 9, 2013 - link

    " They did it with the "3D chips" for IVB, which most people understood from their marketing that it will be BOTH 40% more efficient and 37% more powerful,"

    Uninformed people did, the rest did not.

    TSMC, Global Foundries, Samsung foundries, all claim the same thing. But you need to know that each metrics come seperately.
  • CeriseCogburn - Sunday, January 13, 2013 - link

    I read your linked article several times. Let's break down how honest you are... in actually evalualting the information, which is why of course all you do is broad stroke accusation.

    The Intel chips in question at your article :

    a 17W that may be limited to a 13W

    a 13W that may be limited to 7W

    So, you and The Verge KNOW IT IS a 2GHZ processor, but you want the cheapo version that is limited by the demanding and POOR crowd, so you ignore the top speed implementations and claim they "lied".
    Instead, you and your pocketbook lied.

    Same on the version that can be limited to 7W.

    Intel produces the chips they claimed, and ALSO offers low wattage control on those same chips, and you call Intel a liar because you don't want to PAY for the top end version implementations.

    Sorry charlie - Intel didn't lie, Mr Cheapo lied though, that's your wallet's name.
  • Spunjji - Wednesday, January 9, 2013 - link

    http://www.anandtech.com/show/5333/intel-sort-of-d...

    hescominsoon is wrong about this, of course. Pretty sure this is just the in-game time demo. But yeah, Intel still do that video bullshit.
  • Spunjji - Wednesday, January 9, 2013 - link

    Oh dear... superfluous post. Sorry!
  • hescominsoon - Thursday, January 10, 2013 - link

    a timedemo isn't real gaming so it means nothing. I hope i'm wrong but heck ALL of the graphics companies have been scamming for quite some time.
  • Spunjji - Friday, January 11, 2013 - link

    I wouldn't say it means nothing. It's a standardised indication of game engine performance. It doesn't relate well to actual gaming performance, but it at least allows for cross-model comparisons.
  • CeriseCogburn - Sunday, January 13, 2013 - link

    What a bunch of crap.

    Anand actually played the "demo version setups" in the "direct real game the demo runs 'emulating' ", after verifying the in game settings, and claimed " "THEY RELATE DIRECTLY" IN ACTUAL GAME PLAYING".

    Nothing like ignoring the real world test in this case in order to broadly brushstroke the "not so intelligent" cleche, that is supposed to make you "seem smart", when it is immensely STUPID in this case.

    A stupid idiotic talking point that does not here apply.

    In betwixt your stupid talking points, we get the rest of the whiners claiming the game runs so very fast - THUS "PROVING" real world performance in fact is directly related, right ? Right....

    And of course with those attacks in mind, we see someone added " it's only really a 630LE it's taking on" and "the laptop lid is closed so it's throttling" , " it's an 80W Intel processor " , "only the demo was tested ", " it's (Haswell) overclocked to the moon " .....

    The whiners lie a lot more than Intel ever dreamed of doing.
    Zero standards for the whining haterz, and the ultimate standard for the attacked Intel.

    I call you know what.
  • Spunjji - Monday, January 14, 2013 - link

    ...jeez.

    *In this article*, Anand stated that he "ran" the games independently of the demo. Forgive me for not assuming that meant "played". You can interpret it how you want, but in the language of the article there is no good reason to assume he played the games.

    So, what are you claiming here? You appear to be referring to other articles that I'm not aware of, then ranting about things I never said, whilst providing no evidence of your own.
  • CeriseCogburn - Monday, January 14, 2013 - link


    For not assuming ? What else can it mean you idiot.
  • DigitalFreak - Wednesday, January 9, 2013 - link

    "Until a third party can report the numbers and do their own tests as far as i'm concerned this was two videos and not the actual game engines."

    Moron

    "Note that I confirmed all settings myself and ran both games myself independently of the demo."
  • CeriseCogburn - Thursday, January 10, 2013 - link

    Yes, but the amd fanboys and haters cannot grasp true reality, THEIR sourpussed idiocy is all that really counts.
    LOL

    It is actually AMAZING how much the stupid are willing to share.
  • Spunjji - Thursday, January 10, 2013 - link

    Blah blah blah blah blah.
  • dragonsqrrl - Wednesday, January 9, 2013 - link

    Did you even read the article? The game was without a doubt running in real time. The only question is the actual frame rate and more precise performance measurements, which Intel wouldn't allow.
  • madmilk - Wednesday, January 9, 2013 - link

    The inclusion of the GT 650M in this comparison is meaningless without precise data though. The GT 650M has no trouble exceeding 60fps on Dirt 3 for 1080p, High, and no AA. Haswell might be 1/2 of the performance, 40fps, and it would still look perfectly OK to most human viewers as long as the minimum frame rate isn't too bad.
  • Spunjji - Wednesday, January 9, 2013 - link

    Bang on.
  • CeriseCogburn - Thursday, January 10, 2013 - link

    You've banged your head too much.
  • Spunjji - Friday, January 11, 2013 - link

    Please be constructive. If you can point out what was wrong with the preceding comment that I agreed with, I'll happily hear it.
  • ironargonaut - Friday, January 11, 2013 - link

    So, you're saying Anand wasn't smart enough to check the FPS? Anand said he couldn't report the numbers, he never said he didn't see them. Perhaps you should ask instead of assume.
  • mikato - Friday, January 11, 2013 - link

    Uhhh, what exactly are you replying to? The one that said they were just videos?
  • CeriseCogburn - Sunday, January 13, 2013 - link

    Oh well, if you had actually been following along you'd know the answer to your own Q.

    Here's a constructive comment for Spunjji -

    EXPECT FAR BELOW 15 FPS for a minimum frame rate dip on MUCH HIGHER desktop hardware.
  • CeriseCogburn - Sunday, January 13, 2013 - link

    Yes that's exactly what they said and are saying - that Anand is full of it, and or a duped idiot, and a liar right up there in the article.
    Which I wouldn't mind if they actually had a point, read the article, and used their brain to think for a few minutes before doing that...

    When they are eventually proven wrong, there will be no apology forthcoming.
  • Spunjji - Monday, January 14, 2013 - link

    I have never said any of that. I have said that I don't feel the article and the video provided give me a good enough indication to make *any* judgement beyond that this could be potentially promising *if the hardware performs just the same as this in a thermally constrained environment*.

    Is that clear enough for you now?
  • CeriseCogburn - Tuesday, January 15, 2013 - link

    You said "tested each independently from the demo" meant "did not play".

    That's what you said jerkoff.
  • Spunjji - Monday, January 14, 2013 - link

    "subjectively the two looked to deliver very similar performance"

    I'm making my assumptions based on that. I would bold the word SUBJECTIVELY if I could, but here it is in caps instead. You're making your assumptions based on... what?
  • CeriseCogburn - Monday, January 14, 2013 - link

    Have a look idiot

    http://www.tomshardware.com/reviews/origin-pc-eon1...
  • ironargonaut - Tuesday, January 15, 2013 - link

    you assume I am commenting on that comment, the suggestion is and has been that the whole comparison is "worthless" because the framerate was/is not known by Anand. The word subjective would lead one to believe this may be true. But, why assume? You are commenting on an article written by Anand on Anandtech. Ask the man.
    I asked a clarifying question. To be honest a rheotorical one to emphasis that I believe Anand to be intelligent. What "assumption" did you assume I made?
  • dragonsqrrl - Wednesday, January 9, 2013 - link

    The game was clearly running in real time. The comment I responded to claimed it wasn't. You raise a good argument that may very well be true, but I don't see how it's relevant to the point I was making.
  • CeriseCogburn - Friday, January 11, 2013 - link

    http://www.youtube.com/watch?v=KkKIWq_CsLA&fea...
  • dragonsqrrl - Friday, January 11, 2013 - link

    Might need to get a little less ambiguous there pro.
  • CeriseCogburn - Sunday, January 13, 2013 - link

    The whiners might need to click the link and find out - which - given there's no comment of anyone actually having done that - makes them CLUELESS, lazy, and incorrect, as usual.

    Would you like to fill in the blanks ?

    Oh wait, you dared not click it either.

    I gave it to the person WHO WAS REASONABLE in their commentary, as thus it will likely not have been wasted.

    I suspect you did click it, and viewed it, and saw how it shows what so many here do not want to believe.
  • kyuu - Monday, January 14, 2013 - link

    Lol. That video you linked proves nothing at all and contradicts no ones' points. Are you for real?
  • Spunjji - Monday, January 14, 2013 - link

    Unfortunately, Cerise is indeed for real.
  • CeriseCogburn - Monday, January 14, 2013 - link

    Unfortunately, you're all goofballs.
  • CeriseCogburn - Monday, January 14, 2013 - link

    http://www.tomshardware.com/reviews/origin-pc-eon1...
  • iwod - Wednesday, January 9, 2013 - link

    And we dont know the clock speed for 650M as well which could be lower.
  • usteg - Thursday, January 10, 2013 - link

    then it would've been a GT640M/GT645M (since those are basically downclocked versions of the GT650M, though if that's the case its more likely to be a GT640M since there seems to be no GDDR5 version of the GT645M anywhere).

    I have no doubt that they compared it to the GT650M, but the previous commenter raised a valid point that the GT3 could've been running around 40fps+ in comparison to the GT650M
  • ImSpartacus - Friday, January 11, 2013 - link

    The "650M" in the rMBP15 runs clocks higher than a 660M.

    http://en.wikipedia.org/wiki/Comparison_of_Nvidia_...

    So the definition of a 650M is pretty messy. I expect that this is the DDR3 version, but perhaps with the GDDR5's lower core clocks? If Intel found such a weird 650M implementation, it would probably barely outperform a GK107 640M LE (the shittiest mobile GK107).
  • CeriseCogburn - Sunday, January 13, 2013 - link

    What a bunch of crap.

    650M ddr3 is a SCREAMER in a laptop and NO amd iGPU is close to it.
    Oh yeah, I know personally with hands on experience.

  • ImSpartacus - Sunday, January 13, 2013 - link

    Perhaps at 720p, but this is 1080p. The lack of memory bandwidth would cripple the 650M. Between Crystalwell and potentially overclocked RAM, it's clear that GT3e's numbers could be artificially inflated.
  • CeriseCogburn - Monday, January 14, 2013 - link

    Another one that requires PROOF due to absolute ignorance and being INCORRECT too.

    http://www.tomshardware.com/reviews/origin-pc-eon1...

    LOL - just make it up in your imaginary bandwidth limited gourd

    You people are CLUELESS.

    Do you see that ? 1920x1080 no AA 57.6 fps
  • ciparis - Wednesday, January 9, 2013 - link

    You didn't read the article very well. Anand confirmed the settings, and ran both games himself separately from the demo.
  • CeriseCogburn - Thursday, January 10, 2013 - link

    AND also said he cannot report the "numbers".
    Instead he gave the not so vague "impression".

    No matter, the braindead haterz will always deny reality even after it is 100% obvious, as then they switch to some other gigantic inconsistent far out whining.
  • dragonsqrrl - Thursday, January 10, 2013 - link

    ummm, what?
  • CeriseCogburn - Friday, January 11, 2013 - link

    Don't worry, you'll be clueless forever, even after the Haswell release.
  • dragonsqrrl - Friday, January 11, 2013 - link

    No, your comments are just genuinely poorly written and difficult to follow. I'm honestly uncertain what you were trying to say.
  • Spunjji - Friday, January 11, 2013 - link

    Cerise is off the meds again...
  • CeriseCogburn - Friday, January 11, 2013 - link

    If either of you read the article, paid attention, or had two watts upstairs to use you'd follow along just fine.
  • Spunjji - Monday, January 14, 2013 - link

    I do. You're making more assumptions than the people you're criticising for making assumptions. Chill out.
  • CeriseCogburn - Monday, January 14, 2013 - link

    You idiots can call foul on your conspiracy theory gourd assumptions, and I can call you freaking clueless, and correct you, as others HAVE DONE here.

    If you didn't open your big piehole and spew your disdain, you wouldn't encounter resistance.

    Deal with it.

    http://www.tomshardware.com/reviews/origin-pc-eon1...

    Deal with that too idiot.
  • blue_urban_sky - Thursday, January 10, 2013 - link

    from article

    "Note that I confirmed all settings myself and ran both games myself independently of the demo."
  • CeriseCogburn - Sunday, January 13, 2013 - link

    Umm.. what the heck do you mean ?

    ( yes that was a well deserved sarcastic attack against the Intel haterz here )
  • Khato - Wednesday, January 9, 2013 - link

    If they'd actually demonstrated performance in one of the games that IVB doesn't already do well in. Note that IVB is comparable to to Trinity in Dirt 3 and a number of other games, but then in Civ 5 and a few others it drops down to up to half the performance... those are the games that they need to improve upon, and if HSW doesn't improve there then it'll continue to be sub-par.
  • Spunjji - Wednesday, January 9, 2013 - link

    Indeed. It's unlikely that it will improve either, given that the underlying architecture is not likely to have radically changed.
  • StevoLincolnite - Wednesday, January 9, 2013 - link

    Can't forget the drivers or horrible too.
    Makes AMD's look gold plated in comparison.
  • Spunjji - Wednesday, January 9, 2013 - link

    AMDs *are* gold plated by comparison ;)
  • DanNeely - Wednesday, January 9, 2013 - link

    Which doesn't change the fact that it's the sort of "gold" plating that will turn your skin green if you wear it continuously.
  • Spunjji - Wednesday, January 9, 2013 - link

    No green skin from my drivers yet! ;) Granted, I'm using a laptop without Enduro...
  • mikato - Friday, January 11, 2013 - link

    Where can I download these gold plated drivers? free download? :)
  • DanNeely - Wednesday, January 9, 2013 - link

    The top end Haswell IGP is supposed to be a 40core model vs 16 in IVB. Even if they didn't touch the architecture at all that would be a nice boost.
  • Spunjji - Wednesday, January 9, 2013 - link

    Not necessarily - there could be other limitations it butts up against internally. If they *have* changed the architecture then the cores may be individually weaker.

    Probably going to be a fair bit more powerful, though, all told. If you're prepared to pay for it.
  • DanNeely - Wednesday, January 9, 2013 - link

    I doubt it. They offered a 40core IVB GPU; but when only Apple expressed an interest decided to shelve it.
  • DanNeely - Wednesday, January 9, 2013 - link

    ... It's possible that was the same 40 core IGP they're offering for haswell; but I doubt they wouldn't've planned to bake in some level of improvements into it.
  • CeriseCogburn - Friday, January 11, 2013 - link

    Exactly this isn't AMD we're talking about, otherwise
    Spunjji's theory would be 100% spot on !

    Thanks amd, for frying Spunjii's brain.
  • Spunjji - Monday, January 14, 2013 - link

    Quite true. That doesn't necessarily exclude anything I've said previously, though. Historically, increasing the number of pipelines in a graphics engine does not lead to a direct proportional increase in performance unless the original solution was over-engineered or commensurate tweaks are made to the rest of the engine.

    That 40-core solution would presumably butt up even harder against the chips' thermal limits, which would mean it would give better maximum frame-rates but not solve the issue of throttling giving uneven performance in Ultrabooks where it is arguably of the greatest value.

    So, what we're saying here is we don't know how this will pan out yet. Maybe that's just because AMD friend my brain.

    *cough*
  • CeriseCogburn - Thursday, January 10, 2013 - link

    I have to comment that Intel having matched trinity in a few titles is a hurdle most here were screaming not so long ago "would never happen".

    Now Intel is barking up a 650M tree which is rather impressive for the recent loserdom they occupied for a decade at least in this area.

    I don't see a gargantuan win, but HD3000 could get one by, HD4000 was that much better, now we see this article has the smell of EXCITEMENT around the required skepticism.

    So it appears a large movement forward has been accomplished. A leap if you will.
    That is not bad - and compared to some of amd's tricks aka epic fail or fall back, it's good news.
  • Spunjji - Thursday, January 10, 2013 - link

    I actually agreed with the vast majority of what you said here. o_O
  • mikato - Friday, January 11, 2013 - link

    Took his meds now.
  • CeriseCogburn - Sunday, January 13, 2013 - link

    You mean "didn't run into another spew from a lying sack of crap totally biased commenting amd fanboy so actually gave his own opinion instead ".

    Which is correct. LOL Yes of course it is. A grade far above the others here.
  • Spunjji - Monday, January 14, 2013 - link

    thanks for ruining that conversation for anyone that wasn't you.
  • CeriseCogburn - Tuesday, January 15, 2013 - link

    You made a mess with your first reply hypocrite.
  • Shadowmaster625 - Thursday, January 10, 2013 - link

    Yeah but none of it means anything if intel charges a $200 premium for this chip vs the cheapest one. If it costs less to go with the cheaper intel chip + discrete, oems will do that.
  • Spunjji - Friday, January 11, 2013 - link

    Based on that logic, I doubt Intel would charge quite that much of a premium unless they have a distinct advantage elsewhere (e.g. power / ease of integration). So it's probably going to be of comparable cost. Could be good, could be bad, all conjecture at this stage.
  • CeriseCogburn - Friday, January 11, 2013 - link


    This chip will be offered with discrete as well - certainly standalone too.

    It will smoke the crap out of any amd cpu side.

    So many will buy it in both configurations for a lot of money.

    The crybaby amd fan haterz will cry and hate still.
  • fabarati - Wednesday, January 9, 2013 - link

    So, what are the actual max settings possible on the GT 650m?
  • madmilk - Wednesday, January 9, 2013 - link

    http://www.youtube.com/watch?v=RslBoz5XeJQ

    shows a GDDR5 GT 650M running Dirt 3 at 1080p, probably no AA, and Ultra settings at 40-45fps.

    Intel's use of High and Medium settings stinks of trickery.
  • jwcalla - Wednesday, January 9, 2013 - link

    Well actually that video shows 1600x900 resolution, but 8x QCSAA.
  • CeriseCogburn - Thursday, January 10, 2013 - link

    But the no AA in the video is really showing us where it's at, which is kind of easyville.
  • jwcalla - Thursday, January 10, 2013 - link

    If by "no AA" you mean "8xCQSAA", then yeah.
  • CeriseCogburn - Friday, January 11, 2013 - link

    By no AA I meant the video here, not the linked youtube.

    DUH.
  • jjj - Wednesday, January 9, 2013 - link

    You know very well that they'll downclock the sh*t out of it and just saying GT3 means nothing without clocks but why mention it.
  • Spunjji - Wednesday, January 9, 2013 - link

    Given that Intel's current chips are already thermally constrained in thin form factors, I'm not inclined to believe for one minute that their CPU will perform as well inside an Ultrabook as it does set up in that tower chassis...

    That said, I am still intrigued. Could be good.
  • CeriseCogburn - Thursday, January 10, 2013 - link

    Well the article did contain the caveat "by the time" it is shipped in such a configuration, so that admits in my book, that optimizations and power containment tweaks are still needed to accomplish that.

    I believe that, on the other hand though there is no laptop to show right now, even though it is mentioned it's a laptop vendor board (in testing for production it seems).

    So not really too shabby.
  • Spunjji - Friday, January 11, 2013 - link

    There is always room for that sort of optimisation, it's true. Their process should have matured somewhat by now. I've just been left wary by how much the Ultrabook form factor hobbled HD4000 performance. The early reviews led with desktop chips and desktop cooling and made it look very competitive to discrete solutions, but that didn't hold out in power-constrained environment.

    I guess we'll wait and see. Call me cautiously optimistic about this? Anything that makes AMD/nVidia raise their game is good in my book.
  • CeriseCogburn - Friday, January 11, 2013 - link

    nVidia is already winning the game, and so is Intel cpu side.

    Anything that makes AMD raise from pathetic is good.

    FTFY
  • Spunjji - Monday, January 14, 2013 - link

    ...and you call me the fanboy? :/
  • CeriseCogburn - Tuesday, January 15, 2013 - link

    You got the truth from me, what did we get, continuously from you ?

    We did not get the truth. Fanboy.
  • Batmeat - Wednesday, January 9, 2013 - link

    I really don't care about the graphics performance as I will always have in independent card for gaming. However, mobile gaming will be interesting as currently any nvidia based mobile graphic chipset will run you some serious $$$. What will pricing really be like?
  • Spunjji - Wednesday, January 9, 2013 - link

    Brutal, is the answer. Unless there's a sea-change at Intel, GT3 is only going to appear on the "high-end" i7 chips with the accordingly inflated price tags for 200Mhz extra clock speed and some features not disabled.

    I would *like* to believe otherwise, but I do not.
  • refurb82 - Wednesday, January 9, 2013 - link

    how about showing me open gl as well as dx9 stuff... i feel the past year or 2 for intel has been more driver issues than it has been strictly performance of their IGP's....
  • CeriseCogburn - Thursday, January 10, 2013 - link

    Yeah but oh well amd has survived with ati crapeons doing that for some years now.
    Before Intel was crap hardware and lacking drivers, now it's just crap drivers like amd.
  • ericore - Wednesday, January 9, 2013 - link

    There is no doubt in my mind that Intel is heavily overclocked memory prob 2500 mhz, remember this is 1080p you absolutely need memory bandwidth. The fact that they put it in a desktop means they overclocked on air as far as she goes. So I don't disagree that DIRT 3 runs great on Haswell BT3 on the desktop, but think laptops - - - no provider is going to ship you fast ram like that and you certainly won't be able to overclock it as there are never such option in laptops. Don't forget about the SSD they through in there. That means you can play 50% of the games (the less good half) at 1080p on medium to low settings. The system would be better suited to lower resolutions 720p being the sweet spot.
    FYI the desktop version of GTX 650 achives 33 FPS on Dirt 3 1080p occording to anandtech on medium settings.

    Intel is too easy to read for me.
  • karasaj - Wednesday, January 9, 2013 - link

    Well just fwiw... SSD doesn't affect gaming framerates other than loading times.
  • CeriseCogburn - Friday, February 1, 2013 - link

    That's not in true in many games.
    I'm not saying it can't happen, but I've seen plenty of higher framerates in games from SSD's.
    This occurs because some games access for instance lots of small files while in the game, hence, the frames get a boost vs traditional hard drives.

    FTFY - you're welcome.
  • CeriseCogburn - Thursday, January 10, 2013 - link

    A few people I know are thrilled to death with the GT645 w drr3 not even the ddr5, so we can poo poo Intel all we want, that won't change the fact that outside the highest standards bitchfest in here all the time, the general gaming public is going to be very pleased.
  • ironargonaut - Friday, January 11, 2013 - link

    "remember this is 1080p you absolutely need memory bandwidth."

    Did you factor this piece of info from the article into your comment?
    "GT3 with embedded DRAM"
  • jwcalla - Wednesday, January 9, 2013 - link

    Wait, what was the point of this article? Or was Intel due to run another one?

    No numbers is kinda meaningless. For all we know the 650m is CPU bound on this game at those settings.
  • CeriseCogburn - Friday, January 11, 2013 - link

    The point of this article is Intel's new APU is smoking the h e double toothpicks out of everything the sorry and sad bankrupt amd has.... to point of amd fanboy humiliation and raging as the responses clearly show

    God Bless Intel
  • CeriseCogburn - Friday, January 11, 2013 - link

    Did you turn up the volume and hear the discussion concerning the equivalence of the cpu in the laptop boards ?

    No, of course you didn't, so you blabbed.. again, in pure ignorance...

    Now it fits you though it did not me: blah blah blah blah
  • Krysto - Wednesday, January 9, 2013 - link

    Anand, did you catch this?

    http://www.theverge.com/2013/1/9/3856050/intel-can...

    I've tried to warn you before that this is what Intel will do with the "10W Haswell", too. They will make it much weaker in performance to get a lower TDP. They've been misleading enough by not saying that, and letting people believe (even you guys, actually) that it's a chip that is just as fast, but much more efficient, down to 10W.

    But what's even more annoying about their misleading marketing, is that they won't even admit to it, and promote them as dual core 1.5 Ghz chips or such, when in fact they have 800 Mhz, by using the turbo-boost speed instead of the real, default speed, which is something Intel hasn't done until very recently, which just shows their desperation in the fight against ARM.

    Anyway, I just hope next time you try to question Intel's press releases more, instead of taking them at face value, at least before you get a chance to test the chips yourself. Because if you don't do that, then a lot of people reading this site, and thinking that "you know what you're saying" will take that false information and spread it to other sites.

    So question Intel's PR and don't take it at face value anymore. That's something you should do with every company of course, but you should especially watch out for Intel, as they've been up to no good lately, when they started becoming desperate about ARM.
  • wsw1982 - Thursday, January 10, 2013 - link

    You really demand a tablet level IGP could have GT650M level performance? Then ask nvidia about IGP in tegra 4 that's where the 8w haswell is targeting...
  • CeriseCogburn - Sunday, January 13, 2013 - link

    Did someone say tegra4 ?

    http://www.wired.com/gadgetlab/2013/01/ces-2013han...

    STREAMS games from your PC over wifi

    Another great nVidia invention coming soon

    PROJECT SHIELD ( to be renamed before release according to reports )
  • tygrus - Wednesday, January 9, 2013 - link

    Hey Anand,

    Any idea of average frame rates or subjective view of the fluidity of the demo ?
    Are you saying the GT3e was better, same or worse than the GT650M in this demo ?
    Do you think the settings and game selected would create a very low bar for Intel to get over ? eg. 650M could have averaged 10fps faster than the GT3e but you can't visually discern this (ie. without app to measure fps).
  • madmilk - Wednesday, January 9, 2013 - link

    GT 650M should be well over 60fps at these settings. So yeah, not a very useful visual benchmark...
  • CeriseCogburn - Thursday, January 10, 2013 - link

    It appeared to me the 650M was a bit faster.

    The kicker is the 650M is darn good and a lot of the public would be gaming pleased, so this all bodes too well for Intel.

    Hence the entire thread has been whining and moaning and shrieking not fair, fake, it's a lie, marketing... on and on...

    I'm getting a BIG kick out of it.

    Intel was smart enough to use an nVidia product to whoop on, instead of crossing into the raging amd fanboy arena by taking on some amd chunkage. Very wise Intel. I thank Intel for sparing us.
    LOL
  • jwcalla - Thursday, January 10, 2013 - link

    If your takeaway from this article is that Intel "whooped" the 650M, then that's evidence that ambiguous articles like these create a misleading narrative.

    You're precisely the type of gullible moron that Intel targeted with this article.

    I'm sure the GT3e is great, but let's not get carried away by these demos that are strictly orchestrated by the manufacturer... and maybe actually wait for something concrete before heralding the doom of nvidia.
  • CeriseCogburn - Friday, January 11, 2013 - link

    I'm the gullible idiot that took this sites FOUNDER at his word who personally tested the two configurations and VERIFIED they performed "very similarly".

    Thanks though, for all of you, who have, unwittingly, and in absolute driven biased hatred, claimed the master of this website is a completely clueless lying duped fool.

    Good job, I'm more than certain you've got a handle on reality.....

    N O T

    BTW your stupid pic is a perfect fit

    I mean you people are truly so dumbed down, it is unbelievable.
  • Spunjji - Friday, January 11, 2013 - link

    I'm not sure how much Anand plays games these days, but for my part I wouldn't personally verify that I can tell the difference between, say, 80fps and 50fps while watching a time demo. They would look "very similar" to me. The devil is in the details.
  • CeriseCogburn - Friday, January 11, 2013 - link

    ANOTHER FRIKKIN MORON

    did you read the aticle ? heck no !

    did you notice him say he played the games separate fro the demo ?

    HECK NOOOOOOOOOOOOOOOOOOOO

    Are you an idiot ?

    Since your fps detection is so bad I'll just assume that's why you can't read articles either.
  • CeriseCogburn - Friday, January 11, 2013 - link

    LOL - my quote, first line of what you responded to: " It appeared to me the 650M was a bit faster. "

    Fanboy problems in the brain ?
  • AmdInside - Wednesday, January 9, 2013 - link

    Without data, the comparison is meaningless. The only thing this tells me is that Haswell can run a game that came out in 2007.
  • AmdInside - Wednesday, January 9, 2013 - link

    I mean 2011
  • CeriseCogburn - Thursday, January 10, 2013 - link

    LMAO- if Intel made a 4 year mistake we'd never hear the end of it.
  • CeriseCogburn - Sunday, January 13, 2013 - link

    What a minute there buddbo.

    What it means to me is I can hardly wait to have one.

    ( I'll actually wind up with a lesser sleeper model, and so will others I push it on, who will be grateful no doubt. )

    It means Intel is a power to be reckoned with that AMD is going to have a hard time going up against. Yeah it means that a LOT.
  • tipoo - Wednesday, January 9, 2013 - link

    I wonder how much the GT3 without eDRAM will improve over the HD4000? How big a factor is it in the performance jump? And the lower end SKUs?

    I've often wondered why graphics cards didn't have a bit of eDRAM to help with certain operations, as it seems to allow consoles to get away with much slower main memory (see the Wii U).
  • Spunjji - Wednesday, January 9, 2013 - link

    With PCs it has traditionally not been die-area-efficient to do that where the screen resolution target is unknown. If you don't have enough eDRAM then it will be of little benefit at high resolutions; too much of it burns die space and costs more. Better to have a larger quantity of general purpose RAM with a high-bandwidth bus and high clocks and whatever size caches work best with your estimated texture workload.

    In this instance, however, Intel finally have die area to burn and a usage case (low-end bandwidth-constrained graphics) where it may actually make some sense.
  • wsw1982 - Thursday, January 10, 2013 - link

    I think the eDRAM is on package not on die.
  • Spunjji - Friday, January 11, 2013 - link

    Ahhh, I may well have been mistaken about that bit. I know it's on-package with the Wii and 360, but I assumed Intel were going on-die with this.

    Comments about expense of manufacturing still apply, but definitely less-so if that's the case.
  • CeriseCogburn - Friday, January 11, 2013 - link


    I think the hater just spews whatever his gourd dreams up on the spot - as that last attempt at bashing he spewed proves again.
  • Spunjji - Monday, January 14, 2013 - link

    I explained why it makes technical sense for Intel to do this now and not before. Define what part of that was "hating" and/or incorrect.
  • Rumpelstiltstein - Wednesday, January 9, 2013 - link

    This could be Atom vs. GTX 680 and it wouldn't make a difference so long as it runs smoothly on the Atom.

    Dirt 3 isn't a particularly demanding game, so without performance numbers this tells us absolutely nothing.
  • Zink - Wednesday, January 9, 2013 - link

    The Asus UX51 is probably hitting over 60 fps at these settings based on benchmarks I can find so the comparison is useless. If the Intel is hitting around 30fps, just enough to look smooth, that is right where we would expect performance to be. They could have used a GTX 680M and the comparison would have looked the exact same.
  • CeriseCogburn - Thursday, January 10, 2013 - link


    Excuse me, vsync WAS OFF as per the article. This also as per the article:

    " Intel wouldn't let us report performance numbers, but subjectively the two looked to deliver very similar performance. Note that I confirmed all settings myself and ran both games myself independently of the demo."

    So we have a MASSIVELY experienced gamer / bench tester / reviewer /comparer telling you retarded doubting thomases exactly what occurred - and you pretty much ignore it, then directly refute it straight out of your imagination.

    Thanks, it's been great fun watching the clueless insanely biased haterz pull another gigantic round of pinheadisms for page after page.
  • jwcalla - Thursday, January 10, 2013 - link

    You really shat all over this thread. I'm not sure why you brought up vsync in response to his comment. Vsync on / off has nothing to do with the point he made, which is that after a certain framerate you can't visually tell a difference, so when performance "subjectively looks similar", we can't really draw any conclusions.

    As long as the Intel is hitting the necessary framerate for smoothness, what conclusion can be made without numbers?
  • CeriseCogburn - Friday, January 11, 2013 - link

    You're still a FREAKING IDIOT whom is the real shatter of big doo doo.
    you're WRONG
    WRONG
    WRONG
    and wrong again

    Run the dirt3 game yourself YOU IDIOT - at these settings... then report back the framerate YOU RETARD.

    Thanks for being so stupid it's unbelievable - you frikkin experts know the framerate right ?

    R O F L - YOU DON'T BUT I DO.
  • Spunjji - Friday, January 11, 2013 - link

    ...damn.
  • mikato - Friday, January 11, 2013 - link

    Haha this is just getting funny now.
  • Medallish - Friday, January 11, 2013 - link

    This is like watching Alex Jones being interviewed about Gun control.
  • Spunjji - Monday, January 14, 2013 - link

    Only somehow less coherent..?
  • CeriseCogburn - Friday, January 11, 2013 - link

    After the two amd fanboys declare the test irrelevant, they run into the GPU reviews and scream about 3 frames with the "winning" amd fanboys in 2 games at framerates nearing 100.

    Whatever doofuses.

    " Intel wouldn't let us report performance numbers "

    OH GUESS WHO SAW PERFORMANCE NUMBERS YOU BRAINDEAD MONKEYS.

    Why must you torture sentient elite humans who can actually think and face reality without a thousand excuses ?
  • iwod - Wednesday, January 9, 2013 - link

    My guess is that Intel isn't even anywhere close to GT650M Hardware wise, ( I would be surprised if they did ).

    Not to mention Intel Graphics Drivers is absolutely appalling.
  • CeriseCogburn - Friday, January 11, 2013 - link


    So amd crap breaking, broken, rebroken, never correct drivers are okay too then.
    Good to know.
  • Spunjji - Monday, January 14, 2013 - link

    Fanboy.
  • CeriseCogburn - Friday, February 1, 2013 - link

    Enduro dip****

    Case closed. Idiot liar fanboy, mirror, you.
  • duploxxx - Thursday, January 10, 2013 - link

    great to see this performance, now the big Q will be. What CPU will ship with what kind of GPU.

    knowing intel in the past they screwed around a lot with GT1 GT2 in different CPU models, totally unbalanced.
  • CeriseCogburn - Thursday, January 10, 2013 - link

    Wow like the first reasonable comment even with the poo pooing.
  • Rontalk - Thursday, January 10, 2013 - link

    Yeah, Anandtech and Intel....!

    This how AMD Trinity were run Dirt 3 a year ago:
    http://www.youtube.com/watch?v=lsmTDb-Mlws
  • yankeeDDL - Thursday, January 10, 2013 - link

    How does this compare with AMD's existing APUs?
  • CeriseCogburn - Thursday, January 10, 2013 - link

    It whips the crap out of them - as the 650M is a DISCRETE separate chip, not a gpu in a cpu.

    Yep, it smacks the ever loving crap out of all trinity.

    That's why they had to go with 650M /nVidia - they were being nice and didn't want to rub the amd fanboy noses in it.
  • Rontalk - Thursday, January 10, 2013 - link

    Are you joking? Because a big desktop 80W+ Intel APU was capable running an old game fluent?
    AMD Temash (less than 5W) APU capable run the newest Dirt Showdown in 1080p fluent;
    http://www.youtube.com/watch?v=CV-U50Viv_k

    So forget your AMD fanboy crap talk, it is obvious Intel nowhere to AMD ;)!
  • CeriseCogburn - Friday, January 11, 2013 - link

    Hey amd fanboy brainfarter.....

    " THIS HERE ARTICLE " Intel used a mobile customer reference board in a desktop chassis featuring Haswell GT3 "

    MOBILE REF BOARD

    Hello ?

    Oh sorry, I interrupted your gigantic amd radeon brainfart. Yeah, you got it 100% incorrect, but dats otay.
  • nicolbolas - Sunday, January 13, 2013 - link

    they could use a mobile ref board that is using the 57 watt CPU.

    hello, it might be modified to support a desktop CPU.

    We don't know....

    Either way, it is still a huge jump for Intel and should force dGPU's to get a lot better.
  • CeriseCogburn - Tuesday, January 15, 2013 - link

    And monkeys could fly from betwixt your buttocks cheeks.

    In this case, IT'S NOT AN INTEL HD4000, and it's NOT an 80W cpu, which is what butthead said to begin with "as a rebuttal".

    Now you're coming up with your IMAGINARY 57W cpu scenario.

    Got a link for Haswell 57Watter butthead ?
  • Spunjji - Friday, January 11, 2013 - link

    Not the best example... that frame-rate looked pretty shitty and the architecture in Temash isn't directly comparable to existing AMD APUs either.
  • yankeeDDL - Thursday, January 10, 2013 - link

    CeriseCogburn, sounds to me that if there's a fanboy here, that's you.
    Trinity trounches several low-end discrete graphic chips and outpaces by more than 2X any Ivy Bridge iGP.
    Since with Hashwell the core count went from 16 to 40 (2.5X), I expect Hashwell to be slightly better than Trinity, at least on paper.
    I could not find any direct comparison of Trinity's HD7660G vs the GT650M or something related.
  • Rontalk - Thursday, January 10, 2013 - link

    Trinity 7660G equal with GT630M, but Intel's GT3 nowhere will be close to GT650M. A higher TDP Haswell might gonna be faster than Trinity, like the Core i7 4930MX 57W will outperform the 35W Trinity, but in same TDP or ULV category Trinity will keep its leading positions.
  • CeriseCogburn - Friday, January 11, 2013 - link

    " Trinity 7660G equal with GT630M"

    That's why amd fanboy "couldn't find any information" - LOL

    It's right here on this site, but amd fanboy is clueless.

    He might try using the search engine here.

    The 630M WHIPS THE CRAP OUT OF TRINITY

    http://www.anandtech.com/show/5831/amd-trinity-rev...

    Nice to give the amd fanboy such a big break
  • silverblue - Saturday, January 12, 2013 - link

    It all depends on the titles, but sadly, there's nothing Trinity's 7660G can do about the GT630M, at least, not when packaged with the A10-4600M (the i7 in that comparison is a absolute beast which makes the comparison even more lopsided). It's close in places, but there are some titles that really benefit from CPU grunt, and that's really where AMD is currently falling flat on its face.

    Trinity's one advantage is low power usage - an i7 with GT630M won't last anywhere near as long with the same battery available - but you have to question whether, despite the extra gameplay time, performance will be good enough for the games you're playing.
  • whyso - Thursday, January 10, 2013 - link

    Fanboy much?

    Anandtech did an article on trinity. They found that it was a grand total of <<AMD fanboy drumroll here>> 25% faster than an i7 HD 4000.

    http://www.anandtech.com/show/5831/amd-trinity-rev...

    More benchmarks here. At low settings (the only settings that either gpu can really play games at) the two are quite comparable at many games. Sometimes trinity is better by 2x, sometimes it is better by only 10%.
    http://www.notebookcheck.net/Computer-Games-on-Lap...

    Trinity is better than the hd4000. But if the haswell gt3 is twice as good as hd 4000 (not unbelievable as you said yourself, 40 eu vs 16 eu), Haswell gt3 will beat trinity by a much larger margin than trinity beats the hd 4000.
  • stickmansam - Thursday, January 10, 2013 - link

    The Anand review was showing an i7+ HD4000 vs AMD A10. There will never ever ever be a situation where price wise the too would would even be close together. The A10 would probably be matched against an i3 and at most an i5. So expect the 10% diff in HD4000 and AMD to be actually bigger in most situations since the i3/i5 weaker than i7 so less difference in CPU bound games.

    The GT3 is also running small amount of dedicated ram off package which means it probably won't see the light of the day outside of the 700+ range. This means a potential drop of 20% perf (judging by how desktops cards fare going from GDDR5 to DDR3)

    I will expect the GT3 (without dedicated ram) to pretty much match Trinity's perf and the GT3 with to beat it by about 20% ish.

    It will be interesting to see how GT3 with or without dRAM will do agianst Richland (optimized Trinity) or Kaveri (APU with stemaroller [potential match intel CPU perf] and GCN arch)
  • whyso - Thursday, January 10, 2013 - link

    You can buy an i7 notebook for $699.
    http://www.newegg.com/Product/Product.aspx?Item=N8...

    For $749, you can currently get an i7-3632 with a 640m
    http://www.newegg.com/Product/Product.aspx?Item=N8...

    The cheapest trinity a10 notebook on newegg is $679.
    http://www.newegg.com/Product/Product.aspx?Item=N8...

    Note (you can get cheaper other places). Trinity seems to run about $500 to 800.

    http://www.futureshop.ca/en-CA/product/acer-acer-1...

    So the prices are close but not that close. From a performance/$ viewpoint the $749 i7 + 640m is about 50% better gpu (http://www.notebookcheck.net/Computer-Games-on-Lap... CPU wise the i7 is about twice as fast.

    http://www.tomshardware.co.uk/a10-4600m-trinity-pi...

    The sandybridge i5 beats trinity at cpu tasks almost all the time. The ivy bridge i7 has twice the cores, a more efficient architecture and a higher turbo.

    So from a price/performance perspective the i7 + 640m beats the trinity + 7670m.

    Trinity offers dual graphics true, but the drivers for it suck.
    Despite getting over 2k 3dmark 11 points in games it performs just about as good as the integrated graphics.
    http://www.notebookcheck.net/Computer-Games-on-Lap...

    You can see that the hd 4000 is about as good as the a8 graphics. The hybrid crossfire on average is worse than just using the dedicated card so big whoop to that.

    gt3 will be interesting, it should be substantially better than trinity.
    The difference between hd4000 in different cpus is quite small.about 800 3dmark 11 pts for a duad core i7 compared to about 650 points for a ULV i5. standard i5's run about 50-100 mz slower than the i7 (1100 mhz vs 1150-1200mhz). AMD has stated that kaveri will be about 20-40% better than trinity gpu wise. GT3 could be really close as that is likely slightly inflated (aren't they always?).

    Kaveri will be nowhere close to intel i7 cpu performance.

    (Trinity cpu sucks majorly) Trinity + 7970m is worse than i7 + 660m (http://www.notebookcheck.net/Review-MSI-GX60-Noteb... gameplay wise because of cpu bottlenecks.

    farcry 3 bottlenecked to 32 fps
    hitman 19 fps
    need for speed 24 fps
    world of tanks 34 fps
    gw2 18 fps
    sleeping dogs 35 fps
    This is AVERAGE fps not minimum

    AMD needs at least a 35% cpu power increase to make these games playable. Kaveri I can see getting 35% increase, but 2x, no way. http://techreport.com/news/24164/amd-shares-latest...
    About 20% increase, so hitman and gw2 will still be bottlenecked below 30 fps average.
  • CeriseCogburn - Sunday, January 13, 2013 - link

    Thanks for shutting them up.

    Here is nVidia 645M for $699 in MSI lappy.... LOL

    http://www.newegg.com/Product/Product.aspx?Item=N8...

    Oh boom !

    It's finished trinity
  • nicolbolas - Sunday, January 13, 2013 - link

    trinity laptops with A10-4600 start at $500.

    This would be an end to all trinity besides A10, but i doubt that the CPU + eRAM will cost anything under $300, probably more like $400+.

    What it means is that AMD (was forced to?) is releasing an updated trinity that is up to 40% faster GPU (probably 10-15 in real life) and up to 15% faster CPU (probably all from clock gains).

    oh, and it isn't called trinity.

    So i guess either way trinity dies XD

    also, i remember you asking for people to be contructive or something like that and you have not been. :(

    if you did say that can you please TRY to do it also :)

    PS. Intel's drivers are much worse than AMD's which are a lot better than a few years ago. NVIDIA's still is the best, but not as fast ahead.

    also, remember that the HD4000 only (officially) supported 100 games when it was released.... :(
  • Spunjji - Monday, January 14, 2013 - link

    Do not expect Cerise to be consistent, non-hypocritical, or indeed constructive. I applaud you for asking but I have tried before.
  • CeriseCogburn - Tuesday, January 15, 2013 - link

    Do not expect me to not be all of those, AND to denounce your lies by proving them as such, which is why you're such a sourpuss.

    If you want better behavior, don't openly lie and attack and presume and predict the future based upon your imagination, and then whine when you lose doing it.
  • nicolbolas - Sunday, January 13, 2013 - link

    .... did you REALLY just do that?

    here,
    http://shop.amd.com/us/All/Search?NamedQuery=visio...

    The links for the five A10 laptops (all 4600) vary from:
    PcMall, Office Depot, and Newegg.

    so, you can get an A10-4600 laptop for $500 before taxes/shipping/etc.

    -Q
  • CeriseCogburn - Tuesday, January 15, 2013 - link


    i forgot you amd fanboys are so cheap and poor and foolish, you'll spend $500 for a lousy piece of crap to get you by in shame, instead of $699 for a super screamer that will never disappoint you.
  • Rontalk - Thursday, January 10, 2013 - link

    Anandtech did unfair tests, using Trinity with 4GB CL11 RAM when it is known how sensitive for Memory. 8GB CL9 DDR3 almost doubles Trinity's frame rates and makes capable to run many games in High settings and some game in Ultra. And when you turn the details Up with AA and OA the difference between HD4000 and 7660G is twice.
    With other word Anandtech did not let Trinity's GPU to work, but the CPU and of course a 45W TDP i7 will show similar results as the 35W TDP A10 Trinity.
  • whyso - Thursday, January 10, 2013 - link

    Double, no way.

    http://www.tomshardware.com/reviews/a10-5800k-a8-5...

    Yes this is desktop but I'd expect mobile to be less affected by memory bandwidth because mobile trinity runs at much lower speeds and requires less bandwidth.

    8 GB is not going to change anything over 4 GB. The trinity notebook is using 1600 Mhz ram, there is not that much room for improvement to CL 9.

    I agree completely that the difference between intel and trinity increases as AA and resolution increases. However, many of the games that anand tested in that review are running in the 30-40 fps range. It doesn't matter if trinity is twice as fast as the hd 4000 at 1080p with high details and AA if it only gets 10 fps because no one is going to be using that setting.
  • Rontalk - Thursday, January 10, 2013 - link

    See here a proper mobile Trinity Review with 8GB DDR3 1600MHZ CL9 RAM. Check game settings to see what capable for, check 3DMark11 score which 200 more what Anadtech tested for:
    http://forum.notebookreview.com/hardware-component...

    Stop being Intel fanboy ;)!
  • whyso - Thursday, January 10, 2013 - link

    Not being a fanboy, being realistic. Claim of "almost 2x frame rates" is wrong. 200/1100 is about 18% better. In fact if there was one fanboy here it would be you for making the two times claim in the first place.

    I'd ignore 3d mark 11 when looking at AMD APUs. Despite the 630m getting a slightly lower 3d mark 11 score (1037 vs 1057 for gpu) it is significantly better than the 7660G.
    http://www.notebookcheck.net/Computer-Games-on-Lap...

    I've already made links on my previous posts about poor drivers in hybrid crossfre.

    The Anandtech review is perfectly legitimate. Anandtech used the notebook AMD GAVE him for benchmarking purposes. A $500 trinity a10 notebook is very likely not going to be using 8 GB of 1600 Mhz cl9 ram. Anandtech looked at trinity the way trinity is going to be presented and sold and the majority of trinity a10 notebooks will be similar to the anandtech one.
  • Rontalk - Friday, January 11, 2013 - link

    You realistic? You are a joke, who cannot realize anything. Do you think notebook check used always 8gb 1600 CL9 Ram? With what graphics settings? Very high settings were only aplied in 1080p... Anandtech clearly used crap memory, even if AMD gave them the laptop. How do you want than explain 3DMark 11 1338 point than? Do not you think if 3D Mark 11 measures 200 more point, than frame rates won't be significantly improved in games too? Have you seen the linked real world tests?
    Did you know turning details to high or ultra on AMD cost minimal FPs loss, when Intel graphics will die?
    Can you realize something?
  • Rontalk - Friday, January 11, 2013 - link

    Read here review what are they writing for this memory too; " By moving from one CL11 dimm to two CL9 dimms in my laptop (A10-4600M apu with integrated 7660G Radeon), I doubled my gaming framerate! Low voltage helps extend your battery life."

    Source:
    http://www.newegg.com/Product/Product.aspx?Item=20...
  • Spunjji - Friday, January 11, 2013 - link

    Dude, you seem to have missed the bit there where they changed to dual-channel memory at the same time as making the CAS-11 to CAS-9 change.

    Now, which of those things do we think made the *real* difference? And why do you automatically take their "double" claim at face value?

    Answers on a postcard to:

    The Real World,
    Population,
    Not You.
  • CeriseCogburn - Friday, January 11, 2013 - link

    That's how it goes whyso... they claim 200%, it's 18% really, yet you are the fanboy for correcting them.

    Then they attack with more stupidity. Again and again they lie, are wrong, misteken, or bleeding amd brainfarts so gigantic no one can find even a massive cheating review that comes close to their spewed lie for the moment.

    Just remember though, since you don't agree with their wild eyed lies, since you don't denounce this hands on review above by this sites Founder and namesake, you are the fanboy.

    LOL

    Good luck.
  • Rontalk - Saturday, January 12, 2013 - link

    I claimed 7660G is twice as fast as HD4000. I think 100% means that or no? If Anandtech measured 7660G 20% faster than Hd4000, than with another 18% (CL9 1600MHz) gonn 38% faster. At this point possible to give GPU job and let it work on High, Ultra settings or AA and OA on. The difference will jump to 100%!
  • CeriseCogburn - Sunday, January 13, 2013 - link

    Okay, that's a crazy stretch, and I think you also used a newegg review verified purchaser who just spewed his fanboy I bought it crap...

    I do appreciate the explanation, and your kindness in it's delivery, but it's the initial BS that is the real problem.

    So we give your little amd apu every break and upgrade in the world, then declare it thusly at your 100% or 200%, yet the above article is a big fat lie after Anand did the hands on that every hater decided to ignore 100% or rather 200% ?

    Yeah, whatever....

    Did find this though:

    http://www.newegg.com/Product/Product.aspx?Item=N8...

    I'm not risking that crashing piece of crap - if it was $200 or $300 dent and scratch refurbished I might put up with it.

    Anyway there's a good argument for you - DUAL AMD GRAPHICS $699 !

    I have not tried one of these hands on. If they don't spit and crash like mad...
  • nicolbolas - Sunday, January 13, 2013 - link

    also i order a MacBook pro and it came with a HUGE dent in it.... all products have an amount that are damaged. How much, depends, that is more on the OEM than the CPU/GPU maker 0.o

    and, the RAM part is crazy, plus you cannot find 1866 in Laptops.

    i wouldn't be suprised if the 7660G is at least 70-80% with AA (playable) or 100%+ on ultra/high settings (unplayable)
  • nicolbolas - Sunday, January 13, 2013 - link

    i see the product, but why are you saying "incase it is crashing?"

    i see only 2 reviews, which is not enough to make a good sample of people buying them, but i see no problem.
  • CeriseCogburn - Tuesday, January 15, 2013 - link

    Look around a little instead of playing I'm on the internet and too stupid to surf ?

    " I purchased this laptop last year September but recently my computer has problems during start up.Sometimes the window log in screen hangs till you restart the machine.the problem has escalated till now the laptop screen does not display, I wonder what could be the problem.my friends too are complaining of similar cases.... "
    " Got this for Xmas, trying to find a better alternative so we can exchange it - suggestions?.."
    " Deja vu, again intentionally badly built notebooks with AMD APUs. Intel must have paid much for the manufacturers to suffocate their competitor...What AMD driver version and crossfire profiles were you using for testing?"
    LOL- GAMES WOULD NOT RUN

    http://www.notebookcheck.net/Review-HP-Pavilion-g7...

    They have a bad rep across the entire internet mr clueless.

    http://www.notebookcheck.net/AMD-Radeon-HD-7660G-H...

    Read the first two paragraphs above.

    It cannot match the 650M in the article above: " The performance of the Dual Graphics solution depends greatly on the driver support for the used games. In some games the performance may even degrade by 10-15% compared to using only the APU graphics card. Only in best case scenarios the performance of a GT 640M is possible. Therefore, due to the micro stuttering and performance problems, Dual Graphics may impose more problems than bring performance gains. "

    LOL- now you might know why...
  • CeriseCogburn - Friday, January 11, 2013 - link

    Braindead fartboy much ?

    Somehow you morons jumped down to the not reviewed here HD4000....

    I guess you people are so emotionally corrupt with amd fanboy brainfarts, that you just cannot follow along....

    THIS HERE ARTICLE " Intel used a mobile customer reference board in a desktop chassis featuring Haswell GT3 "

    MOBILE REF BOARD
  • Wwhat - Thursday, January 10, 2013 - link

    "Intel wouldn't let us report performance numbers"

    See at that point you say 'OK we'll do a 4 line blurb then without video and pictures'
  • Spunjji - Thursday, January 10, 2013 - link

    I wouldn't exactly call this an in-depth review..! Be fair to them, it's usually better to present the information you're given with context than to omit in entirely.
  • CeriseCogburn - Thursday, January 10, 2013 - link

    Wow another CEO boss from the ranks of the angry intel hating proletariat.

    Was it wrong for the dude to check all the settings himself then run the game on both, to make as certain as was possible it wasn't all a big fat sham ?

    Sorry fascist, I'm glad the dude is out there, doing what the dude does.
  • Wwhat - Friday, January 11, 2013 - link

    Do you even look at what you comment?
  • CeriseCogburn - Sunday, January 13, 2013 - link

    Yes I do, I read it all over and over again.

    You should all my comments too.
  • R3MF - Thursday, January 10, 2013 - link

    in short, is this going to be available to average joe buying a 2013 ultrabook from best buy - every bog standard i3 and i5 ultrabook gets the same gpu as premium i7 SKU's - or will it be reserved to people dropping $1500 on a premium ultrabook?
  • thebluephoenix - Thursday, January 10, 2013 - link

    Pentiums and Celerons: GT1, 10 Shaders
    Standard mobile CPU-s: GT2 aka HD4600, 20 Shaders
    High End CPUs: GT3 - 40 Shaders and Crystallwell eDRAM (~64MB).
  • R3MF - Thursday, January 10, 2013 - link

    my worry is that the everyday Haswell i5 will only have the GT2 gpu, and that intel will break with Core series tradition in ensuring that all ultra-mobile parts come with the high-end gpu.

    if they have broken with this tradition it will be a great shame, and there are rumours that they have due to the expense of GT3.

    from the PoV of someone with an i5 3217U ultrabook today, say $750, there would be little interest in haswell (GT2) and little incentive to upgrade.
  • mrdude - Thursday, January 10, 2013 - link

    Don't forget the persistent throttling...

    Although a GT3 +eDRAM Haswell ULV chip sounds fantastic for a gaming Ultrabook, if the thing is still limited to 17W and doesn't have room to stretch its legs then all you've got is great hardware without the performance to match.

    That's the only way Intel is going to get their Haswell and Ivy chips to below that 17W threshold: throttling. With HD4000 Ivy 17W ULVs there was some serious stuttering in gaming which the desktop and 45W mobile parts didn't have. Increasing the EUs and adding eDRAM is going to make that even worse. What you saw with the A15 throttling is what's going to happen on Intel's new low power chips (ULV included). Drop the clock speeds and throttle more aggressively to reach the target TDP.

    A lot of mobile companies are selling us chips with high turbo clock claims but if it only sits there for a fraction of a second, who cares? It's false advertisement, imo :P For a GPU that's a pretty big deal, especially if your frame rates are all over the god damn place.
  • R3MF - Thursday, January 10, 2013 - link

    so we lose either way potentially.

    on the one hand, intel may start to tier the GPU tech, so removing a decent performance improvement to the very large majority of 3rd gen ultrabook users.

    on the other, even GT3 with edram may not actually provide much of a boost given the 17W limit combined with the fact that Haswell is also made on 22nm.

    i'm feeling like my ivy-bridge Samsung Series 9 has a much longer useful life ahead of it than previously anticipated.
  • mrdude - Thursday, January 10, 2013 - link

    They can improve perf-per-watt, but that TDP headroom isn't going to grow but decrease given the move to mobile and slimmer/sleeker products.

    Intel is also stating that they're going to get around the memory bottleneck issue that plagues modern day APUs by adopting eDRAM. While it's great for performance, it's also a much costlier solution and more expensive compared to a discrete GPU.

    For gaming in slim/small form factors that TDP threshold is the biggest hurdle. An HD4000 in a 35W/45W laptop chip performs much better than the HD4000 attached to a 17W ULV processor, not to mention provides much more fluid frame rates. Even if you increase the performance with a GT3+eDRAM variant you're still bottlenecked by that TDP and have no choice but to resort to throttling.

    14nm Broadwell could potentially alleviate that a bit, but it's still not a cure. The only remedy that could purge this issue is a cheap exotic form of cooling that allows for a higher TDP in a smaller form factor. I have more hope in exotic cooling methods than I do the diminishing returns of die shrinks.
    http://www.gizmag.com/ge-dual-piezo-cooling-jet/25...
    Short of advances like ^ that, there isn't much Intel or anybody else can do.

    Although implementing it would be a grand idea ;)
  • Spunjji - Friday, January 11, 2013 - link

    Increasing the EUs isn't necessarily going to make it worse. If they can bring said EUs down in voltage to hit a power/performance sweet-spot, more EUs running slower will perform better than fewer running faster at a less efficient point on the curve.

    Similarly I don't think the potential benefits of eDRAM can be ignored, *if* they implement it well. I would like to think they will...

    But yes, your general thrust here is correct. They're probably not going to get miracles out of this.
  • CeriseCogburn - Friday, January 11, 2013 - link

    It's pretty easy to call the Sandy Bridge a miracle cpu with a straight face.

    Then you can scream brutal pricing, oh wait you already did that, as well for this unreleased Haswell.

    LOL

    Just last night the HD2000 in an i3/2120 on an H67 w just a single sata6 300gig velociraptor and 2x2 1333 impressed several to the point that they couldn't believe how fast the OS and surfing was (my install of course because I am the best), and then, even though we had angry birds type gamers, a few runs of Stronghold had them begging for more - the HD2000 @1920x1080 smoked right through it without a single glitch, slowdown, or driver problem - and pulled a 14,512 on 3dmark2001se bench, which it completed when amd gpu's often do not.

    The system response was so lightning fast even without an ssd ( the velociraptor was only pulling 125-130 on ASSSD read and write), I was even laughing in disbelief at the snapping lightning quick speed.

    That's what Intel delivers ( with a master like myself at the setup wheels ).

    BTW the owner hates amd (from personal experience instead of some corporate OWS profit whine) as much as you people hate Intel, so don't worry it wasn't me selling Intel, but prior desperately trying to recommend an AMD build which was met with anger and "not in my house!" type attitude...

    See, it's rough out there, like in here, only people in the real world are the exact opposite of you people, and some of us deal with them all the time.

    Bottom line - installation was so fast the whole room erupted in laughter, applause and disbelief for quite some time - the HD2000 gamed admirably and above required par for the not so into games besides angry birds needs.
    The system just SCREAMED speed the internet pages loaded so fast the pages just blinked from one to the other with no loading even visible, miraculous indeed.

    That's what Intel delivers. It's frikkin unbelievable.
    CHEAP SB laptops too for some time, creating addicts, literally.

    I guess the ridiculous complainers here like you will be disappointed, whining, wailing about the price, and generally claiming everything is a lie even after Haswell hits...
    W H A T E V E R.

    I can hardly wait for Haswell.

    I cannot expect the same from any amd setup - I could get caught in another endless amd NIGHTMARE easily with their crashing pieces of crap.( it almost always happens) It's not worth it. Apple has an "it just works" reputation, something AMD desperately needs.
    DESPERATELY.

  • nicolbolas - Sunday, January 13, 2013 - link

    i just made a 5800k system for myself. it took about 15 minutes, maybe, for me to get everything togethor and installing windows 7 from flash drive was fast to.

    The graphics have no problem, and nothing else is running wrong. Much better drivers than my Quadro NVS 140m (which is a card that is horrible to start with)

    It is just as fast as your i3 probably is (only due to the 1+ GHZ on clock speed it has on it) but beats it for gaming.

    That being said, i don't know how you can truly hate AMD, especially if you love Intel. Had Athlon not dominated/slightly beat (depends on time) the P4 intel might not have gone back to using the Core Arc. at least not when they did, we might have had another few years of P4 :S
  • Spunjji - Monday, January 14, 2013 - link

    You don't understand... Cerise's unsubstantiated claims are worth more than yours. Because that is how debating works!
    /sarcasm
  • CeriseCogburn - Tuesday, January 15, 2013 - link


    See above you for who spews a pile a crap and why, because there is a debate rebuttal, one I left for your new buddy.
  • CeriseCogburn - Tuesday, January 15, 2013 - link

    How does your theory of competition work now ? Since Intel has dominated amd cpu's for some time, why hasn't amd done a wonderful job of surpassing Intel, spurred on by the competition, as you claim Intel did only because amd dominated intel athlon vs p4 ?

    Does it only work when amd dominates intel, and not when intel dominates amd ?

    Maybe your theory is another pile of crap - oh wait it is, we have proof of that.

  • Spunjji - Monday, January 14, 2013 - link

    Thank you for collecting together so many assumptions about who I am and what I do into one nice long rant including some seriously egregious shit. It was informative. I can now ignore you happily.
  • R3MF - Saturday, January 12, 2013 - link

    "Similarly I don't think the potential benefits of eDRAM can be ignored, *if* they implement it well. I would like to think they will..."

    That is all fine and well, but the original point of my question was to ask whether average joe would actually have access to GT3 with edram, or;

    Whether GT3 would be reserved for the well-healed spending $1,500 dollars plus on an i7 ultrabook.............
  • mrdude - Saturday, January 12, 2013 - link

    Almost certainly the higher end and uber expensive models. eDRAM implementation isn't cheap so expect to pay a pretty penny for it. There will only be a small set of SKUs with GT3+eDRAM.

    Going forward, I think it's currently the most logical solution for CPU/GPUs as dual channel RAM is probably going to go the way of the dodo bird.
  • Spunjji - Monday, January 14, 2013 - link

    I'm pretty sure it's the latter. Cerise thinks otherwise. I'm going on past Intel history, Cerise is going on... hmm.
  • nicolbolas - Sunday, January 13, 2013 - link

    the odds are this part (with eDRAM) will only be on parts that are usually over 300-400 dollars.

    if a OEM wants do they might be able to get a GT in an ultrabook for 1250, maybe 1150:
    would look at the i7 model and add 150-250 more dollars (~50 for GT3 > GT2 + 100-200 for EDRAM), however the large issue is that Intel might only offer the EDRAM on the highest end models....

    if they offer all i7's with it and it is only 150 more dollars or so for GT3 AND EDRAM than it could be as low as 1050/1100 without discounts (based on Newegg)
  • thebluephoenix - Thursday, January 10, 2013 - link

    It's a DDR3 version of GT650M, I presume, not GDDR5 like in Mac Book Pro 15 Retina or Alienware M14X R2.
  • shiznit - Thursday, January 10, 2013 - link

    Anand, did they tell you how much eDRAM the chips has? Is it on die or package?

    Will the regular mobile SKU (13" Macbook Pro) have the eDRAM?

    Thanks.
  • Wolfpup - Thursday, January 10, 2013 - link

    I might have to stop insulting Intel.

    Of course we'll have to see what parts this actually ships in, how it actually performs, how the drivers are (for new AND old games), and also how many corners they cut with image quality...like right now it's not even a valid comparison, given Intel's doing less work than Nvidia and AMD to get their worse FPS on Ivy Bridge.

    Of course even if this is all true, I'll STILL be disappointed as they're now blowing an enormous number of transistors on a GPU that should be *optional*. All of them could be spent on CPU, or even just making the chip cheaper.
  • rootheday - Thursday, January 10, 2013 - link

    I am tired of people pulling out their old conceptions about Intel's graphics drivers - yes, 3 years ago Intel had a lot of game compatibility issues.

    Not so any more. As far as I know, Ivybridge works with basically any recent or older games. The few remaining issues are mostly with a couple old games that have coding bugs (e.g. Fallout3) or don't understand that "dedicated" graphics memory doesn't mean anything on integrated parts (e.g. GTA IV, Empire Total War, PES 2009...).

    re "corners cut with image quality" and "not even valid ... Intel's doing less work" - show me side by side screen shots or youtube videos where there is any difference in image quality between Ivybridge vs AMD vs NVidia with identical game settings. On Sandybridge the anisotropic filtering quality was lower, but Ivybridge fixed that. Intel doesn't do the sort of game profile "cheats" to texture or render tartget formats that AMD and NVidia do.
  • nicolbolas - Sunday, January 13, 2013 - link

    the real problem was Intel had only 100 games officially supported (even non-supported ones work often however). That means if you non-officially support game does not work or starts to not work you have no way of knowing if/when Intel will work to fix it.

    The quality of drivers is still overstated as you said, but it is not as bad as the AMD's (ATI's) drivers are horrible v. Nvidia's.
  • rootheday - Monday, January 14, 2013 - link

    Full disclosure: I work for Intel in the graphics driver team.

    There is a lot of misunderstanding about that list of 100 or so games. I am assuming you are referring to this page:
    http://www.intel.com/support/graphics/intelhdgraph...

    This list isn't about "officially supported" games. Rather it is about performance (playable frame rate).

    If you read the wording on the page carefully you will see that this list is about which games were known to deliver a playable experience INCLUDING ON HD2500 (GT1). The list that is playable on HD4000 was much longer and included more demanding games but unfortunately only one web page was published to cover both HD2500 and HD4000 so the list there was culled to those that were playable on the lowest common denominator. Obviously we wouldn't include any games on the list if they had compatibility issues - that is a given.

    We take compatibility very seriously.

    Besides the list above (and the longer list I mentioned for HD4000) where we test for both functionality and performance, there are also hundreds of other older games where we run automated tests for compatibility but the testing approach impacts frame rate such that we can't use the that data to make performance claims.

    Intel also has application engineers and our testing labs working with game developers to test hundreds of new games each year on Intel hardware and drivers before the games are released to ensure they are compatible and provide feedback to the game developers on how to tune them for performance.

    We take any reports of game compatibility issues seriously for current hardware. Due to resource constraints and the code freeze for releases we may not be able fix user reported issues immediately but I can assure you that quite a number of issues reported to communities.intel.com have been addressed within a couple driver releases this last year. When the issue turns out to be a game bug, we contact the game developer to see if they will issue a patch. In a few cases, we have found the issue to be OEM BIOS bugs and have refered users to the motherboard website for a BIOS update.

    Message to the gaming community: Intel wants to deliver the best gaming experience we can. Please let us know of any issues you see, providing good steps on how to reproduce the issue.
  • nathanddrews - Thursday, January 10, 2013 - link

    You people act as though Intel must either:

    1. Make a 5W chip for a $500 laptop that gets over 60fps in steroscopic mode whilst playing BF3 @ 1080p with ultimate details and max AA/AF.

    or

    2. Not even try.

    What is Intel to do? Everyone hates on them because their IGP sucks, so they improve it to the point where it can play a relatively modern and popular game at 1080p with high settings and fluid framerates. WTF are you complaining about?

    Wait, nevermind, I don't really want to know.
  • silverblue - Thursday, January 10, 2013 - link

    This is the fourth generation Core-based IGP and I think it's fair to say that over the past four years, Intel has certainly come a long way in terms of performance. If they can get their drivers sorted - and it's not as if they lack the resources to do so - then Haswell could be a very solid performer without even taking into account CPU prowess.
  • kyuu - Thursday, January 10, 2013 - link

    Uh, we'd like actual numbers, not a demo that is supposed to imply it has comparible performance to a 650m yet there are obviously so many ways in which it could be -- and almost certainly is -- misleading.

    Regardless, I'm not interested in an Intel graphics solution until their drivers stop being such absolute garbage. Raw power with crap drivers is meaningless.
  • CeriseCogburn - Friday, January 11, 2013 - link

    And thus we get to the AMD problem sideays - crap drivers, that people like you have totally ignored and quite the opposite fanatically endorsed anyway.
    LOL - total hypocrisy.
    Sweet justice !

    Intel drivers are better than amd drivers
  • nicolbolas - Sunday, January 13, 2013 - link

    LOLOLLOLOLLOLOLOLOLOLOLOLOLOLOLOLOLOLOLOLOLOLOLOLOLOL

    ARE YOU LOOKING AT WHAT YOU ARE TYPING/WRITING??????

    PLEASE, PLEASE, PLEASE, WAKE UP.

    i did not consider you anyone who was so biased to one side that they were ignorant of facts. The facts are AMD/ATI drivers have improved dramtically to the point where they are almost up to Nvidia's drivers.....

    I don't know if i can even take you seriously any more.
  • kyuu - Monday, January 14, 2013 - link

    Considering I've used an AMD graphics solution in my desktop for years with absolutely zero driver issues, I'm going to have to go ahead and relegate your opinion to the "crap" category.
  • CeriseCogburn - Tuesday, January 15, 2013 - link

    You've used a single amd graphics card in a single system, probably a single OS in a single install, and that means you know.

    You people are insane and stupid.
  • nicolbolas - Sunday, January 13, 2013 - link

    Intel loves to ignore how the HD4000 get weaker with weaker CPU's yet they usually only use the top end i7 to compare it to other GPUs...

    also their SPD..... that made me go from "meh intel" to "-_- intel, fail"
  • sockfish - Thursday, January 10, 2013 - link

    Just in case no one else has noticed, the screen on the UX51VZ is closed-- which blocks the fan exhaust. Especially with thin+light, the notebook WILL throttle if it overheats, and with the fan exhaust blocked you can bet that the 650M or CPU is going to downclock.

    Irrelevant though since there is nothing showing frames per second, so the FPS of the video could limit any fluidity anyway.
  • damianrobertjones - Thursday, January 10, 2013 - link

    "think 13-inch MacBook Pro with Retina Display, maybe 11-inch Ultrabook/MacBook Air?" - Why is it that the only two you mention are both Apple products?

    I was thinking more along the lines of tablets
  • Brainling - Thursday, January 10, 2013 - link

    Because Apple is the leading seller of ultrabooks in the world? Oh, also the leading seller of tablets.

    You may not like it, but Apple is a huge driver of Intel's business.
  • andrewaggb - Thursday, January 10, 2013 - link

    it's not intel powering apple's tablets. And if charlie is right, intel might not be powering apple's ultrabooks in a couple years either.

    If Apple can put their own cpu's in these machines instead of a core i5 or an i7 that's significant savings to apple and possibly even thinner and more portable devices for the consumer. In a couple years when their arm chips are fast enough for general computing (probably 2-3 more revisions) apple should switch. And they probably will.

    Anyways, on topic, I'm excited about haswell. Sure they are presenting best case... that's exactly what every company does. Of course some games won't work, others will run like crap, etc etc. And yes, with throttling, in a tablet or some ultrabooks you won't get the performance they are showing here. But considering how many tablets/ultrabooks I've seen with fast cpu's and good graphics.... I think everybody needs to get their expectations in line. This is going to be an improvement, but 7-10W will give you less than 17W or 35W out of the same chip. It pretty much has to. Real question is whether 7-10W (or even 17W) will be fast enough to run what you want with consistent performance. I don't think this demo tells us anything in that regard. We know nothing :-)
  • SirPerro - Friday, January 11, 2013 - link

    2-3 ARM revisions to match general computing performance of an intel core processor? That sentence is overoptimistic even with the assumption that Intel stops creating new products.

    If I have to bet, I bet for Intel to steal part of the ARM business starting this year with ultra low voltage Haswells, and not the opposite.
  • wetwareinterface - Friday, January 11, 2013 - link

    "2-3 ARM revisions to match general computing performance of an intel core processor? That sentence is overoptimistic even with the assumption that Intel stops creating new products"

    the problem is he didn't say match intel performance he stated for the apple arm variant to have an acceptable performance level for everyday desktop computing tasks.

    and now back to the topic.

    haswell will have improved graphics performance. the problem is expecting it to match a 650m while sitting in an ultrabook.

    intel has to hit a power profile and heat profile for the cpu/gpu combo to sit in an ultrabook. while intel is very good at power optimizations and process technology on the cpu side, that means squat on the gpu side. a gpu needs to run full out and load the many shader units constantly to achieve good framerates. which means no throttling, also no optimizing for low power beyond restricting clock speeds and shader counts etc...

    my point being if the gpu runs it has to run most of the time at max power draw and that being said: no ultrabook will will have anywhere near 650m performance on the next generation gpu from intel. and especially nowhere near the performance of amd's next gen part in a full laptop either. the intel demo is just a marketing hype example, and actually i'm not thrilled to see a post about it without a lot of text explaining it was a pure marketing stunt and not to take the claims as anything but such.
  • Spunjji - Friday, January 11, 2013 - link

    Well said. There is no "rush to sleep" with graphics performance. You give it all all of the time or you might as well not have showed up to the fight.
  • CeriseCogburn - Friday, January 11, 2013 - link

    LOL - the last gasping breath of the losers crew -

    The cpu advantage, bus advantage, stability advantage, reputation advantage will all combine and no amount of whining will change that.

    Since so many of you were more than willing to put up with amd driver crap for YEARS because of your fanboy fanaticism, you're clearly in deep trouble, as this article demonstrates.

    It's big trouble - Haswell is coming.

    ROFL - Oh I love it.

    One word of sanity = ENDURO

    Finished ! It's over amd.
  • wsw1982 - Friday, January 11, 2013 - link

    The clover trail+ in lenovo k900 SMARTPHONE already crush duel core A15 in Nexus 10 TABLET...

    http://www.phonearena.com/news/Intel-Atom-powered-...
  • Krysto - Friday, January 11, 2013 - link

    Haha. What a fake benchmark. Are you telling e S4 Pro gets 13,000 in Antutu but CloverTrail get 25,000? Come on now....

    We've seen before how certain hacks or patches can make these benchmark scores much higher. Show me a stock device being tested but a relatively reputable source, and then I'll believe it.
  • nicolbolas - Sunday, January 13, 2013 - link

    As the subject of the post says, if Apple can make an ARM CPU + get an ARM GPU that is good enough for most people who use their products they will do it.

    And if any company can i would say Apple can... especially because they can also go for more cores and add code which lets the OS use all those cores :)

    Plus, by the design ARM cores are more power efficient in general than current x86 cores.
  • Brainling - Thursday, January 10, 2013 - link

    I supposed this may be great in some sectors, but it's not going to get me to change from discrete offerings.

    I buy Intel because their general register performance blows away anything AMD offers right now. The on-board GPU is completely irrelevant to me. I really wish Intel would offer "bare" version of their CPU's, that have no attached IGP. Shave off 50 bucks or something. I don't need, and will never use, Intel's integrated solution.
  • dj christian - Monday, January 14, 2013 - link

    Well on the desktop side just by the it's sibling the Xeon. It's exactly the same just a tad slower mhz but bigger cache and no igp which in turn make it draw a little less power.
  • Hector2 - Thursday, January 10, 2013 - link

    This is what we've been doing for over the last 40 years. Integration, integration, integration. SSI chips on the motherboard combined into a "chipset". The external L2 cache was integrated into the CPU, then the chipset and memory controller integrated into the CPU, now the graphics is combining into the "CPU". The silicon black hole sucking up all transistors around it into a single piece of silicon continues.

    After 22nm, then 14nm, then 10nm, then 7nm.
  • torp - Friday, January 11, 2013 - link

    I have a feeling that those of us running Linux will still need to get a NVidia card because they have the only full featured drivers that run with no problem...
  • Medallish - Friday, January 11, 2013 - link

    Last I checked AMD's Binary drivers on Linux weren't that bad, they aren't perfect, but the main reason AMD is criticized on Linux is for their Open Driver support, which as bad as it is, is even worse with nVidia(Linus Thorvaldsen's reason for his stunt a while back) currently, hoping this might change, Intel actually seems to have the best support when it comes to Open Source drivers.
  • torp - Saturday, January 12, 2013 - link

    Not interested in open source/closed source, just in drivers that run my 3d games in wine. If you take a look at the application DB on winehq.org, most people complaining of artifacts, missing details etc run on amd video cards, no matter the drivers. Nvidia has far less problems, while intel doesn't even get a mention usually :)
  • CeriseCogburn - Monday, January 14, 2013 - link

    It's an amd fanboy, so real world facts do not matter.
    Save the world open source everything for free forever matters, so long as evil profit companies lose in that process.

    Dummy doesn't realize Torvalds did his birdie stunt because nVidia won't hand him the goods, not because nVidia drivers "suck".

    amd fanboys are stupid, biased, liars, incorrect, etc.

    Whatever they dream up - who cares
  • Hrel - Friday, January 11, 2013 - link

    I'm not going to be able to recommend a dedicated GPU to very many people starting in August. If Intel starts focusing, and delivering, on reliable drivers for the iGPU then that number drops even lower. Honestly 1, really good, reliable, stable, driver release/year would do it.
  • CeriseCogburn - Friday, January 11, 2013 - link

    There ya go - there's a dose of coming reality. It's already occurring, and amd is left in the dust because their cpu sucks and their drivers suck.

    This spells the doom of amd, as the alternate recommendations will be Intel cpu and OPTIMUS / nVidia.

    That's why the amd fanboys are going nutso.

    Good luck AMD and you need more than just that, like reversal of the brain drain and a huge bailout from some oil sheiks, better name a few more upcoming products Abu Dhabi or the like....
  • Medallish - Friday, January 11, 2013 - link

    Please explain in a detailed why what's wrong with AMD's CPU's and gfx drivers? My AMD systems work just fine, they're stable, and they perform very well, Windows 8 has been a bigger cause of instability on my Laptop, and it hasn't been that often.

    I have an HTPC/fileserver using a Llano APU, it runs 24/7, so far it hasn't crashed at all.

    What's a mystery to me is how you're not banned? This could have been a nice discussion about the implications of Intel improving their GPU's this much, instead it's a fanboy feces-fest, and you're the only one throwing.
  • nicolbolas - Sunday, January 13, 2013 - link

    but soon became engulfed by his desire to overwhelm anyone who liked AMD or make non-super-positive comments about Intel.

    anyways, i doubt this will mean much for anyone unless Intel puts the GT3 in more parts than i think they will (i think only i7) and/or EDRAM in with all GT3, maybe even GT2 (once more, i think only SOME i7s).

    I think that will mean that only enthusiasts or people buying from boutiques will possibly get a GT3 (+ EDRAM).

    I think this narrows it down to only people who just spend a lot of money on computer the OEM recommends.
  • dj christian - Monday, January 14, 2013 - link

    Don't listen to him! He's just a immature teenager by the way he writes.
  • jamyryals - Friday, January 11, 2013 - link

    I enjoy Intel's products, and they are masters of execution. However, this Cerise guy has ruined the comments in this post. No one can have any kind of discussion with him polluting the threads.

    Thanks to javascript, I will be auto hiding his comments from the DOM.
  • silverblue - Saturday, January 12, 2013 - link

    Oh, how I envy you. :(
  • dj christian - Monday, January 14, 2013 - link

    Just try to ignore him! He's just a immature teenager.
  • CeriseCogburn - Friday, February 1, 2013 - link

    That's "an".

    Being correct all the time and correcting the emotionally controlled blabbering idiots is AN adult's vocation.
  • 7beauties - Sunday, January 13, 2013 - link

    I couldn't tell the difference in the demo video, but I'm a firm believer in the superiority of discreet GPU's which have unrivaled high-end solutions, such as AMD's 7990 and Nvidia's GTX 590. Nonetheless, it's nice to know that integrated graphics has come a long way.
  • CeriseCogburn - Thursday, February 14, 2013 - link

    Nvidia's GTX 690....
    It's not your fault the amd fanboys have his the fastest card in the world from you.
  • lmcd - Monday, January 14, 2013 - link

    Umm, an A10m Trinity almost hits a GT 640m, which in turn, clocked up, constitutes a 650m. Meanwhile, GCN destroyed VLIW4 solutions. Haswell's graphics aren't making as much ground as this video suggests, and settings and/or game selection also suggest possibly a lower resolution to prevent memory bandwidth from being a bottleneck.
  • silverblue - Tuesday, January 15, 2013 - link

    I haven't seen any evidence of the A10-4600M beating a GT630M setup. If you have a link or two, I'd be very happy to see it. :)

    Notebookcheck have a huge number of benchmarks, and although I cannot see what CPU is being paired with the GT630M, it's a very interesting source of information:

    http://www.notebookcheck.net/AMD-Radeon-HD-7660G.6...

    What I wouldn't mind seeing is how the A10-4600M handles a GT630M as its discrete card; does it limit performance or is that only something you're going to notice with faster cards?
  • tech.noob.fella - Tuesday, January 15, 2013 - link

    hey guys please help me i need to know a few things....

    i wanted to know if there is any importance of integrated graphics in a computer if it has discrete graphics in it....i'm asking because i wanted to know if i should go for the series 7 chronos with high end discrete graphics card with average integrated graphics or wait for haswell graphics with better integrated graphics and mid range discrete gpu.....

    in short, will there be any significant difference btwn current hd 4000 and upcoming haswell gpu??
  • peterfares - Tuesday, January 15, 2013 - link

    If you have a high end dedicated graphics then obviously that would be used when doing anything graphically intensive and the HD4000 is more than sufficient for desktop usage. It may be worth it to wait for other reasons, though.
  • tech.noob.fella - Tuesday, January 15, 2013 - link

    what i meant to ask was, will a series 7 ultra/chronos running haswell provide better graphics performance than the same ruuning ivy bridge??

    i need to decide wether i should wait for haswell versions or go for recently announced versions...i need good graphics performance laptop

    thanks :)
  • redtruthseeker - Sunday, March 3, 2013 - link

    This is just a demo, not real world situation where you control and this could be just an optimized game demo. This is a desktop demo and it was compared to a Nvidia's laptop GPU the GT650M.

    From what I can see in the video, GT3 is an desktop IGP and its a great improvement over 40% probavly as Intel claimed. GT3 wont beat AMD's A10 5800k APU's GPU Radeon HD 7660D and that can be concluded easily by watching this video that benchmarks /watch?v=qvVBsB5Bvl4

    Dirt 3 benchmark 1080p38-40+fps High No AA - /watch?v=-pv-EeTn-GM

    As I respect this site, its well known that Anandtech is crawling with die-hard biased Intel fanboys and this is the "eagles nest" for them... I know that people that prefer AMD are always discriminated and insulted by unculture, uncivilized and primitive uneducated fanboy rage by hardcore Intel fanboys.

    Intel was a stomped by AMD and even Cyrix in 1990is and before, Intel is now a power house because AMD decided to buy ATI and they made some bad moves plus Intel decided to use some dirty tricks like bribery and "special discounts" for the shop that delays shipment of AMD's CPU's... The 2 billion dollars fine is just too small for the damage over 10-20 billion dollars that Intel has done to AMD. -_-"

Log in

Don't have an account? Sign up now