Comments Locked

78 Comments

Back to Article

  • DMisner - Tuesday, May 3, 2011 - link

    Part of me hopes BD gets delayed so AMD will release a Phenom II X4 @ 4GHz
  • tipoo - Tuesday, May 3, 2011 - link

    ...But why? It would just look good on paper, BD is where their real performance aspirations are.
  • DMisner - Tuesday, May 3, 2011 - link

    for the sheer novelty of it. Thats all.
  • Belard - Tuesday, May 3, 2011 - link

    There is no novelty to these issues. Its business. Buy any AMD X4 965+ and OC to 4Ghz.... thats the Novelty part.

    Having ALL your best products - even those costing almost $300 that is slower than the competitions $200 lower-end CPUs is not fun.

    I have a #2 desktop that is rendering videos daily (converting my OLD VHS) - and I'll need to upgrade its mobo/CPU to speed up the process. A NEW CPU will speed things up at lest 4-6x. (Its an OLD AMD X2).

    The Intel's use less power, there is that odd-ball combination that allows the GPU of the intel be used to help render video faster.

    So yeah, when looking at a $150~200 CPU, its performance that counts - not MHz.

    Still, for most people - any $75~100 CPU will DO just fine. Including gaming.
  • GullLars - Wednesday, May 4, 2011 - link

    With a $20-30 aftermarket cooler, you can hit 4GHz while undervolting a x50+ Phenom II. I haven't tried Athlon II's, but i'll be upgrading my father's Athlon X2 7550 to an Athlon II x4 645 and donate my old 1066 CL5 DDR2 sticks to it, so i guess i'll try hitting 4GHz on that too just for fun with a Cooler Master Hyper 212 Plus.
    My 1090T runs F@H smoothly at 4GHz with stock volt (Noctua NH-D14 cooler).
  • JonnyDough - Thursday, May 5, 2011 - link

    I agree.

    Considering that the AMD Phenom II X6 1100T is only $54 more and beats the new chip in most benchmarks while using less power under load (using max turbo @ 3.7, same clocks as the new 980 BE) I'd say that this new 4 core is a poor value comparatively.
  • JumpingJack - Sunday, September 21, 2014 - link

    Hindsight is 20/20, it didn't work out quite as people wanted....
  • StrangerGuy - Tuesday, May 3, 2011 - link

    I'm sure the PhII is already stuck at 3.8GHz at reasonable voltages and has been this way for a long time.

    I wonder how AMD feels when the mobile i7-2820QM is just as fast as their 4.2GHz OCed Phenom II X4. Bulldozer single-threaded performance and power consumption has to be at least the same as Nehalem to stand a fighting chance.
  • khimera2000 - Tuesday, May 3, 2011 - link

    probably the same way Intel feels about AMD's video cards.
  • Belard - Tuesday, May 3, 2011 - link

    You ARE kidding, right?!

    You can already OC to 4Ghz. And it will STILL be slower than i5-2400 ($190) which runs at 3.1Ghz.

    Clock Speed doesn't mean then end-all. Remember the says of AMD64 vs P4? Even at 3~4Ghz, the $1000 P4 Extreme Editions were still SLOWER than AMD's 2.0~2.4Ghz $200~300 CPUs.

    The performance wouldn't be so much an issue *IF* that i5-2400 was selling for $400, but its not. Its selling at the same price with a actual performance benefits.

    The i3-2100 (I hate these stupid intel model numbers) = $125 and puts it on par with the AMD-PII 955 ($140) - which is an upper end AMD part... going against an intel bottom end Core X CPU.

    Bulldozer needs to be OUT NOW. AMD makes great products, but they are late to the party.
  • Sivar - Tuesday, May 3, 2011 - link

    In "Gaming Performance", the last two charts show Core i5 frame rates which are a little on the low side.
    Thanks for saving the images in PNG format though. You'd be surprised how many technical authors save images which have large areas of flat color in JPG format. :)
  • gevorg - Tuesday, May 3, 2011 - link

    How many times does AMD wants to win the worst power consumption crown?
  • mattgmann - Tuesday, May 3, 2011 - link

    Amazing that Amd's Phenom II is still clock for clock on par with the old q6600. It might have a few more refinements in coding, and a higher clock speeds, but it's raw horsepower is still on par with a 5 year old chip.

    AMD is two gens behind intel in performance, I don't see how BD can reasonably expect to be in the same ballpark as sandy bridge. I'm sure they'll compete in a budget based scenario, but the high end is a moon shot. With Intel's x58 replacement is in the que I just don't see AMD even sniffing a piece of the high end action.
  • abhaxus - Tuesday, May 3, 2011 - link

    You know, I am with you. But who believed that AMD would ever have released the Athlon and beaten Intel for as long as they did? I have hope, and we all should, for our own wallet's sake. AMD's inability to beat intel is the reason my Q6600 is still a fast CPU (granted, overclocked to 3.2ghz, it isn't exactly stock).

    Nothing would make me happier than a fast AMD cpu to make me consider upgrading.
  • LancerVI - Tuesday, May 3, 2011 - link

    I admit that I'm an Intel man. But I've got to say. I hope AMD pulls BD out and comes out guns blazing. My wallet could use the help!!!

    Intel has been able to charge whatever for quite sometime now. It needs some furious competition to help drive prices down.

    ...at least I hope.
  • mattgmann - Tuesday, May 3, 2011 - link

    Don't get me wrong, I'd love it if AMD were neck and neck with intel. Competition helps my wallet and keeps high end gear in my beige box.

    It's just at this point in the game, BD cannot logically advance far enough past their last generation to compete with Intel's lineup. I'm sure that BD will be priced competitively and give some trouble to the low end sandy bridge chips. It won't be because their more advanced though, it'll be because the AMD chip will be a power hoggin, quad+ core with unlocked multipliers up against a lean locked down intel dual core.
  • DMisner - Tuesday, May 3, 2011 - link

    They COULD be the high end.. if they made the move to ARM.

    Still though, I have faith in AMD.
  • aegisofrime - Tuesday, May 3, 2011 - link

    Make the move to ARM, and abandon the x86 software market? Not a wise move.

    Unless they used some sort of dynamic translator a la Transmeta Crusoe.
  • Shadowmaster625 - Tuesday, May 3, 2011 - link

    All they need to do is add a couple ARM cores to their existing bobcat design. Then they need to get in bed with microsoft and make sure that windows 8 properly implements schedulers that can seamlessly integrate ARM and x86 code. If AMD and microsoft both execute correctly, an 18 watt brazos chip is all we'd need for a desktop OS once the windows kernel is all ARM. Office would follow in a year. Antivirus and photo/video editing would go to the SIMDs. The x86 cores would only handle our legacy apps.
  • extide - Wednesday, May 4, 2011 - link

    Why would you EVER want to do that? Then you would have developers having to build apps that have to run on two architectures at the same time? If you want to do fixed function stuff do something like Intels Quick Sync for video, and then leave an x86 chip as an x86 chip.
  • JKflipflop98 - Tuesday, May 3, 2011 - link

    ARM gets CRUSHED on the desktop by x86.
  • JimmiG - Tuesday, May 3, 2011 - link

    The Q6600 didn't come at 3.7 GHz or overclock to 4.2 with stock cooling. Performance clock for clock doesn't really mean anything.

    But I'm not disagreeing that the Phenom II is unimpressive. The Phenom II is essentially the same as the Phenom released in 2007, but with more L3 cache. The Phenom itself wasn't all that different from the K8 from 2003, which in itself was just an evolution of the original Athlon.

    You could trace current Intel CPUs back to the Pentium Pro in the same way, but they have gone through many more, radical changes over the years. Hopefully those radical changes will come to AMD's CPU architecture with the release of BD.
  • jabber - Tuesday, May 3, 2011 - link

    Look at it another way and you could say AMD have done an amazing job keeping whats essentialy an 8 year old design in the running.

    When you look at it that way Intel's latest gen chips giving you an extra 10fps isnt that amazing.
  • BSMonitor - Tuesday, May 3, 2011 - link

    "When you look at it that way Intel's latest gen chips giving you an extra 10fps isnt that amazing."

    FPS are the best case scenario for older/weaker processors as they are ultimately limited by GPU performance....

    "Look at it another way and you could say AMD have done an amazing job keeping whats essentialy an 8 year old design in the running."

    The Core i series can be trace it's origins to the original Core Duo processors that debuted Apple's transition to x86 (5 Years ago) Intel has had 4 extremely impressive architecture changes since, each improving performance from ~20% to ~100% in some cases... AMD has executed 2. The first Phenom was a HUGE disappointment and couldn't compete with Core 2 Duo's let alone Core 2 Quads.. Phenom II, now gaining traction, is still barely competing against those same Core 2 Quads..
    Your same ~10fps difference could be said of my old Core 2 E6600 against your Phenom II x6 in GPU limited scenarios...

    AMD's entire success in surpassing Intel was placing the MCU on the CPU die. With that move, they blew their load. Thanks to the power hungry beast that was the netburst processors, Intel worked on improving caching algorithms, multi-threading, parallel processing, etc The end result is Intel with an extremely efficient CPU(born from it's mistakes) and an integrated MCU, AMD is left with just a so-so CPU and an integrated MCU.
  • Action_Parsnip - Tuesday, May 3, 2011 - link

    "FPS are the best case scenario for older/weaker processors as they are ultimately limited by GPU performance...."

    This sentence has no meaning.

    "Intel has had 4 extremely impressive architecture changes since"

    I count 3 changes. core 2 -> nehalem wasn't #extremely# impressive, just impressive.

    "AMD's entire success in surpassing Intel was placing the MCU on the CPU die."

    Your a fool and do not know what your talking about.
  • MilwaukeeMike - Tuesday, May 3, 2011 - link

    "Your a fool and do not know what your talking about. "

    It's you're btw. "Better to remain silent and let others think you're a fool than open your mouth and remove all doubt" ;)
  • SlyNine1 - Tuesday, May 3, 2011 - link

    ""FPS are the best case scenario for older/weaker processors as they are ultimately limited by GPU performance...."

    This sentence has no meaning."

    Made perfect sense to me. The rendering of FPS ( Frames per Second) is a best case senario for older/weaker processors as the bottleneck is elsewhere..

    I don't understand how that doesn't make sense, it makes perfect sense.

    And getting on someone for saying extremely impressive instead of just impressive, Facepalm!!
  • Action_Parsnip - Tuesday, May 3, 2011 - link

    ""FPS are the best case scenario for older/weaker processors as they are ultimately limited by GPU performance...."

    so is he saying the game tests in this review do not show the same trends as the other tests? That is what he is saying in effect. They look perfectly in line with expectations afaik. The sentence on it's own does mean very little if anything. He is either saying this review is doing something wrong or that gaming tests are of little/no worth.

    "nd getting on someone for saying extremely impressive instead of just impressive, Facepalm!!"

    ZOMGWTFBBQ111!!!!!!! LOL!!!11

    Being a native english speaker, I know there is a difference between impressive and extremely impressive.

    Skipping a stone on a lake 20 times is impressive. Walking on the water there is extremely impressive.
  • extide - Wednesday, May 4, 2011 - link

    "Thanks to the power hungry beast that was the netburst processors, Intel worked on improving caching algorithms, multi-threading, parallel processing, etc The end result is Intel with an extremely efficient CPU(born from it's mistakes) and an integrated MCU, AMD is left with just a so-so CPU and an integrated MCU. "

    I have said this exact same thing many times before. At this point in time making the P4 helped Intel because they had to optimize the heck out of EVERYTHING in it to even be remotely competitive, and well, now all that work is done.
  • Ushio01 - Tuesday, May 3, 2011 - link

    Actually with core/thread/clock being equal 65nm Core2 will beat Phenom II in nearly all benchmarks 45nm Core2 crushes it.
  • medi01 - Tuesday, May 3, 2011 - link

    Now try to make price of CPU + Motherboard equal and compare again.
  • BSMonitor - Tuesday, May 3, 2011 - link

    "Now try to make price of CPU + Motherboard equal and compare again."

    Price is a function of demand. AMD HAS to sell them this cheap to move them at all. They do not sell them as cheap as they do because they are more cost effective to produce, so your comment in this context is completely irrelevant ..
  • Action_Parsnip - Tuesday, May 3, 2011 - link

    You misunderstand his point.
  • kmmatney - Tuesday, May 3, 2011 - link

    Yep. I've bought at least 10 cpu+MB combos for $99 at Microcenter over the last year, for clients, friends, and family. Fast enough for most purposes, and you are GPU limited in most games, and so better off spending the cost savings on a better video card. For $300, for can get a much better performing AMD system, versus Intel. Maybe not good for AMD, but good for me.
  • silverblue - Thursday, May 5, 2011 - link

    But it doesn't. There's plenty of benchmarks around to show that a comparative Phenom II X4 gets a few wins, especially in encoding. Aside of a few very strong Intel-optimised situations, there's really very little in it, and anyway, if you ignore the very high price of the upper Core 2 Quads, you can get a faster Phenom II X4 or X6 for less.
  • Targon - Tuesday, May 3, 2011 - link

    AMD has a new platform on the way, and that is where the focus has been. For now, any new Athlon 2 or Phenom 2 processor will just be a bit more of the same with a higher clock rate, but still has the same exact design as previous chips. Bulldozer on the other hand is the big push to get back to being competitive.

    The big problem is, and remains the lack of a 32nm fab process which Intel has had for a long time now.
  • haplo602 - Tuesday, May 3, 2011 - link

    thanks for the compilation benchmark. finaly something usable for the Linux folks :-) linux kernel compilation (or open office) would be better, but this is not a bad start ;-)
  • macky_r - Tuesday, May 3, 2011 - link

    I knew from the beginning that AMD wants to milk more performance out of it's Deneb architecture. I wasn't surprise when I saw this item on this website that I visit from time to time.

    I currently own one of the first Deneb CPUs (PH II 965 BE) w/ a clock of 3.4 GHz rated at 140W! I didn't even bother OCing this regretful purchased CPU of mine. How I wish I waited for the cooler versions, but I was in a hurry to build a PC that I can use at home for my Networking classes

    I will buy the 1100T soon. I will be be OCing it at 3.7 GHz w/ turbo core disabled.

    I don't wanna be negative, but I know for a fact that Bulldozer will not beat Intel's upcoming LGA 2011 CPUs. Maybe Bulldozer is meant to compete with Sandy-Bridge.
  • Griswold - Tuesday, May 3, 2011 - link

    Yea, you know for a fact...
  • BSMonitor - Tuesday, May 3, 2011 - link

    "I don't wanna be negative, but I know for a fact that Bulldozer will not beat Intel's upcoming LGA 2011 CPUs. Maybe Bulldozer is meant to compete with Sandy-Bridge"

    Exactly. Even if. Look at the turbo boost scenarios from the Intel SNB processors. There is so much headroom on these processors. Intel is already holding back performance because of lack of competition.

    Intel has silicon running stock air-cooled 4GHz and beyond in house, guaranteed.
  • Action_Parsnip - Tuesday, May 3, 2011 - link

    "I don't wanna be negative, but I know for a fact that Bulldozer will not beat Intel's upcoming LGA 2011 CPUs. Maybe Bulldozer is meant to compete with Sandy-Bridge"

    Unless you've seen the prototypes you do not know for a fact. PERIOD.

    "Intel has silicon running stock air-cooled 4GHz and beyond in house, guaranteed."

    Unless you've been 'in-house' you cannot guarantee that. PERIOD.
  • Orwell - Tuesday, May 3, 2011 - link

    What about overclocking the CPU-NB of the chip?

    It has been proved useful in one of the Phenom II X6 reviews (can't find it now though), where performance in HAWX just shot right up bij about 20% I believe when upping from 2GHz to 3GHz.

    It's a shame most if not all reviewers don't overclock their uncores. Or, well, at least, they're not telling you and they don't put a CPU-Z Memory Tab screenshot in the review, showing the Uncore-frequency.

    I know pretty much all hope is lost for this aging design (Deneb), but as an owner of this furnace CPU called the Phenom II X4 C2 (yes, 140W at 3.4GHz), I'd like to know how much faster the Intels are compared to my 3.7GHz/2.4GHz oc.
  • Stas - Tuesday, May 3, 2011 - link

    I haven't had an Intel CPU since P4 Willamette. I've been happy with AMD bang-for-buck, as performance seemed sufficient, and overclock always covered any shortcomings. Nowadays, I see mid-level Intel CPUs beat AMDs top-end offerings every release. And honestly, I'm really bottlenecking in the CPU department, but I don't see AMD offering a solution (running Phenom X3 @x4 3.5Ghz). I've been waiting for 2 years to upgrade the processor, and I'm getting tired of this. Don't make me cross the Sandy Bridge, AMD. Make BD happen. And it better be good.
  • jabber - Tuesday, May 3, 2011 - link

    I often see the reviews with Intels chips running ahead.

    Then I think "hang on though, the AMD chip gives me 60fps+ and costs half the price including the motherboard tax!"

    Then its not so bad.
  • starfalcon - Monday, May 9, 2011 - link

    But then Ivy Bridge comes just a few months after BD, so what happens then?
  • raevoxx - Tuesday, May 3, 2011 - link

    Where I work, AMD still outsells Intel by a factor of.. well... over 20-to-1 if not more. What we always tell customers, is that if price is no object, the Intel platforms are higher performance. But best bang-for-the-buck is AMD, and it's not like we're comparing an i7 to a 486-SX. We try to explain it in best terms, but there's Intel processors in our display case that are actually gathering dust. Which frankly makes an AMD fan like me happy :) But I digress.

    We carry a full line of Intels, from the Celeron cheapies, all the way up to i7. And we finally closed out all of our 1156s and only sell 1155s.

    Like it or not, our customers are amazed that they can pair up a decent mobo, an Athlon II 250, and 4GB of 1333, for less than $160. Most spring for the 1075T for the price, too. Whether or not it's faster than a similarly priced i5, people like the ability to say they have six cores. When they can get 75% the performance of a comparable chip, for less than 50% of the cost... people bite.

    We're quite excited to start carrying Zambezi chips, when we're able to, since they'll be more competitive. But it's always going to be darkest before the dawn, and it's nice that AMD is throwing in a speed bump or two (1100T, etc) before the architecture change. Instead of letting their chips languish until BD.
  • jabber - Tuesday, May 3, 2011 - link

    Must admit I havent ever made an Intel box for a customer, its always AMD.

    I check out the Intel CPU range every now and then and check what price the bottom non Celeron Intel chip is going for, then the cheapest decent brand Intel motherboard and after seeing any profit just vaporise, I roll my eyes and go back to the AMD section.

    Intel isnt worth the extra cost for most ordinary folks. Intel are total overkill. The good old 3GHz dual core Athlon with a mATX MB and 4GB of 1333DDR3 works a charm everytime.

    If a customer came to me and said he wanted to do loads of transcoding and video editing and had £1200 to spend then lets go Intel. But as most come to me with a budget of £4-500 and I need to take my cut, its not going to happen.
  • MilwaukeeMike - Tuesday, May 3, 2011 - link

    This is exactly how I feel. I've always owned AMD because it's been fast enough and cheaper. If I had to build a PC today I'd choose a Sandy Bridge processor, but i'm not building one because my AMD 955 BE still does everything I need it to. I have tons of windows open on my 22" monitor and play my games on my 23" and have no issues.

    A lot of that Intel performance gain falls into the 'can't even tell' category for many users.
  • Peroxyde - Tuesday, May 3, 2011 - link

    The saving made by buying AMD, would you pay it back in electricity? Let's say, after 2 years? Just want to see if the higher power consumption would translate somewhere. If any of you have done any comparison in this area, I would appreciate very much if you can give some highlights.
  • jabber - Wednesday, May 4, 2011 - link

    It would take longer than the life of the Intel i3 box to make back the £65+ extra the intel box cost me in power savings.

    If the difference was £10 then yes but £65 ($104) is just too great to make back.

    AMD still wins for a standard system cost wise. As these are PCs for Joe User and not overclockers then you can switch on the power saving settings anyway.

    Plus they rarely run at 100% for very long.

    Intel still isnt competative at the increasingly growing low end customer group.

    Most people dont need 4GHz+ leviathan power CPUs anymore. If anything Intels future customers at the top end will be getting a smaller and smaller group.

    How many of us here still demand the top end (or as close to) CPU we can buy? I bet many of us are now happy to make do with a mid-range or less CPU and spend the saving elsewhere.
  • silverblue - Wednesday, May 4, 2011 - link

    I've undervolted my 710. Makes virtually no difference to performance or stability and seemed like a good idea at the time. More people should do it.
  • Shadowmaster625 - Tuesday, May 3, 2011 - link

    This is ridiculous. BD is one month from launch, and llano has been shiiping for weeks. Yet you guys dont have a single benchmark of either. Or maybe you do and are just too spineless to post them. It is sad that we had better sources of information 10 years ago. Now there is nothing because everyone seems to care too much about NDAs. Who cares about NDAs? If you know someone who has the hardware you should post their review asap, and not worry about NDAs. What good does it do to release a review the same exact day as two dozen other sites? If you want more eggs sometimes you have to sacrifice a few chickens.
  • AssBall - Tuesday, May 3, 2011 - link

    Who cares about NDAs??? Did you really just ask that?

    How about reviewers that enjoy getting engineering samples, new products to review, and support from the manufacturer's for free? How about any reputable, respectable, reviewer? How about anyone who frowns upon breaking a contract?

    And your chicken analogy is full of fail.
  • heymrdj - Tuesday, May 3, 2011 - link

    I asked myself the same thing, did he really just ask that? Someone never took ethics courses, or has any ethics for that matter. Just because you want to drool like a baby over specs is no excuse to ask these reputable authors to break NDA. On the day of release feel more than free to go view any hardware site you wish.
  • Shadowmaster625 - Tuesday, May 3, 2011 - link

    Yeah well you must be a spineless chicken too. There are plenty of ways to get chips. All those engineering samples out there, and no one can get their hands on one? Yeah right. If all NDAs were broken there would be no NDAs. Your brain is full of fail if you cannot understand that. Just a bunch of spineless cowards.
  • Makaveli - Tuesday, May 3, 2011 - link

    If its so easy to get one why don't you and post one on your own review site oo wait a min......
  • haplo602 - Tuesday, May 3, 2011 - link

    are you a total idiot ? it's called competitive advantage. if they would break the NDA, no engineering sample for them for the next round. if everybody breaks the NDA, no launch day reviews. that simple.
  • Shadowmaster625 - Wednesday, May 4, 2011 - link

    The engineering samples would go to different people, but chances are one of them would hook up with someone who has the reputation for having some freakin cojones. Especially if money was involved. There IS money to be made in this way, and its not illegal. Again, it just takes cajones.
  • AssBall - Thursday, May 5, 2011 - link

    You go find a web review site with "cajones" then, since you are so fond of them, and please leave the rest of us in peace while we wait for the non-half-ass Anandtech review after the NDA lifts.
  • Makaveli - Tuesday, May 3, 2011 - link

    LMAO when you understand what an NDA is and why you shouldn't break it or alteast turn 18 then come back and post.

    Its obvious Common sense and logic ain't strong in your family.
  • rfle500 - Tuesday, May 3, 2011 - link

    I'm glad to see a compile test in the lineup. Can I ask - how did you choose number of threads for the test - was it the same for all (12) or based on number of physical, or hyperthreaded cores. In my experience, on AMD chips most compilations are faster with overloaded threads, say 2x number of physical cores - did you test this possibility?
  • silverblue - Thursday, May 5, 2011 - link

    Some programs don't work well with non-power-of-2 architectures either, which would harm performance of the X3 and X6 processors (Windows Media Encoder 9), or even worse, cripple things completely (DivX or XviD on VirtualDub uses just one of an X3's cores, so you need to set the affinity to two cores). I suppose logically, overstressing a hyperthreaded CPU would mean that the its execution units are fully utilised and, as such, logical cores won't actually make any difference, so it would be in these situations where the X6 could perhaps close the gap a little.
  • Shadowmaster625 - Tuesday, May 3, 2011 - link

    It also gives them an opportunity to remove millions of wasted cpu cycles.
  • krumme - Tuesday, May 3, 2011 - link

    Man this is some boring news. I would prefer to get some more inside or backgroud info from AMD, Intel or Arm country, even if it takes years compared to this. But i guess this is better business :)
  • greenguy - Tuesday, May 3, 2011 - link

    I keep checking here looking for the inevitable Llano review, but it's not here yet. How long do we have to wait?
  • Action_Parsnip - Tuesday, May 3, 2011 - link

    It appears you would like to marry Francois Piednoel?
  • jabber - Wednesday, May 4, 2011 - link

    Who is Francois Pedofile?

    Never heard of him.
  • 529th - Tuesday, May 3, 2011 - link

    I don't see this mentioned in the review nor the benchmarks, someone correct me if I am wrong but when those benches are made, are they made with Turbo disabled? If not, I don't see it as a fair comparison if you are running stock speeds when comparing a 3.3Ghz i7 2500K vs a 3.33Ghz i7 975 You have games that make use of all cores, and some that use only a few.. so you may get a higher turbo on a Sandy Bridge chip. This is not exact science but making the GHz speed into an exact comparison without Turbo enabled gives a little more insight into the product..

    just sayin
  • PubFiction - Tuesday, May 3, 2011 - link

    This is just a final release to cap off Phenom 2 they know they were beat they are just giving a little speed boost to it for more value. If this was bulldozer then ya it would be bad but we already know what phenom 2 has to offer why is there so much discussion around a final revision chip. They are just throwing out what may be the most speed they can with that core until BD arrives.
  • Casper42 - Wednesday, May 4, 2011 - link

    "AMD originally introduced the Phenom II architecture over two years ago to compete with Intel's Core 2 lineup. Intel has since been through one major microarchitecture revision (Sandy Bridge) and Phenom II is beginning to show its age."

    Am I missing something or just not counting properly.

    Phenom II introduced to compete with Core 2
    Intel then Introduces Nehalem/Westmere
    Intel then Introduces Sandy Bridge

    So wouldn't Intel have been through 2 major architecture revisions?
  • silverblue - Wednesday, May 4, 2011 - link

    Phenom II came to market after Nehalem. Yes, it wasn't exactly meant to compete with it, and was more about sorting out what Phenom did wrong, but the unfortunate truth is that the i7 beat it to market.

    529th - I think the issue would be that you would be disabling a feature and thus not showing accurate performance, however that in itself would show the difference given by having Turbo in the first place. Doing the same with Thuban would be interesting.

    Zosma would've made more sense than continuing on with Deneb, however AMD quickly shelved that idea.
  • GullLars - Wednesday, May 4, 2011 - link

    Nah, it's just Intel has a huge advantage on fabs. They also have a clock for clock advantage on most workloads, but AMD's engineers are by no means incompetent. Brazos is the only recent great product though.
    If both Bulldozer and Lano fails spectacularly, i'll start leaning towards your side though. The last years Intel have alienated me with horrible business ethics and artificial restrictions on processors for market segmentation with very high prices on the chips that were not gimped (disabled functional units).
    Hopefully power consumption will be better with GloFo.
  • silverblue - Wednesday, May 4, 2011 - link

    Weren't you banned? No? Oh.

    Intel actually DOES need to make its compilers more vendor-agnostic since the settlement. If a piece of software works rather poorly on an AMD CPU and much better for an Intel offering, this isn't always down to the Intel CPU being stronger; it might actually be the result of the AMD CPU working on an inefficient codepath. Of course, it may actually be the limit of the processor. If something is compiled using a neutral compiler, I'd be far more inclined to trust the results of that.

    I doubt Intel is forcing people to use their compiler nor are they making them develop for Intel-only platforms, however in the interests of fair competition it's only right that checking for the CPU ID tag be removed, if that is what they're doing.
  • silverblue - Wednesday, May 4, 2011 - link

    Since you're such an expert, could you perhaps tell us, oh mighty one, what errata requires fixing? What is so buggy about an AMD processor?

    If you're going to make such bold statements, it's about time you actually backed them up with FACTS.
  • silverblue - Thursday, May 5, 2011 - link

    Still waiting.
  • silverblue - Wednesday, May 4, 2011 - link

    The most popular CPUs are towards the lower end. Intel doesn't have a quad core CPU below $100 whereas AMD has multiple. There's no reason why an Athlon II X4 setup shouldn't easily undercut an i3 setup, and occasionally, four true cores will be a benefit.
  • wh3resmycar - Wednesday, May 4, 2011 - link

    why and why would are you still guys sticking to 1680x1050? don't give me the crap about correctly segmenting processor performance hence a cpu bound resolution for gameX.

    people who actually are gamers, who are looking for a quadcore and up cpu solutions are probably gaming at HD resolutions and upwards.

    at least add graphs that are done on higher resolutions. your gaming performance benchmarks don't work in the real world.
  • stimudent - Sunday, May 8, 2011 - link

    This is great hardware, but it seems that most everyone I know who is an average user has left their tower systems and switched to smart phones, iPads, and gaming consoles for the most part. These items are now being discussed at family get togethers instead of PCs as was the case just a few years ago. Some advanced users are using the latest graphics cards for computational projects instead of gaming - the CPU is no longer seen as the best way to fold. It's the graphics card doing the grunt work while the quad-core processor sits mostly idle.

Log in

Don't have an account? Sign up now