There are 8 fully pipelined integer cores in there, they are just very weak. Some of it is the shared frontend/decoder some of it is the integer execution units themselves. Weak SIMD/FPU-performance isn't the only thing it got. It just does so much less. You don't have two pipelines with separate resources to achieve SMT/HT. They need wider execution here. Preferably dropping the shared front end thing too. Makes no point of having it around, focus on making it faster and dump all that cache which does no good. Mobile/Notebook chips can't really have 16MB of cache any way. Just a few MB.
And you base this on the first that came to your mind to make you feel better about you over paying for your wimpy little 4 core Intel CPU with half the power of a FX-8320. LOL
Just kidding I have a i5 myself, But guys you really should stop being such fanboys. AMD has a great Chip here with the FX-8320 and FX-8350. They are priced much lower than the top i5 CPU, And they will perform just as well in gaming if not better. And who cares about it using 125 watts? 125 watts is a bit more than what Intel's i5's use , But it can still be ran more than fine with a High end GPU with just a decent 600watt mainstream PSU like a CX600.
Intel i5's can handle faster memory than any amp chip. They have a stronger MC plus they OC very well too. With the 1155 socket the amd chips were barely keeping up. Since haswell the speed champs are intel cpu's hands down and with the z97 boards and the new processors that will only work with the 97 boards, look out amd! Better OC'ing and handling faster memory than before!
You know, you look at this crap amd chip in the review, and then it hits you... hard
You can lock sandy b 2500K at 4,500.00 mhz all day long on every motherboard out there, all 4 cores, with no voltage increase, w the crap stock intel fan for years on end, never a hiccup, and smoke the every living daylights out of everything amd has.
I mean, I find it very interesting that with videocards we STILL hear about overclocking, but when it comes to cpu's, suddenly all the OC fanboys of amd fall dead freaking silent on the matter.
Sorry, just cannot help noticing when the wannabe emperor is stark naked and moments ago his glorious worshippers were pointing and admiring and swooning over just how fast the amd jaybird can run when pushed, and then.... suddenly, the MASTER OC CHIP OF ALL TIME, the Sandy Bridge... is rather ignored... and the naked jaybird streaking thing becomes the dejected peasants silence...
Oh look, it's the highly ignorant fool CeriseCogburn...
You make demonstrably false statements on every single AMD article and merely provide further proof of your ignorance. Honestly, if you're unable to find anything better to do than post idiotic comments on every anadtech article, I have a suggestion: EDUCATE YOURSELF, LITTLE DIPSHIT.
Dude, seriously? Did you know the current world record for overclocking is held on an AMD processor? AMD is incredible when overclocking, especially super cooling, yes it may be a bit slower than Intel, but damn you can't beat AMDs price.
think about this.. which one is better an eight unlock cores as AMD called it. or 4 cores with hyperthreading technology?
4 cores with hyperthreading it will act like an eight cores because single core will act like two cores as claimed by intel but bear in mind there would be a mathematical complex going on behind intels hyperthreading. but WHAT if you unlock this thing so it would go freely to eight cores as claim by AMD?
THAT IS THE REASON WHY AMD HOLDS THE WORLD RECORD FOR OVERCLOCKING BECAUSE THEY MANAGED TO UNLOCK THIS THING. IF INTEL WILL PRODUCE 8 CORES CPU..THEN AMD WILL MAKE 16 CORES..AND THEN AMD STILL WIL BE THE WINNER FOR WORLD RECORD..
ON THIS POINT OF VIEW JUST BE OPEN MINDED. DO NOT ACT LIKE YOU REALLY SICK WHEN ITS NOT BECAUSE YOU WOULD MAKE YOURSELF LIKE TWICE MORON AND TWICE IDIOT. JUST TAKE AN ADVICE FROM G101. EDUCATE YOURSELF.
I USED TO BE AN INTEL USER.. BUT WHEN I'M BEGINNING TO UNDERSTAND THIS TREND..I WOULD LOVE AMD AS THEY LOVE CONSUMER'S POCKET (AGAIN- BE OPEN MIND. ITS MY OPINION)
CAPS. The world overclocking record is meaningless. You can't compare clock speed across brands and AMD cores are half-cores. They still can't compete with intel and the circuit size is several years behind, which is why they consume so much power. AMD has given up in the enthusiast CPU market, and is losing money.
Here comes the 2500K-loving Intel mercenary know-it-all-about-processors again, and yet he cannot sleep to know that there are chips out there who can outperform his 2500K. SOB!!!!
I OC my 8320 to 4.7 GHz and only have to raise the voltage .0125 from stock. Why is it that you think AMD is worthless. They are all for the consumer and I am grateful for it. I'm really sick of your ignorance. Really sick of it.
What everyone will get "sick" of in a hurry is if AMD falls on its face with its "crap" cpu manufacturing... Intel will double their pricing on cpus again, same as they were during the time period AMD released it's "crap" slot athlon 500. AMD also has a decent gpu to fall back on since they bought ATI which was a smart move.
*AMD released Slot A Athlon 500 Intel slashed it's prices in half same day to match.
*If ATI hadn't stepped up to the plate and suceeded when nVidia bought 3dfx Everyone would be crying about the prices of gaming cards
*Sure AMD is lagging behind right now but never underestimate the importance of competition and the effect it has on affordable upgrades for everyone...
I love AMD for the pricing they have brought to us this day and time... with cpu's AND gpu's I am fixing to build my FIRST intel platform since the since amd released the slot a athlon simply because I want a hackentosh... there is simply too much heartache building one on a AMD CPU/MOBO combo for the simple fact apple does not support them. I dont know if you would call me a amd fanboy or not, your opinion, I simply have supported a company that brought affordability to everyone.... I am either MAC (intel, no choice really) or AMD for the windows platform...
Anyone that cannot see the reasoning behind this, now I would say they are a intel fanboy though.... Everyone has the right to choose.....
Well I can answer that question for you. It's because we don't need to. I hear Intel fanboys talking about overclocking all the time. Now why is that? Is it because you spent a house payment on your big fancy i7 alone, so now you just have to squeeze in those extra clock cycles? And then here I am, spent 110 bucks on a 8320 on a Newegg sale and what do you know, I don't *need* to overclock. Why would I do something I don't need to do? Well, I'm not a little bitch fanboy afterall so I'll just be happy with what I got and not talk shit about something I have no experience with (take the hint).
You know what I am sick of? Hearing people like you bitch and moan about how they hate AMD, on an article *about* AMD. Is your butt hurting you hun? Don't you have some crippled kids you have to go beat? Shoo.
Do the math on electricity consumption and you find it is not priced decently at all. Let's say you run a system with an idle FX 8350 for 8 hours a day everyday for 5 years and electricity costs you $0.16 per kwh:
(( 8 hours * (74.2 - 57.5) * 365 days * 5 years ) / 1000) * 0.16 = $39
(74.2 - 57.5 is the difference in idle power consumption between an fx-8350 and i5-3570)
So add $40 to the price of the FX 8350 before you do a comparison. More if you run your system 24/7.
After 5 years add $40 to AMD and you will get mind blowing processor compared to Intel Core i5-3570 or any intel processor after 5 years in that price bracket...
Looks like Piledriver delivered. Granted the bar was set pretty low with Bulldozer but this at least has a use case, highly threaded applications but considering this is a process node behind Intel I’d say its pretty good. If they can keep this pace up and hit IPC a bit harder AMD could be back in a pretty good position.
At least now they can say they beat Intel in a lot of multithreaded situations. Losing to Intel AND using more power was unpalatable. I'd like to see an undervolted 8350, perhaps AMD's conservative side is rearing its ugly head again.
I'm a bit concerned that, even with hard-edge flops and the RCM, the clock speed difference is about 11% for the same power. I'd have thought that even the former would shave off a decent amount, unless RCM doesn't work so well at higher speeds. Still, there's one disadvantage to be had - overclocking won't work so well due to the flop change.
If AMD can beat Intel now in multithreading in most circumstances, Steamroller is just going to let them pull away. Single-threaded workloads are the worry, though. Still, at least they can say that they finally beat Nehalem even in single-threaded work. I did lament the lack of an appearance of Phenom II, but looking at the results, they've buried that particular ghost.
ROFL, thank you the 3 stooges. I'd like to particularly thank silverblue the little co amd fanboy who provided immense entertainment in that they lost, then moments later, they won, deranged fantasy spew. Good to know Stearoller is going to "pull away" !
An official lower-TDP version version of the 8-core CPU would be very nice. 95W or even lower as Intel does with their -S SKUs.
At my workplace, the i7-3770S has been just plain outstanding for our small form-factor server/workstation appliance that travels to tradeshows with our sales guys. I'd happily trial an AMD 8-core equivalent.
Remember the Phenom 2's IPC is lower than the later model Core 2's and Piledriver still needs 700mhz+ to beat a Phenom 2, so that puts it in perspective. Mind you, overclock the NB on a Phenom 2 and you can get some pretty interesting gains in the range of 5-15% depending on the situation.
However, like AMD has done for the last several years, they are happy to throw more cores at the performance problem, which is great, we just wish those cores were a little beefier or software to become more heavily threaded.
The other flip-side is this will drop straight into some AM3 motherboards and all AM3+ motherboards, so it's a worthy upgrade if you're running something like an Athlon, plus it's cheap.
But the consensus is that if you're still running a Phenom 2 x6, and you don't need 8-threads and mostly play video games, it really is throwing money into the fire in order to upgrade to the FX line, Piledriver or not, unless you intend to overclock the chips to 4.8ghz+ which the Phenom 2's can't reach on air.
But the consensus is that if you're still running a Phenom 2 x6, and you don't need 8-threads and mostly play video games, it really is throwing money into the fire in order to upgrade to the FX line, Piledriver or not, unless you intend to overclock the chips to 4.8ghz+ which the Phenom 2's can't reach on air
Yes, we don't neeed of 8 core/threads for gaming today, but do You have prognosis for near future?
Why would you upgrade for no reason other than speculation? If an advantage arises in heavily threaded games in the future, upgrade at that time. You'll get more processing power / $ spent in the future than you will at present.
amd fanboys are pennywise and pound foolish, so buying the amd crap now, and telling everyone it has the deranged amd furuteboy advantage, works for them !
I mean really, it sucks so freaking bad, they cannot help themselves, like a crack addict they must have and promote, so heck, the last hope of the loser is telling everyone how bright they are and how on down the line in the imaginary years ahead their current pileofcrap will "truly shine!"
It's safe to say all programs/games going forward will take advantage of four cores or more. Battlefield 3 released LAST year and basically requires 4 cores in order to be GPU-limited (as in the game is CPU limited with just about any videocard unless you have 4 cores.
Prognosis for the near future is that having that many threads will still not be a whole lot of use for gaming. See Amdahl's law for why.
Amdahl's Law is not a reason. There is plenty of task parallelism to exploit. The real issue is ROI, and there's two aspects to that. One is that multi-threaded development is freakishly hard. Unlike single-threaded development, you cannot know exactly what each thread is doing at any given time. You need to synchronised to make certain actions deterministic. But even then you can end up with race conditions if you're not careful. The current synchronization methods are just very primitive. Intel will fix that with Haswell. The TSX technology enables hardware lock elision and hardware transactional memory. Both will make the developer's life a lot easier, and also make synchronization more efficient.
The second aspect isn't about the costs but about the gains. It has taken quite a while for more than two cores to become the norm. So it just wasn't worth it for developers to go through all the pain of scalable fine-grained multi-threaded development if the average CPU is still only a dual-core. Haswell's TSX technology will come right in time as quad-core becomes mainstream. Also, Haswell will have phenomenal Hyper-Threading performance thanks to two nearly symmetrical sets of two integer execution units.
AMD needs to implement TSX and AVX2 sooner rather than later to stay in the market.
No, gaming won't need that many threads in the near future either. Nobody is going to make a game demand more than 4 threads because that's what common gamer systems support.
I disagree. Say we have a hypothetical game that support 8 threads. The overhead of over-threading in a quad core system is frankly, not very much, while it may provide improvements on people with octocore or Intel processors with hyper-threading.
In fact, there are many games nowadays that split workload into many threads for economic simulation, background AI planning in user phase, physics, audio, graphics subthread, network management, preloading and resources management. It is just that even with the parallelism, there bound to be bottlenecks in single threading that a 8 core may not benefit at all compared to 4 cores.
So I disagree, it is not about people not spending resources in making parallelism or not supporting it. It is the nature of the workload that is the determining factor.
What a one dimensional computer enthusiast you are. You spend hundreds to play games on a computer when you could do the same ona console for less?? I use my computer to gain knowledge, impart knowledge, do organizing work to liberate the working class from wage slavery, write leaflets, an documents. I occasionally play strategy games that are usually multi-threaded, like Galactic Civilizations II. . There is no greater value on the planet than the FX processors for what I do. They save me time for the work I do over the Intel processor in the $200 price class. Time and money that's important , frame rates of a 120 are useless but too the over-privileged who buy 120 mhz monitors for their gaming. What a waste of money and resources that could be used for the advancement of human kind.
"Value" is more than just perf per purchase dollar, running costs also need to be included.
E.g. a basic calculation based on the charts above the FX CPU I've saved $50 on would cost 2c extra per hour at full load in power. So 2500 hours at load would be my break even point. That's 7 hours a day at full load over a year, a heavy use scenario but quite possible.
Multithreaded games are such a vast exception to the rule (that once you have "enough" CPU power you gain infinitessimal fps from more) they are not worth even mentioning.
You know NOT what you speak. Battlefield 3 is multithreaderd and look at AMD FX-8350 on Battlefield III - right up near the top, better than I 5 3570 and close to I7 3770. You guys are ignoring the facts and ignoring the trends in software. the move to parallelism is unstoppable and will accelerate. Multithreading is a growing presence and ONLY BAD programmers and software designers ignore it. The turning point will come when steamroller ships in a year and it will compete nicely with Hasbeen. At 28nm it will be almost as efficient as Hasbeen Performance wise it will be as good.
LOL why am i not surprised, massive amd fanboy with chips on the shoulder, and a fantasy brain. " o organizing work to liberate the working class from wage slavery" LOL - perfect, just like the rest of the amd fruitballs. Have fun at the OWS protests, though it would have been decent to join up with Tea Party, instead of coming on a year plus late after all the complaining. (you brought up politics fanboy)
Anyway back to your fanboy fantasy. As I said, you can look all day long at the pileofcrap amd releases and tell yourself it's the greatest ball of cheese for you, but no one has to believe your bs. One big reason why. SB 2500K oc's to 4500 like butter on stock everything, all 4 cores, all day and all night with zero hiccups, and blows the amd crap away period.
You actually have to be very stupid to not choose it. Very stupid. Be aware fanboy, you're looking at stock 2500K in all the charts, and a clear +50% increase in instantly available with it, FOR FREE.
There is no way any amd fanboy made the correct decision after 2500K was released. And it's not occurring now either. You're living a lie, so please stop sharing it with us, and by the way - I don't think it's your place to tell others WHAT they can use their computer systems for.
THEY OWN THEM, not you. They are theirs, not yours, and you shouldn't be tooting your virgin purity hippy love angel wing CRAP here, and then also have the obnoxious insolence to tell others they are wasting their computer power.
There are plenty of people who will tell you flat out you are wasting your life and wrecking the nation with the crap you are doing, no doubt about it, so keep it to yourself, won't you ?
Now let's hear how your crapdirver amd can possibly match a 2500K in the real world... LOL ain't happening mister
lol that funny calling a spade a spade look at yourself i my self have your 2500k and have the piledriver dont see any difference in them in the real world in fact whats funny is i can run many programs in the back ground and still play aion without any frame loose or any shuttering problems cant do that with my 2500k it drops in frame rates and shutters like hell so keep telling peeps how much u dont know about cpu's we really like hearing from u.
Who gives a crap who has the better processor? Honestly......do you work for Intel? Then why care what other people like? I have an FX series processor, as well as several Intel machines. I like them both. Going online and getting into a pi$$ing contest over which company makes a better processor and resorting to making fun of people (google "Internet tough guy and you'll see what a majority of people think about that) is non constructive, gains you nothing except negative attention, and makes you look less intelligent than you probably are. I could give a $hit what you like, or which processor you run. Neither AMD nor Intel pays me any money to give a d@mn, and whether I think you are wasting your money or spending it wisely doesn't impact me in the least bit. People, just buy what you personally like, and screw all the fanboyism that seems to be rampant ON BOTH SIDES.
The technical arguments have some merits, the political ones are per-digested socialist propaganda. I almost threw up at the end of the post. Must be nice to be able to advance the cause of the class struggle from a cozy living room somewhere in a free market country where your freedom of speech is protected by some freely elected capitalistic pig.
Except that it requires nearly double the power of a Ivy Bridge to squeak out a few wins in those multi-threaded apps... Only when a company is this close to obscurity can we say this is a win. Especially in light of ARM competition with x86... AMD continues with insanely power hungry chips?? Not good.
At $200 it still is a tough sell. Double the power of i5-3570K and 80W more than i7-3770K. No way. The chip looks dated. cough cough Pentium 4 Prescott anyone?
What market is AMD aiming at here?!? Intel produces 2 IVB per 1 of these. And IVB is an APU of all things.. This thing is AMD's non-iGPU part. Imagine if Intel released an 6-8 core IVB without the iGPU. Same die size as the IVB APU.
Bleak does not even begin to describe AMD. The fact that AMD sits at $1.5B market cap and no one is talking about buying the company says a lot.
Thank you for the proficient monitoring, although I disagree with at least the characterization of calling it a win based upon amd being on it's way out, or whatever.
It gets called and referred to as a win, because honesty is now CRAP, and fanboy fruittard is "in". That's all.
When there is some bare win for amd in some game, then of course it's a massive killing and total destruction, and sometimes when it's a tie or a lose it gets called and manipulated and talking pointed and spun into a win.
Personally, I believe that's why amd is a freaking failure. They coddled and produced a large raging fanboy base, with their PR hits against nVidia and Intel, all of it lies that the fruiters totally believed, and went on a continuous rampage with. That emotional battery allowed AMD to produce crap, not support their crap properly, feel good about their little warm and "not evil" hearts they pretended to "live by", and thus go down the frikkin tubes while bathing themselves in glory.
The very few times the massive collective of lockstep fanboy parrots broke out of their idiot mind chains and actually criticized AMD, and it only occurred several times mind you, after much ignoring and glossing over, why then AMD, shocked and stunned - WOKE THE HECK UP... got off their coddled PR fanboy based BUTTS - and did something about their huge problem...
I must say the results those few times were extraordinary for AMD, and quite exemplary in any overall comparison across the board to other companies in the mix. A few examples of that should not be hard to bring to mind.
That's why I don't like the fanboy crap. I certainly don't believe it's good for amd, nor good for my bottom line, as I suffer under the constant coddling and lying, too. We all do.
Now it's likely too late, but I'm still hoping for a bailout for amd. Lots of oil sheiks out there.
Yeah! I'm really impressed how much better these are...the fact that they're beating Intel again in ANYTHING is awesome!
We need AMD for the competition, and anymore with Intel pushing their worthless video so hard, it gives AMD a competitive advantage both because they can skip video and have more transistors on CPU, OR they can put in a massively better GPU.
I wish they had an 8 core notebook part though for the mid range with no integrated GPU....it seems like that ought to be a solid enough choice for a system, combined with a high end Nvidia or AMD GPU.
Seriously thinking of making my next notebook AMD, both to support them, and to avoid switchable graphics... (well, still have AMD's switchable graphics, but hopefully since they make the whole thing they'll do better).
I used to be scared off by AMD as I got burnt twice on horrible 3rd party chipsets, but I bought a c50 based notebook last year for the kitchen, and it's been 100% rock solid stable and non-weird...like Intel's always been known for. Makes me feel a lot better about buying an A series notebook this year or an FX desktop.
FYI - Anandtech is suffering server issues at the time of this post...
What many reviewers and fanbois tend to miss over and over is that AMD is delivering the best performance-for-the dollar and that ANY current model desktop CPU will run ANY software just fine. Unless you have some enterprise level software that brings a modern CPU to it's knees, ANY of the currently avialable desktop CPUs will run Windoze or Linux based software just fine. In fact Linux apps do even better in many cases than Windoze bloatware.
I have no idea if AMD will ever offer a discrete CPU to equal Intel's top of the line, over-priced models nor do I care. I buy what delivers the best performance for the price. I have yet to purchase any AMD desktop CPU that would not run ALL software as well as an Intel CPU, without any isses what so ever.
If all you do is benchmark all day long and you have money to burn, blow it on an Intel CPU, unless of course you are opposed to evil, chronic, law violating corporations looking to eliminate consumer choice. You could always vote your conscience, if you have one.
I am always amazed that people actually falsely believe that AMD processors are some how "inadequate". Even with tainted benches, AMD processors deliver all the performance and good value that most consumers desire. It's tough however getting people to look at the data objectively. All most people think of is that "more" is better, when in fact that's the sucker play when you look at performance vs. cost and actual needs.
Considering that Intel got a whopping ~5% performance gain from a 32nm to 22nm node drop and Tri-gate tansistors with Ivy Bridge, (along with over-heating and poor overclocability...), AMD did quite well to deliver a ~10-15% improvement with Vishera. With AMD's pricing Vishera should sell well because of it's excellent performance and cost.
I'm sorry, but I can't take seriously anything where the writer uses "Windoze". Any such text is obviously written by a heavily biased individual and therefore any "analysis" in it is flawed.
It would be interesting to see some linux benchmarks considering this chip's only future may be running servers or bargain machines.
Some linux webserver and sql database benchmarks would be interesting. I didn't see any desktop use case for this processor at all. In every benchmarked case I'd rather have an intel chip. Even when Intel lost it wasn't by much. And the conclusion basically said the same thing, if you are 100% sure you are running heavily threaded code all the time, then this MIGHT be the chip for you if you don't mind a bigger power bill. That's just not great.
But as for the windows remark, windows is fine. Linux has some strong points, particularly with servers, and it's kernel->user mode transitions, but everything is a trade off. I use linux for many of my servers and have for years, but I mostly agree with this http://linuxfonts.narod.ru/why.linux.is.not.ready.... as to the problems with linux. If you've genuinely used linux alot, you'll know most of these things are true to one degree or another. Basically once you get X audio and video involved, it's not awesome and you'll appreciate windows more :-)
If you prefer to analyze things scientifically and think independently you would NOT use your computer primarily or exsclusively for gaming, you are a one-dimensional human being IIt is a multi-faceted tool that can do work , organize a revolution, spread joy through its communications ability. Help the oppressed get together to fight their expoliters It can be entertaining as well. Practicing being a paid mercenary like the Seals does not intrigue me, it repulses me. There is nothing this cpu can't do either better . as well, or almost as well asan Interl chip in its price class. Single-threaded apps are dying out. More and more games are being programmed to take advantage of multi cores and AMDs' superiority there is only going to grow. Dis it all you like it show your brain is not eoperating at a high efficiency. it is irrational just liek those Iphone nuts who stand on line for aproduct that is being bought as a status symbol rather than as a superior tool (which it is not).
It doesn't matter if something is multithreaded or not. If it doesn't use more than 4 threads Intel's Single threaded advantage still holds.
Only until you fully saturate the 8threads does the AMD, maybe, pull ever so slightly ahead. Even sometimes there it falls way behind.
If the amount of threads your software is asking is equal to the amount of cores your Core I7,5,3 has the Intel is spanking AMD. Only if the amount of cores DOUBLES intel's is the AMD maybe winning alittle bit.
I'm sorry, this is a bit off-topic to the processors (although I can't say I'm partial to either, I'll be using an AMD because it works in the motherboard I want and performs well with all the other parts I want, but I have used my friend's i5 build and it runs very nicely) but how do you say that Linux is only for servers and bargain builds? Besides the fact that I have seen $600 builds that blow away multiple thousand dollar builds, Linux has become an extremely advanced OS in the last few years. The desktop environments available are all more intuitive than either Windows or Mac, given that they can be customized to the user's preferences down to where they interact directly with the kernel. Not to mention they offer a vast array of features that Windows and Mac don't, as well as using far less resources. I have Linux operating systems that will idle around 1.2% CPU usage. Windows 7 idles around 5%-6%. It manages network connections more efficiently, utilizes the resources it does use much more effectively, and in general just gives a much more immersive and intuitive user experience if you know what you're doing. I would really like to see more support for Linux, because if software and firmware that is available for Windows was available for Linux without the use of WINE, I would use Linux exclusively because it would be so much more efficient, and at this point it has become so streamlined and beautiful that most people who have seen me using it and are Windows users say they would switch because of it's ease of use and visual appeal if only all the software they use on Windows was available. Oh and putting servers and bargain builds in the same group really wasn't well thought out... Most servers have high end components to be able to handle large amounts of traffic and heavy loads on resources.
Funny how the same type of thing could be said in the video card wars, but all those amd fanboys won't say it there !
Isn't that strange, how the rules change, all for poor little crappy amd the loser, in any and every direction possible, even in opposite directions, so long as it fits the current crap hand up amd needs to "get there" since it's never "beenthere". LOL
We've heard all of this before, and while much of what you say is true, and ignoring the idiotic "Windoze" comments not to mention the tirade on "evil Intel", Anand sums it up quite clearly:
Vishera performance isn't terrible but it's not great either. It can beat Intel in a few specific workloads (which very few people will ever run consistently), but in common workloads (lightly threaded) it falls behind by a large margin. All of this would be fine, were it not for the fact that Vishera basically sucks down a lot of power in comparison to Ivy Bridge and Sandy Bridge. Yes, that's right: even at 32nm with Sandy Bridge, Intel beats Vishera hands down.
If we assume Anand's AMD platform is a bit heavy on power use by 15W (which seems kind as it's probably more like 5-10W extra at most), then we have idle power slightly in Intel's favor but load power favors Intel by 80W. 80W in this case is 80% more power than the Intel platform, which means AMD is basically using a lot more energy just to keep up (and the Sandy Bridge i5-2500K uses about 70W less).
So go ahead and "save" all that money with your performance-for-dollar champion where you spend $200 on the CPU, $125 on the motherboard (because you still need a good motherboard, not some piece of crap), coming to $325 total for the core platform. Intel i5-3570K goes for $220 most of the time (e.g. Amazon), but you can snag it for just $190 (plus $10 shipping) from MicroCenter right now. As for motherboards, a decent Z77 motherboard will also set you back around $125.
So if we go with a higher class Intel motherboard, pay Newegg pricing on all parts, and go with a cheaper (lower class) AMD motherboard, we're basically talking $220 for the FX-8350 (price gouging by Newegg), $90 for a mediocre Biostar 970 chipset motherboard, and a total of $310. If we go Intel it's $230 for the i5-3570K, and let's go nuts and get the $150 Gigabyte board, bringing us to $380. You save $70 in that case (which is already seriously biased since we're talking high-end Gigabyte vs. mainstream Biostar).
Now, let's just go with power use of 60W Intel vs. 70W AMD, and if you never push the CPUs you only would spend about $8.75 extra per year leaving the systems on 24/7. Turn them off most of the day (8 hours per day use) and we're at less than $3 difference in power costs per year. Okay, fine, but why get a $200+ CPU if you're going to be idle and power off 2/3 of the day?
Let's say you're an enthusiast (which Beenthere obviously tries to be, even with the heavy AMD bias), so you're playing games, downloading files, and doing other complex stuff where your PC is on all the time. Hell, maybe you're even running Linux with a server on the system, so it's both loaded moderately to heavily and powered on 24/7! That's awesome, because now the AMD system uses 80W more power per day, which comes out to $70 in additional power costs per year. Oops. All of your "best performance-for-the-dollar" make believe talk goes out the window.
Even the areas where AMD leads (e.g. x264), they do so by a small to moderate margin but use almost twice as much power. x264 is 26% faster on the FX-8350 compared to i5-3570K, but if you keep your system for even two years you could buy the i7-3770K (FX is only 3% faster in that case) and you'd come out ahead in terms of overall cost.
The only reason to get the AMD platform is if you run a specific workload where AMD is faster (e.g. x264), or if you're going budget and buying the FX-4300 and you don't need performance. Or if you're a bleeding heart liberal with some missing brain cells that thinks that support one gigantic corporation (AMD) makes you a good person while supporting another even more gigantic corporation (Intel) makes you bad. Let's not use products from any of the largest corporations in the world in that case, because every one of them is "evil and law violating" to some extent. Personally, I'm going to continue shopping at Walmart and using Intel CPUs until/unless something clearly better comes along.
That talk suffers from the same inability to consider any other viewpoint but that of the hardware fetishist.
If you are fapping to benchmarks in your free time you are the 1%. The other 99% couldn't care less which company produced their CPU, GPU or whatever is working the "magic" inside their PC.
Hey idiot, he got everything correct except saying 80W more every second of the day, and suddenly, you the brilliant critic, no doubt, discount everything else. Well guess what genius - if you can detect an error, and that's all you got, HE IS LARGELY CORRECT, AND EVEN CORRECT ON THE POINT concerning the unit error you criticized. So who the gigantic FOOL is that completely ruined their own credibility by being such a moronic freaking idiot parrot, that no one should pay attention to ? THAT WOULD BE YOU, DUMB DUMB !
Here's a news flash for all you skum sucking doofuses : Just because someone gets some minor grammatical or speech perfection issue written improperly, THEY DON'T LOSE A DAMN THING AND CERTAINLY NOT CREDIBILITY WHEN YOU FRIKKIN RETARDS CANNOT PROVE A SINGLE POINT OF THE MANY MADE INCORRECT !
It really would be nice if you babbling idiots stopped doing it. but you do it because it's stupid, it's irritating, it's incorrect, and you've seen a hundred other jerk offs like ourself pull that crap, and you just cannot resist, because that's all you've got, right ?
It looks like extra 10W in idle test could be largely or solely due to mainboard. There is no clear evidence to what extent and whether at all the new AMD draws more power than Intel at idle.
A high end CPU and low utilization (mostly idle time) is in fact a very useful and common case. For example, as a software developer, i spend most time reading and writing code (idle), or testing the software (utilization: 15-30% CPU, effectively two cores tops). However, in between, software needs to be compiled, and this is unproductive time which i'd like to keep as short as possible, so i am inclined to chose a high-end CPU. For GCC compiler on Linux, new AMD platform beats any i5 and a Sandy Bridge i7, but is a bit behind Ivy Bridge i7.
Same with say a person who does video editing, they will have a lot of low-utilization time too just because there's no batch job their system could perform most of the time. The CPU isn't gonna be the limiting factor while editing, but when doing a batch job, it's usually h264 export, they may also have an advantage from AMD.
In fact every task i can think of, 3D production, image editing, sound and music production, etc, i just cannot think of a task which has average CPU utilization of more than 50%, so i think your figure of 80Wh/day disadvantage for AMD is pretty much unobtainable.
And oh, noone in their right mind runs an internet-facing server as their desktop computer, for a variety of good reasons, so while Linux is easy to use as a server even at home, it ends up a limited-scope, local server, and again, the utilization will be very low. However, you are much less likely to be bothered by the services you're providing due to the sheer number of integer cores. In case you're wondering, in order to saturate a well-managed server running Linux based on up to date desktop components, no connection you can get at home will be sufficient, so it makes sense to co-locate your server at a datacenter or rent theirs. Datacenters go to great lengths to not be connected to a single point, which in your case is your ISP, but to have low-latency connections to many Internet nodes, in order to enable the servers to be used efficiently.
As for people who don't need a high end system, AMD offers better on-die graphics accelerator and at the lower end, the power consumption difference isn't gonna be big in absolute terms.
And oh, "downloading files" doesn't count as "complex stuff", it's a very low CPU utilization task, though i don't think this changes much apropos the argument.
And i don't follow that you need a 125$ mainboard for AMD, 60$ boards work quite well, you generally get away with cheaper boards for AMD than for Intel obviously even when taking into account somewhat higher power-handling capacity of the board needed.
The power/thermal advantage of course extends to cooling noise, and it makes sense to pay extra to keep the computer noise down. However, the CPU is just so rarely the culprit any longer, with GPU of a high-end computer being the noise-maker number one, vibrations induced by harddisk number two, and only to small extent the CPU and its thermal contribution.
Hardly anything of the above makes Piledriver the absolute first-choice CPU, however it's not a bad choice still.
Finally, the desktop market isn't so important, the margins are terrible. The most important bit for now for AMD is the server market. Obviously the big disadvantage vs. Intel with power consumption is there, and is generally important in server market, however with virtualization, AMD can avoid sharp performance drop-off and allow to deploy up to about 1/3rd more VMs per CPU package because of higher number of integer cores, which can offset higher power consumption per package per unit of performance. I think they're onto something there, they have a technology they use on mobile chips now which allows them to sacrifice top frequency but reduce surface area and power consumption. If they make a server chip based on that technology, with high performance-per-watt and 12 or more cores, that is very well within realms of possible and could very well be a GREAT winner in that market.
more speculation from mr gnu This of course caps it all off - the utter amd fanboy blazing in our faces, once again the FANTASY FUTURE is the big amd win :
" If they make a server chip based on that technology, with high performance-per-watt and 12 or more cores, that is very well within realms of possible and could very well be a GREAT winner in that market. "
LOL - Why perhaps you should be consulting or their next COO or CEO ?
It looks like extra 10W in idle test could be largely or solely due to mainboard. There is no clear evidence to what extent and whether at all the new AMD draws more power than Intel at idle.
A high end CPU and low utilization (mostly idle time) is in fact a very useful and common case. For example, as a software developer, i spend most time reading and writing code (idle), or testing the software (utilization: 15-30% CPU, effectively two cores tops). However, in between, software needs to be compiled, and this is unproductive time which i'd like to keep as short as possible, so i am inclined to chose a high-end CPU. For GCC compiler on Linux, new AMD platform beats any i5 and a Sandy Bridge i7, but is a bit behind Ivy Bridge i7.
Same with say a person who does video editing, they will have a lot of low-utilization time too just because there's no batch job their system could perform most of the time. The CPU isn't gonna be the limiting factor while editing, but when doing a batch job, it's usually h264 export, they may also have an advantage from AMD.
In fact every task i can think of, 3D production, image editing, sound and music production, etc, i just cannot think of a task which has average CPU utilization of more than 50%, so i think your figure of 80Wh/day disadvantage for AMD is pretty much unobtainable.
And oh, noone in their right mind runs an internet-facing server as their desktop computer, for a variety of good reasons, so while Linux is easy to use as a server even at home, it ends up a limited-scope, local server, and again, the utilization will be very low. However, you are much less likely to be bothered by the services you're providing due to the sheer number of integer cores. In case you're wondering, in order to saturate a well-managed server running Linux based on up to date desktop components, no connection you can get at home will be sufficient, so it makes sense to co-locate your server at a datacenter or rent theirs. Datacenters go to great lengths to not be connected to a single point, which in your case is your ISP, but to have low-latency connections to many Internet nodes, in order to enable the servers to be used efficiently.
As for people who don't need a high end system, AMD offers better on-die graphics accelerator and at the lower end, the power consumption difference isn't gonna be big in absolute terms.
And oh, "downloading files" doesn't count as "complex stuff", it's a very low CPU utilization task, though i don't think this changes much apropos the argument.
And i don't follow that you need a 125$ mainboard for AMD, 60$ boards work quite well, you generally get away with cheaper boards for AMD than for Intel obviously even when taking into account somewhat higher power-handling capacity of the board needed.
The power/thermal advantage of Intel of course extends to cooling noise, and it makes sense to pay extra to keep the computer noise down. However, the CPU is just so rarely the culprit any longer, with GPU of a high-end computer being the noise-maker number one, vibrations induced by harddisk number two, and only to small extent the CPU and its thermal contribution.
Hardly anything of the above makes Piledriver the absolute first-choice CPU, however it's not a bad choice still.
Finally, the desktop market isn't so important, the margins are terrible. The most important bit for now for AMD is the server market. Obviously the big disadvantage vs. Intel with power consumption is there, and is generally important in server market, however with virtualization, AMD can avoid sharp performance drop-off and allow to deploy up to about 1/3rd more VMs per CPU package because of higher number of integer cores, which can offset higher power consumption per package per unit of performance. I think they're onto something there, they have a technology they use on mobile chips now which allows them to sacrifice top frequency but reduce surface area and power consumption. If they make a server chip based on that technology, with high performance-per-watt and 12 or more cores, that is very well within realms of possible and could very well be a GREAT winner in that market.
Except the "Any CPU is fine" market isn't about $200 processors, Intel or AMD. That market is now south of $50 and making them pennies with Celerons and Atoms competing with AMD A4 series. You're not spending this kind of money on a CPU unless performance matters. Funny that you're dissing the overclockability of the IVB while pushing a processor that burns 200W when overclocked, you honestly want THAT in your rig instead.
Honestly, while this at least puts them back in the ring it can't be that great for AMDs finances. They still have the same die size and get to raise prices of their top level process or from $183 to $199, yay. But I guess they have to do something to try bringing non-APU sales back up, Bulldozer can not have sold well at all. And I still fear Haswell will knock AMD out of the ring again...
I agree. I would think they may do better with the 16-core socket G34 Opterons with 4 RAM channels, particularly if they can get down to 95W at 2.5 GHz. A 2-socket board gives 32 cores with lots of RAM per 2U server chassis. This should work nicely for high availability virtualized clusters. In this environment, it is better to have more cores in the same power envelope than faster per-core performance, because the virtual machines are independent from one another. I think Piledriver can compete in this environment much better than in the non-APU desktop/workstation market.
The thing is that people would want a balanced performance. Balanced between single and multithreaded that is. Now Piledriver does a lot better than Bulldozer here, but I think Intel offers a better balance still. As much as I would like to build a new AMD system, I think it will be Intel this time around.
What class of gaming are you looking at? If you're looking at even midrange gaming, your best bet is an A10 + a 6670 (runs $60-$70 average and $90 for low profile). Really a great gaming value option.
I just did that for our secondary machine and put in a 6850. Works quite well... aside from bios issues on a brand new board chipset that is. Considering prices on the 7750/70 I'd probably opt out for one of those at $30 more then any of the 6x series. I'd also have probably picked up one of these new cpu's over a A10 given the oportunity.
LMAO at fanboy system frikk failure.... hahahahha "adise from bios issues" and uhh.. the "crashing" .. and uhh, I'd buy not the 6850, but 7770, and uh... not the A10 but one of these...
I frequently have two or three high-cpu apps running at a time, so would AMD be better in this case? Even though each app runs better on Core-i5 individually?
I shoot for a do-it-all system. I run video encode, get bored and start a game. I run malware scans on external drives and backup other drives into compressed images. Perhaps if you ran h.264 encodes while you ran another benchmark, like Skyrim, or the browser bench?
Oh, typo on page 6, I think "gian" where you meant gain.
I'm trying to find a good scenario for those desktop cpus... Cheap 8 core virtualization hosts? Video encoding? Other than that, in this "mobile" world when every desktop PC looks out of time, I don't know what you can do with them. They are obviously not good for light loads or gaming...
The architecture makes more sense when less modules are used, i.e. the APU series. Look at how Trinity destroyed Llano, both desktop and mobile. And note that an A10+6670 is a perfect midrange gaming value.
What are you blabbing about? You should be banned from this forum.
While Intel's CPUs are clearly in a class of their own for high-end CPU gaming rigs, AMD's GPUs are doing very well this generation, having captured single-GPU performance crown, performance/$ and overclocking performance. The minute you said NV smacks AMD's GPU around, you lost ALL credibility.
You may want to take a look at 90% of all the games that came out in 2012 - GTX680 loses to 7970 GE (or 680 OC vs. 7970 OC). Facts must not sit well with AMD haters.
Steamroller 15% is straight from the horse's (AMD's) mouth and 15% for Haswell is well within reason because it's a "tock" (new architecture). So, i think 15% for both works out fine for making speculative statements at this moment.
Except we can probably agree amd will fail to meet their goal in most of these future cases and fail to meet it and be very late as well (cpu side after all and they suck at being on time) , so some sideways review with all the possible advantages in testing and text will need to be skewed toward the amd side ( and they will be) to be able to claim "amd met it's goal! " Let's face it, if amd meets some bare minimum "test" for "achieving that goal" in one single area, the review will claim "they accomplished it". Like the recent time amd released whatever it was a cpu or a vid card and they had a single or so instance of 5 or 10 PROMISED on the egg shelf the day of their declared release and we were told (yes here) "they did it !" LOL
On the other side of the "perfectly legitimate 15% is an equal and fair equation of the future", we have Intel, that likely won't be late, and if so just barely, and has the cahunas and record to probably exceed their stated goals.
To be honest, the best I can do for amd is say I like their new cpu name " Steamroller !" For some reason, and this is of course pure speculation, hopeful ESP, wimmin's intuition, or perhaps a desperate attempt to give them a break...but... I think the name Steamroller indicates they will have a winner when that puppy comes out. Yes, I think it's crazy as well, but what if I'm right ? LOL I might be absolutely correct.
Especially the Sysmark and Compile ones, which project Steamroller being ahead of Ivy Bridge, and of course clear wins for the multi-threaded ones.
I wish Anand had done a similar current->projection graph for power consumption as well, that would have been very useful.
* Fingers crossed *, I think (wish!) the next gen APUs, Steamroller + GCN + new process node will be a real winner for AMD assuming power envelop comes down to < 100W TDP across the board and delivery in 2013.
Isn't it interesting how when it comes to AMD, the fanboy will go to great lengths never before seen, never before done for any other product from any other entity ever, and I mean ever, and spend their time in pure speculation about the future, graph it out, get their hopes going, take a look at the futureville landscape - LOL
Just proves, as it would that there shouldn't be any AM3+ socket for AMD as there are no high-end processors and it doesn't even help for general computing, workstation and gaming performance that AM3+ has 8-core chips. If you want i3 performance and gaming why would you buy anything over FX4300? If you buy a chip like that why couldn't you and AMD just have gone with FM2 instead? Why not launch 8-core FM2 chips if you really want them in consumers hands? FM2 will get Steamroller, why not make sure it will be thriving as a platform instead of having two desktop platforms? I don't think AM3+ justifies it's existence, I don't really want it and it doesn't really bring anything. It just reminds me of the AM2/+ and AM3 Phenom II days and look dated. I understand that there is no hypertransport in the FM2 platform, but let workstation users just buy Opteron server chips. AMD still needs to up it's singlethreaded performance with about 50%.
It is not so easy to make it for FM2. First they have to put the NB inside the CPU in the FX line, for it to work on FM2. That will require more space, power consummation and heat.
The NB has been inside the CPU since K8 (Athlon 64 / Opteron 2003). It's only a HT (or IO) to PCIe bridge and integrated GPU in Trinity, and they already have Piledriver in Trinity (only CPU for FM2) so what the hell are you talking about? It performs just the same and are more modern core for core. NB is just DRAM-controller and some registers. I see it more like LGA1155 is good enough for everything now days, so is FM2 despite not having 6 or 8 core CPUs, just having faster cpus is enough. You can bake, bind or layout faster processors for FM2 even adding the number of cores if you like. AM3+ chipset doesn't have PCIe 3.0 in the 990-series and don't seem to be getting it any time soon so why buy into that platform? Certainly isn't much more performing.
Enthusiasts can use SR5670 and Opteron's. It's not like 990-series has USB3 support any way. Now, FM2 doesn't have PCIe 3 support either, but might support it in an upgrade. Which of course would require new CPU and new motherboards. On AM3+ it would require new motherboards with new chipset only. Don't think PCIe "2.1" is a hindrance though, and a new CPU would benefit greatly either way. It's just a list of things that adds on the "feels old" category.
So you want to block the upgrade path? If as you say the FX4300 is all that people need, they should go ahead and buy Trinity. I'm sure that AMD will also release Athlon CPU's for FM2 like they did with FM1 and Llano. But for people who want a higher end AMD CPU, perhaps to upgrade their old one, being able to use the same socket as the old one did is helpful. A stable platform is a good thing, and AM3+ motherboards have been on the market for a while. I just don't see the rationale behind junking it in favour of the socket-du-jour.
Naw, I would say just go with Socket C32 and Opteron derived products. It's an under utilized Socket any way. I don't want to give away choice, just have a more clean and sensible line for desktop. AM3+ just seems like an orphaned platform to put Bulldozer out there. So it has lost it's relevance. They should release higher-performing Trinity chips too is what I would argue for of course. Having four Sockets here doesn't make a lot of sense. Moving to C32 would enable enthusiast like board, cheaper workstation-boards with dual-socket too, i.e. two 8-core 4 module Piledriver chips. It has dual-channel DDR3 like the desktop platform's. Maybe even two higher clocked native 6-core variants would have been something. But they won't do it. Of course Intel doesn't need to counter with an enthusiast platform either.
Mainstream platforms is where it's at. Having an upgrade path for two desktop platform's, different chipset's etc doesn't make a lot of sense right now.
With 16GB of RAM and eight threads, why aren't we seeing realistic VM-driven virtualization benchmarks? Honestly, this is a huge application area that remains ignored by AT in their core architecture reviews. Something I always look for and never find.
I do keep some VMs running on my desktop but they are not generally loaded. I'm assuming, because of the power draw, these would not be a good choice for a dedicated VM server build?
Keep in mind that Vishera doesn't have an on-die GPU. OpenCL can run on the GPU or the CPU (with the appropriate ICDs), but we're almost always talking about GPU execution when we're talking about OpenCL.
Test after test by many reviewers using real apps, not synthetic benches which exaggerate RAM results, has shown that DDR3 running at 1333-1600 MHz. shows no system bottleneck on a typical Intel or AMD powered desktop PC. Even when increasing the RAM frequency to 2600 MHz. there was no tangible gains because the existing bandwidth @ 1333 MHz. is not saturated enough to cause a bottleneck. APUs do show some GPU benefit with up to 1866 MHz. RAM.
It's absolutely ridiculous that even though AMD has pushed out quite a nice and competitive product (in that price range), Intel has gotten wayy too big in the past 6 years that AMD was sleeping and i don't think they'll be pressured to do any price cuts still. So, even though we still have so-called-competition, Intel has a virtual monopoly and i can't hope that the new AMD releases will help drive prices down any more.
Additional thought : I do believe that apart from the power consumption, AMD has a more overall compelling processor with the 8350. Single thread performance has already long crossed the point where you could tell the difference in experience between AMD and Intel (the exception to this is gaming). And AMD is better in heavily threaded applications.
So, IF ONLY they could fix the power problem, i wouldn't hesitate to recommend an AMD system for any other purpose than gaming. Just my 2 cents.
But really... even in games where is the bottleneck with an FX? Remember than in 99% on monitor youìve got a 60hz refresh rate and you can't see more than 60 without glitches fps on that screen, so what's the difference beetween 85 or 95 fps?? I've got an [email protected] with an hd6950@6970 really i can't find a single game that didn't run smooth in 1920x1080 and playing to skyrim with 4 core allocated to the game while 2 pairs of other core are doing video processing on 2 anime episodes is pleasing :-)
Problem is you have that attitude more than once, and then you're into a slow degrading slop with lost performance. So, why do we get these arguments from amd fanboys ? Obviously you purchased the 6950 and decided you needed every last drop of juice from it, and there you go, oc'ed to 6970...
So on the one hand you've takena safe and sufficient card and hammered the piss out of it for a few more frames you cannot notice, as you do tell us you cannot notice them, and then you take your cannot notice them argument, and claim that's why your amd cpu is so great... and intel is not needed.
I see you did that without even noticing. You totally freaking contradicted yourself, completely.
Look, go ahead and buy amd and be a fanboy, I say more power to you, just DON'T type up crap talking points with 100% absolute contradiction, plussing them blatantly in amd's favor, and expect I'l just suck em down like freaking koolaid. DO NOT.
Intel is desired for the very same reason, you the amd fanboy, bought the 6950 and have it OC'ed to 6970 speeds.
Sorry bub, nice try, till you ate both your own shoes.
I would also like to see virtualisation in benchmarks as well. Multi core processors should lend to some interesting results in these sort of benchmarks.
LOL - oh dats not much powa ! All da amd fanboys don't care about power or saving the earth, they never cared, they never are worried, nor have they ever been worried about housefires... All that hate toward JenHsung and all those years of destroying the earth rage against nVidia, that actually happened on another planet in an alternate universe.
Anand, while enjoying your Vishera review, upon loading a page, the ad on your page took over my browser and navigated to a fishy looking "You are today's 100,000th visitor!" site. I know AT contracts out the ad part, but I thought you might be interested in knowing what happened. Here is a screenshot of what I experienced -- http://i45.tinypic.com/2aigegx.jpg
Thanks for the review and all of the time that went into it! It looks like I can finally start recommending something that isn't a Phenom II from AMD. Even though I personally love AMD and haven't used an intel chip since the P3 coppermine days, I couldn't recommend anything from AMD to anyone else this past year with a clear conscience, at least if it wasn't a stand alone CPU upgrade. But from the tests that I could find that compared the fx 4300 to the i3 3220 it looks like its mostly a wash between them, well other than power usage and thermals that is, so that was looking great! Is there any chance of a stand alone review in the future comparing the two? It looks like the 8350 performs about like a 2500K which is also awesome, unfortunately it's at least a year late to the party :(. This does however give me great hope for Steamroller, the only issue I had with the article is with the games selection (too few and too intel biased in the titles (SC2 and WOW)), might there be a more in depth review on that in the future as well? Thanks again for the review though!
Wow hi there. Glad to meet you, an amd fanboy with the glimmers of a conscience.
Now, don't forget the i2500K Oc's from 3200 right on up to 4500 (and beyond) on stock voltage and crap stock fan, so it actually SMOKES the daylights out of this fishy vishy.
Now what I'd like to see is how many of these benchmarks are compiled with intel compiler. In case you don't know yet, Intel Compiler disables a lot of optimizations if you are not running a Genuine Intel CPU, even if your CPU supports required features and would benefit from these optimizations. In other words, anything compiled with intel compiler will run slower on AMD cpus just because of Intel compiler.
Now you can argue that this is a reflection of real performance on Windows, as in Windows quite a few of DLLs are compiled with Intel Compiler as well.
What I'd like to see is some more benchmarks for Linux operating system and/or professional software. Things like data base servers (including something non-Microsoft, like PostgreSQL or MySQL), java application servers, GCC compiler benchmarks, apache/PHP server, virtualization, python/perl/ruby, LibreOffice/OpenOffice productivity.
Now, back to Vishera. This looks like a nice CPU. I haven't been CPU bound in my work for a while now, so performance wise this would be sufficient for my needs. What I'd like to see however is lower power consumption. Unfortunatelly I don't see that coming until Global Foundries minimizes their process...
"As I mentioned earlier however, this particular test runs quicker on Vishera however the test would have to be much longer in order to really give AMD the overall efficiency advantage. "
If you think about it, efficiency is unrelated to length of test.
He was talking about electrical usage vs work done, hence with amd's higher per second use of electricity, it must complete the test MUCH faster than Intel in order to win that. It completed faster, but not fast enough to use less power.
In price per performance. A 125w part beating a 67w (Not sure about that figure) will cause Intel to keep the same TDW for 2014 and just have a 35-40% performance increase. I can only hope.
I wish Anandtech would include some form of value scatter graphs like Techreport does in its reviews. The graphs do not have to be an exact imitation of what Techreport does, and the benchmark(s) used to determine the 'overall performance' can be different. Perhas we could even get performance per watt per dollar graphs.
Graphs like these make the whole exercise of comparing competing products so much more relevant to users, because most of us will buy the most performant processor per dollar.
This is, of course, considering the result without any attention to performance/watt. If you include power consumption in the calculations at all, Intel is an easy choice.
Difficult to see how AMD will cope with Haswell, even if they get another 15% boost next year. The gap in performance / watt only seems to be diverging, Intel taking a commanding lead.
Hey Mr. ChariseHogburn, why don't yoy take your 2500K with you and leave us all to our musings? You seem to know everything about processors why don't you let others do what they want to do? You big piece of Intel mercenary shit! SOB!!!!
It does give a reason and an upgrade path to finally move up from my aging P2 1090T. One of the main workloads I do when I use my PC heavily is indeed easy h.264 encoding for game and other types of video. Always nice to be able to knock a video file down from 2.5GB to 200-500MB. I've personally always used MSI or ASRock boards myself, with some Asus boards when I can catch the price right, in reply to the board used for the benchmarks.
I noticed there are overclocking numbers that do look decent. Some things I'm curious about. How do they take to undervolting? My luck with previous AMD generations has been pretty good when it came to that. At least when I felt like tinkering. Use to be able to run the old 9600be and 9850be considerably lower than stock voltages for example, at stock speeds, and some times even with mild overclocks on the NB's. I've noticed with that AMD tends to be fairly conservative.
And since they appear to still be using the same IMC/L3 speed linked to the north bridge hyper transport speed. How does upping the actual speed of the NB IMC/L3 effect the performance and stability of the platform. I know back in the day of the 9600be/9850be I could generally get them close to the same performance level as a core2 quad at the same clock speeds through that kind of tweaking.
And on a final note, it's a nice performance increase overall, even in single threaded apps, over the bulldozer cores. But you'd think they would of implemented a way to gang the integer cores and make them act as a single core for single threaded performance. That's all it would really take the pick up a bit of the slack I think.
Why the heck are you starting your power consumption charts at 50W rather than at zero?
That's *extremely* misleading, wildly exaggerating AMD's disadvantage. AMD has roughly 2x the power consumption of IVB at load and 1.25x the power consumption at idle- but by starting your chart at 50W you're exaggerating that into over 3x at load *and at idle*.
*Please* get yourself a copy of "The Visual Display of Quantitative Information" and read the section talking about the "lie factor" of a graph or chart.
They probably do. But that´s not the point. A GRAPH is meant to show a string of figures as a drawing.
When a graph starts at anything but zero, it will not show a true picture.
With two pieces of something to compare, where both lay in the area between say 90 and 91 of some kind of value..
If you then make a graph, thats going from 89-92 in 1/10´s, you wil get a graph, that shows a very uneven curve, going up and down all the time, with seemingly big differences in values.
But if it started at zero, like it´s supposed to, you would see a almost straight line, reflecting the true picture: These two things are practically alike in this specific area.
IF you don´t make a graph like that ALL THE TIME, there´s no need to make a graph at all, you could just write the values as figures.
Anands testing was the usual lazy-designed testing with poor planning. Why run sysmark, that every one knows uses testing methods that tend to ignore multi-threading.. Keep the test on applications only and make sure your gaming apps are representative. I saw better testing done on several other websites where the usually poorly designed and coded trash was balanced with other games that did employ some level of multi-threading The FX-8350 did immensely better in that gaming selection. Mostl gamers are not shoot-em-up fascist gamers. There is no reason for Anand to stack the game selctions in the single-threaded direction only. I beleive Anand is a shill for Intel and chose the stupid sysmark tests and the game sin such a fashion to downplay the vast performance improvments that are possible from the FX-8350 cpu. That is one reason I do NOT spend much time on this site any more. There is nothing I detest more than intellectual dishionesty. Check out Tom's hardware their review was done more scientifically and had a balanced selection of tests. The Vishera FX-8350clearly bested the I5 3570 in most tests and was the best performance for the buck by far. A better objectively designed test. No axes to grind. To hell with Anand, unofficial Intel shill and LAZY intellectually.
What they didn't show you is what happens when you OC the K parts from intel. They end up roughly in the same TDP as AMD on stock, and just crush them on performance, so if you are going to use the same aftermarket cooler and will overclock either pick, the Intel parts can go further and win across the board. Compared to the cost of an entire system ~$100-150 extra on the CPU for a 3770K might end up being around a 10% increase in total build cost, and giving you anywhere from 10 to 50% better performance (or more in some corner cases).
I'm happy with my OC'd 3930K :D (compared to to the rest of my equipment, the price was no problem) Probably upgrading my C2D 2.0GHz + 8600GTM laptop to a Haswell with only IGP in 2013. With an SSD and RAM upgrade it has had surprisingly good longevity.
Exactly, all the amd cpu unlockers, all the dang phenom 2 overlcockers - totally silent.
The 2500K smoke sit up to 4500mhz like butter, but heck, when something is freaking GREAT and it's not in amd's advantage, it's memory hole time, it's ignore it completely after barely mentioning it, and just as often continue to declare it an unused and unliked and proprietary failure that holds everyone back - whatever can be done to lie and skew and spin.
I had pretty low expectations admittedly and for good reason. This is a great step because I would have to think platform vs. platform where before I didn't even need to consider it. Intel has a huge thermal advantage, which indicates that Intel could easily come out with a proc at anytime to take advantage of that headroom should AMD ever become slightly threatening, and sadly still be below performance per Watt. I could reasonably consider AMD now for my desktop. Where AMD fails is at the platform level. They have underperformed in the motherboard department. If they wanted to sell more, they need to give serious advantages at the motherboard + CPU purchase level versus Intel. They have been fine with a mediocre chipset. They failed when they locked out Nvidia from chipset development, same with Intel. AMD and Intel's greed was a big loss for the consumer and part of the reason why AMD doesn't compete well with Intel. Nvidia made better chipsets for AMD than AMDs ATI chipset development team.
It is even worse on the server side for AMD. AMD needs to include 10GbE on the board to really question Intel and this is doable. AMD cannot compete on process (22nm and 3D transistors) unless Intel closed shop for 18 months.
LOL - thank you so much. Yes, the hatred for nVidia (and Intel) cost the amd fanboys dearly, but they really, really, really enjoyed it when nVidia chipsets got shut down. nVidia was going to be destroyed then shortly thereafter according to them, and thus we heard a couple years of it from their freaking pieholes - amd would drop the price and knife nVidia into oblivion... and bankrupcty... cause amd has a "lot more room" to price drop and still rake in profit... they said... LOL - as they whined about nVidia being greedy and making so much money. Yes, they were, and are, nutso.
Oh the party days they had... now they have CRAP. LOL
Anand, I think you missed a diamond in the rough here for gaming. Obviously, if someone has the money for an i5, they should get an i5. But if they don't, the FX-4300 looks like a better gaming choice than an i3. There are cheap mobos available, so overall cost is less than an i3, and the FX-4300 is overclockable. OCed it may even approach low-end i5 performance - at a power cost, of course.
For only $10 more I'd say the FX-6300 is the better bet. Slightly lower clock speed but still overclockable, and I don't think it will be that long before six cores become really relevant for gaming.
The Projected Performance page doesn't appear to take AVX2 and TSX into account, not to mention improved Hyper-Threading performance by having an extra execution port. The 10-15% number is for single-threaded workloads only. Everything else will see a much bigger leap.
AMD is screwed unless it adds AVX2 and TSX support sooner rather than later. They haven't made a single mention of it yet...
From a price-performance perspective, I think that FX-6300 is the most interesting part here. For barely more than the i3-3220, you get essentially the same performance in games and 20-30% better performance in multithreaded applications. And, as time goes on and games become more multithreaded, the FX-6300 will pull ahead in games, too. At 95w, the power consumption is much higher than Intel, but it's manageable. Plus, it's overclockable, unlike the i3.
Add in the potential for ~10% savings under load with a bit of undervolting and it looks even better. One to watch for those who, like me, would rather buy AMD if it doesn't smell like shooting yourself in the foot.
Hello I am a long time reader of this site pretty much since it first started. Heck in the time this site has been running I have gone through probably 35 or more computers & currently have 11 in my home that are used for different tasks. I also went through a wife but thats another story to tell lol.
Anyways what I was wondering is this. In the past with AMD I noticed a huge gain in CPU out put when raising the bus speed up to as far as the hardware would go. What I am wondering is what if you took a 8350 raised the bus up as high as it would go but keep the multi set that the CPU would run at default speed & do some test in both single threaded & multi threaded programs as well as a few games where this CPU is a bit lacking to see if the CPU itself is being held back by the bus. Then try to do both the bus at high & raise the Multi to the max CPU speed of 4.8Ghz & see what raising both the bus & CPU speed do.
I am hoping it has the same effect as it did in the older AMD CPU's & gives a nice boost. I think maybe the bulldozer & piledriver core might be held back by a lack of bandwidth to the rest of the system resources. If not then at least it was a fun little side project for you guys. maybe raise the memory speed as well to make sure that is not the issue too. Just an idea that may open up some hidden performance in the CPU hopefully. I would do it myself but at the moment the only AMD system I have left if a older Athlon 64 x2 6400+ that the step son uses to surf the web & play a few games on.
Oh yeah, how come we don't have the big speculation all the time solidified into the standard commentary and "looking for the validation on the tests" concerning what's wrong with the amd cpu's ?
I mean we get that argument when it's time to attack their competitors in the articles here, the "well established theory" that "turns out to be wrong" "years later" is used as the "standing thesis" as to why " xXxxxx" did so poorly in this test against AMD, "as we expected" says the reviewer, making sure to note the "weakness suspected" has reared it's ugly head... Yeah, another thing that is so bothersome.
I guess amd's architecture sucks so badly, nothing can explain it's constant failures.
Steamroller is far more focused on increasing IPC, however without a new process node it'll be difficult to demonstrate another gian in frequency like we see today with Vishera. I suspect the real chance for AMD to approach parity in many of these workloads will be with its 20nm architecture, perhaps based on Excavator in 2014.
Firstly, by 2014 Intel will be already on a 14 nm process. Add Intel's already superior (3D fin) transistor technology, coupled with massive R&D budgets on a slew of parallel projects to further refine the basic process tech, and the situation is not going to get any prettier for AMD any time soon.
Second, computing is increasingly going mobile (laptops, tablets, phones, phablets, etc.) The number one thing for mobile is power efficiency. AMD's CPUs absolutely suck at that; they are multiple generations behind Intel (and not all of that can be blamed on process lag.)
Third, AMD's trump card in the consumer space so far has been integrated graphics, but with Haswell and then Broadwell Intel's going to take that advantage away. So, by 2014 AMD won't have any feature set advantages left.
Fourth, AMD's other hopes have been in the HPC/server domain, but there again power efficiency is getting increasingly more important, and AMD is losing the war. Moreover, with its new MIC architecture ("Xeon Phi") now debuted (and it will be continually refined going forward) Intel's poised to capture even more of the HPC market, and AMD currently has no answer to that product line.
Seems to me that AMD is hosed on all fronts, unless they can pull not just one but a flock of fire-breathing dragons out of a hat, soon if not today.
The "process advantage" didn't do shit for Intel. Note how Ivy Bridge had horrible overclocking compared to anticipated, Intel's shift to mobile architecture benefits in new architectures, and that Microsoft is finally looking at helping AMD out with things like the scheduler in Win8, etc.
AMD is succeeding at power efficiency in mobile: the Trinity review correctly indicated AMD's success there in nearly closing the gap to Sandy Bridge (and Ivy? I forget now. It was close though).
Finally, IT IS A PROCESS NODE CHANGE. It's 28nm. BD/PD are 32nm. Besides, a GCN IGP is really attractive, and will still dominate versus Broadwell. VLIW5->4 was a tiny change and yet AMD managed to pull a pretty nice performance jump out of it; GCN was comparatively huge, designed for 28nm, and set to scale better as evidenced by the already low-power 7750 desktop.
The only worrying thing about Steamroller is whether the caches and memory controller speed up. If those don't, the platform is likely to be bottlenecked horribly.
The "process advantage" didn't do shit for Intel. Note how Ivy Bridge had horrible overclocking compared to anticipated...
That's partially due to the lower quality TIM they used on Ivy Bridge, and mostly due to bringing the Sandy Bridge design to the 22 nm TriGate process in a verbatin manner. Despite being a radically new process, they didn't change anything in the design to optimize for it. Haswell on the other hand is designed from the ground up to use the TriGate process. So it will show the process advantage much more clearly.
Yes, also partially perception, since SB was such a massively unbelievable overclocker, so expectations were sky high.
I'd like amd fan boy to give his quick amd win comparison on overlcocking Ivy vs whatever amd crap he wants opposite.
I more than suspect Ivy Brdige will whip the ever loving crap out the amd in the OC war there, so amd fanboy can go suck some eggs, right.
I mean that's the other thing these jerks do - they overlook an immense Intel success and claim 2500K oc power is not needed, then babble about " the next Intel disappointment " immediately afterward, not compared to their amd fanboy chip, but to Intel's greatest ever !
I mean the sick and twisted and cheating nature of the amd fanboy comments are literally unbelievable.
He writes "Brazos had a mild update" uhmm noo sorryy. Brazos 2 hasn't been released to market.. so we havent seen any update here. Same for Trinity although you can buy some Trinity laptops. Not much. But nothing on desktop.
Since AMD seems to be pushing multi-threaded performance at this time, it would have been interesting to see how this is born out in a more multi-threaded game, such as the Battlefield series titles. I know that in benchmarking my six core 960T (with unlocked cores), I could see some performance advantage between running this CPU with 6 cores versus the default 4 cores playing BFBC2. I'm not saying that this is where AMD will outshine Intel by any means, but it would have been an interesting test case for comparison's sake.
(I actually suspect that by the time you go from 6 cores to 8 cores, you will have run out of any significant advantage to being able to handle the extra threads.)
I think in Final Words Anand should add: "And the 4 year old Intel i7-920 - what a chip that was/is!" It is startiling to see AMD barely keep-up with the venrable 920 a good four years on - and those first charts in this article are with the i7-920 at stock speed. The average enthusiast is running an i7-920 at 3.8 Ghz on air all day long and achieving performance on par - or better - than many of today's CPU's! Of course, there is one big downside - power. This is Intel's big story to me: the speed and power of an O/C'd i7-920 on one quarter (or less) the power. Cool! Thanks for putting the old i7-920 in the mix - it shows just what a ground breaking design it was...and in many ways, still is.
Indeed i7-920 is the most awesome CPU in those graphs considering its age and nice overclockability. If there was overclocked version of it graps would look pretty funny.
I use i7-930 @ 4.1 for a long time now and just can't justify my itching urge for upgrade. More than that, it'll probably survive here for 2 more years until Haswell-EP as plain Haswell looks handicapped in terms of compute power in favor of iGPU and power draw. I do NOT need power-restricted desctop CPU – with power saving features it'll do fine on idle with any max TDP.
I don't understand why you keep saying that the 6300 fails to beat Intel at it's price point for the multi-threaded tests. At $130-140, the 6300 is going up against the core i3's and the multi-threaded benchmarks show the 6300 beating the core i3. Seriously: what am I missing here?
I can't see this as anything but a "win" for AMD, although there are certainly some sad feelings lingering about as I read this article regarding the 15% employee layoff that recently occurred. The promise AMD made was 10-15% improvements in IPC, and we certainly got that; not only that, but at a lower price than the first generation, AND with some very promising overclocking potential based on the scaling shown in this article.
However, I refuse to acknowledge these as "FX" chips. "FX" was the designation given to the very first CPU produced by AMD that outperformed Intel's best offerings by a significant margin, the Socket 940 FX-51 2.2ghz single-core CPU based on the Opteron version of the Athlon64 architecture. The reason for my petty "harrumph-ing" is that I own one of the very first FX-51 chips released (from the first batch of 1,000), purchased the day of release back in 2003 alongside an Asus SK8V motherboard with 2GB of Corsair XMS3200RE DDR-400 Reg/ECC Dual-Channel RAM, and which served admirably with its brother the X800XT-PE 256MB GDDR2, until its well-deserved retirement in 2009. That chip was, and to this day is, my favorite CPU of all time. It was a quirky chip: a server-backbone (and consequent unusual socket choice), ahead-of-its-time 64bit architecture, record-setting bandwidth at a "low" 2.2Ghz while Intel was trying their darnedest to hit 4.0Ghz with the P4, no real options in terms of a future upgrade path, and its champagne-tastes that could only be satiated by incredibly expensive Registered memory. However, it was FAST as all get-out, ran nice and cool with a Thermaltake Silent Tower Plus, and had a good amount of overclocking headroom for the time (an extra ~200-280Mhz was common). Oh, and it DEMOLISHED the Pentium IV Emergency Edition CPU's that came clocked 55% higher! Paired with the best video card the world had ever seen at the time, it was unstoppable, and I recall running 3dMark for the first time after the build was finished only to nearly poop myself, as this 2.2Ghz chip STOCK just out-performed every single Intel CPU on the charts outside of those OC'd with the help of LN2. I am working right now to rebuild the rig, as I feel it is time for it to come out of retirement and have some fun again, and I want to see just what it really is capable of with some better cooling (extreme-air or decent water). [SPOILER]I have already lapped the CPU and the block (I am amazed at how poorly the two mated before; the chip AND cooler were noticeably convex), and based on the flatness of each it will certainly be good for a few degrees; add in the magic of today's best TIM's (PK-1) compared to that of 2003, the wonders of modern computer fans via 2x 92mm 3800rpm 72cfm/6.5mmH2O fans doing push-pull, and an extra 3 intake fans feeding it fresh air.... It will be a fun way to bring a memory back to life :D Plus, the X800XT-PE has been thoroughly prepped for overclocking, with a 6-heatpipe heatsink and 92x20mm (61.3cfm/3.9mmH2O) swapped for the stock unit and mated with PK-1, EnzoTech Pure-Copper VGA RAM-Sinks attached to all of the card's modules with PK-1 and less than a needle-tip's worth of superglue at two of the four corners of each, and the same for the MOSFET/VRMs on the card. Combined with a pair of 120mm 69cfm fans blowing air across it (mounted on the inner side of the HDD cage opposite an intake fan), an 80x15mm 28.3cfm fan mounted to blow air directly on the back of the card, a PCI slot blower fan pulling hot air from the card and exhausting it out the back, as well as an 80x25mm 48cfm fan mounted where the lower PCI brackets used to be exhausting air... I think it'll do just fine ;) [/SPOILER]
However, I am not taking any sides in this "CPU WAR". The minute one company starts to seriously pull ahead, the competition is lost, and we ALL lose. Innovation will become scarce, people will become excited about 5% IPC improvements from generation to generation and fork out the money for the next "great thing" in the CPU world, not to mention the cost for the constantly-changing socket interface. AMD has been in a bad way for some time now, pretty much since the Core processors from Intel began to overrun their Phenom lineup. Sure, they had some really amazing processors for the money, such as the Phenom II X4 965BE/980BE/960T and X6 1055/1090/1100T, but Intel was still the performance leader with their E8600, Q6600, and the many QX9xxxx processors that transitioned into the still-strong X58/1136 platform (with the 920/930/975X/990X standing out), and they have only gained traction since.
I am no fanboy, and I hate to get onto any enthusiast site and scroll through comments sections where pimply-faced, Cheetoh-encrusted, greasy-haired know-it-all loser's frantically type away in a "Heated Battle of 'Nuh-Uh's' and 'Yuh-HUH!'". (that is called hyperbole)
Fortunately, at least for the most part, I don't see that here.
Perhaps we should all go out and buy one of these new chips, maybe for a build for a friend or family member, or a home-theater PC or whatever, but regardless of whether you "Bleed Red" or "Bleed Blue", both "sides" will win if AMD gets the money to truly devote enough resources to one-upping Intel, or more likely, coming close enough to scare them. When the competition is closest, only THEN do we see truly innovative and ground-breaking product launches; and at the current rate, we may be telling our grandchildren about how "once, a long time ago, there was a company.... a company named AMD".
For the record, I AM NOT in any way a Fanboy; I buy whatever gives me the best bang-for-my-buck. Fortunately, at my job I am the only "tech-y" person there so whenever there is an upgrade in someone's equipment, or even servers, I get the "old" stuff :D I have sold literally hundreds of CPU's off that I had no use for, but I kept the favorites or the highest-end in each category that I was able to get. However, many of them I purchased myself (Opteron/Xeon from work, the rest I bought 90% of). Here's a list of processors I currently have in possession, in my house, in the best reverse-chronological order I can remember: i7-3930K (24/7 4.6Ghz - Max 5.2Ghz), i7-3820QM, i7-2600K, 4x Xeon E7-8870 (got 8 for $2k from work, sold 4 @ $2k/ea and built a Bitcoin Miner that earned me ~6,700Mh/sec with 4x 5970's in CF-X; earned over 1200BC and cashed out when they peaked at ~$17/ea for a massive profit and eventually stopped mining), i5-2400, Xeon X5690, i5-2430M, Opteron 6180SE 12-core, Xeon W3690, Phenom II X6 1100T-BE, Phenom II X4 980-BE, Phenom II X4 960T-BE (built girlfriend a rig: best CPU for $$; unlocked to 6-core; hits 4.125Ghz 6C / 4.425Ghz 4C), Xeon X7460, Core2Quad Q9650, 4x Xeon X3380's, Opteron 8439SE, Xeon X5492, Core2Duo E8600 (from Optiplex 960, hits 4.5Ghz on air), Core2Duo T7400, Athlon II X4 640 (E0), Athlon II X4 650, Pentium Dual-Core T4400, Turion II N550 Athlon X2 7750BE, AMD FX-62 (3.25Ghz easy), Xeon X3230, Athlon X2 5200+, Opteron 890, Turion II Ultra M660, Athlon64 X2 6400+ BE, Opteron 185, Athlon64 X2 4800+, Opteron 856, Opteron 156, AMD FX-51 (24/7 2.45Ghz stock voltage), Opteron 144 (OC'd to ~2.6Ghz), Turion ML-44, Pentium 4-EE 3.46Ghz (could barely hit 3.5Ghz...junk), Pentium 4 3.2Ghz, Pentium 4 2.8Ghz (easily ran at 3.6Ghz on air 24/7 with +0.015V, awesome CPU!), Celeron Mobile 1.6Ghz, Pentium 4 2.4Ghz, Celeron 1.8Ghz, and plenty more.... ***Have a set of 8x Xeon-EP E5-4650 8-core's coming when we upgrade again in January; they are upgrading the whole rack so I am getting, along with the chips: 3 total 4-CPU boards, 384GB of DDR3-1333 Reg/ECC, the entire cooling system, 16x LSI/Adaptec RAID Controller Cards (all PCI-e x8, support minimum 24x SAS 6Gbs drives, have between 1 and 4GB of Cache, and all have BBU's), 96x 150GB 15Krpm SAS6 + 48x 600GB 15Krpm SAS6 enterprise drives, and about two-dozen Nvidia "professional" cards (12x Tesla M2090's, 4x Tesla K10's that were used to evaluate platform, and 8 Quadro 6000's) all for $1900!!!!!!!! The supplier offered $2150 for "Trade-Up" but I am really good friends with the entire IT department (all 6 of them) and they offered them to me instead! FOLDING@HOME WILL BE SHOWN NO MERCY!
First of all, nice amd fanboy story. Before you go insane claiming over and over again it is not true, I want to point out to you your own words....
1. Intel has been dominating since core 2 2. Without amd competing there will be no innovation and tiny 5% will cost and arm and a leg and the idiots will be spending all their money buying it
Okay, let's take those 2 statements, and add in SANDY BRIDGE and it's amazing architectural jump. Whoops ! There goes your sick as heck and fan boy theory.
Furthermore with your obviously supremely flawed BS above, you did your little amd fanboy promotion saying we should all go out and buy one of these amd chips for a family member or some upgrade - BLAH BLAH BLAH BLAH BLAH.
Let's add in your eccentric Fx chip story, your declaration it's your favorite cpu of all time, your bashing of Intel claiming only Ln2 could bring Intel close, and then your holy of holies the resurrection build...
OK ? Forgive any and all of us who don't buy your I am not an insane amd fanboy lines.
Look in the mirror, and face the dark side, let it flow through you Luke, you are and amd fanboy, and Intel will innovate and make absolutely amazing cpu's like the SB even when amd is slapping itself in the face and choking and dying ... feel the anger amd fanboy - amd is NOT NEEDED... let it flow through you amd fanboy, your journey to dark side is nearly complete... When you kill your greatest FX cpu rebuild, you will have crossed over to the darkside !
I looked over the tests this character devised. ONly a few were multithreaded. Tom's Hardware had a very through testing procedure , explaining eacg application and what it showed about the architecture of the various cpus being compared. They were very balanced in single threaded apps and multithreaded apps. They did NOT do a lot of synthetic benchmarks because many of them are skewed in a prejudicial way. He also used win zip , photoshop cs5, video editing software, etc. Games were not all single thread shoot-em-ups, they were a collection of widely diverse games.. The FX-8350 came out ahead of not only the I5 3450 but also the I5-3570. He had some criticisms of course , but he said it was the best bang for the buck in the $200 price space. This review was shallow and meaningless done by somebody who either is lazy or on a mission to discredit. By the way The FX-8350 had the highest score on win zip bettering even the I7 3770. This reviewer owes us a well-designed retest and apology for a bunch of misleading garbage.
Well, that's the beauty of product reviews - there are multiple for a specific product, and all with different tests. What you need to do is find the test that matters to you, and if it excels at it, you may buy it solely based on that (even ignoring bad points). If, on the other hand, it doesn't perform so well in the discipline of your choice, that is really making your mind up for you to go buy something else.
LOL - no that's the beauty of AMD's "we are not evil" LIE, and their "totally and completely proprietary build of the "open source!!!!!!!! not like nvidia physx!!!!" W I N Z I P
Now, all you freaking amd fanboy liars and losers have to be constantly reminded about your evil, sick, proprietary, "open source" AMD LIED AND COMPATIBILITY DIED - winzip BS !
LOL - let it dig into you fanboy, let it sink in deeply. All those years amd played your wet brains like limp noodles get played, and you scowled and spit and hated and howled nVidia and PhysX and open source and OpenCL and amd is not evil and they aren't thta kind of company and then you went and had the stupid 3rd grader amd gamers manifesto stapled to your foreheads.... LOL
You didn't find it in your evil fanboy manual to let your amd fanboy freind there know about the HACKING amd did on winzip ?
Translation: I am either paid by AMD, or a total fanboi, and these benchmarks did not say what I want them to say. So I am going to come on here and plug a different reviewers website, that is known to be AMD biased, and tell everyone how unbias they are and how their conclusions are the right ones, because they agree with my world view.
I suggest stopping using x264 HD benchmark and looking for another test case.
Let's look at what x264 HD benchmark does:
Source film: MPEG-2!!! 6931kbps on avg, with a maximum bitrate of 12xxxkbps!!! You guys know that MPEG-2 is DVD standard...... DVD has a resolution of 480p(720x480 for wide-screen), but for FullHD it's 1920x1080, 6 times pixels as DVD has! And dvd has a ~5000kbps bitrate on avg, so what quality of the source film could we expect??
And then let's look at its output: OMFG! 8000kbps!! h264!!!! I'd say for such a source, 2000kbps is fairly enough for an h264 output....
So do you guys think such a test could ultimates a cpu's calculating potentials?
I suggest finding any ts/BD-ISO source, and use proper options on x264 (basically you can directly use --preset xxx), then use it as a reference...
It's 125TDP part that gets consistently blown away by the 95 TDP Ivy Bridge, which has more transistors and a smaller more modern node process....and at the high end, it's really not that much cheaper than an Ivy Bridge i5.
*sigh* Oh AMD...how the mighty have fallen. Can the real AMD, the one that gave us Thunderbird and Athlon64, please stand up?
To the Intel fanatics whose bottom-line is" My car's better than your car, my car's better than yours. What infantile sensibilities . The computer is a tool. A multifaceted tool that has 1001 purposes. The AMD technology meets the needs of 99.99% of computer users with a better bang for the buck. Only a one-dimensional person can say otherwise. Myopic gamers need to open their eyes and see there is a bigger world out there.
Here we go again, the activist on another preaching rampage, with his attack on Intel cpu owners.... nice little OWS protest against the rich Intel people...
You wouldn't mind then if I said I can't stand you cheap, broke, ghetto amd dirty little rascals who can't pay for themselves let alone the education they need to properly use a computer. Not to mention your ignorance in supporting a losing, technologically backwards second tier set of idiots wasting monetary resources that could be spent on something good for the world instead of on foolish amd misadventures that pay interest on amd's debt and not much else. You ought to support the company that pays a LIVING WAGE, instead of the one firing their employees, axing them over and over again.
Thanks for not being capable of properly acquiring and using a computer.
I've rooted for AMD against Intel before I built my first PC with the 700Mhz Athlon in 2000. AMD stole Intel's thunder to much acclaim. For a while AMD and Intel dueled for supremacy, exchanging leads, much like the tit for tat between Radeon and Geforce GPU's are engaged in. AMD's scrappy fight spurred Intel's clock to speed up its ticks and tocks, and the computing world benefited from this. It would be bad for all of us if AMD goes out of business. I root for the underdog, for David against Goliath, but David is lying on the ground and boasting of winning. It was embarrassing when the Phenom was so unphenomenal. Then AMD heralded the Bulldozer. Bulldoze what? The empty hype makes the truth more painful. Intel plans to integrate the South Bridge onto Haswell's die, and folks, AMD will lose teeth and get bloodied. I'm growing weary of being a sort of Cubs fan.
If power use is important to you, you should know that different reviews give different results for the power use vs competing intel chips.
A couple of sites even have equal or lower idle power draw for the 8350 vs i7 3770.
Trying to figure out why, one variable is the motherboard. Is the Crosshair V a power hog?
I also looked at yearly cost in electrical use for my own useage.
The only thing I do that pegs multiple cores at 100% is chess analysis. In Deep Fritz the 8350 is close in performance to the i7 3770.
I do chess analysis about 1-5 hours a week on average, perhaps 200 hours per year.
The math is very simple. Power costs 16 cents per kilowatt hour. Peak power useage would cost an extra $3/year roughly vs an intel rig for me. Since I'd use a more power efficient motherboard than the Asus Crosshair, idle power is reasonable. I standby a lot when not using also.
An 8350 would cost me in the range of $4-$8 more per year in power bills vs an i7 3770 (it's competitor for chess analysis).
Starting the power cosumption graphs at 50 watt instead of 0 watt is GROSSLY MISLEADING! and very unfair to AMD.
Lack of performance per watt comparison is unfair to Intel. Yea, AMD finally is able to, at stock, beat intel on some benchmarks... But they consume significantly more power to do so (intel could easily start selling higher clocked parts too)
If I ever build a new machine...it looks like I'll swing towards my first ever Intel box...hrmmm the anticipation may make me do it just for fun even though my Phenom II X555BE Unlocked and OC'd to 3.5GHz serves me just fine.
I got curious about the idle power and visited 7 sites to look at reviews. No 2 sites had the same idle power difference between the 8350 vs the i7 3770. Values ranged from 9 watts AMD *lower* (lower! than intel) to 22 watts higher. The higher readings seemed to all be with the Asus Crosshair V, which logically must be a power hog.
You should consider the idle power numbers *not* representative. Unreliable.
LOL - seems like... hahahahhahahah in some imaginary future in a far off land, if and when and only if amd does xxxx and yyyyyy and blah blah blah blah,.... blew it.
More extreme ignorance from the idiot CeriseCogburn. Little boys who only game should seriously consider not commenting on things they aren't capable of comprehending.
Stupid little bitchboy CeriseCogburn...What a waste of oxygen.
If you live in an area that requires A/C most of the year like me, the true cost of owning a FX8350 processor is about an additional $100 year vs. owing a 3570k. Fx8350 +15 watts idle +95 watts load vs. i5 3570k 50 hours week light cpu usage = 75W 10 hours week heavy cpu usage = 760w Combined usage = 1025w @$0.11 Kw/h = $1.12 A/C usage 75%-80% @$0.11 Kw/h = $.84 Extra electrical cost $2/week Extra electrical cost $100/yearly or $300/3 years
Maybe my math is wrong, but if you use A/C most of the year and pay for electricity an AMD cpu is a waste of money. Then again some people still use incandescent light bulbs instead of compact fluorescent lamps or LED bulbs.
LoL my math was wrong in the above post. Fx8350 +15 watts idle +95 watts load vs. i5 3570k
68 hours week light cpu usage = 1kW 100 hours week heavy cpu usage = 9.5w Combined weekly usage = 10.5kw @$0.11 Kw/h = $01.15 Average A/C usage 80%*$1.15 @$0.11 Kw/h = $.0.92 Extra electrical cost $2/week vs. owing a 3570k Extra electrical cost $100/yearly or $300/3 years vs. owing a 3570k In this usage scenario the computer is heavily used for tasks like folding, gaming or video editing
121117 Intel is a semiconductor company first; and a microprocessor company second ($13.5B revenue [#3.8B quarter]) used to make a living making memory chips (dram) until they became commoditized by Japanese rivals in the 1980s and margins plunged, now volume produced microprocessors happen to be the most profitable; makes more useable chips per $5B-300mm & $7B-450mm/+40to120% extra chips wafer fabs, (90% yield range vs 60 to 80% rivals), moving to 22nm geometries, 14nm by 4q13; Chips take about 3 months to make and they are put through more than 300 separate processes intel atom vs brit arm (less pwr & customizable, 98% of cellphones since 2005 have had arms in them) for touch panel (w8) mobile/cell phone/tablet mkt i3 55W ceiling...
Merom mobile arch (1q06, 667-800MT/s, 35W, bga479, socketM&P; updated 65nm [fab] Yonah core P6[pentium pro '95]) marked Intel’s acknowledging that the Pentium 4 (netburst arch) was not a viable long term solution because power efficiency is crucial to success; 65nm Conroe desktop (core2 quad, 800 MT/s, 65W, lga775); Woodcrest (lga771) 1333 scaled down to 1066MT/s workstation ([email protected], 80W 3GHz), server(xeon socket604&lga771); differing socket(M,P,T,fcbga), bus speed and pwr consumption; the identical Core microarchitecture was designed by Israel's Intel Israel (IDC) team; stepping represent incremental improvements but also different sets of features like cache size and low power modes; Nehalem 45nm (16pipeline), Westmere 32nm; Sandy bridge 32nm, ivy bridg 22nm; Haswell 22nm, Broadwell 14nm; Skylake 14nm, Skymont 10nm; sandy bridge 32nm integrated x86 microprocessor as SoC (cpu/gpu+last lvl cache+sys I/O) vs amd bulldozer intel haswell dual-threaded, out-of-order cpu arch 22nm FinFET, (high end tablets, low pwr) theoretical peak performance for Haswell is over double that of Sandy Bridge, twice the FLOP/s, cache bandwidth doubled, vs '13 amd Steamroller core & w8;
Why is it that every review I see, no one uses the RAM that the CPU memory controller is rated for? Or in overclocking, did you see what a RAM overclock added?
Just because your Intel chip may not perform any better with 1333, 1600, or 1866, doesnt mean you dont run your AMD chip with the 1866 its rated for. I see this all over the review sites. You dont have to use the same RAM in both systems for a comparison of CPU performance. RAM choice is designed into the CPU, and has an affect.
Nice job AMD. I guess some of these benchmarks are skewed because windows 8 is superior to windows 7 because it uses the modules better, but still.. the $166 8350 outperforms in some cases or comes close to the performance of a $200 i5-3570k.
This is great news for AMD. Keep up the good work!
The Intel fanboys simply do not understand market dynamics. If if were not for AMD, Intel processors would be three or four times as expensive. In addition, competition drives innovation. Intel CPUs were mediocre until AMD created it first Athlon 64-bit processor that kicked Intel's P-4 butt in every measure of performance. Intel fanboys, you simply don't give credit where it it due. I am the first to admit that Intel's Sandy Bridge CPUs are, for the most part, better designed than competing CPUs from AMD. However, I feel obligated to do my part to keep AMD alive. It benefits the users of both AMD and Intel CPUs and it keeps prices down. Only an idiot would hope for Intel to drive AMD out of business. This is not football, basketball or NASCAR, you dimwits!
Exactly. I have been building AMD systems since my Athlon Xp 2500+ OC to 3200+ back in 2004. My last Intel was a fast Celeron back in 2003. If AMD were to go out of bussiness Intel would get complacent and jack the prices up. Competition is good for providing innovation. Imagine if Intel had a total monopoly and AMD never existed. I doubt that we would have anything near as good as the processors we have right now. Plus AMD has always offered good performance for the dollar spent. I never buy bleeding edge technology because it costs twice as much for just a small amount of perfomance advantage I like to buy AMD and then use the money saved to put a better GPU into the system since a FX8350 is not going to bottle neck any modern GPU and spend the bucks were it counts. I find all the talk about Fanbois hilarious. I like to build powerful gaming systems on a budget and when I am done with them they are sold on Ebay and I build another. I wish AMD well for if they were to fail then Intel would shoot their processor prices up into the stratosphere again. I felt Intel was surprised by AMD back in 2004 when they came out with the 939 dual core chips that pummeled the fasted PIV systems. In fact I had a Asrock dual sata 939 with a Dual core 939 3800+ that my son was using for his gaming rig until this Christmas when I built him a new AM3+ system with an Asrock 990Fx extreme 4 motherboard. Competition if good and I want to see it continue so thats why I continue to buy AMD
I thought I came on this site to read in depth reviews, not to see a bunch of fanboys fighting each other for who has the better processor, using grade school humor to do so. I get sick of it. People, just buy what YOU want; I could not care less what you spend your money on. If you are happy with it, then that is all that matters.
If you came on this site to read the in depth review then do so, you idiot.
Oh wait, instead of doing what you demand everyone else do, you go down as low as possible while holding yourself up to the angelic light and claim everyone needs to stop fanboying .... well guess what moron - if you came to read the article read it, THEN LEAVE, or stop whining in comments too, being a fanboy of sorts, the one who, doesn't FREAKING REALIZE the article itself is a BIG FAT FIGHT BETWEEN INTEL AND AMD, YOU FREAKING IDIOT.
Over 2x performance per watt skower then intel and 3x slower performance per watt when both overclocked. Its crazy how far intel are ahead and the most worrying thing is haswell is where intel will be going all out to bring better performance per watt and also beat amd trump card as in integrated graphics haswell will be amazing.
Yes amd cost less then for most of there cpu's but you pay for hat you get. Amd are releasing cpu's that intel had 3-4 years ago that were better performance per watt. Also you are saving over 2x the electricity and completing tasks much faster on intel cpu's.
Put it this way if amd get to where intel will be with haswell in 2013, I mean for amd by 2016 or 2017 to get to where haswell will be in 2013 amd would have done a miracle as they are that far behind. I reckon amd are so far behind now that they will just target lower end market with there apu's for gaming.
i just bought an FX8350 with a new mainboard and 7980 radeon. i think for an 8 core chip at 4ghz its a good price. hope it out performs my old core 2 q6600 @ 3ghz anyhow or ill feel like a complete sucker!
A lot of people posting comments in this article are nothing but tools. For those of you who can't see beyond benchmarks and who think a slight advantage in a benchmark = blowing the competition out of the water, let me give you a lesson.
There's a difference in say a 30% difference at say 30 FPS, and a 30% difference at say, 110 FPS. When both chips are performing at 60 FPS, there is no blowing the other chip aware. At that point, it's simply a stalemate. It's just a shame that Intel fanboys are too arrogant and also ignorant to admit this. They're so fixated on "OMFG, my chip gets an extra 11.71 FPS on a benchmark than your chip does".
Everything AMD has out there this side of a Phenom II X4 (or hell, even a low end AMD FX 4100) will run anything on the market maxed out at a solid 60+ FPS, given you are supporting it with a video card that doesn't hold it back. With that said, most people play with v-sync enabled anyways due to massive screen tearing with most games. What does it matter that a Core i7 is pulling 147 FPS and a AMD FX 8350 is pulling 107 FPS when your frame rate is just going to be locked down to 60 FPS anyways?
I know a lot of people like to be future proofed, and the more overhead you have over 60 FPS the more future proof your system is, but future proof by an extra year != blowing the competition out of the water. Gaming requirements has pretty much hit a brick wall. System requirements has not really went up much at all in the last 2-3 years. With a Phenom II X4 965 and a GeForce 650 Ti my system runs anything I throw at it at a solid 55-60 FPS on Ultra settings. If I threw a 650 Ti Boost, or even better a 660 Ti or a 680 in my system, everything would run even better. My CPU still never really gets maxed out in most games.
Anymore where the difference lies is how fast the CPU can encode and how fast the CPU can do other things that are not gaming related. That's where Intel is focusing right now, but as far as gaming, we've hit a brick wall, and have been behind that brick wall for several years now.
With that being said, I'm very proud of my AMD Phenom II X4 965 that is coupled with my GeForce 650 Ti. In many games I play with my friend, this hardware compared to his Core i7-920 overclocked to over 4.0GHz running GeForce GTX 470's in SLI. In some games, I was slightly below his performance. In other games I was equal to, and in a few games my system actually outperformed his. He has since upgraded his GeForce GTX 470's with a single GeForce GTX 680, and even against that card, my system does very well in comparison. In DIRT Showdoown, we both were over the 50 FPS average mark. I was at about 57, he was at about 70, on average. Now, that may sound like a lot, right?
Well, then you factor in the pricing. My motherboard, processor and RAM was less than $250. His motherboard alone was more expensive than everything I paid combined. Coupled with another $250 for the CPU. That's $500. That's double price just for the motherboard and processor compared to what I paid for everything, outside of a PSU, case, monitor, etc in which I already had. The performance difference, however, definitely isn't double.
I mean, you can either go pay $600+ to build a system (motherboard, CPU, RAM since most people reuse other parts such as optical drive, sound card, network card, hard drive, PSU, etc for many years), or you can pay $250 to build a system that will get slightly less performance on benchmarks, but still be future proof.
It's your call. I don't know about other people, but I like knowing I'm getting the best bang for the buck, and while Intel definitely may offer slightly better performance in benchmarks, AMD definitely offers the best bang for the buck. How can you turn your head at a 4 module 8 thread CPU for $185 when it costs over $300 to get a decent Intel chip? They're both future proof and will run anything at over 60 FPS for years to come, so why blow the extra $100 on the CPU and an extra $100 on a motherboard? Oh, and good luck finding an Intel motherboard that compares to the AM3+ ASUS M5A97 with a UEFI BIOS for under $200.
Most ppl in the general public will be like me. I dont OC, I tried it but never got into it. I dont even game on my PC...and for what I use my PC for....stock vs stock.....Intel is where its at. Sorry. I do lots of video encoding.
The general public see this article....they will probably think the same thing. I also look at power consumption. Again, Intel is where its at. I had my sights set on a i7 3770k for over a year. I can probably wait 2 more years...and it might still be a better buy vs AMD.
I never noticed this until just now, I always heard Anandtech intel bias but never noticed it until this article. They purposely set the resolutions lower knowing that Piledriver fares much better against SB/IB in 1080p and 1440p.
Remember Price vs performance, AMD always win against intel. Well, intel has the fastest processor in the world,.... but, do you need all the intel's potential power? i don't think so. some people only use their computer for internet browsing even they have intel core i7.
I see most comments talking about how this card is shit.
Sigh.
This card works great for what I need it to do. I host 7 servers on my computer on v boxes for my gaming community. Mine craft, ts, star made, cube world, ect... I don't get paid for my services and I need it cheap. This card give performance to host a lot of people on each on each core. From a low 4 people on cube world to 45+ on team speak.
Why would I buy Intel for these purposes other than to spend money for same performance in the scenario and flaunt my epeen.?
Performance is not based on score or GHz. Its based on money. So before saying this card is shit. Why not look at multiple applications that this card can be used it.
As far as this scenario goes. Any unused CPU is lost money.
You keep complaining about amd fanboys, but your obviously an intel fanboy... You sound completely immature with your constant us of LOL in caps. No need to get too excited about cpu's... They are just part of a machine that you use lol. Seriously you should not post on this website if you are just going to be a immature intel fanboy. You sound too young to be posting on this website anyways. "obnoxious insolence" you should re-read your post as it is completely obnoxious. Your attitude stinks -.-
All those Intel & AMD Fanboys.. Holy crap -_- I have an FX-8350 and had an i7 in the past. I went for the FX-8350 since it was so much cheaper compared to the i7 3770K while having almost identical performance, AMD just can't be beaten by their Price-Performance products, however Intel's best CPU's will Always be better than AMD's best, thats a fact to be known. BUT, that doesnt make AMD bad, like all these Intel ''Fanboys'' seem te think, Intel and AMD are both bad and good in their own way, get the fuck over it both you AMD AND Intel fanboys.
If you live in a hot and humid climate an FX processors would be a terrible value. Most people that live in central Florida use air-conditioning 80% to 90% of the year. For my uses the equivalent AMD CPU's have been tested and shown to consume about 65 watts more under full load and 11 watts under idle conditions than a comparable Intel CPU. Since I pay 100% of the electrical bill then buying a Intel CPU will become a significant cost saving purchase over 4 or 5 years. Well, 10 or 50 watts might not seem like it would cost much. If you factor in daily usage, a/c cost, and 4 years of ownership than the pennies start to add up.
1w * 12hrs/day = 4.3kw/hrs/yr 4.3kw * 0.8 a/c cost = 7.7kw * $0.11 kw/hrs = $0.85 yr $0.85 * 4 yr = $3.40 per watt example $3.40 * 30w = $102
1w * 24hrs/day = 8.6kw/hrs/yr 8.6kw * 0.8 a/c cost = 15.4kw * $0.11 kw/hrs = $1.7 yr $01.7 * 4 yr = $6.80 per watt example $6.80 * 50w = $340
If I was building someone a budget gaming desktop, I would recommend spending $130 on a Intel Core i3-3225 Ivy Bridge 3.3GHz LGA 1155 55W Dual-Core Desktop Processor Intel HD Graphics 4000 running at stock speeds instead of spend $120 on a AMD FX-6300 Vishera 3.5GHz (4.1GHz Turbo) Socket AM3+ 95W Six-Core Desktop Processor even if overclocked to 4GHz.
Quote""Single threaded performance is my biggest concern, and compared to Sandy Bridge there's a good 40-50% advantage the i5 2500K enjoys over the FX-8150"I know this is an old post BUT i just bought its big bother the FX 8350 for $129.99 ,and they are saying the same thing,poor single thread performance.I say WHO CARES? I know most people that have 1/2 a brain don't care!! Most people buy this CPU for gaming !What games can you name are single threaded???unless you go back to the stone age ! and are playing the very 1st battlefield 1942?thats all i do is game and transcode videos and dollar to dollar this thing kicks intels a s s! so case closed!!!! AMD all the way FTW
1st time in ages pondering an upgrade , why I don't know as what I have doe's everything i need to , I don't game by the way. Sitting on Phenom11 Quad 965 Black Edition , gently O/c to 3.7ghz . Asus M4A88TD-evo with 16GB of memory. Not impressed with bench marks in the least , used to laugh at them with lenses when I used to do photography , meaningless in the real world mostly. pondering this chip , currently about £143. Been building machines since 286 days and always use AMD , never let me down , have been with several Intel ones. If AMD went bust leaving Intel with a monopoly, would price out of our reach. My thoughts anyway. Not really bothered which chip as long as it works , each to their own , don't understand the squabbling. lol.
Did you look at the games benchmarks in this review? It's not that games only use a single thread, it's that plenty of games are no optimised well for multi-threaded CPUs. If we're talking about BF4 fine, the FX chips will perform well against Intel, but it's clear from these benchmarks that in plenty of titles from the past few years that the poor per-core performance of the AMD chips does hamper its gaming potential.
And the fanboyism repeats itself again.... AMD is slowly creeping up to Intel, eventually they would have to be equal. AMD does good at 3D modelling, video editing and gaming, Intel is suited more for video editing, rendering and graphics. AMD is low price and value. If you can OC it properly and with good cooling, it beats Intel in SOME areas. Intel, on the other hand, can deliver a tad more power with lower clock speeds.
AMD has less framerate from Intel, but only a *slight* difference in performance. AMD is known to have weak cores and make it up by adding 8. Intel has slightly more power in each core, but mainly goes up to 6 cores.
My conclusion? AMD should stop fighting with Intel and slowly work their way up from there.
Ok people,lets just agree that nothing is free of charge.We HAVE TO PAY to get these badboys on our pcs so you can stop bitchin about "price dont matter". AMD FX 8320 is around 160$.lets check for the same price to intel right?Intel - Core i3 Dual-core i3-4340 is what i found on the net after a quick search. If you tell me that i3 guy can best FX 8320 than yep Intel is the winner.But if you try to compare i7 and this fx guy,thats wrong. I think amd tries to sell cpus to middle-class folks who only wants to have best performance with a weak salary and Intel is trying to do the best of the bests regarding how much it will cost. If you have lots of money buy an intel and have fun of it. If you dont have that much money but still need some performance go for AMD.
My opinion is giving half the money and have a tick less of performance.I'll live.
Tech support since the first ibm 88/66. AMD has its hits and misses. And i could argue tech benchmarks as well. But i buy my machines for reliability and stability. We supported 65k users with only 12 techs. 24/7 Phenom 2 was beautiful. so much to a point i just started field testing for a roll out for the 8320. So far Intel will still be what it is. (Apple, Intel, Samsung) Alot of money spent on hype. You wanna talk about whats real word useful? Well my friend the business world is all about documents of large size being transmitted and AMD's multi thread kicked the crap out of Intel in both time to completion and completion without error. Many many of our users game. Our test group did admit they loved the 8320 on A 970g. So the conclustion is If i wanna live in a box and run bench marks. Intel. I wanna get some tasks done AMD. Period, The added bonus my AMD vendors are so much easier to work with on the VERY rare occasion of a failure. Intel... not so much.
Im at the moment looking for the best performance/price cpu, so after reading most of this i couldnt resist to comment. Clearly most of you here do OC and are a fkn benchmark freaks, while i play mmo's for 6-8 days you jiggle your PC in benchmarks, licking it and cooling off to get higher score.....but for what? Its like making a car with 300-400km/h and shitting on cars that barely do 200km/h, but forgetting the fact, that theres not many places where on daily basis you can go so fast, right?
So im looking on game stats here as thats only thing that makes sense to me. So i was looking to buy 8350 a lil upgrade from my phenom II x4 965, but i started to wonder, why the hell i need so huge fps, if normaly you wont see any real difference between 60-70fps and 100. So i decided to buy 6350 and oc a bit as i read only good oc reviews about it. And yea, its not a huge difference in money, but still, im looking for a good build that wont hurt my pocket much. And to those who start to count electricity bills, you are stupid, NO gamer will use same system for 5 years, so saying to use 100usd more and buy intel is damn stupid, as in those 5 years i better save those 100usd and put em to a new amd cpu what they will have in that time.
I wont lie, i wanted to make intel pc as intel mobos are so much greater looking in my price range, but when i saw the price of cpus my dream was crashed. Here in Latvia i can buy FX-9370 4,4GHz almost cheaper than starting level i5 processor. If i compare only speed and reviews(as im only a gamer and dont know shit nor care about some numbers in tests, that intel has 5-10% better performance) then local store amd processors beat intel in price 2x. And for a simple(not making shitloads of money) gamer, price/performance is all what i need.
ofcourse these benchmark tests tell you alot more than to me, but really, if you have 200km/h car for 2k euro and 210km/h for 4k euro, then it all goes down only how much you can afford and if you really need, will use those extra 10km/h....right?
So please, stop bitching and telling total crap, as in end most people who buy these stuff are gamers, not overclockers who just need a better number is benchmark than guy next to him. Price is everything and even more price/performance. I better loose those 10% fps and still play game at 90% than pay 2x more to actually dont see any difference in daily basis.
Thats all!
Sorry about my english tho, its very bad, so dont even bother commenting on that as i wont get back here, just kinda made me sick all those idiots measuring theyre dicks in internet(for that join the chatroulette dick flashers)
I just remember the days back in 2004, when Athlon destoyed intel's reputation. Jump ahead a few years, after intel failed twice with multicore architectures and they finally came through with core duo (their third attempt from scratch), after monopolizing an array of markets under the table (and finally losing in court years later, or too late, paying over 2 billion in damages to amd) and now, intel, the monarch in cpus, boasting their success over 5-10% performance against competition, being in that place after doing a lot of harm to their younger betters, still and always selling their products in a "milking" way, just makes me shake about our future. Buying amd's reasonably positioned products pricewise makes me feel I do my part in maintaining a much needed competition, that does everybody a lot of good. I think every amd customer offers intel customers faster progress and affordable prices. But not many willingly understand. Buying is a choice. And everybody needs amd but intel.
I work as an Architect Specialist for a local Phil based company, I decided to purchase a new computer since my Acer Asphire with an Intel 2330 with a mistake of buying it without the benefit of a dedicated VRAM. When it came for me to learn Luminous 3.2 the laptop screen turned blue hence I am decided to buy a desktop PC.
I conferred with our resident IT, and he sudjested for me to purchase a PC with an AMD processor since Intel even with their hyperthreading will just put most of my money on the CPU instead of opting for an AMD and slap it on with a 2GB dedicated VRAM 128 bit. Makes a lot of sense really and sadly with all the let-down statements about AMD I can only imagine a life without AMD where a whole lot of people won't be able to buy a simple ass PC on account of Intels exorbitant price range. Make no mistake though Intel really runs AMD down like a raging bull. But that is crap in the bag, speed isn't only the real issue here as the other components has to come into play. VRAM, RAMM speed and RAMM memory, motherboard, power supply, CPU case, cooling system, softwares. Add them all up and really with an AMD it can all be within arms reach as not everyone can afford Intel.
The FX 6300 is a great chip for gaming if the game actually utilizes all 6 cores. Since hyperthreading only adds up to 50% more performance, the i3 (which is the only chip within its price range) is actually more like a 3 core Intel processor. That's why the i5 beats it because it actually has a full 4 cores.
Since the Piledriver cores are more than half as fast as Intel's that puts the FX6300 at above i3 performance in properly threaded games and within striking range of the i5. The FX8350 ends up being in between the i5 and i7 in games that like 8 threads. At less than $150 that makes the 8 core Piledriver chips very competitive with the i5.
If only games didn't emphasize the importance of the performance of the first two cores so much. AMD would have a serious winner with a 6 core APU
You buy what you need for what you do, its not that complicated. If im building a budget pc aimed at being a "console killer" why would i spend around $300 for a i5 4690k and board when i can buy an equally priced board, and an fx 6300 for a little over $150 total which performs great in gaming, multitasking, and some mild video production? You pay for what you get. (Prices were found on newegg at the time of post)
I have a electricity flatrate and I mean it serious (I have no Idea, I only thought/think that AMD is doing stupid with saying its octo-core, its like the i3-540 I bought in 2010, it had 2 cores but 4 x 3.06 GHz... Intel stopped this when I bought this system, why is the power consumption soo important for the average user or is it something for the users who really know much? I thought max consumption is 125W and wondered about ~180-190W but its "System Power", I have ASUS M5 A97 R 2.0 Motherboard, I did not buy a new CPU yet because the prices are crazy because of the now very weak €uro to USD... Intel CPU's in late 2014 released at ~330€, exactly the same ones did cost in late 2016 ~350€! I have a PC since I'm 6-7 years old and I can't remember ever that 2 years after release a CPU is more expensive than at its release and the "new" ones are costing as much as a complete "low budget gaming"-system...
So I use it, GTA 5 runes quite fine with 2,6GB of 2GB available VRAM (thanks to Nvidia Geforce Expierence, these settings are really cool, manually I can't make them, the game tells me I have not enough VRAM, but somehow the Geforce Software is able to do this, and its no prob, I'm a bit angry because I did not even try GTA 5 at this 8320/GTA 760 OC ASUS (1072 normal, OC to 1150 MHz, normal is 980 MHz and Turbo 1.033 I think, only 2GB VRAM @ 6008 MHz, I run it with ~6200)
So its ok to use the system like I did?! I sometimes pushed AMD Overclock a bit ahead, is it damaging my system??
Sorry for posting again, I mean GTA 5 works really great (I did not try because GTA 4 is from 2008, the i3-540 is from 2010 and I had a HD 5770 which I overclocked from 850/1200 to 875-900/1250-1300 (GPU/VRAM), so a 2 year after GTA 4 was made CPU and a GPU which is not great but in 2010 was not crap was not able to play GTA 4 at everything max, now the GTA 5 runs really great and I "lost" over 1 year because I did not try (same with the free win 10 upgrade), the graphic is great, even with the 2,6GB settings, i wonder that the FX-8320 and the ASUS DirectCU-OC 760 works soo great at GTA 5! Only a few things are "off" which could be done by a 900-series I think,
So I wonder, but maybe GTA 5 is simply one of the few games where it is like this and the other thing is the "optimized settings" from Geforce Experience, I would enable/do other settings, so I now enjoy it... but maybe GTA 5 already is a old game....
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
250 Comments
Back to Article
klatscho - Tuesday, October 23, 2012 - link
but at least priced decently.leexgx - Tuesday, October 23, 2012 - link
still like how AMD think they have 8 full cores in there (some sites list the Modules not FP cores in their lists)8x is 4 Modules (4M/8T)
6x is 3 Modules (3M/6T)
4x is 2 Modules (2M/4T)
they hardly outperform stock clocked matched cpus (that they listed)
leexgx - Tuesday, October 23, 2012 - link
also to add if you own an Bulldozer (or Vishera) type of cpu you should all ready have these patches installedhttp://support.microsoft.com/kb/2646060
http://support.microsoft.com/kb/2645594
Penti - Wednesday, October 24, 2012 - link
There are 8 fully pipelined integer cores in there, they are just very weak. Some of it is the shared frontend/decoder some of it is the integer execution units themselves. Weak SIMD/FPU-performance isn't the only thing it got. It just does so much less. You don't have two pipelines with separate resources to achieve SMT/HT. They need wider execution here. Preferably dropping the shared front end thing too. Makes no point of having it around, focus on making it faster and dump all that cache which does no good. Mobile/Notebook chips can't really have 16MB of cache any way. Just a few MB.DDR4 - Wednesday, November 7, 2012 - link
It's just a marketing thing, the cores don't have that much power, AMD's just looking to do better than Intel in one area.P39Airacobra - Wednesday, May 14, 2014 - link
And you base this on the first that came to your mind to make you feel better about you over paying for your wimpy little 4 core Intel CPU with half the power of a FX-8320. LOLP39Airacobra - Wednesday, May 14, 2014 - link
Just kidding I have a i5 myself, But guys you really should stop being such fanboys. AMD has a great Chip here with the FX-8320 and FX-8350. They are priced much lower than the top i5 CPU, And they will perform just as well in gaming if not better. And who cares about it using 125 watts? 125 watts is a bit more than what Intel's i5's use , But it can still be ran more than fine with a High end GPU with just a decent 600watt mainstream PSU like a CX600.spooky2th - Tuesday, June 3, 2014 - link
Intel i5's can handle faster memory than any amp chip. They have a stronger MC plus they OC very well too. With the 1155 socket the amd chips were barely keeping up. Since haswell the speed champs are intel cpu's hands down and with the z97 boards and the new processors that will only work with the 97 boards, look out amd! Better OC'ing and handling faster memory than before!DesiredUser - Friday, February 12, 2016 - link
Currently, FX-8350 costs over $200.Used Xeon 5647 costs just $50 and beats a crap out of it.
Both support ECC. Go figure.
Homeles - Sunday, October 28, 2012 - link
Because performance = core count. Brilliant.CeriseCogburn - Tuesday, October 30, 2012 - link
You know, you look at this crap amd chip in the review, and then it hits you... hardYou can lock sandy b 2500K at 4,500.00 mhz all day long on every motherboard out there, all 4 cores, with no voltage increase, w the crap stock intel fan for years on end, never a hiccup, and smoke the every living daylights out of everything amd has.
I mean, I find it very interesting that with videocards we STILL hear about overclocking, but when it comes to cpu's, suddenly all the OC fanboys of amd fall dead freaking silent on the matter.
Sorry, just cannot help noticing when the wannabe emperor is stark naked and moments ago his glorious worshippers were pointing and admiring and swooning over just how fast the amd jaybird can run when pushed, and then.... suddenly, the MASTER OC CHIP OF ALL TIME, the Sandy Bridge... is rather ignored... and the naked jaybird streaking thing becomes the dejected peasants silence...
Really sick of it man. Really sick of it.
g101 - Wednesday, November 21, 2012 - link
Oh look, it's the highly ignorant fool CeriseCogburn...You make demonstrably false statements on every single AMD article and merely provide further proof of your ignorance. Honestly, if you're unable to find anything better to do than post idiotic comments on every anadtech article, I have a suggestion: EDUCATE YOURSELF, LITTLE DIPSHIT.
Desolator2B - Wednesday, November 21, 2012 - link
Dude, seriously? Did you know the current world record for overclocking is held on an AMD processor? AMD is incredible when overclocking, especially super cooling, yes it may be a bit slower than Intel, but damn you can't beat AMDs price.You're a bit of an idiot mate.
JrPgFaN83 - Wednesday, May 14, 2014 - link
i couldn't have said it better myself mate. This response made the read worth it.vench - Friday, November 23, 2012 - link
think about this..which one is better an eight unlock cores as AMD called it.
or 4 cores with hyperthreading technology?
4 cores with hyperthreading it will act like an eight cores because single core will act like two cores as claimed by intel but bear in mind there would be a mathematical complex going on behind intels hyperthreading. but WHAT if you unlock this thing so it would go freely to eight cores as claim by AMD?
THAT IS THE REASON WHY AMD HOLDS THE WORLD RECORD FOR OVERCLOCKING BECAUSE THEY MANAGED TO UNLOCK THIS THING.
IF INTEL WILL PRODUCE 8 CORES CPU..THEN AMD WILL MAKE 16 CORES..AND THEN AMD STILL WIL BE THE WINNER FOR WORLD RECORD..
ON THIS POINT OF VIEW JUST BE OPEN MINDED.
DO NOT ACT LIKE YOU REALLY SICK WHEN ITS NOT BECAUSE YOU WOULD MAKE YOURSELF LIKE TWICE MORON AND TWICE IDIOT.
JUST TAKE AN ADVICE FROM G101. EDUCATE YOURSELF.
I USED TO BE AN INTEL USER.. BUT WHEN I'M BEGINNING TO UNDERSTAND THIS TREND..I WOULD LOVE AMD AS THEY LOVE CONSUMER'S POCKET (AGAIN- BE OPEN MIND. ITS MY OPINION)
sleekz - Thursday, May 2, 2013 - link
CAPS. The world overclocking record is meaningless. You can't compare clock speed across brands and AMD cores are half-cores. They still can't compete with intel and the circuit size is several years behind, which is why they consume so much power. AMD has given up in the enthusiast CPU market, and is losing money.JrPgFaN83 - Wednesday, May 14, 2014 - link
AMEN!Mombasa69 - Thursday, March 14, 2013 - link
My AMD Vishera 8350 blows my ghay i7 3770k out the water and was HALF THE PRICE, says it all really.Etnos - Saturday, April 20, 2013 - link
I hardly think you own intel shares so the fact you care so much is pretty sad.very sad.
Idiot10 - Tuesday, May 7, 2013 - link
Here comes the 2500K-loving Intel mercenary know-it-all-about-processors again, and yet he cannot sleep to know that there are chips out there who can outperform his 2500K. SOB!!!!Idiot10 - Tuesday, May 7, 2013 - link
Here comes Mr. ChariseHogburn again. the all-knowing Intel mercenary processor expert, who thinks nothing can beat his beloved 2500K. SOB!!!!Got my first AMD - Thursday, May 8, 2014 - link
I OC my 8320 to 4.7 GHz and only have to raise the voltage .0125 from stock. Why is it that you think AMD is worthless. They are all for the consumer and I am grateful for it. I'm really sick of your ignorance. Really sick of it.cbrown - Thursday, June 12, 2014 - link
What everyone will get "sick" of in a hurry is if AMD falls on its face with its "crap" cpu manufacturing... Intel will double their pricing on cpus again, same as they were during the time period AMD released it's "crap" slot athlon 500. AMD also has a decent gpu to fall back on since they bought ATI which was a smart move.*AMD released Slot A Athlon 500 Intel slashed it's prices in half same day to match.
*If ATI hadn't stepped up to the plate and suceeded when nVidia bought 3dfx Everyone would be crying about the prices of gaming cards
*Sure AMD is lagging behind right now but never underestimate the importance of competition and the effect it has on affordable upgrades for everyone...
I love AMD for the pricing they have brought to us this day and time... with cpu's AND gpu's
I am fixing to build my FIRST intel platform since the since amd released the slot a athlon simply because I want a hackentosh... there is simply too much heartache building one on a AMD CPU/MOBO combo for the simple fact apple does not support them. I dont know if you would call me a amd fanboy or not, your opinion, I simply have supported a company that brought affordability to everyone.... I am either MAC (intel, no choice really) or AMD for the windows platform...
Anyone that cannot see the reasoning behind this, now I would say they are a intel fanboy though.... Everyone has the right to choose.....
saintalchemyst - Saturday, July 4, 2015 - link
Watch and relize the fan boy you are. Intel is great at what they do...SO IS AMD! https://www.youtube.com/watch?v=eu8Sekdb-IECaulkWarmer - Wednesday, August 5, 2015 - link
Well I can answer that question for you. It's because we don't need to. I hear Intel fanboys talking about overclocking all the time. Now why is that? Is it because you spent a house payment on your big fancy i7 alone, so now you just have to squeeze in those extra clock cycles? And then here I am, spent 110 bucks on a 8320 on a Newegg sale and what do you know, I don't *need* to overclock. Why would I do something I don't need to do? Well, I'm not a little bitch fanboy afterall so I'll just be happy with what I got and not talk shit about something I have no experience with (take the hint).You know what I am sick of? Hearing people like you bitch and moan about how they hate AMD, on an article *about* AMD. Is your butt hurting you hun? Don't you have some crippled kids you have to go beat? Shoo.
sleekz - Thursday, May 2, 2013 - link
Priced cheaply, but they don't match price/performance wise. Price is not a valid argument against intel.Abdussamad - Monday, August 12, 2013 - link
Do the math on electricity consumption and you find it is not priced decently at all. Let's say you run a system with an idle FX 8350 for 8 hours a day everyday for 5 years and electricity costs you $0.16 per kwh:(( 8 hours * (74.2 - 57.5) * 365 days * 5 years ) / 1000) * 0.16 = $39
(74.2 - 57.5 is the difference in idle power consumption between an fx-8350 and i5-3570)
So add $40 to the price of the FX 8350 before you do a comparison. More if you run your system 24/7.
lostinspacex - Wednesday, November 13, 2013 - link
duuude its 40 dollars in 5 years, you spend more going to a restaurant once , and u are multiplying watts with kilowatts xD, basic math mistakechekk42 - Wednesday, November 13, 2013 - link
Look closer. The math is correct. However, yes, $40 over 5 years is not really a concern.ninza228248 - Monday, April 14, 2014 - link
After 5 years add $40 to AMD and you will get mind blowing processor compared to Intel Core i5-3570 or any intel processor after 5 years in that price bracket...Operandi - Tuesday, October 23, 2012 - link
Looks like Piledriver delivered. Granted the bar was set pretty low with Bulldozer but this at least has a use case, highly threaded applications but considering this is a process node behind Intel I’d say its pretty good. If they can keep this pace up and hit IPC a bit harder AMD could be back in a pretty good position.silverblue - Tuesday, October 23, 2012 - link
At least now they can say they beat Intel in a lot of multithreaded situations. Losing to Intel AND using more power was unpalatable. I'd like to see an undervolted 8350, perhaps AMD's conservative side is rearing its ugly head again.I'm a bit concerned that, even with hard-edge flops and the RCM, the clock speed difference is about 11% for the same power. I'd have thought that even the former would shave off a decent amount, unless RCM doesn't work so well at higher speeds. Still, there's one disadvantage to be had - overclocking won't work so well due to the flop change.
If AMD can beat Intel now in multithreading in most circumstances, Steamroller is just going to let them pull away. Single-threaded workloads are the worry, though. Still, at least they can say that they finally beat Nehalem even in single-threaded work. I did lament the lack of an appearance of Phenom II, but looking at the results, they've buried that particular ghost.
Finally - Tuesday, October 23, 2012 - link
Undervolting, you said?Here you go: http://www.computerbase.de/artikel/prozessoren/201...
Spunjji - Tuesday, October 23, 2012 - link
Thanks! +1 to that.CeriseCogburn - Tuesday, October 30, 2012 - link
ROFL, thank you the 3 stooges.I'd like to particularly thank silverblue the little co amd fanboy who provided immense entertainment in that they lost, then moments later, they won, deranged fantasy spew. Good to know Stearoller is going to "pull away" !
hahahahhahahahahha
One for all and amd all won !
LOL
rkrb79 - Friday, October 10, 2014 - link
FYI I joined Anandtech just so I could tell you that you are a douchebag!!Taft12 - Tuesday, October 23, 2012 - link
+2 in fact!An official lower-TDP version version of the 8-core CPU would be very nice. 95W or even lower as Intel does with their -S SKUs.
At my workplace, the i7-3770S has been just plain outstanding for our small form-factor server/workstation appliance that travels to tradeshows with our sales guys. I'd happily trial an AMD 8-core equivalent.
silverblue - Wednesday, October 24, 2012 - link
Happy to +3 you on that. :)CeriseCogburn - Tuesday, October 30, 2012 - link
- 10 for the once again PATHETIC HACKING that is required for amd to be acceptable.(that's minus ten !)
LOL - fan boy fan joy fan toy EPIC FAIL !
StevoLincolnite - Tuesday, October 23, 2012 - link
I wouldn't say it has delivered. Not yet anyway.Remember the Phenom 2's IPC is lower than the later model Core 2's and Piledriver still needs 700mhz+ to beat a Phenom 2, so that puts it in perspective.
Mind you, overclock the NB on a Phenom 2 and you can get some pretty interesting gains in the range of 5-15% depending on the situation.
However, like AMD has done for the last several years, they are happy to throw more cores at the performance problem, which is great, we just wish those cores were a little beefier or software to become more heavily threaded.
The other flip-side is this will drop straight into some AM3 motherboards and all AM3+ motherboards, so it's a worthy upgrade if you're running something like an Athlon, plus it's cheap.
But the consensus is that if you're still running a Phenom 2 x6, and you don't need 8-threads and mostly play video games, it really is throwing money into the fire in order to upgrade to the FX line, Piledriver or not, unless you intend to overclock the chips to 4.8ghz+ which the Phenom 2's can't reach on air.
wwwcd - Tuesday, October 23, 2012 - link
Yes, we don't neeed of 8 core/threads for gaming today, but do You have prognosis for near future?
Kisper - Tuesday, October 23, 2012 - link
Why would you upgrade for no reason other than speculation?If an advantage arises in heavily threaded games in the future, upgrade at that time. You'll get more processing power / $ spent in the future than you will at present.
CeriseCogburn - Tuesday, October 30, 2012 - link
amd fanboys are pennywise and pound foolish, so buying the amd crap now, and telling everyone it has the deranged amd furuteboy advantage, works for them !I mean really, it sucks so freaking bad, they cannot help themselves, like a crack addict they must have and promote, so heck, the last hope of the loser is telling everyone how bright they are and how on down the line in the imaginary years ahead their current pileofcrap will "truly shine!"
LOL - oh man, funny but so true.
Spunjji - Tuesday, October 23, 2012 - link
Prognosis for the near future is that having that many threads will still not be a whole lot of use for gaming. See Amdahl's law for why.Samus - Tuesday, October 23, 2012 - link
It's safe to say all programs/games going forward will take advantage of four cores or more. Battlefield 3 released LAST year and basically requires 4 cores in order to be GPU-limited (as in the game is CPU limited with just about any videocard unless you have 4 cores.c0d1f1ed - Tuesday, October 23, 2012 - link
Amdahl's Law is not a reason. There is plenty of task parallelism to exploit. The real issue is ROI, and there's two aspects to that. One is that multi-threaded development is freakishly hard. Unlike single-threaded development, you cannot know exactly what each thread is doing at any given time. You need to synchronised to make certain actions deterministic. But even then you can end up with race conditions if you're not careful. The current synchronization methods are just very primitive. Intel will fix that with Haswell. The TSX technology enables hardware lock elision and hardware transactional memory. Both will make the developer's life a lot easier, and also make synchronization more efficient.
The second aspect isn't about the costs but about the gains. It has taken quite a while for more than two cores to become the norm. So it just wasn't worth it for developers to go through all the pain of scalable fine-grained multi-threaded development if the average CPU is still only a dual-core. Haswell's TSX technology will come right in time as quad-core becomes mainstream. Also, Haswell will have phenomenal Hyper-Threading performance thanks to two nearly symmetrical sets of two integer execution units.
AMD needs to implement TSX and AVX2 sooner rather than later to stay in the market.
CeriseCogburn - Tuesday, October 30, 2012 - link
Nice post. Appreciate it.And ouch for amd once again.
surt - Tuesday, October 23, 2012 - link
No, gaming won't need that many threads in the near future either. Nobody is going to make a game demand more than 4 threads because that's what common gamer systems support.AnnihilatorX - Wednesday, October 24, 2012 - link
I disagree. Say we have a hypothetical game that support 8 threads. The overhead of over-threading in a quad core system is frankly, not very much, while it may provide improvements on people with octocore or Intel processors with hyper-threading.AnnihilatorX - Wednesday, October 24, 2012 - link
In fact, there are many games nowadays that split workload into many threads for economic simulation, background AI planning in user phase, physics, audio, graphics subthread, network management, preloading and resources management. It is just that even with the parallelism, there bound to be bottlenecks in single threading that a 8 core may not benefit at all compared to 4 cores.So I disagree, it is not about people not spending resources in making parallelism or not supporting it. It is the nature of the workload that is the determining factor.
CeriseCogburn - Sunday, December 9, 2012 - link
LOL- amd sucks period, did you look at the gaming page ?these visheras got literally stomped to death
AMD fanboy = the imaginary, non existent, and never to exist future looks glorious for de furhor amd!
redwarrior - Wednesday, October 24, 2012 - link
What a one dimensional computer enthusiast you are. You spend hundreds to play games on a computer when you could do the same ona console for less?? I use my computer to gain knowledge, impart knowledge, do organizing work to liberate the working class from wage slavery, write leaflets, an documents. I occasionally play strategy games that are usually multi-threaded, like Galactic Civilizations II. . There is no greater value on the planet than the FX processors for what I do. They save me time for the work I do over the Intel processor in the $200 price class. Time and money that's important , frame rates of a 120 are useless but too the over-privileged who buy 120 mhz monitors for their gaming. What a waste of money and resources that could be used for the advancement of human kind.bennyg - Thursday, October 25, 2012 - link
"Value" is more than just perf per purchase dollar, running costs also need to be included.E.g. a basic calculation based on the charts above the FX CPU I've saved $50 on would cost 2c extra per hour at full load in power. So 2500 hours at load would be my break even point. That's 7 hours a day at full load over a year, a heavy use scenario but quite possible.
Multithreaded games are such a vast exception to the rule (that once you have "enough" CPU power you gain infinitessimal fps from more) they are not worth even mentioning.
redwarrior - Thursday, October 25, 2012 - link
You know NOT what you speak. Battlefield 3 is multithreaderd and look at AMD FX-8350 on Battlefield III - right up near the top, better than I 5 3570 and close to I7 3770. You guys are ignoring the facts and ignoring the trends in software. the move to parallelism is unstoppable and will accelerate. Multithreading is a growing presence and ONLY BAD programmers and software designers ignore it. The turning point will come when steamroller ships in a year and it will compete nicely with Hasbeen. At 28nm it will be almost as efficient as HasbeenPerformance wise it will be as good.
CeriseCogburn - Tuesday, October 30, 2012 - link
LOL why am i not surprised, massive amd fanboy with chips on the shoulder, and a fantasy brain." o organizing work to liberate the working class from wage slavery"
LOL - perfect, just like the rest of the amd fruitballs. Have fun at the OWS protests, though it would have been decent to join up with Tea Party, instead of coming on a year plus late after all the complaining. (you brought up politics fanboy)
Anyway back to your fanboy fantasy. As I said, you can look all day long at the pileofcrap amd releases and tell yourself it's the greatest ball of cheese for you, but no one has to believe your bs. One big reason why.
SB 2500K oc's to 4500 like butter on stock everything, all 4 cores, all day and all night with zero hiccups, and blows the amd crap away period.
You actually have to be very stupid to not choose it. Very stupid.
Be aware fanboy, you're looking at stock 2500K in all the charts, and a clear +50% increase in instantly available with it, FOR FREE.
There is no way any amd fanboy made the correct decision after 2500K was released. And it's not occurring now either. You're living a lie, so please stop sharing it with us, and by the way - I don't think it's your place to tell others WHAT they can use their computer systems for.
THEY OWN THEM, not you. They are theirs, not yours, and you shouldn't be tooting your virgin purity hippy love angel wing CRAP here, and then also have the obnoxious insolence to tell others they are wasting their computer power.
There are plenty of people who will tell you flat out you are wasting your life and wrecking the nation with the crap you are doing, no doubt about it, so keep it to yourself, won't you ?
Now let's hear how your crapdirver amd can possibly match a 2500K in the real world...
LOL
ain't happening mister
Evilwake - Saturday, November 17, 2012 - link
lol that funny calling a spade a spade look at yourself i my self have your 2500k and have the piledriver dont see any difference in them in the real world in fact whats funny is i can run many programs in the back ground and still play aion without any frame loose or any shuttering problems cant do that with my 2500k it drops in frame rates and shutters like hell so keep telling peeps how much u dont know about cpu's we really like hearing from u.CeriseCogburn - Sunday, December 9, 2012 - link
another liar, another amd fanboy, another evil personiceman34572 - Wednesday, January 2, 2013 - link
Who gives a crap who has the better processor? Honestly......do you work for Intel? Then why care what other people like? I have an FX series processor, as well as several Intel machines. I like them both. Going online and getting into a pi$$ing contest over which company makes a better processor and resorting to making fun of people (google "Internet tough guy and you'll see what a majority of people think about that) is non constructive, gains you nothing except negative attention, and makes you look less intelligent than you probably are. I could give a $hit what you like, or which processor you run. Neither AMD nor Intel pays me any money to give a d@mn, and whether I think you are wasting your money or spending it wisely doesn't impact me in the least bit. People, just buy what you personally like, and screw all the fanboyism that seems to be rampant ON BOTH SIDES.pmartin - Thursday, January 3, 2013 - link
You hope it performs as well as Hasbeen. My guess is it won't. If you want top of the range performance, buy Intel, simple as that.pl1n1 - Saturday, October 27, 2012 - link
The technical arguments have some merits, the political ones are per-digested socialist propaganda. I almost threw up at the end of the post.Must be nice to be able to advance the cause of the class struggle from a cozy living room somewhere in a free market country where your freedom of speech is protected by some freely elected capitalistic pig.
Useful idiots from around the world unite!
pmartin - Thursday, January 3, 2013 - link
Please shut the hell up.captg - Wednesday, October 24, 2012 - link
What about someone with an AMD Phenom II X4 940 Black Edition at stock speeds?Wisenos - Wednesday, October 24, 2012 - link
i run my 965 @ 4ghz... 1.48v 20x200mhzOrigin64 - Wednesday, October 24, 2012 - link
4.8GHz? My Phenom II doesnt even do 4, but I have an extremely shitty mobo. vdrops like a downer after a suger rush.BSMonitor - Tuesday, October 23, 2012 - link
Except that it requires nearly double the power of a Ivy Bridge to squeak out a few wins in those multi-threaded apps... Only when a company is this close to obscurity can we say this is a win. Especially in light of ARM competition with x86... AMD continues with insanely power hungry chips?? Not good.At $200 it still is a tough sell. Double the power of i5-3570K and 80W more than i7-3770K. No way. The chip looks dated. cough cough Pentium 4 Prescott anyone?
What market is AMD aiming at here?!? Intel produces 2 IVB per 1 of these. And IVB is an APU of all things.. This thing is AMD's non-iGPU part. Imagine if Intel released an 6-8 core IVB without the iGPU. Same die size as the IVB APU.
Bleak does not even begin to describe AMD. The fact that AMD sits at $1.5B market cap and no one is talking about buying the company says a lot.
CeriseCogburn - Tuesday, October 30, 2012 - link
Thank you for the proficient monitoring, although I disagree with at least the characterization of calling it a win based upon amd being on it's way out, or whatever.It gets called and referred to as a win, because honesty is now CRAP, and fanboy fruittard is "in". That's all.
When there is some bare win for amd in some game, then of course it's a massive killing and total destruction, and sometimes when it's a tie or a lose it gets called and manipulated and talking pointed and spun into a win.
Personally, I believe that's why amd is a freaking failure. They coddled and produced a large raging fanboy base, with their PR hits against nVidia and Intel, all of it lies that the fruiters totally believed, and went on a continuous rampage with.
That emotional battery allowed AMD to produce crap, not support their crap properly, feel good about their little warm and "not evil" hearts they pretended to "live by", and thus go down the frikkin tubes while bathing themselves in glory.
The very few times the massive collective of lockstep fanboy parrots broke out of their idiot mind chains and actually criticized AMD, and it only occurred several times mind you, after much ignoring and glossing over, why then AMD, shocked and stunned - WOKE THE HECK UP... got off their coddled PR fanboy based BUTTS - and did something about their huge problem...
I must say the results those few times were extraordinary for AMD, and quite exemplary in any overall comparison across the board to other companies in the mix. A few examples of that should not be hard to bring to mind.
That's why I don't like the fanboy crap. I certainly don't believe it's good for amd, nor good for my bottom line, as I suffer under the constant coddling and lying, too.
We all do.
Now it's likely too late, but I'm still hoping for a bailout for amd. Lots of oil sheiks out there.
Yoda's apprentice - Wednesday, June 26, 2013 - link
It kind of bothers me how you ignore that you're exactly the same fanboy too.Wolfpup - Tuesday, October 23, 2012 - link
Yeah! I'm really impressed how much better these are...the fact that they're beating Intel again in ANYTHING is awesome!We need AMD for the competition, and anymore with Intel pushing their worthless video so hard, it gives AMD a competitive advantage both because they can skip video and have more transistors on CPU, OR they can put in a massively better GPU.
I wish they had an 8 core notebook part though for the mid range with no integrated GPU....it seems like that ought to be a solid enough choice for a system, combined with a high end Nvidia or AMD GPU.
Seriously thinking of making my next notebook AMD, both to support them, and to avoid switchable graphics... (well, still have AMD's switchable graphics, but hopefully since they make the whole thing they'll do better).
I used to be scared off by AMD as I got burnt twice on horrible 3rd party chipsets, but I bought a c50 based notebook last year for the kitchen, and it's been 100% rock solid stable and non-weird...like Intel's always been known for. Makes me feel a lot better about buying an A series notebook this year or an FX desktop.
Beenthere - Tuesday, October 23, 2012 - link
FYI - Anandtech is suffering server issues at the time of this post...What many reviewers and fanbois tend to miss over and over is that AMD is delivering the best performance-for-the dollar and that ANY current model desktop CPU will run ANY software just fine. Unless you have some enterprise level software that brings a modern CPU to it's knees, ANY of the currently avialable desktop CPUs will run Windoze or Linux based software just fine. In fact Linux apps do even better in many cases than Windoze bloatware.
I have no idea if AMD will ever offer a discrete CPU to equal Intel's top of the line, over-priced models nor do I care. I buy what delivers the best performance for the price. I have yet to purchase any AMD desktop CPU that would not run ALL software as well as an Intel CPU, without any isses what so ever.
If all you do is benchmark all day long and you have money to burn, blow it on an Intel CPU, unless of course you are opposed to evil, chronic, law violating corporations looking to eliminate consumer choice. You could always vote your conscience, if you have one.
I am always amazed that people actually falsely believe that AMD processors are some how "inadequate". Even with tainted benches, AMD processors deliver all the performance and good value that most consumers desire. It's tough however getting people to look at the data objectively. All most people think of is that "more" is better, when in fact that's the sucker play when you look at performance vs. cost and actual needs.
Considering that Intel got a whopping ~5% performance gain from a 32nm to 22nm node drop and Tri-gate tansistors with Ivy Bridge, (along with over-heating and poor overclocability...), AMD did quite well to deliver a ~10-15% improvement with Vishera. With AMD's pricing Vishera should sell well because of it's excellent performance and cost.
zappb - Tuesday, October 23, 2012 - link
Hear Hear!Good to see AMD back in the saddle again, and with stellar performance in multi threaded stuff...
ET - Tuesday, October 23, 2012 - link
I'm sorry, but I can't take seriously anything where the writer uses "Windoze". Any such text is obviously written by a heavily biased individual and therefore any "analysis" in it is flawed.Finally - Tuesday, October 23, 2012 - link
I'm sorry, but I can't take seriously anything where the writer uses the screen name "ET"...extide - Tuesday, October 23, 2012 - link
X2 this guy is one of those "Linux Zealots" Pretty sad that there are still people out there that feel that way.andrewaggb - Tuesday, October 23, 2012 - link
It would be interesting to see some linux benchmarks considering this chip's only future may be running servers or bargain machines.Some linux webserver and sql database benchmarks would be interesting. I didn't see any desktop use case for this processor at all. In every benchmarked case I'd rather have an intel chip. Even when Intel lost it wasn't by much. And the conclusion basically said the same thing, if you are 100% sure you are running heavily threaded code all the time, then this MIGHT be the chip for you if you don't mind a bigger power bill. That's just not great.
But as for the windows remark, windows is fine. Linux has some strong points, particularly with servers, and it's kernel->user mode transitions, but everything is a trade off. I use linux for many of my servers and have for years, but I mostly agree with this http://linuxfonts.narod.ru/why.linux.is.not.ready.... as to the problems with linux. If you've genuinely used linux alot, you'll know most of these things are true to one degree or another. Basically once you get X audio and video involved, it's not awesome and you'll appreciate windows more :-)
redwarrior - Thursday, October 25, 2012 - link
If you prefer to analyze things scientifically and think independently you would NOT use your computer primarily or exsclusively for gaming, you are a one-dimensional human being IIt is a multi-faceted tool that can do work , organize a revolution, spread joy through its communications ability. Help the oppressed get together to fight their expoliters It can be entertaining as well. Practicing being a paid mercenary like the Seals does not intrigue me, it repulses me.There is nothing this cpu can't do either better . as well, or almost as well asan Interl chip in its price class. Single-threaded apps are dying out. More and more games are being programmed to take advantage of multi cores and AMDs' superiority there is only going to grow. Dis it all you like it show your brain is not eoperating at a high efficiency. it is irrational just liek those Iphone nuts who stand on line for aproduct that is being bought as a status symbol rather than as a superior tool (which it is not).
CeriseCogburn - Tuesday, October 30, 2012 - link
Thank you so much mr revolution. By the way obnoxious idiot, you have NO IDEA how the person you responded to uses his or her computer(s) !We get it, you're a dyed in the wool amd freak. Now explain how the 2500K at 4500mhz on stock everything doesn't smoke the spanking lies out of you?
LOL - see you at the FEMA CAMP, i'll be on the other side of the barbed wire, mr revolutionary.
SlyNine - Saturday, November 17, 2012 - link
Let me just say. Shut up.als2we - Friday, August 16, 2013 - link
Just look at the facts , more performance for your $$ http://www.cpubenchmark.net/cpu.php?cpu=AMD+FX-835...AMD has it there we need it there , if amd goes away Intel can charge what ever they want ..........
SlyNine - Saturday, November 17, 2012 - link
It doesn't matter if something is multithreaded or not. If it doesn't use more than 4 threads Intel's Single threaded advantage still holds.Only until you fully saturate the 8threads does the AMD, maybe, pull ever so slightly ahead. Even sometimes there it falls way behind.
If the amount of threads your software is asking is equal to the amount of cores your Core I7,5,3 has the Intel is spanking AMD. Only if the amount of cores DOUBLES intel's is the AMD maybe winning alittle bit.
apache1649 - Friday, November 29, 2013 - link
I'm sorry, this is a bit off-topic to the processors (although I can't say I'm partial to either, I'll be using an AMD because it works in the motherboard I want and performs well with all the other parts I want, but I have used my friend's i5 build and it runs very nicely) but how do you say that Linux is only for servers and bargain builds? Besides the fact that I have seen $600 builds that blow away multiple thousand dollar builds, Linux has become an extremely advanced OS in the last few years. The desktop environments available are all more intuitive than either Windows or Mac, given that they can be customized to the user's preferences down to where they interact directly with the kernel. Not to mention they offer a vast array of features that Windows and Mac don't, as well as using far less resources. I have Linux operating systems that will idle around 1.2% CPU usage. Windows 7 idles around 5%-6%. It manages network connections more efficiently, utilizes the resources it does use much more effectively, and in general just gives a much more immersive and intuitive user experience if you know what you're doing. I would really like to see more support for Linux, because if software and firmware that is available for Windows was available for Linux without the use of WINE, I would use Linux exclusively because it would be so much more efficient, and at this point it has become so streamlined and beautiful that most people who have seen me using it and are Windows users say they would switch because of it's ease of use and visual appeal if only all the software they use on Windows was available. Oh and putting servers and bargain builds in the same group really wasn't well thought out... Most servers have high end components to be able to handle large amounts of traffic and heavy loads on resources.apache1649 - Friday, November 29, 2013 - link
Also X is not the only option. There are other, more functional, less bulky alternativesTaft12 - Tuesday, October 23, 2012 - link
Ad hominem fallacy. Address his arguments, not the slang.Windows is inappropriate for many important purposes, it says so right there in the EULA.
jabber - Tuesday, October 23, 2012 - link
Indeed, reading a lot of comments over the past 18 months you would think AMD were still pushing their old K6-2 CPUS from the turn of the century.Build a AMD machine or an Intel one and average Joe Customer isn't going to notice.
If honest, most of us here wouldn't either probably.
CeriseCogburn - Tuesday, October 30, 2012 - link
Funny how the same type of thing could be said in the video card wars, but all those amd fanboys won't say it there !Isn't that strange, how the rules change, all for poor little crappy amd the loser, in any and every direction possible, even in opposite directions, so long as it fits the current crap hand up amd needs to "get there" since it's never "beenthere". LOL
whatthehey - Tuesday, October 23, 2012 - link
We've heard all of this before, and while much of what you say is true, and ignoring the idiotic "Windoze" comments not to mention the tirade on "evil Intel", Anand sums it up quite clearly:Vishera performance isn't terrible but it's not great either. It can beat Intel in a few specific workloads (which very few people will ever run consistently), but in common workloads (lightly threaded) it falls behind by a large margin. All of this would be fine, were it not for the fact that Vishera basically sucks down a lot of power in comparison to Ivy Bridge and Sandy Bridge. Yes, that's right: even at 32nm with Sandy Bridge, Intel beats Vishera hands down.
If we assume Anand's AMD platform is a bit heavy on power use by 15W (which seems kind as it's probably more like 5-10W extra at most), then we have idle power slightly in Intel's favor but load power favors Intel by 80W. 80W in this case is 80% more power than the Intel platform, which means AMD is basically using a lot more energy just to keep up (and the Sandy Bridge i5-2500K uses about 70W less).
So go ahead and "save" all that money with your performance-for-dollar champion where you spend $200 on the CPU, $125 on the motherboard (because you still need a good motherboard, not some piece of crap), coming to $325 total for the core platform. Intel i5-3570K goes for $220 most of the time (e.g. Amazon), but you can snag it for just $190 (plus $10 shipping) from MicroCenter right now. As for motherboards, a decent Z77 motherboard will also set you back around $125.
So if we go with a higher class Intel motherboard, pay Newegg pricing on all parts, and go with a cheaper (lower class) AMD motherboard, we're basically talking $220 for the FX-8350 (price gouging by Newegg), $90 for a mediocre Biostar 970 chipset motherboard, and a total of $310. If we go Intel it's $230 for the i5-3570K, and let's go nuts and get the $150 Gigabyte board, bringing us to $380. You save $70 in that case (which is already seriously biased since we're talking high-end Gigabyte vs. mainstream Biostar).
Now, let's just go with power use of 60W Intel vs. 70W AMD, and if you never push the CPUs you only would spend about $8.75 extra per year leaving the systems on 24/7. Turn them off most of the day (8 hours per day use) and we're at less than $3 difference in power costs per year. Okay, fine, but why get a $200+ CPU if you're going to be idle and power off 2/3 of the day?
Let's say you're an enthusiast (which Beenthere obviously tries to be, even with the heavy AMD bias), so you're playing games, downloading files, and doing other complex stuff where your PC is on all the time. Hell, maybe you're even running Linux with a server on the system, so it's both loaded moderately to heavily and powered on 24/7! That's awesome, because now the AMD system uses 80W more power per day, which comes out to $70 in additional power costs per year. Oops. All of your "best performance-for-the-dollar" make believe talk goes out the window.
Even the areas where AMD leads (e.g. x264), they do so by a small to moderate margin but use almost twice as much power. x264 is 26% faster on the FX-8350 compared to i5-3570K, but if you keep your system for even two years you could buy the i7-3770K (FX is only 3% faster in that case) and you'd come out ahead in terms of overall cost.
The only reason to get the AMD platform is if you run a specific workload where AMD is faster (e.g. x264), or if you're going budget and buying the FX-4300 and you don't need performance. Or if you're a bleeding heart liberal with some missing brain cells that thinks that support one gigantic corporation (AMD) makes you a good person while supporting another even more gigantic corporation (Intel) makes you bad. Let's not use products from any of the largest corporations in the world in that case, because every one of them is "evil and law violating" to some extent. Personally, I'm going to continue shopping at Walmart and using Intel CPUs until/unless something clearly better comes along.
DarkXale - Tuesday, October 23, 2012 - link
I would also add in the cost of getting a 100W more powerful power supply. (At least)The cost of the better cooling (either via better/more fans or better case), And the 'cost' of having a system with a higher noise profile.
Finally - Tuesday, October 23, 2012 - link
That talk suffers from the same inability to consider any other viewpoint but that of the hardware fetishist.If you are fapping to benchmarks in your free time you are the 1%.
The other 99% couldn't care less which company produced their CPU, GPU or whatever is working the "magic" inside their PC.
dananski - Tuesday, October 23, 2012 - link
I agree with you but stopped reading at "uses 80W more power per day" because you have ruined your trustworthyness with unit fail.CeriseCogburn - Tuesday, October 30, 2012 - link
Hey idiot, he got everything correct except saying 80W more every second of the day, and suddenly, you the brilliant critic, no doubt, discount everything else.Well guess what genius - if you can detect an error, and that's all you got, HE IS LARGELY CORRECT, AND EVEN CORRECT ON THE POINT concerning the unit error you criticized.
So who the gigantic FOOL is that completely ruined their own credibility by being such a moronic freaking idiot parrot, that no one should pay attention to ?
THAT WOULD BE YOU, DUMB DUMB !
Here's a news flash for all you skum sucking doofuses : Just because someone gets some minor grammatical or speech perfection issue written improperly, THEY DON'T LOSE A DAMN THING AND CERTAINLY NOT CREDIBILITY WHEN YOU FRIKKIN RETARDS CANNOT PROVE A SINGLE POINT OF THE MANY MADE INCORRECT !
It really would be nice if you babbling idiots stopped doing it. but you do it because it's stupid, it's irritating, it's incorrect, and you've seen a hundred other jerk offs like ourself pull that crap, and you just cannot resist, because that's all you've got, right ?
LOL - now you may complain about caps.
Siana - Thursday, October 25, 2012 - link
It looks like extra 10W in idle test could be largely or solely due to mainboard. There is no clear evidence to what extent and whether at all the new AMD draws more power than Intel at idle.A high end CPU and low utilization (mostly idle time) is in fact a very useful and common case. For example, as a software developer, i spend most time reading and writing code (idle), or testing the software (utilization: 15-30% CPU, effectively two cores tops). However, in between, software needs to be compiled, and this is unproductive time which i'd like to keep as short as possible, so i am inclined to chose a high-end CPU. For GCC compiler on Linux, new AMD platform beats any i5 and a Sandy Bridge i7, but is a bit behind Ivy Bridge i7.
Same with say a person who does video editing, they will have a lot of low-utilization time too just because there's no batch job their system could perform most of the time. The CPU isn't gonna be the limiting factor while editing, but when doing a batch job, it's usually h264 export, they may also have an advantage from AMD.
In fact every task i can think of, 3D production, image editing, sound and music production, etc, i just cannot think of a task which has average CPU utilization of more than 50%, so i think your figure of 80Wh/day disadvantage for AMD is pretty much unobtainable.
And oh, noone in their right mind runs an internet-facing server as their desktop computer, for a variety of good reasons, so while Linux is easy to use as a server even at home, it ends up a limited-scope, local server, and again, the utilization will be very low. However, you are much less likely to be bothered by the services you're providing due to the sheer number of integer cores. In case you're wondering, in order to saturate a well-managed server running Linux based on up to date desktop components, no connection you can get at home will be sufficient, so it makes sense to co-locate your server at a datacenter or rent theirs. Datacenters go to great lengths to not be connected to a single point, which in your case is your ISP, but to have low-latency connections to many Internet nodes, in order to enable the servers to be used efficiently.
As for people who don't need a high end system, AMD offers better on-die graphics accelerator and at the lower end, the power consumption difference isn't gonna be big in absolute terms.
And oh, "downloading files" doesn't count as "complex stuff", it's a very low CPU utilization task, though i don't think this changes much apropos the argument.
And i don't follow that you need a 125$ mainboard for AMD, 60$ boards work quite well, you generally get away with cheaper boards for AMD than for Intel obviously even when taking into account somewhat higher power-handling capacity of the board needed.
The power/thermal advantage of course extends to cooling noise, and it makes sense to pay extra to keep the computer noise down. However, the CPU is just so rarely the culprit any longer, with GPU of a high-end computer being the noise-maker number one, vibrations induced by harddisk number two, and only to small extent the CPU and its thermal contribution.
Hardly anything of the above makes Piledriver the absolute first-choice CPU, however it's not a bad choice still.
Finally, the desktop market isn't so important, the margins are terrible. The most important bit for now for AMD is the server market. Obviously the big disadvantage vs. Intel with power consumption is there, and is generally important in server market, however with virtualization, AMD can avoid sharp performance drop-off and allow to deploy up to about 1/3rd more VMs per CPU package because of higher number of integer cores, which can offset higher power consumption per package per unit of performance. I think they're onto something there, they have a technology they use on mobile chips now which allows them to sacrifice top frequency but reduce surface area and power consumption. If they make a server chip based on that technology, with high performance-per-watt and 12 or more cores, that is very well within realms of possible and could very well be a GREAT winner in that market.
CeriseCogburn - Tuesday, October 30, 2012 - link
more speculation from mr gnuThis of course caps it all off - the utter amd fanboy blazing in our faces, once again the FANTASY FUTURE is the big amd win :
" If they make a server chip based on that technology, with high performance-per-watt and 12 or more cores, that is very well within realms of possible and could very well be a GREAT winner in that market. "
LOL - Why perhaps you should be consulting or their next COO or CEO ?
I'm telling you man, that is why, that is why.
Siana - Thursday, October 25, 2012 - link
It looks like extra 10W in idle test could be largely or solely due to mainboard. There is no clear evidence to what extent and whether at all the new AMD draws more power than Intel at idle.A high end CPU and low utilization (mostly idle time) is in fact a very useful and common case. For example, as a software developer, i spend most time reading and writing code (idle), or testing the software (utilization: 15-30% CPU, effectively two cores tops). However, in between, software needs to be compiled, and this is unproductive time which i'd like to keep as short as possible, so i am inclined to chose a high-end CPU. For GCC compiler on Linux, new AMD platform beats any i5 and a Sandy Bridge i7, but is a bit behind Ivy Bridge i7.
Same with say a person who does video editing, they will have a lot of low-utilization time too just because there's no batch job their system could perform most of the time. The CPU isn't gonna be the limiting factor while editing, but when doing a batch job, it's usually h264 export, they may also have an advantage from AMD.
In fact every task i can think of, 3D production, image editing, sound and music production, etc, i just cannot think of a task which has average CPU utilization of more than 50%, so i think your figure of 80Wh/day disadvantage for AMD is pretty much unobtainable.
And oh, noone in their right mind runs an internet-facing server as their desktop computer, for a variety of good reasons, so while Linux is easy to use as a server even at home, it ends up a limited-scope, local server, and again, the utilization will be very low. However, you are much less likely to be bothered by the services you're providing due to the sheer number of integer cores. In case you're wondering, in order to saturate a well-managed server running Linux based on up to date desktop components, no connection you can get at home will be sufficient, so it makes sense to co-locate your server at a datacenter or rent theirs. Datacenters go to great lengths to not be connected to a single point, which in your case is your ISP, but to have low-latency connections to many Internet nodes, in order to enable the servers to be used efficiently.
As for people who don't need a high end system, AMD offers better on-die graphics accelerator and at the lower end, the power consumption difference isn't gonna be big in absolute terms.
And oh, "downloading files" doesn't count as "complex stuff", it's a very low CPU utilization task, though i don't think this changes much apropos the argument.
And i don't follow that you need a 125$ mainboard for AMD, 60$ boards work quite well, you generally get away with cheaper boards for AMD than for Intel obviously even when taking into account somewhat higher power-handling capacity of the board needed.
The power/thermal advantage of Intel of course extends to cooling noise, and it makes sense to pay extra to keep the computer noise down. However, the CPU is just so rarely the culprit any longer, with GPU of a high-end computer being the noise-maker number one, vibrations induced by harddisk number two, and only to small extent the CPU and its thermal contribution.
Hardly anything of the above makes Piledriver the absolute first-choice CPU, however it's not a bad choice still.
Finally, the desktop market isn't so important, the margins are terrible. The most important bit for now for AMD is the server market. Obviously the big disadvantage vs. Intel with power consumption is there, and is generally important in server market, however with virtualization, AMD can avoid sharp performance drop-off and allow to deploy up to about 1/3rd more VMs per CPU package because of higher number of integer cores, which can offset higher power consumption per package per unit of performance. I think they're onto something there, they have a technology they use on mobile chips now which allows them to sacrifice top frequency but reduce surface area and power consumption. If they make a server chip based on that technology, with high performance-per-watt and 12 or more cores, that is very well within realms of possible and could very well be a GREAT winner in that market.
Kjella - Tuesday, October 23, 2012 - link
Except the "Any CPU is fine" market isn't about $200 processors, Intel or AMD. That market is now south of $50 and making them pennies with Celerons and Atoms competing with AMD A4 series. You're not spending this kind of money on a CPU unless performance matters. Funny that you're dissing the overclockability of the IVB while pushing a processor that burns 200W when overclocked, you honestly want THAT in your rig instead.Honestly, while this at least puts them back in the ring it can't be that great for AMDs finances. They still have the same die size and get to raise prices of their top level process or from $183 to $199, yay. But I guess they have to do something to try bringing non-APU sales back up, Bulldozer can not have sold well at all. And I still fear Haswell will knock AMD out of the ring again...
Jaybus - Tuesday, October 23, 2012 - link
I agree. I would think they may do better with the 16-core socket G34 Opterons with 4 RAM channels, particularly if they can get down to 95W at 2.5 GHz. A 2-socket board gives 32 cores with lots of RAM per 2U server chassis. This should work nicely for high availability virtualized clusters. In this environment, it is better to have more cores in the same power envelope than faster per-core performance, because the virtual machines are independent from one another. I think Piledriver can compete in this environment much better than in the non-APU desktop/workstation market.Sufo - Tuesday, October 23, 2012 - link
"If all you do is benchmark all day long and you have money to burn, blow it on an Intel CPU"Uh, I'd happily take one to play games on my "Windoze" machine.
Idiot.
cfaalm - Tuesday, October 23, 2012 - link
The thing is that people would want a balanced performance. Balanced between single and multithreaded that is. Now Piledriver does a lot better than Bulldozer here, but I think Intel offers a better balance still. As much as I would like to build a new AMD system, I think it will be Intel this time around.lmcd - Tuesday, October 23, 2012 - link
What class of gaming are you looking at? If you're looking at even midrange gaming, your best bet is an A10 + a 6670 (runs $60-$70 average and $90 for low profile). Really a great gaming value option.just4U - Tuesday, October 23, 2012 - link
lmcd,I just did that for our secondary machine and put in a 6850. Works quite well... aside from bios issues on a brand new board chipset that is. Considering prices on the 7750/70 I'd probably opt out for one of those at $30 more then any of the 6x series. I'd also have probably picked up one of these new cpu's over a A10 given the oportunity.
CeriseCogburn - Tuesday, October 30, 2012 - link
LMAO at fanboy system frikk failure.... hahahahha "adise from bios issues" and uhh.. the "crashing" .. and uhh, I'd buy not the 6850, but 7770, and uh... not the A10 but one of these...LOL - there is the life of the amd fanboy
'nar - Tuesday, October 23, 2012 - link
I frequently have two or three high-cpu apps running at a time, so would AMD be better in this case? Even though each app runs better on Core-i5 individually?I shoot for a do-it-all system. I run video encode, get bored and start a game. I run malware scans on external drives and backup other drives into compressed images. Perhaps if you ran h.264 encodes while you ran another benchmark, like Skyrim, or the browser bench?
Oh, typo on page 6, I think "gian" where you meant gain.
MySchizoBuddy - Tuesday, October 23, 2012 - link
correct for your workload AMD is a better choice in speed and costBlibbax - Tuesday, October 23, 2012 - link
Read the techreport review. Intel still comes out on top.CeriseCogburn - Tuesday, October 30, 2012 - link
Don't worry AMD is going to SteamRoll Intel soon !CeriseCogburn - Tuesday, October 30, 2012 - link
NO, amd never does better. It does worse, often by a lot, and sad little cheapo SB's spank it sorry a lot of the time.Mugur - Tuesday, October 23, 2012 - link
I'm trying to find a good scenario for those desktop cpus... Cheap 8 core virtualization hosts? Video encoding? Other than that, in this "mobile" world when every desktop PC looks out of time, I don't know what you can do with them. They are obviously not good for light loads or gaming...lmcd - Tuesday, October 23, 2012 - link
The architecture makes more sense when less modules are used, i.e. the APU series. Look at how Trinity destroyed Llano, both desktop and mobile. And note that an A10+6670 is a perfect midrange gaming value.CeriseCogburn - Tuesday, October 30, 2012 - link
fanboy much ? Now we have again the amd perfection. LOLSB smacks it down, as does nVidia. Sorry fanboy, amd has nothing that is a perfect value, especially in gaming.
RussianSensation - Tuesday, November 6, 2012 - link
What are you blabbing about? You should be banned from this forum.While Intel's CPUs are clearly in a class of their own for high-end CPU gaming rigs, AMD's GPUs are doing very well this generation, having captured single-GPU performance crown, performance/$ and overclocking performance. The minute you said NV smacks AMD's GPU around, you lost ALL credibility.
http://www.techpowerup.com/reviews/AMD/Catalyst_12...
You may want to take a look at 90% of all the games that came out in 2012 - GTX680 loses to 7970 GE (or 680 OC vs. 7970 OC). Facts must not sit well with AMD haters.
mayankleoboy1 - Tuesday, October 23, 2012 - link
Nice performance predictions for Haswell and Steamroller.But IMHO, 15% increase for Haswell is too high and 15% for Steamroller is low.
IMHO, more realistic expectations would be :
Haswell 10%. probably more like 8%.
Steamroller 20%
dishayu - Tuesday, October 23, 2012 - link
Steamroller 15% is straight from the horse's (AMD's) mouth and 15% for Haswell is well within reason because it's a "tock" (new architecture). So, i think 15% for both works out fine for making speculative statements at this moment.CeriseCogburn - Tuesday, October 30, 2012 - link
Except we can probably agree amd will fail to meet their goal in most of these future cases and fail to meet it and be very late as well (cpu side after all and they suck at being on time) , so some sideways review with all the possible advantages in testing and text will need to be skewed toward the amd side ( and they will be) to be able to claim "amd met it's goal! "Let's face it, if amd meets some bare minimum "test" for "achieving that goal" in one single area, the review will claim "they accomplished it". Like the recent time amd released whatever it was a cpu or a vid card and they had a single or so instance of 5 or 10 PROMISED on the egg shelf the day of their declared release and we were told (yes here) "they did it !"
LOL
On the other side of the "perfectly legitimate 15% is an equal and fair equation of the future", we have Intel, that likely won't be late, and if so just barely, and has the cahunas and record to probably exceed their stated goals.
To be honest, the best I can do for amd is say I like their new cpu name " Steamroller !"
For some reason, and this is of course pure speculation, hopeful ESP, wimmin's intuition, or perhaps a desperate attempt to give them a break...but...
I think the name Steamroller indicates they will have a winner when that puppy comes out.
Yes, I think it's crazy as well, but what if I'm right ?
LOL
I might be absolutely correct.
ac2 - Monday, October 22, 2012 - link
I really like those graphs...Especially the Sysmark and Compile ones, which project Steamroller being ahead of Ivy Bridge, and of course clear wins for the multi-threaded ones.
I wish Anand had done a similar current->projection graph for power consumption as well, that would have been very useful.
* Fingers crossed *, I think (wish!) the next gen APUs, Steamroller + GCN + new process node will be a real winner for AMD assuming power envelop comes down to < 100W TDP across the board and delivery in 2013.
CeriseCogburn - Tuesday, October 30, 2012 - link
Isn't it interesting how when it comes to AMD, the fanboy will go to great lengths never before seen, never before done for any other product from any other entity ever, and I mean ever, and spend their time in pure speculation about the future, graph it out, get their hopes going, take a look at the futureville landscape - LOLIt's AMAZING.
Penti - Tuesday, October 23, 2012 - link
Just proves, as it would that there shouldn't be any AM3+ socket for AMD as there are no high-end processors and it doesn't even help for general computing, workstation and gaming performance that AM3+ has 8-core chips. If you want i3 performance and gaming why would you buy anything over FX4300? If you buy a chip like that why couldn't you and AMD just have gone with FM2 instead? Why not launch 8-core FM2 chips if you really want them in consumers hands? FM2 will get Steamroller, why not make sure it will be thriving as a platform instead of having two desktop platforms? I don't think AM3+ justifies it's existence, I don't really want it and it doesn't really bring anything. It just reminds me of the AM2/+ and AM3 Phenom II days and look dated. I understand that there is no hypertransport in the FM2 platform, but let workstation users just buy Opteron server chips. AMD still needs to up it's singlethreaded performance with about 50%.Mysteoa - Tuesday, October 23, 2012 - link
It is not so easy to make it for FM2. First they have to put the NB inside the CPU in the FX line, for it to work on FM2. That will require more space, power consummation and heat.Penti - Tuesday, October 23, 2012 - link
The NB has been inside the CPU since K8 (Athlon 64 / Opteron 2003). It's only a HT (or IO) to PCIe bridge and integrated GPU in Trinity, and they already have Piledriver in Trinity (only CPU for FM2) so what the hell are you talking about? It performs just the same and are more modern core for core. NB is just DRAM-controller and some registers. I see it more like LGA1155 is good enough for everything now days, so is FM2 despite not having 6 or 8 core CPUs, just having faster cpus is enough. You can bake, bind or layout faster processors for FM2 even adding the number of cores if you like. AM3+ chipset doesn't have PCIe 3.0 in the 990-series and don't seem to be getting it any time soon so why buy into that platform? Certainly isn't much more performing.Enthusiasts can use SR5670 and Opteron's. It's not like 990-series has USB3 support any way. Now, FM2 doesn't have PCIe 3 support either, but might support it in an upgrade. Which of course would require new CPU and new motherboards. On AM3+ it would require new motherboards with new chipset only. Don't think PCIe "2.1" is a hindrance though, and a new CPU would benefit greatly either way. It's just a list of things that adds on the "feels old" category.
ET - Tuesday, October 23, 2012 - link
So you want to block the upgrade path? If as you say the FX4300 is all that people need, they should go ahead and buy Trinity. I'm sure that AMD will also release Athlon CPU's for FM2 like they did with FM1 and Llano. But for people who want a higher end AMD CPU, perhaps to upgrade their old one, being able to use the same socket as the old one did is helpful. A stable platform is a good thing, and AM3+ motherboards have been on the market for a while. I just don't see the rationale behind junking it in favour of the socket-du-jour.Penti - Tuesday, October 23, 2012 - link
Naw, I would say just go with Socket C32 and Opteron derived products. It's an under utilized Socket any way. I don't want to give away choice, just have a more clean and sensible line for desktop. AM3+ just seems like an orphaned platform to put Bulldozer out there. So it has lost it's relevance. They should release higher-performing Trinity chips too is what I would argue for of course. Having four Sockets here doesn't make a lot of sense. Moving to C32 would enable enthusiast like board, cheaper workstation-boards with dual-socket too, i.e. two 8-core 4 module Piledriver chips. It has dual-channel DDR3 like the desktop platform's. Maybe even two higher clocked native 6-core variants would have been something. But they won't do it. Of course Intel doesn't need to counter with an enthusiast platform either.Mainstream platforms is where it's at. Having an upgrade path for two desktop platform's, different chipset's etc doesn't make a lot of sense right now.
SonicIce - Tuesday, October 23, 2012 - link
2nd paragraph: "Look beyond those specific applications however and Intel can pull away with a significantly lead."Dadofamunky - Tuesday, October 23, 2012 - link
With 16GB of RAM and eight threads, why aren't we seeing realistic VM-driven virtualization benchmarks? Honestly, this is a huge application area that remains ignored by AT in their core architecture reviews. Something I always look for and never find.frozen ox - Tuesday, October 23, 2012 - link
Yes please! This is the only reason I even read reviews about CPUs with more than 4 cores.JohanAnandtech - Tuesday, October 23, 2012 - link
What kind of usage scenarios are you thinking off? Because virtualizaiton benches are very prominent in our AT Opteron reviews.Virtualization on top of the desktop is rarely done to run heavily loads AFAIK.
sep332 - Tuesday, October 23, 2012 - link
I do keep some VMs running on my desktop but they are not generally loaded. I'm assuming, because of the power draw, these would not be a good choice for a dedicated VM server build?MySchizoBuddy - Tuesday, October 23, 2012 - link
Can they do opencl like the Intel counterpart?Ryan Smith - Tuesday, October 23, 2012 - link
Keep in mind that Vishera doesn't have an on-die GPU. OpenCL can run on the GPU or the CPU (with the appropriate ICDs), but we're almost always talking about GPU execution when we're talking about OpenCL.Beenthere - Tuesday, October 23, 2012 - link
Test after test by many reviewers using real apps, not synthetic benches which exaggerate RAM results, has shown that DDR3 running at 1333-1600 MHz. shows no system bottleneck on a typical Intel or AMD powered desktop PC. Even when increasing the RAM frequency to 2600 MHz. there was no tangible gains because the existing bandwidth @ 1333 MHz. is not saturated enough to cause a bottleneck. APUs do show some GPU benefit with up to 1866 MHz. RAM.fredbloggs73 - Tuesday, October 23, 2012 - link
Hey Anand, great review!Can we please some undervolting results of the FX-8350 like the i7-3770k undervolting article?
Thanks
dishayu - Tuesday, October 23, 2012 - link
It's absolutely ridiculous that even though AMD has pushed out quite a nice and competitive product (in that price range), Intel has gotten wayy too big in the past 6 years that AMD was sleeping and i don't think they'll be pressured to do any price cuts still. So, even though we still have so-called-competition, Intel has a virtual monopoly and i can't hope that the new AMD releases will help drive prices down any more.dishayu - Tuesday, October 23, 2012 - link
Additional thought : I do believe that apart from the power consumption, AMD has a more overall compelling processor with the 8350. Single thread performance has already long crossed the point where you could tell the difference in experience between AMD and Intel (the exception to this is gaming). And AMD is better in heavily threaded applications.So, IF ONLY they could fix the power problem, i wouldn't hesitate to recommend an AMD system for any other purpose than gaming. Just my 2 cents.
figus77 - Tuesday, October 23, 2012 - link
But really... even in games where is the bottleneck with an FX?Remember than in 99% on monitor youìve got a 60hz refresh rate and you can't see more than 60 without glitches fps on that screen, so what's the difference beetween 85 or 95 fps??
I've got an [email protected] with an hd6950@6970 really i can't find a single game that didn't run smooth in 1920x1080 and playing to skyrim with 4 core allocated to the game while 2 pairs of other core are doing video processing on 2 anime episodes is pleasing :-)
CeriseCogburn - Tuesday, October 30, 2012 - link
Problem is you have that attitude more than once, and then you're into a slow degrading slop with lost performance.So, why do we get these arguments from amd fanboys ?
Obviously you purchased the 6950 and decided you needed every last drop of juice from it, and there you go, oc'ed to 6970...
So on the one hand you've takena safe and sufficient card and hammered the piss out of it for a few more frames you cannot notice, as you do tell us you cannot notice them, and then you take your cannot notice them argument, and claim that's why your amd cpu is so great... and intel is not needed.
I see you did that without even noticing. You totally freaking contradicted yourself, completely.
Look, go ahead and buy amd and be a fanboy, I say more power to you, just DON'T type up crap talking points with 100% absolute contradiction, plussing them blatantly in amd's favor, and expect I'l just suck em down like freaking koolaid.
DO NOT.
Intel is desired for the very same reason, you the amd fanboy, bought the 6950 and have it OC'ed to 6970 speeds.
Sorry bub, nice try, till you ate both your own shoes.
Ellimist - Tuesday, October 23, 2012 - link
I would also like to see virtualisation in benchmarks as well. Multi core processors should lend to some interesting results in these sort of benchmarks.Good Review btw. Keep up the good work.
silverblue - Tuesday, October 23, 2012 - link
I think that's Johan's domain; expect him to do so when he looks at the new server-based models.HW_mee - Tuesday, October 23, 2012 - link
The 8320 is a 125W partCeriseCogburn - Tuesday, October 30, 2012 - link
LOL - oh dats not much powa ! All da amd fanboys don't care about power or saving the earth, they never cared, they never are worried, nor have they ever been worried about housefires...All that hate toward JenHsung and all those years of destroying the earth rage against nVidia, that actually happened on another planet in an alternate universe.
gamoniac - Tuesday, October 23, 2012 - link
Anand, while enjoying your Vishera review, upon loading a page, the ad on your page took over my browser and navigated to a fishy looking "You are today's 100,000th visitor!" site. I know AT contracts out the ad part, but I thought you might be interested in knowing what happened. Here is a screenshot of what I experienced -- http://i45.tinypic.com/2aigegx.jpgGood reading, as always.
Ryan Smith - Tuesday, October 23, 2012 - link
Thank you for reporting that. I'll see if I can track down the source and get them removed.artk2219 - Tuesday, October 23, 2012 - link
Thanks for the review and all of the time that went into it! It looks like I can finally start recommending something that isn't a Phenom II from AMD. Even though I personally love AMD and haven't used an intel chip since the P3 coppermine days, I couldn't recommend anything from AMD to anyone else this past year with a clear conscience, at least if it wasn't a stand alone CPU upgrade. But from the tests that I could find that compared the fx 4300 to the i3 3220 it looks like its mostly a wash between them, well other than power usage and thermals that is, so that was looking great! Is there any chance of a stand alone review in the future comparing the two? It looks like the 8350 performs about like a 2500K which is also awesome, unfortunately it's at least a year late to the party :(. This does however give me great hope for Steamroller, the only issue I had with the article is with the games selection (too few and too intel biased in the titles (SC2 and WOW)), might there be a more in depth review on that in the future as well? Thanks again for the review though!CeriseCogburn - Tuesday, October 30, 2012 - link
Wow hi there. Glad to meet you, an amd fanboy with the glimmers of a conscience.Now, don't forget the i2500K Oc's from 3200 right on up to 4500 (and beyond) on stock voltage and crap stock fan, so it actually SMOKES the daylights out of this fishy vishy.
Just thought I'd mention that.
SilthDraeth - Monday, October 22, 2012 - link
That was weird, was reading this article about Vishera and it got pulled in the middle of me reading it.coder111 - Tuesday, October 23, 2012 - link
Now what I'd like to see is how many of these benchmarks are compiled with intel compiler. In case you don't know yet, Intel Compiler disables a lot of optimizations if you are not running a Genuine Intel CPU, even if your CPU supports required features and would benefit from these optimizations. In other words, anything compiled with intel compiler will run slower on AMD cpus just because of Intel compiler.Now you can argue that this is a reflection of real performance on Windows, as in Windows quite a few of DLLs are compiled with Intel Compiler as well.
What I'd like to see is some more benchmarks for Linux operating system and/or professional software. Things like data base servers (including something non-Microsoft, like PostgreSQL or MySQL), java application servers, GCC compiler benchmarks, apache/PHP server, virtualization, python/perl/ruby, LibreOffice/OpenOffice productivity.
Now, back to Vishera. This looks like a nice CPU. I haven't been CPU bound in my work for a while now, so performance wise this would be sufficient for my needs. What I'd like to see however is lower power consumption. Unfortunatelly I don't see that coming until Global Foundries minimizes their process...
CeriseCogburn - Tuesday, October 30, 2012 - link
So amd has abandoned gamers on the cpu side.Good to know.
Blibbax - Tuesday, October 23, 2012 - link
The dragon age origins graph has the 8150 in blue and no i3.Blibbax - Tuesday, October 23, 2012 - link
SC2 and WoW also.dishayu - Tuesday, October 23, 2012 - link
7zip as wellBlibbax - Tuesday, October 23, 2012 - link
"As I mentioned earlier however, this particular test runs quicker on Vishera however the test would have to be much longer in order to really give AMD the overall efficiency advantage. "If you think about it, efficiency is unrelated to length of test.
CeriseCogburn - Tuesday, October 30, 2012 - link
He was talking about electrical usage vs work done, hence with amd's higher per second use of electricity, it must complete the test MUCH faster than Intel in order to win that.It completed faster, but not fast enough to use less power.
This lesson is over for amd.
iTzSnypah - Tuesday, October 23, 2012 - link
In price per performance. A 125w part beating a 67w (Not sure about that figure) will cause Intel to keep the same TDW for 2014 and just have a 35-40% performance increase. I can only hope.CeriseCogburn - Tuesday, October 30, 2012 - link
If you're used to running or servicing Intel cpu's then you pick up the LEAD WEIGHT that is the modern AMD cpu, all that HEAT SINKING comes to mind.I mean they are just honkers. You pick it up and it's like what the heck !?
meloz - Tuesday, October 23, 2012 - link
I wish Anandtech would include some form of value scatter graphs like Techreport does in its reviews. The graphs do not have to be an exact imitation of what Techreport does, and the benchmark(s) used to determine the 'overall performance' can be different. Perhas we could even get performance per watt per dollar graphs.Graphs like these make the whole exercise of comparing competing products so much more relevant to users, because most of us will buy the most performant processor per dollar.
As example:
http://techreport.com/r.x/amd-fx-8350/value-scatte...
http://techreport.com/r.x/amd-fx-8350/gaming-scatt...
This is, of course, considering the result without any attention to performance/watt. If you include power consumption in the calculations at all, Intel is an easy choice.
Difficult to see how AMD will cope with Haswell, even if they get another 15% boost next year. The gap in performance / watt only seems to be diverging, Intel taking a commanding lead.
CeriseCogburn - Tuesday, October 30, 2012 - link
So did you buy the i5 3470, or the FX 6200 ?According to you and your 1st chart, that's what "most of us bought". Okay, since we know that's total BS, what you said is also total BS.
" because most of us will buy the most performant processor per dollar "
LOL - okay, so there's a big problem bub - OC the 2500K and it skyrockets off the top of your 1st chart straight up.
So, did you buy the 2500K, like "most of us did" if we "used your declared knowledge about us all" and added 2 watts of common sense into the mix ?
Why must you people torture us so ?
Idiot10 - Tuesday, May 7, 2013 - link
Hey Mr. ChariseHogburn, why don't yoy take your 2500K with you and leave us all to our musings? You seem to know everything about processors why don't you let others do what they want to do? You big piece of Intel mercenary shit! SOB!!!!Mathos - Tuesday, October 23, 2012 - link
It does give a reason and an upgrade path to finally move up from my aging P2 1090T. One of the main workloads I do when I use my PC heavily is indeed easy h.264 encoding for game and other types of video. Always nice to be able to knock a video file down from 2.5GB to 200-500MB. I've personally always used MSI or ASRock boards myself, with some Asus boards when I can catch the price right, in reply to the board used for the benchmarks.I noticed there are overclocking numbers that do look decent. Some things I'm curious about. How do they take to undervolting? My luck with previous AMD generations has been pretty good when it came to that. At least when I felt like tinkering. Use to be able to run the old 9600be and 9850be considerably lower than stock voltages for example, at stock speeds, and some times even with mild overclocks on the NB's. I've noticed with that AMD tends to be fairly conservative.
And since they appear to still be using the same IMC/L3 speed linked to the north bridge hyper transport speed. How does upping the actual speed of the NB IMC/L3 effect the performance and stability of the platform. I know back in the day of the 9600be/9850be I could generally get them close to the same performance level as a core2 quad at the same clock speeds through that kind of tweaking.
And on a final note, it's a nice performance increase overall, even in single threaded apps, over the bulldozer cores. But you'd think they would of implemented a way to gang the integer cores and make them act as a single core for single threaded performance. That's all it would really take the pick up a bit of the slack I think.
jensend - Tuesday, October 23, 2012 - link
Why the heck are you starting your power consumption charts at 50W rather than at zero?That's *extremely* misleading, wildly exaggerating AMD's disadvantage. AMD has roughly 2x the power consumption of IVB at load and 1.25x the power consumption at idle- but by starting your chart at 50W you're exaggerating that into over 3x at load *and at idle*.
*Please* get yourself a copy of "The Visual Display of Quantitative Information" and read the section talking about the "lie factor" of a graph or chart.
Spunjji - Tuesday, October 23, 2012 - link
I think they are anticipating their readership noticing that the graph starts at 50W, just as you did.kevith - Tuesday, October 23, 2012 - link
They probably do. But that´s not the point. A GRAPH is meant to show a string of figures as a drawing.When a graph starts at anything but zero, it will not show a true picture.
With two pieces of something to compare, where both lay in the area between say 90 and 91 of some kind of value..
If you then make a graph, thats going from 89-92 in 1/10´s, you wil get a graph, that shows a very uneven curve, going up and down all the time, with seemingly big differences in values.
But if it started at zero, like it´s supposed to, you would see a almost straight line, reflecting the true picture: These two things are practically alike in this specific area.
IF you don´t make a graph like that ALL THE TIME, there´s no need to make a graph at all, you could just write the values as figures.
CeriseCogburn - Tuesday, October 30, 2012 - link
No spooge, it's called amd fanboy advantage, that is what should be always anticipated, and is actually always provided.Pythias - Tuesday, October 23, 2012 - link
Why was the i3 dropped from some of the charts?CeriseCogburn - Tuesday, October 30, 2012 - link
Because it kicked so much amd pileofcrap.redwarrior - Tuesday, October 23, 2012 - link
Anands testing was the usual lazy-designed testing with poor planning. Why run sysmark, that every one knows uses testing methods that tend to ignore multi-threading.. Keep the test on applications only and make sure your gaming apps are representative. I saw better testing done on several other websites where the usually poorly designed and coded trash was balanced with other games that did employ some level of multi-threading The FX-8350 did immensely better in that gaming selection. Mostl gamers are not shoot-em-up fascist gamers. There is no reason for Anand to stack the game selctions in the single-threaded direction only. I beleive Anand is a shill for Intel and chose the stupid sysmark tests and the game sin such a fashion to downplay the vast performance improvments that are possible from the FX-8350 cpu. That is one reason I do NOT spend much time on this site any more.There is nothing I detest more than intellectual dishionesty. Check out Tom's hardware their review was done more scientifically and had a balanced selection of tests. The Vishera FX-8350clearly bested the I5 3570 in most tests and was the best performance for the buck by far. A better objectively designed test. No axes to grind. To hell with Anand, unofficial Intel shill and LAZY intellectually.
A5 - Tuesday, October 23, 2012 - link
The gaming benchmarks contain exactly 0 FPS games. You should really read the article before commenting.kevith - Tuesday, October 23, 2012 - link
I could eat a dictionary and SHIT a better comment than that.CeriseCogburn - Tuesday, October 30, 2012 - link
" The FX-8350 did immensely better in that gaming selection. Mostl gamers are not shoot-em-up fascist gamers. "LOL - wrong again, but your lefty whine was extremely amusing. Most gamers are shoot em up gamers, YOU LIAR.
GullLars - Tuesday, October 23, 2012 - link
What they didn't show you is what happens when you OC the K parts from intel. They end up roughly in the same TDP as AMD on stock, and just crush them on performance, so if you are going to use the same aftermarket cooler and will overclock either pick, the Intel parts can go further and win across the board.Compared to the cost of an entire system ~$100-150 extra on the CPU for a 3770K might end up being around a 10% increase in total build cost, and giving you anywhere from 10 to 50% better performance (or more in some corner cases).
I'm happy with my OC'd 3930K :D (compared to to the rest of my equipment, the price was no problem)
Probably upgrading my C2D 2.0GHz + 8600GTM laptop to a Haswell with only IGP in 2013. With an SSD and RAM upgrade it has had surprisingly good longevity.
CeriseCogburn - Tuesday, October 30, 2012 - link
Exactly, all the amd cpu unlockers, all the dang phenom 2 overlcockers - totally silent.The 2500K smoke sit up to 4500mhz like butter, but heck, when something is freaking GREAT and it's not in amd's advantage, it's memory hole time, it's ignore it completely after barely mentioning it, and just as often continue to declare it an unused and unliked and proprietary failure that holds everyone back - whatever can be done to lie and skew and spin.
It's just amazing isn't it.
eanazag - Tuesday, October 23, 2012 - link
I had pretty low expectations admittedly and for good reason. This is a great step because I would have to think platform vs. platform where before I didn't even need to consider it. Intel has a huge thermal advantage, which indicates that Intel could easily come out with a proc at anytime to take advantage of that headroom should AMD ever become slightly threatening, and sadly still be below performance per Watt. I could reasonably consider AMD now for my desktop. Where AMD fails is at the platform level. They have underperformed in the motherboard department. If they wanted to sell more, they need to give serious advantages at the motherboard + CPU purchase level versus Intel. They have been fine with a mediocre chipset. They failed when they locked out Nvidia from chipset development, same with Intel. AMD and Intel's greed was a big loss for the consumer and part of the reason why AMD doesn't compete well with Intel. Nvidia made better chipsets for AMD than AMDs ATI chipset development team.It is even worse on the server side for AMD. AMD needs to include 10GbE on the board to really question Intel and this is doable. AMD cannot compete on process (22nm and 3D transistors) unless Intel closed shop for 18 months.
CeriseCogburn - Tuesday, October 30, 2012 - link
LOL - thank you so much.Yes, the hatred for nVidia (and Intel) cost the amd fanboys dearly, but they really, really, really enjoyed it when nVidia chipsets got shut down.
nVidia was going to be destroyed then shortly thereafter according to them, and thus we heard a couple years of it from their freaking pieholes - amd would drop the price and knife nVidia into oblivion... and bankrupcty... cause amd has a "lot more room" to price drop and still rake in profit... they said... LOL - as they whined about nVidia being greedy and making so much money.
Yes, they were, and are, nutso.
Oh the party days they had... now they have CRAP. LOL
Ken g6 - Tuesday, October 23, 2012 - link
Anand, I think you missed a diamond in the rough here for gaming. Obviously, if someone has the money for an i5, they should get an i5. But if they don't, the FX-4300 looks like a better gaming choice than an i3. There are cheap mobos available, so overall cost is less than an i3, and the FX-4300 is overclockable. OCed it may even approach low-end i5 performance - at a power cost, of course.tekphnx - Tuesday, October 23, 2012 - link
For only $10 more I'd say the FX-6300 is the better bet. Slightly lower clock speed but still overclockable, and I don't think it will be that long before six cores become really relevant for gaming.lmcd - Tuesday, October 23, 2012 - link
Worst case you can probably disable a core and OC more, right?c0d1f1ed - Tuesday, October 23, 2012 - link
The Projected Performance page doesn't appear to take AVX2 and TSX into account, not to mention improved Hyper-Threading performance by having an extra execution port. The 10-15% number is for single-threaded workloads only. Everything else will see a much bigger leap.AMD is screwed unless it adds AVX2 and TSX support sooner rather than later. They haven't made a single mention of it yet...
tekphnx - Tuesday, October 23, 2012 - link
From a price-performance perspective, I think that FX-6300 is the most interesting part here. For barely more than the i3-3220, you get essentially the same performance in games and 20-30% better performance in multithreaded applications. And, as time goes on and games become more multithreaded, the FX-6300 will pull ahead in games, too. At 95w, the power consumption is much higher than Intel, but it's manageable. Plus, it's overclockable, unlike the i3.Spunjji - Tuesday, October 23, 2012 - link
Add in the potential for ~10% savings under load with a bit of undervolting and it looks even better. One to watch for those who, like me, would rather buy AMD if it doesn't smell like shooting yourself in the foot.CeriseCogburn - Tuesday, October 30, 2012 - link
Got any toes left ?lol
Didn't think so.
rocky12345 - Tuesday, October 23, 2012 - link
HelloI am a long time reader of this site pretty much since it first started. Heck in the time this site has been running I have gone through probably 35 or more computers & currently have 11 in my home that are used for different tasks. I also went through a wife but thats another story to tell lol.
Anyways what I was wondering is this. In the past with AMD I noticed a huge gain in CPU out put when raising the bus speed up to as far as the hardware would go. What I am wondering is what if you took a 8350 raised the bus up as high as it would go but keep the multi set that the CPU would run at default speed & do some test in both single threaded & multi threaded programs as well as a few games where this CPU is a bit lacking to see if the CPU itself is being held back by the bus. Then try to do both the bus at high & raise the Multi to the max CPU speed of 4.8Ghz & see what raising both the bus & CPU speed do.
I am hoping it has the same effect as it did in the older AMD CPU's & gives a nice boost. I think maybe the bulldozer & piledriver core might be held back by a lack of bandwidth to the rest of the system resources. If not then at least it was a fun little side project for you guys. maybe raise the memory speed as well to make sure that is not the issue too. Just an idea that may open up some hidden performance in the CPU hopefully. I would do it myself but at the moment the only AMD system I have left if a older Athlon 64 x2 6400+ that the step son uses to surf the web & play a few games on.
thanks
lmcd - Tuesday, October 23, 2012 - link
If I remember right with AM3+ there isn't really a bus to raise.CeriseCogburn - Tuesday, October 30, 2012 - link
Oh yeah, how come we don't have the big speculation all the time solidified into the standard commentary and "looking for the validation on the tests" concerning what's wrong with the amd cpu's ?I mean we get that argument when it's time to attack their competitors in the articles here, the "well established theory" that "turns out to be wrong" "years later" is used as the "standing thesis" as to why " xXxxxx" did so poorly in this test against AMD, "as we expected" says the reviewer, making sure to note the "weakness suspected" has reared it's ugly head...
Yeah, another thing that is so bothersome.
I guess amd's architecture sucks so badly, nothing can explain it's constant failures.
boeush - Tuesday, October 23, 2012 - link
Second, computing is increasingly going mobile (laptops, tablets, phones, phablets, etc.) The number one thing for mobile is power efficiency. AMD's CPUs absolutely suck at that; they are multiple generations behind Intel (and not all of that can be blamed on process lag.)
Third, AMD's trump card in the consumer space so far has been integrated graphics, but with Haswell and then Broadwell Intel's going to take that advantage away. So, by 2014 AMD won't have any feature set advantages left.
Fourth, AMD's other hopes have been in the HPC/server domain, but there again power efficiency is getting increasingly more important, and AMD is losing the war. Moreover, with its new MIC architecture ("Xeon Phi") now debuted (and it will be continually refined going forward) Intel's poised to capture even more of the HPC market, and AMD currently has no answer to that product line.
Seems to me that AMD is hosed on all fronts, unless they can pull not just one but a flock of fire-breathing dragons out of a hat, soon if not today. Firstly, by 2014 Intel will be already on a 14 nm process. Add Intel's already superior (3D fin) transistor technology, coupled with massive R&D budgets on a slew of parallel projects to further refine the basic process tech, and the situation is not going to get any prettier for AMD any time soon.
lmcd - Tuesday, October 23, 2012 - link
The "process advantage" didn't do shit for Intel. Note how Ivy Bridge had horrible overclocking compared to anticipated, Intel's shift to mobile architecture benefits in new architectures, and that Microsoft is finally looking at helping AMD out with things like the scheduler in Win8, etc.AMD is succeeding at power efficiency in mobile: the Trinity review correctly indicated AMD's success there in nearly closing the gap to Sandy Bridge (and Ivy? I forget now. It was close though).
Finally, IT IS A PROCESS NODE CHANGE. It's 28nm. BD/PD are 32nm. Besides, a GCN IGP is really attractive, and will still dominate versus Broadwell. VLIW5->4 was a tiny change and yet AMD managed to pull a pretty nice performance jump out of it; GCN was comparatively huge, designed for 28nm, and set to scale better as evidenced by the already low-power 7750 desktop.
The only worrying thing about Steamroller is whether the caches and memory controller speed up. If those don't, the platform is likely to be bottlenecked horribly.
c0d1f1ed - Tuesday, October 23, 2012 - link
That's partially due to the lower quality TIM they used on Ivy Bridge, and mostly due to bringing the Sandy Bridge design to the 22 nm TriGate process in a verbatin manner. Despite being a radically new process, they didn't change anything in the design to optimize for it. Haswell on the other hand is designed from the ground up to use the TriGate process. So it will show the process advantage much more clearly.
CeriseCogburn - Tuesday, October 30, 2012 - link
Yes, also partially perception, since SB was such a massively unbelievable overclocker, so expectations were sky high.I'd like amd fan boy to give his quick amd win comparison on overlcocking Ivy vs whatever amd crap he wants opposite.
I more than suspect Ivy Brdige will whip the ever loving crap out the amd in the OC war there, so amd fanboy can go suck some eggs, right.
I mean that's the other thing these jerks do - they overlook an immense Intel success and claim 2500K oc power is not needed, then babble about " the next Intel disappointment " immediately afterward, not compared to their amd fanboy chip, but to Intel's greatest ever !
I mean the sick and twisted and cheating nature of the amd fanboy comments are literally unbelievable.
zilexa - Tuesday, October 23, 2012 - link
He writes "Brazos had a mild update" uhmm noo sorryy. Brazos 2 hasn't been released to market.. so we havent seen any update here. Same for Trinity although you can buy some Trinity laptops. Not much. But nothing on desktop.Spacecomber - Tuesday, October 23, 2012 - link
Since AMD seems to be pushing multi-threaded performance at this time, it would have been interesting to see how this is born out in a more multi-threaded game, such as the Battlefield series titles. I know that in benchmarking my six core 960T (with unlocked cores), I could see some performance advantage between running this CPU with 6 cores versus the default 4 cores playing BFBC2. I'm not saying that this is where AMD will outshine Intel by any means, but it would have been an interesting test case for comparison's sake.(I actually suspect that by the time you go from 6 cores to 8 cores, you will have run out of any significant advantage to being able to handle the extra threads.)
BellFamily7 - Tuesday, October 23, 2012 - link
I think in Final Words Anand should add: "And the 4 year old Intel i7-920 - what a chip that was/is!" It is startiling to see AMD barely keep-up with the venrable 920 a good four years on - and those first charts in this article are with the i7-920 at stock speed. The average enthusiast is running an i7-920 at 3.8 Ghz on air all day long and achieving performance on par - or better - than many of today's CPU's!Of course, there is one big downside - power. This is Intel's big story to me: the speed and power of an O/C'd i7-920 on one quarter (or less) the power. Cool!
Thanks for putting the old i7-920 in the mix - it shows just what a ground breaking design it was...and in many ways, still is.
Senti - Tuesday, October 23, 2012 - link
Indeed i7-920 is the most awesome CPU in those graphs considering its age and nice overclockability. If there was overclocked version of it graps would look pretty funny.I use i7-930 @ 4.1 for a long time now and just can't justify my itching urge for upgrade. More than that, it'll probably survive here for 2 more years until Haswell-EP as plain Haswell looks handicapped in terms of compute power in favor of iGPU and power draw. I do NOT need power-restricted desctop CPU – with power saving features it'll do fine on idle with any max TDP.
ClagMaster - Tuesday, October 23, 2012 - link
With comprehensive ECC at the same price this would make a good server or workstation chip.AMD needs to get a 22nm process going and start some serious architectural soul-searching.
bwcbwc - Tuesday, October 23, 2012 - link
I don't understand why you keep saying that the 6300 fails to beat Intel at it's price point for the multi-threaded tests. At $130-140, the 6300 is going up against the core i3's and the multi-threaded benchmarks show the 6300 beating the core i3. Seriously: what am I missing here?Rhezuss - Tuesday, October 23, 2012 - link
I'd have loved to see the PHenom II X4 980 BE or any X6s in the comparisons...nleksan - Tuesday, October 23, 2012 - link
I can't see this as anything but a "win" for AMD, although there are certainly some sad feelings lingering about as I read this article regarding the 15% employee layoff that recently occurred. The promise AMD made was 10-15% improvements in IPC, and we certainly got that; not only that, but at a lower price than the first generation, AND with some very promising overclocking potential based on the scaling shown in this article.However, I refuse to acknowledge these as "FX" chips. "FX" was the designation given to the very first CPU produced by AMD that outperformed Intel's best offerings by a significant margin, the Socket 940 FX-51 2.2ghz single-core CPU based on the Opteron version of the Athlon64 architecture. The reason for my petty "harrumph-ing" is that I own one of the very first FX-51 chips released (from the first batch of 1,000), purchased the day of release back in 2003 alongside an Asus SK8V motherboard with 2GB of Corsair XMS3200RE DDR-400 Reg/ECC Dual-Channel RAM, and which served admirably with its brother the X800XT-PE 256MB GDDR2, until its well-deserved retirement in 2009.
That chip was, and to this day is, my favorite CPU of all time. It was a quirky chip: a server-backbone (and consequent unusual socket choice), ahead-of-its-time 64bit architecture, record-setting bandwidth at a "low" 2.2Ghz while Intel was trying their darnedest to hit 4.0Ghz with the P4, no real options in terms of a future upgrade path, and its champagne-tastes that could only be satiated by incredibly expensive Registered memory. However, it was FAST as all get-out, ran nice and cool with a Thermaltake Silent Tower Plus, and had a good amount of overclocking headroom for the time (an extra ~200-280Mhz was common).
Oh, and it DEMOLISHED the Pentium IV Emergency Edition CPU's that came clocked 55% higher! Paired with the best video card the world had ever seen at the time, it was unstoppable, and I recall running 3dMark for the first time after the build was finished only to nearly poop myself, as this 2.2Ghz chip STOCK just out-performed every single Intel CPU on the charts outside of those OC'd with the help of LN2.
I am working right now to rebuild the rig, as I feel it is time for it to come out of retirement and have some fun again, and I want to see just what it really is capable of with some better cooling (extreme-air or decent water).
[SPOILER]I have already lapped the CPU and the block (I am amazed at how poorly the two mated before; the chip AND cooler were noticeably convex), and based on the flatness of each it will certainly be good for a few degrees; add in the magic of today's best TIM's (PK-1) compared to that of 2003, the wonders of modern computer fans via 2x 92mm 3800rpm 72cfm/6.5mmH2O fans doing push-pull, and an extra 3 intake fans feeding it fresh air.... It will be a fun way to bring a memory back to life :D
Plus, the X800XT-PE has been thoroughly prepped for overclocking, with a 6-heatpipe heatsink and 92x20mm (61.3cfm/3.9mmH2O) swapped for the stock unit and mated with PK-1, EnzoTech Pure-Copper VGA RAM-Sinks attached to all of the card's modules with PK-1 and less than a needle-tip's worth of superglue at two of the four corners of each, and the same for the MOSFET/VRMs on the card. Combined with a pair of 120mm 69cfm fans blowing air across it (mounted on the inner side of the HDD cage opposite an intake fan), an 80x15mm 28.3cfm fan mounted to blow air directly on the back of the card, a PCI slot blower fan pulling hot air from the card and exhausting it out the back, as well as an 80x25mm 48cfm fan mounted where the lower PCI brackets used to be exhausting air... I think it'll do just fine ;)
[/SPOILER]
However, I am not taking any sides in this "CPU WAR". The minute one company starts to seriously pull ahead, the competition is lost, and we ALL lose. Innovation will become scarce, people will become excited about 5% IPC improvements from generation to generation and fork out the money for the next "great thing" in the CPU world, not to mention the cost for the constantly-changing socket interface.
AMD has been in a bad way for some time now, pretty much since the Core processors from Intel began to overrun their Phenom lineup. Sure, they had some really amazing processors for the money, such as the Phenom II X4 965BE/980BE/960T and X6 1055/1090/1100T, but Intel was still the performance leader with their E8600, Q6600, and the many QX9xxxx processors that transitioned into the still-strong X58/1136 platform (with the 920/930/975X/990X standing out), and they have only gained traction since.
I am no fanboy, and I hate to get onto any enthusiast site and scroll through comments sections where pimply-faced, Cheetoh-encrusted, greasy-haired know-it-all loser's frantically type away in a "Heated Battle of 'Nuh-Uh's' and 'Yuh-HUH!'".
(that is called hyperbole)
Fortunately, at least for the most part, I don't see that here.
Perhaps we should all go out and buy one of these new chips, maybe for a build for a friend or family member, or a home-theater PC or whatever, but regardless of whether you "Bleed Red" or "Bleed Blue", both "sides" will win if AMD gets the money to truly devote enough resources to one-upping Intel, or more likely, coming close enough to scare them. When the competition is closest, only THEN do we see truly innovative and ground-breaking product launches; and at the current rate, we may be telling our grandchildren about how "once, a long time ago, there was a company.... a company named AMD".
For the record, I AM NOT in any way a Fanboy; I buy whatever gives me the best bang-for-my-buck. Fortunately, at my job I am the only "tech-y" person there so whenever there is an upgrade in someone's equipment, or even servers, I get the "old" stuff :D I have sold literally hundreds of CPU's off that I had no use for, but I kept the favorites or the highest-end in each category that I was able to get. However, many of them I purchased myself (Opteron/Xeon from work, the rest I bought 90% of).
Here's a list of processors I currently have in possession, in my house, in the best reverse-chronological order I can remember:
i7-3930K (24/7 4.6Ghz - Max 5.2Ghz), i7-3820QM, i7-2600K, 4x Xeon E7-8870 (got 8 for $2k from work, sold 4 @ $2k/ea and built a Bitcoin Miner that earned me ~6,700Mh/sec with 4x 5970's in CF-X; earned over 1200BC and cashed out when they peaked at ~$17/ea for a massive profit and eventually stopped mining), i5-2400, Xeon X5690, i5-2430M, Opteron 6180SE 12-core, Xeon W3690, Phenom II X6 1100T-BE, Phenom II X4 980-BE, Phenom II X4 960T-BE (built girlfriend a rig: best CPU for $$; unlocked to 6-core; hits 4.125Ghz 6C / 4.425Ghz 4C), Xeon X7460, Core2Quad Q9650, 4x Xeon X3380's, Opteron 8439SE, Xeon X5492, Core2Duo E8600 (from Optiplex 960, hits 4.5Ghz on air), Core2Duo T7400, Athlon II X4 640 (E0), Athlon II X4 650, Pentium Dual-Core T4400, Turion II N550 Athlon X2 7750BE, AMD FX-62 (3.25Ghz easy), Xeon X3230, Athlon X2 5200+, Opteron 890, Turion II Ultra M660, Athlon64 X2 6400+ BE, Opteron 185, Athlon64 X2 4800+, Opteron 856, Opteron 156, AMD FX-51 (24/7 2.45Ghz stock voltage), Opteron 144 (OC'd to ~2.6Ghz), Turion ML-44, Pentium 4-EE 3.46Ghz (could barely hit 3.5Ghz...junk), Pentium 4 3.2Ghz, Pentium 4 2.8Ghz (easily ran at 3.6Ghz on air 24/7 with +0.015V, awesome CPU!), Celeron Mobile 1.6Ghz, Pentium 4 2.4Ghz, Celeron 1.8Ghz, and plenty more....
***Have a set of 8x Xeon-EP E5-4650 8-core's coming when we upgrade again in January; they are upgrading the whole rack so I am getting, along with the chips: 3 total 4-CPU boards, 384GB of DDR3-1333 Reg/ECC, the entire cooling system, 16x LSI/Adaptec RAID Controller Cards (all PCI-e x8, support minimum 24x SAS 6Gbs drives, have between 1 and 4GB of Cache, and all have BBU's), 96x 150GB 15Krpm SAS6 + 48x 600GB 15Krpm SAS6 enterprise drives, and about two-dozen Nvidia "professional" cards (12x Tesla M2090's, 4x Tesla K10's that were used to evaluate platform, and 8 Quadro 6000's) all for $1900!!!!!!!! The supplier offered $2150 for "Trade-Up" but I am really good friends with the entire IT department (all 6 of them) and they offered them to me instead! FOLDING@HOME WILL BE SHOWN NO MERCY!
CeriseCogburn - Tuesday, October 30, 2012 - link
Ok idiot, I've had enough already.First of all, nice amd fanboy story. Before you go insane claiming over and over again it is not true, I want to point out to you your own words....
1. Intel has been dominating since core 2
2. Without amd competing there will be no innovation and tiny 5% will cost and arm and a leg and the idiots will be spending all their money buying it
Okay, let's take those 2 statements, and add in SANDY BRIDGE and it's amazing architectural jump.
Whoops !
There goes your sick as heck and fan boy theory.
Furthermore with your obviously supremely flawed BS above, you did your little amd fanboy promotion saying we should all go out and buy one of these amd chips for a family member or some upgrade - BLAH BLAH BLAH BLAH BLAH.
Let's add in your eccentric Fx chip story, your declaration it's your favorite cpu of all time, your bashing of Intel claiming only Ln2 could bring Intel close, and then your holy of holies the resurrection build...
OK ? Forgive any and all of us who don't buy your I am not an insane amd fanboy lines.
Look in the mirror, and face the dark side, let it flow through you Luke, you are and amd fanboy, and Intel will innovate and make absolutely amazing cpu's like the SB even when amd is slapping itself in the face and choking and dying ... feel the anger amd fanboy - amd is NOT NEEDED... let it flow through you amd fanboy, your journey to dark side is nearly complete...
When you kill your greatest FX cpu rebuild, you will have crossed over to the darkside !
Ukdude21 - Thursday, August 15, 2013 - link
Geez I think this intel fanboy should stfu and stop talking a load of verbal shit lol.redwarrior - Tuesday, October 23, 2012 - link
I looked over the tests this character devised. ONly a few were multithreaded. Tom's Hardware had a very through testing procedure , explaining eacg application and what it showed about the architecture of the various cpus being compared. They were very balanced in single threaded apps and multithreaded apps. They did NOT do a lot of synthetic benchmarks because many of them are skewed in a prejudicial way. He also used win zip , photoshop cs5, video editing software, etc. Games were not all single thread shoot-em-ups, they were a collection of widely diverse games.. The FX-8350 came out ahead of not only the I5 3450 but also the I5-3570. He had some criticisms of course , but he said it was the best bang for the buck in the $200 price space. This review was shallow and meaningless done by somebody who either is lazy or on a mission to discredit. By the way The FX-8350 had the highest score on win zip bettering even the I7 3770. This reviewer owes us a well-designed retest and apology for a bunch of misleading garbage.silverblue - Wednesday, October 24, 2012 - link
Well, that's the beauty of product reviews - there are multiple for a specific product, and all with different tests. What you need to do is find the test that matters to you, and if it excels at it, you may buy it solely based on that (even ignoring bad points). If, on the other hand, it doesn't perform so well in the discipline of your choice, that is really making your mind up for you to go buy something else.CeriseCogburn - Tuesday, October 30, 2012 - link
LOL - no that's the beauty of AMD's "we are not evil" LIE, and their "totally and completely proprietary build of the "open source!!!!!!!! not like nvidia physx!!!!"W I N Z I P
Now, all you freaking amd fanboy liars and losers have to be constantly reminded about your evil, sick, proprietary, "open source" AMD LIED AND COMPATIBILITY DIED - winzip BS !
LOL - let it dig into you fanboy, let it sink in deeply. All those years amd played your wet brains like limp noodles get played, and you scowled and spit and hated and howled nVidia and PhysX and open source and OpenCL and amd is not evil and they aren't thta kind of company and then you went and had the stupid 3rd grader amd gamers manifesto stapled to your foreheads....
LOL
You didn't find it in your evil fanboy manual to let your amd fanboy freind there know about the HACKING amd did on winzip ?
Tsk, tsk. for shame for shame.
Brainling - Wednesday, October 24, 2012 - link
Translation: I am either paid by AMD, or a total fanboi, and these benchmarks did not say what I want them to say. So I am going to come on here and plug a different reviewers website, that is known to be AMD biased, and tell everyone how unbias they are and how their conclusions are the right ones, because they agree with my world view.yumeyao - Wednesday, October 24, 2012 - link
I suggest stopping using x264 HD benchmark and looking for another test case.Let's look at what x264 HD benchmark does:
Source film:
MPEG-2!!! 6931kbps on avg, with a maximum bitrate of 12xxxkbps!!!
You guys know that MPEG-2 is DVD standard...... DVD has a resolution of 480p(720x480 for wide-screen), but for FullHD it's 1920x1080, 6 times pixels as DVD has! And dvd has a ~5000kbps bitrate on avg, so what quality of the source film could we expect??
And then let's look at its output:
OMFG! 8000kbps!! h264!!!! I'd say for such a source, 2000kbps is fairly enough for an h264 output....
So do you guys think such a test could ultimates a cpu's calculating potentials?
I suggest finding any ts/BD-ISO source, and use proper options on x264 (basically you can directly use --preset xxx), then use it as a reference...
Brainling - Wednesday, October 24, 2012 - link
It's 125TDP part that gets consistently blown away by the 95 TDP Ivy Bridge, which has more transistors and a smaller more modern node process....and at the high end, it's really not that much cheaper than an Ivy Bridge i5.*sigh* Oh AMD...how the mighty have fallen. Can the real AMD, the one that gave us Thunderbird and Athlon64, please stand up?
redwarrior - Wednesday, October 24, 2012 - link
To the Intel fanatics whose bottom-line is" My car's better than your car, my car's better than yours. What infantile sensibilities . The computer is a tool. A multifaceted tool that has 1001 purposes. The AMD technology meets the needs of 99.99% of computer users with a better bang for the buck. Only a one-dimensional person can say otherwise. Myopic gamers need to open their eyes and see there is a bigger world out there.CeriseCogburn - Tuesday, October 30, 2012 - link
Here we go again, the activist on another preaching rampage, with his attack on Intel cpu owners.... nice little OWS protest against the rich Intel people...You wouldn't mind then if I said I can't stand you cheap, broke, ghetto amd dirty little rascals who can't pay for themselves let alone the education they need to properly use a computer.
Not to mention your ignorance in supporting a losing, technologically backwards second tier set of idiots wasting monetary resources that could be spent on something good for the world instead of on foolish amd misadventures that pay interest on amd's debt and not much else.
You ought to support the company that pays a LIVING WAGE, instead of the one firing their employees, axing them over and over again.
Thanks for not being capable of properly acquiring and using a computer.
7beauties - Wednesday, October 24, 2012 - link
I've rooted for AMD against Intel before I built my first PC with the 700Mhz Athlon in 2000. AMD stole Intel's thunder to much acclaim. For a while AMD and Intel dueled for supremacy, exchanging leads, much like the tit for tat between Radeon and Geforce GPU's are engaged in. AMD's scrappy fight spurred Intel's clock to speed up its ticks and tocks, and the computing world benefited from this. It would be bad for all of us if AMD goes out of business. I root for the underdog, for David against Goliath, but David is lying on the ground and boasting of winning. It was embarrassing when the Phenom was so unphenomenal. Then AMD heralded the Bulldozer. Bulldoze what? The empty hype makes the truth more painful. Intel plans to integrate the South Bridge onto Haswell's die, and folks, AMD will lose teeth and get bloodied. I'm growing weary of being a sort of Cubs fan.CeriseCogburn - Tuesday, October 30, 2012 - link
You simpletons all have the same hate filled idiot theory - so let me ask you - since amd has competition, WHY DO THEY SUCK SO BADLY ?Somehow you idiots claim, that if amd wasn't around, intel would suck. "Amd has made intel great"
Well, wait a minute - Intel is around, it's great, AND AMD SUCKS.
Take a moment, look in the mirror, think about it.... then let me know how red you turned... if not at all, contraception from here on out is a must.
How are you people so stupid ? How is it possible ?
Ukdude21 - Thursday, August 15, 2013 - link
You the biggest idiot on this website. I have read many comments on this website but yours are the most idiotic intel fanboy stained comments ever.halbhh2 - Thursday, October 25, 2012 - link
If power use is important to you, you should know that different reviews give different results for the power use vs competing intel chips.A couple of sites even have equal or lower idle power draw for the 8350 vs i7 3770.
Trying to figure out why, one variable is the motherboard. Is the Crosshair V a power hog?
I also looked at yearly cost in electrical use for my own useage.
The only thing I do that pegs multiple cores at 100% is chess analysis. In Deep Fritz the 8350 is close in performance to the i7 3770.
I do chess analysis about 1-5 hours a week on average, perhaps 200 hours per year.
The math is very simple. Power costs 16 cents per kilowatt hour. Peak power useage would cost an extra $3/year roughly vs an intel rig for me. Since I'd use a more power efficient motherboard than the Asus Crosshair, idle power is reasonable. I standby a lot when not using also.
An 8350 would cost me in the range of $4-$8 more per year in power bills vs an i7 3770 (it's competitor for chess analysis).
CeriseCogburn - Tuesday, October 30, 2012 - link
So go ahead and destroy the earth, see if any humans care.Ukdude21 - Thursday, August 15, 2013 - link
If you are the worried about the earth why don't you give your pc away. Least then we would not have to read your shit comments lol.taltamir - Thursday, October 25, 2012 - link
Starting the power cosumption graphs at 50 watt instead of 0 watt is GROSSLY MISLEADING! and very unfair to AMD.Lack of performance per watt comparison is unfair to Intel. Yea, AMD finally is able to, at stock, beat intel on some benchmarks... But they consume significantly more power to do so (intel could easily start selling higher clocked parts too)
pcfxer - Thursday, October 25, 2012 - link
If I ever build a new machine...it looks like I'll swing towards my first ever Intel box...hrmmm the anticipation may make me do it just for fun even though my Phenom II X555BE Unlocked and OC'd to 3.5GHz serves me just fine.OCedHrt - Friday, October 26, 2012 - link
it would be nice if they were normalized to idle power usage since we are comparing CPU power usage.halbhh2 - Friday, October 26, 2012 - link
I got curious about the idle power and visited 7 sites to look at reviews. No 2 sites had the same idle power difference between the 8350 vs the i7 3770. Values ranged from 9 watts AMD *lower* (lower! than intel) to 22 watts higher. The higher readings seemed to all be with the Asus Crosshair V, which logically must be a power hog.You should consider the idle power numbers *not* representative. Unreliable.
danrien - Monday, October 29, 2012 - link
Seems like its server opteron cousin would be kick-ass.CeriseCogburn - Tuesday, October 30, 2012 - link
LOL - seems like... hahahahhahahah in some imaginary future in a far off land, if and when and only if amd does xxxx and yyyyyy and blah blah blah blah,.... blew it.g101 - Wednesday, November 21, 2012 - link
More extreme ignorance from the idiot CeriseCogburn. Little boys who only game should seriously consider not commenting on things they aren't capable of comprehending.Stupid little bitchboy CeriseCogburn...What a waste of oxygen.
DDR4 - Wednesday, November 7, 2012 - link
nice to see AMD make better procs and lower their pricesandrewkoch - Friday, November 9, 2012 - link
If you live in an area that requires A/C most of the year like me, the true cost of owning a FX8350 processor is about an additional $100 year vs. owing a 3570k.Fx8350 +15 watts idle +95 watts load vs. i5 3570k
50 hours week light cpu usage = 75W
10 hours week heavy cpu usage = 760w
Combined usage = 1025w @$0.11 Kw/h = $1.12
A/C usage 75%-80% @$0.11 Kw/h = $.84
Extra electrical cost $2/week
Extra electrical cost $100/yearly or $300/3 years
Maybe my math is wrong, but if you use A/C most of the year and pay for electricity an AMD cpu is a waste of money. Then again some people still use incandescent light bulbs instead of compact fluorescent lamps or LED bulbs.
andrewkoch - Saturday, November 10, 2012 - link
LoL my math was wrong in the above post.Fx8350 +15 watts idle +95 watts load vs. i5 3570k
68 hours week light cpu usage = 1kW
100 hours week heavy cpu usage = 9.5w
Combined weekly usage = 10.5kw @$0.11 Kw/h = $01.15
Average A/C usage 80%*$1.15 @$0.11 Kw/h = $.0.92
Extra electrical cost $2/week vs. owing a 3570k
Extra electrical cost $100/yearly or $300/3 years vs. owing a 3570k
In this usage scenario the computer is heavily used for tasks like folding, gaming or video editing
criter - Saturday, November 17, 2012 - link
121117Intel is a semiconductor company first; and a microprocessor company second ($13.5B revenue [#3.8B quarter])
used to make a living making memory chips (dram) until they became commoditized by Japanese rivals in the 1980s and margins plunged, now volume produced microprocessors happen to be the most profitable;
makes more useable chips per $5B-300mm & $7B-450mm/+40to120% extra chips wafer fabs, (90% yield range vs 60 to 80% rivals), moving to 22nm geometries, 14nm by 4q13;
Chips take about 3 months to make and they are put through more than 300 separate processes
intel atom vs brit arm (less pwr & customizable, 98% of cellphones since 2005 have had arms in them) for touch panel (w8) mobile/cell phone/tablet mkt
i3 55W ceiling...
amd Piledriver (x86 architecture) retail Oct 2012, 4c, 8c, L2&3 16MB, am3+, ddr3, dx11, 32nm; 1st gen Zambezi; (pdriver performs +15% than bulldozer;)
FX4300 8MB/95w, 3.8GHz, $131, 4c
FX6300 14MB/95w, 3.5GHz, $176, 6c
fx8320 16MB/125w, 3.5GHz, $242, 8c
fx8350 16MB/125w, 4GHz, $253, 8c
K10 4q07 (phenom x4/IIx3/IIx4, gen3 Opteron), K8 2q03 (Athlon 64/64x2, Sempron 64), k7 3q99 (Athlon/xp, Duron, Sempron)
?Trinity apu (AcceleratedProcessingUnit, pdriver) architecture (2gen x86 cores) Oct 1, integrated gpu, 'fusion' fm2 formfactor (previous llano use fm1 socket), dx11 (shader5), (radion A4-7480 to A10-7660 hd gpu)
Merom mobile arch (1q06, 667-800MT/s, 35W, bga479, socketM&P; updated 65nm [fab] Yonah core P6[pentium pro '95]) marked Intel’s acknowledging that the Pentium 4 (netburst arch) was not a viable long term solution because power efficiency is crucial to success;
65nm Conroe desktop (core2 quad, 800 MT/s, 65W, lga775); Woodcrest (lga771) 1333 scaled down to 1066MT/s workstation ([email protected], 80W 3GHz), server(xeon socket604&lga771);
differing socket(M,P,T,fcbga), bus speed and pwr consumption; the identical Core microarchitecture was designed by Israel's Intel Israel (IDC) team;
stepping represent incremental improvements but also different sets of features like cache size and low power modes;
Nehalem 45nm (16pipeline), Westmere 32nm; Sandy bridge 32nm, ivy bridg 22nm; Haswell 22nm, Broadwell 14nm; Skylake 14nm, Skymont 10nm;
sandy bridge 32nm integrated x86 microprocessor as SoC (cpu/gpu+last lvl cache+sys I/O) vs amd bulldozer
intel haswell dual-threaded, out-of-order cpu arch 22nm FinFET, (high end tablets, low pwr)
theoretical peak performance for Haswell is over double that of Sandy Bridge, twice the FLOP/s, cache bandwidth doubled, vs '13 amd Steamroller core & w8;
g101 - Wednesday, November 21, 2012 - link
What is this pile of copy/pasted shit?Principle - Thursday, November 29, 2012 - link
Why is it that every review I see, no one uses the RAM that the CPU memory controller is rated for? Or in overclocking, did you see what a RAM overclock added?Just because your Intel chip may not perform any better with 1333, 1600, or 1866, doesnt mean you dont run your AMD chip with the 1866 its rated for. I see this all over the review sites. You dont have to use the same RAM in both systems for a comparison of CPU performance. RAM choice is designed into the CPU, and has an affect.
ThaSpacePope - Monday, December 3, 2012 - link
Nice job AMD. I guess some of these benchmarks are skewed because windows 8 is superior to windows 7 because it uses the modules better, but still.. the $166 8350 outperforms in some cases or comes close to the performance of a $200 i5-3570k.This is great news for AMD. Keep up the good work!
jreyes2254 - Thursday, December 20, 2012 - link
I find it funny all the Intel lovers are hating on so called AMD "fanboys."Isnt defending Intel and hating on AMD in turn make you all Intel "Fanboys."?
...Intel has not served me well I havent used one since my ol P4... I gladly claim the title of AMD "Fanboy.".
I have this CPU, Its pretty Epic.
angusfox - Monday, December 31, 2012 - link
The Intel fanboys simply do not understand market dynamics. If if were not for AMD, Intel processors would be three or four times as expensive. In addition, competition drives innovation. Intel CPUs were mediocre until AMD created it first Athlon 64-bit processor that kicked Intel's P-4 butt in every measure of performance. Intel fanboys, you simply don't give credit where it it due. I am the first to admit that Intel's Sandy Bridge CPUs are, for the most part, better designed than competing CPUs from AMD. However, I feel obligated to do my part to keep AMD alive. It benefits the users of both AMD and Intel CPUs and it keeps prices down. Only an idiot would hope for Intel to drive AMD out of business. This is not football, basketball or NASCAR, you dimwits!GustoGuy - Thursday, January 17, 2013 - link
Exactly. I have been building AMD systems since my Athlon Xp 2500+ OC to 3200+ back in 2004. My last Intel was a fast Celeron back in 2003. If AMD were to go out of bussiness Intel would get complacent and jack the prices up. Competition is good for providing innovation. Imagine if Intel had a total monopoly and AMD never existed. I doubt that we would have anything near as good as the processors we have right now. Plus AMD has always offered good performance for the dollar spent. I never buy bleeding edge technology because it costs twice as much for just a small amount of perfomance advantage I like to buy AMD and then use the money saved to put a better GPU into the system since a FX8350 is not going to bottle neck any modern GPU and spend the bucks were it counts. I find all the talk about Fanbois hilarious. I like to build powerful gaming systems on a budget and when I am done with them they are sold on Ebay and I build another. I wish AMD well for if they were to fail then Intel would shoot their processor prices up into the stratosphere again. I felt Intel was surprised by AMD back in 2004 when they came out with the 939 dual core chips that pummeled the fasted PIV systems. In fact I had a Asrock dual sata 939 with a Dual core 939 3800+ that my son was using for his gaming rig until this Christmas when I built him a new AM3+ system with an Asrock 990Fx extreme 4 motherboard. Competition if good and I want to see it continue so thats why I continue to buy AMDiceman34572 - Wednesday, January 2, 2013 - link
I thought I came on this site to read in depth reviews, not to see a bunch of fanboys fighting each other for who has the better processor, using grade school humor to do so. I get sick of it. People, just buy what YOU want; I could not care less what you spend your money on. If you are happy with it, then that is all that matters.CeriseCogburn - Saturday, February 2, 2013 - link
If you came on this site to read the in depth review then do so, you idiot.
Oh wait, instead of doing what you demand everyone else do, you go down as low as possible while holding yourself up to the angelic light and claim everyone needs to stop fanboying .... well guess what moron - if you came to read the article read it, THEN LEAVE, or stop whining in comments too, being a fanboy of sorts, the one who, doesn't FREAKING REALIZE the article itself is a BIG FAT FIGHT BETWEEN INTEL AND AMD, YOU FREAKING IDIOT.
Have a nice day being a stupid lying fake angel.
Ukdude21 - Thursday, August 15, 2013 - link
Don't listen to him, he is just a silly bitch lol.nissangtr786 - Thursday, January 17, 2013 - link
Over 2x performance per watt skower then intel and 3x slower performance per watt when both overclocked. Its crazy how far intel are ahead and the most worrying thing is haswell is where intel will be going all out to bring better performance per watt and also beat amd trump card as in integrated graphics haswell will be amazing.Yes amd cost less then for most of there cpu's but you pay for hat you get. Amd are releasing cpu's that intel had 3-4 years ago that were better performance per watt. Also you are saving over 2x the electricity and completing tasks much faster on intel cpu's.
Put it this way if amd get to where intel will be with haswell in 2013, I mean for amd by 2016 or 2017 to get to where haswell will be in 2013 amd would have done a miracle as they are that far behind. I reckon amd are so far behind now that they will just target lower end market with there apu's for gaming.
lordxyx - Tuesday, April 2, 2013 - link
i just bought an FX8350 with a new mainboard and 7980 radeon. i think for an 8 core chip at 4ghz its a good price. hope it out performs my old core 2 q6600 @ 3ghz anyhow or ill feel like a complete sucker!ToastedJellyBowl - Friday, April 5, 2013 - link
A lot of people posting comments in this article are nothing but tools. For those of you who can't see beyond benchmarks and who think a slight advantage in a benchmark = blowing the competition out of the water, let me give you a lesson.There's a difference in say a 30% difference at say 30 FPS, and a 30% difference at say, 110 FPS. When both chips are performing at 60 FPS, there is no blowing the other chip aware. At that point, it's simply a stalemate. It's just a shame that Intel fanboys are too arrogant and also ignorant to admit this. They're so fixated on "OMFG, my chip gets an extra 11.71 FPS on a benchmark than your chip does".
Everything AMD has out there this side of a Phenom II X4 (or hell, even a low end AMD FX 4100) will run anything on the market maxed out at a solid 60+ FPS, given you are supporting it with a video card that doesn't hold it back. With that said, most people play with v-sync enabled anyways due to massive screen tearing with most games. What does it matter that a Core i7 is pulling 147 FPS and a AMD FX 8350 is pulling 107 FPS when your frame rate is just going to be locked down to 60 FPS anyways?
I know a lot of people like to be future proofed, and the more overhead you have over 60 FPS the more future proof your system is, but future proof by an extra year != blowing the competition out of the water. Gaming requirements has pretty much hit a brick wall. System requirements has not really went up much at all in the last 2-3 years. With a Phenom II X4 965 and a GeForce 650 Ti my system runs anything I throw at it at a solid 55-60 FPS on Ultra settings. If I threw a 650 Ti Boost, or even better a 660 Ti or a 680 in my system, everything would run even better. My CPU still never really gets maxed out in most games.
Anymore where the difference lies is how fast the CPU can encode and how fast the CPU can do other things that are not gaming related. That's where Intel is focusing right now, but as far as gaming, we've hit a brick wall, and have been behind that brick wall for several years now.
With that being said, I'm very proud of my AMD Phenom II X4 965 that is coupled with my GeForce 650 Ti. In many games I play with my friend, this hardware compared to his Core i7-920 overclocked to over 4.0GHz running GeForce GTX 470's in SLI. In some games, I was slightly below his performance. In other games I was equal to, and in a few games my system actually outperformed his. He has since upgraded his GeForce GTX 470's with a single GeForce GTX 680, and even against that card, my system does very well in comparison. In DIRT Showdoown, we both were over the 50 FPS average mark. I was at about 57, he was at about 70, on average. Now, that may sound like a lot, right?
Well, then you factor in the pricing. My motherboard, processor and RAM was less than $250. His motherboard alone was more expensive than everything I paid combined. Coupled with another $250 for the CPU. That's $500. That's double price just for the motherboard and processor compared to what I paid for everything, outside of a PSU, case, monitor, etc in which I already had. The performance difference, however, definitely isn't double.
I mean, you can either go pay $600+ to build a system (motherboard, CPU, RAM since most people reuse other parts such as optical drive, sound card, network card, hard drive, PSU, etc for many years), or you can pay $250 to build a system that will get slightly less performance on benchmarks, but still be future proof.
It's your call. I don't know about other people, but I like knowing I'm getting the best bang for the buck, and while Intel definitely may offer slightly better performance in benchmarks, AMD definitely offers the best bang for the buck. How can you turn your head at a 4 module 8 thread CPU for $185 when it costs over $300 to get a decent Intel chip? They're both future proof and will run anything at over 60 FPS for years to come, so why blow the extra $100 on the CPU and an extra $100 on a motherboard? Oh, and good luck finding an Intel motherboard that compares to the AM3+ ASUS M5A97 with a UEFI BIOS for under $200.
jmcb - Monday, April 22, 2013 - link
Most ppl in the general public will be like me. I dont OC, I tried it but never got into it. I dont even game on my PC...and for what I use my PC for....stock vs stock.....Intel is where its at. Sorry. I do lots of video encoding.The general public see this article....they will probably think the same thing. I also look at power consumption. Again, Intel is where its at. I had my sights set on a i7 3770k for over a year. I can probably wait 2 more years...and it might still be a better buy vs AMD.
IntelBias - Sunday, June 2, 2013 - link
I never noticed this until just now, I always heard Anandtech intel bias but never noticed it until this article. They purposely set the resolutions lower knowing that Piledriver fares much better against SB/IB in 1080p and 1440p.Granttamaid - Tuesday, June 11, 2013 - link
Remember Price vs performance, AMD always win against intel. Well, intel has the fastest processor in the world,.... but, do you need all the intel's potential power? i don't think so.some people only use their computer for internet browsing even they have intel core i7.
pantong - Monday, July 8, 2013 - link
I see most comments talking about how this card is shit.Sigh.
This card works great for what I need it to do. I host 7 servers on my computer on v boxes for my gaming community. Mine craft, ts, star made, cube world, ect... I don't get paid for my services and I need it cheap. This card give performance to host a lot of people on each on each core. From a low 4 people on cube world to 45+ on team speak.
Why would I buy Intel for these purposes other than to spend money for same performance in the scenario and flaunt my epeen.?
Performance is not based on score or GHz. Its based on money.
So before saying this card is shit. Why not look at multiple applications that this card can be used it.
As far as this scenario goes. Any unused CPU is lost money.
Ukdude21 - Thursday, August 15, 2013 - link
You keep complaining about amd fanboys, but your obviously an intel fanboy... You sound completely immature with your constant us of LOL in caps.No need to get too excited about cpu's... They are just part of a machine that you use lol.
Seriously you should not post on this website if you are just going to be a immature intel fanboy.
You sound too young to be posting on this website anyways.
"obnoxious insolence" you should re-read your post as it is completely obnoxious.
Your attitude stinks -.-
Gholt - Thursday, August 29, 2013 - link
All those Intel & AMD Fanboys.. Holy crap -_-I have an FX-8350 and had an i7 in the past.
I went for the FX-8350 since it was so much cheaper compared to the i7 3770K while having almost identical performance, AMD just can't be beaten by their Price-Performance products, however Intel's best CPU's will Always be better than AMD's best, thats a fact to be known.
BUT, that doesnt make AMD bad, like all these Intel ''Fanboys'' seem te think, Intel and AMD are both bad and good in their own way, get the fuck over it both you AMD AND Intel fanboys.
alcomenow - Friday, October 11, 2013 - link
If you live in a hot and humid climate an FX processors would be a terrible value. Most people that live in central Florida use air-conditioning 80% to 90% of the year. For my uses the equivalent AMD CPU's have been tested and shown to consume about 65 watts more under full load and 11 watts under idle conditions than a comparable Intel CPU. Since I pay 100% of the electrical bill then buying a Intel CPU will become a significant cost saving purchase over 4 or 5 years.Well, 10 or 50 watts might not seem like it would cost much. If you factor in daily usage, a/c cost, and 4 years of ownership than the pennies start to add up.
1w * 12hrs/day = 4.3kw/hrs/yr
4.3kw * 0.8 a/c cost = 7.7kw * $0.11 kw/hrs = $0.85 yr
$0.85 * 4 yr = $3.40 per watt
example $3.40 * 30w = $102
1w * 24hrs/day = 8.6kw/hrs/yr
8.6kw * 0.8 a/c cost = 15.4kw * $0.11 kw/hrs = $1.7 yr
$01.7 * 4 yr = $6.80 per watt
example $6.80 * 50w = $340
If I was building someone a budget gaming desktop, I would recommend spending $130 on a Intel Core i3-3225 Ivy Bridge 3.3GHz LGA 1155 55W Dual-Core Desktop Processor Intel HD Graphics 4000 running at stock speeds instead of spend $120 on a AMD FX-6300 Vishera 3.5GHz (4.1GHz Turbo) Socket AM3+ 95W Six-Core Desktop Processor even if overclocked to 4GHz.
ToeringsNthong - Saturday, January 11, 2014 - link
Quote""Single threaded performance is my biggest concern, and compared to Sandy Bridge there's a good 40-50% advantage the i5 2500K enjoys over the FX-8150"I know this is an old post BUT i just bought its big bother the FX 8350 for $129.99 ,and they are saying the same thing,poor single thread performance.I say WHO CARES? I know most people that have 1/2 a brain don't care!! Most people buy this CPU for gaming !What games can you name are single threaded???unless you go back to the stone age ! and are playing the very 1st battlefield 1942?thats all i do is game and transcode videos and dollar to dollar this thing kicks intels a s s! so case closed!!!! AMD all the way FTWmersastra - Thursday, January 16, 2014 - link
1st time in ages pondering an upgrade , why I don't know as what I have doe's everything i need to , I don't game by the way. Sitting on Phenom11 Quad 965 Black Edition , gently O/c to 3.7ghz . Asus M4A88TD-evo with 16GB of memory. Not impressed with bench marks in the least , used to laugh at them with lenses when I used to do photography , meaningless in the real world mostly. pondering this chip , currently about £143. Been building machines since 286 days and always use AMD , never let me down , have been with several Intel ones. If AMD went bust leaving Intel with a monopoly, would price out of our reach. My thoughts anyway. Not really bothered which chip as long as it works , each to their own , don't understand the squabbling. lol.LBJ - Saturday, January 18, 2014 - link
Did you look at the games benchmarks in this review? It's not that games only use a single thread, it's that plenty of games are no optimised well for multi-threaded CPUs. If we're talking about BF4 fine, the FX chips will perform well against Intel, but it's clear from these benchmarks that in plenty of titles from the past few years that the poor per-core performance of the AMD chips does hamper its gaming potential.Chromatose - Sunday, February 2, 2014 - link
And the fanboyism repeats itself again.... AMD is slowly creeping up to Intel, eventually they would have to be equal. AMD does good at 3D modelling, video editing and gaming, Intel is suited more for video editing, rendering and graphics. AMD is low price and value. If you can OC it properly and with good cooling, it beats Intel in SOME areas. Intel, on the other hand, can deliver a tad more power with lower clock speeds.AMD has less framerate from Intel, but only a *slight* difference in performance. AMD is known to have weak cores and make it up by adding 8. Intel has slightly more power in each core, but mainly goes up to 6 cores.
My conclusion? AMD should stop fighting with Intel and slowly work their way up from there.
Melcinitan - Thursday, June 19, 2014 - link
Ok people,lets just agree that nothing is free of charge.We HAVE TO PAY to get these badboys on our pcs so you can stop bitchin about "price dont matter".AMD FX 8320 is around 160$.lets check for the same price to intel right?Intel - Core i3 Dual-core i3-4340 is what i found on the net after a quick search. If you tell me that i3 guy can best FX 8320 than yep Intel is the winner.But if you try to compare i7 and this fx guy,thats wrong.
I think amd tries to sell cpus to middle-class folks who only wants to have best performance with a weak salary and Intel is trying to do the best of the bests regarding how much it will cost. If you have lots of money buy an intel and have fun of it. If you dont have that much money but still need some performance go for AMD.
My opinion is giving half the money and have a tick less of performance.I'll live.
Melcinitan - Thursday, June 19, 2014 - link
sorry for bothering you chromatose,i feel to comment when i was reading yours. my comment is not regarding of yours.D0ubl3Tap - Sunday, February 9, 2014 - link
Tech support since the first ibm 88/66. AMD has its hits and misses. And i could argue tech benchmarks as well. But i buy my machines for reliability and stability. We supported 65k users with only 12 techs. 24/7 Phenom 2 was beautiful. so much to a point i just started field testing for a roll out for the 8320. So far Intel will still be what it is. (Apple, Intel, Samsung) Alot of money spent on hype. You wanna talk about whats real word useful? Well my friend the business world is all about documents of large size being transmitted and AMD's multi thread kicked the crap out of Intel in both time to completion and completion without error. Many many of our users game. Our test group did admit they loved the 8320 on A 970g. So the conclustion is If i wanna live in a box and run bench marks. Intel. I wanna get some tasks done AMD. Period, The added bonus my AMD vendors are so much easier to work with on the VERY rare occasion of a failure. Intel... not so much.quickbot - Monday, September 15, 2014 - link
Im at the moment looking for the best performance/price cpu, so after reading most of this i couldnt resist to comment. Clearly most of you here do OC and are a fkn benchmark freaks, while i play mmo's for 6-8 days you jiggle your PC in benchmarks, licking it and cooling off to get higher score.....but for what? Its like making a car with 300-400km/h and shitting on cars that barely do 200km/h, but forgetting the fact, that theres not many places where on daily basis you can go so fast, right?So im looking on game stats here as thats only thing that makes sense to me. So i was looking to buy 8350 a lil upgrade from my phenom II x4 965, but i started to wonder, why the hell i need so huge fps, if normaly you wont see any real difference between 60-70fps and 100. So i decided to buy 6350 and oc a bit as i read only good oc reviews about it. And yea, its not a huge difference in money, but still, im looking for a good build that wont hurt my pocket much. And to those who start to count electricity bills, you are stupid, NO gamer will use same system for 5 years, so saying to use 100usd more and buy intel is damn stupid, as in those 5 years i better save those 100usd and put em to a new amd cpu what they will have in that time.
I wont lie, i wanted to make intel pc as intel mobos are so much greater looking in my price range, but when i saw the price of cpus my dream was crashed. Here in Latvia i can buy FX-9370 4,4GHz almost cheaper than starting level i5 processor. If i compare only speed and reviews(as im only a gamer and dont know shit nor care about some numbers in tests, that intel has 5-10% better performance) then local store amd processors beat intel in price 2x. And for a simple(not making shitloads of money) gamer, price/performance is all what i need.
ofcourse these benchmark tests tell you alot more than to me, but really, if you have 200km/h car for 2k euro and 210km/h for 4k euro, then it all goes down only how much you can afford and if you really need, will use those extra 10km/h....right?
So please, stop bitching and telling total crap, as in end most people who buy these stuff are gamers, not overclockers who just need a better number is benchmark than guy next to him. Price is everything and even more price/performance. I better loose those 10% fps and still play game at 90% than pay 2x more to actually dont see any difference in daily basis.
Thats all!
Sorry about my english tho, its very bad, so dont even bother commenting on that as i wont get back here, just kinda made me sick all those idiots measuring theyre dicks in internet(for that join the chatroulette dick flashers)
analogbyte - Tuesday, October 28, 2014 - link
I just remember the days back in 2004, when Athlon destoyed intel's reputation. Jump ahead a few years, after intel failed twice with multicore architectures and they finally came through with core duo (their third attempt from scratch), after monopolizing an array of markets under the table (and finally losing in court years later, or too late, paying over 2 billion in damages to amd) and now, intel, the monarch in cpus, boasting their success over 5-10% performance against competition, being in that place after doing a lot of harm to their younger betters, still and always selling their products in a "milking" way, just makes me shake about our future. Buying amd's reasonably positioned products pricewise makes me feel I do my part in maintaining a much needed competition, that does everybody a lot of good. I think every amd customer offers intel customers faster progress and affordable prices. But not many willingly understand. Buying is a choice. And everybody needs amd but intel.arkitek4 - Wednesday, March 18, 2015 - link
I work as an Architect Specialist for a local Phil based company, I decided to purchase a new computer since my Acer Asphire with an Intel 2330 with a mistake of buying it without the benefit of a dedicated VRAM. When it came for me to learn Luminous 3.2 the laptop screen turned blue hence I am decided to buy a desktop PC.I conferred with our resident IT, and he sudjested for me to purchase a PC with an AMD processor since Intel even with their hyperthreading will just put most of my money on the CPU instead of opting for an AMD and slap it on with a 2GB dedicated VRAM 128 bit. Makes a lot of sense really and sadly with all the let-down statements about AMD I can only imagine a life without AMD where a whole lot of people won't be able to buy a simple ass PC on account of Intels exorbitant price range. Make no mistake though Intel really runs AMD down like a raging bull. But that is crap in the bag, speed isn't only the real issue here as the other components has to come into play. VRAM, RAMM speed and RAMM memory, motherboard, power supply, CPU case, cooling system, softwares. Add them all up and really with an AMD it can all be within arms reach as not everyone can afford Intel.
LikeClockwork64 - Thursday, December 24, 2015 - link
The FX 6300 is a great chip for gaming if the game actually utilizes all 6 cores. Since hyperthreading only adds up to 50% more performance, the i3 (which is the only chip within its price range) is actually more like a 3 core Intel processor. That's why the i5 beats it because it actually has a full 4 cores.Since the Piledriver cores are more than half as fast as Intel's that puts the FX6300 at above i3 performance in properly threaded games and within striking range of the i5. The FX8350 ends up being in between the i5 and i7 in games that like 8 threads. At less than $150 that makes the 8 core Piledriver chips very competitive with the i5.
If only games didn't emphasize the importance of the performance of the first two cores so much. AMD would have a serious winner with a 6 core APU
CosmicTrek - Tuesday, May 31, 2016 - link
You buy what you need for what you do, its not that complicated. If im building a budget pc aimed at being a "console killer" why would i spend around $300 for a i5 4690k and board when i can buy an equally priced board, and an fx 6300 for a little over $150 total which performs great in gaming, multitasking, and some mild video production? You pay for what you get. (Prices were found on newegg at the time of post)CosmicTrek - Tuesday, May 31, 2016 - link
The people that buy overkill hardware that they dont utilize 50% of, are the same people spending over $1000 for a gtx 1080 REFERENCE at launchKilon - Saturday, April 29, 2017 - link
I have a electricity flatrate and I mean it serious (I have no Idea, I only thought/think that AMD is doing stupid with saying its octo-core, its like the i3-540 I bought in 2010, it had 2 cores but 4 x 3.06 GHz... Intel stopped this when I bought this system, why is the power consumption soo important for the average user or is it something for the users who really know much? I thought max consumption is 125W and wondered about ~180-190W but its "System Power", I have ASUS M5 A97 R 2.0 Motherboard, I did not buy a new CPU yet because the prices are crazy because of the now very weak €uro to USD... Intel CPU's in late 2014 released at ~330€, exactly the same ones did cost in late 2016 ~350€! I have a PC since I'm 6-7 years old and I can't remember ever that 2 years after release a CPU is more expensive than at its release and the "new" ones are costing as much as a complete "low budget gaming"-system...So I use it, GTA 5 runes quite fine with 2,6GB of 2GB available VRAM (thanks to Nvidia Geforce Expierence, these settings are really cool, manually I can't make them, the game tells me I have not enough VRAM, but somehow the Geforce Software is able to do this, and its no prob, I'm a bit angry because I did not even try GTA 5 at this 8320/GTA 760 OC ASUS (1072 normal, OC to 1150 MHz, normal is 980 MHz and Turbo 1.033 I think, only 2GB VRAM @ 6008 MHz, I run it with ~6200)
So its ok to use the system like I did?! I sometimes pushed AMD Overclock a bit ahead, is it damaging my system??
Kilon - Saturday, April 29, 2017 - link
Sorry for posting again, I mean GTA 5 works really great (I did not try because GTA 4 is from 2008, the i3-540 is from 2010 and I had a HD 5770 which I overclocked from 850/1200 to 875-900/1250-1300 (GPU/VRAM), so a 2 year after GTA 4 was made CPU and a GPU which is not great but in 2010 was not crap was not able to play GTA 4 at everything max, now the GTA 5 runs really great and I "lost" over 1 year because I did not try (same with the free win 10 upgrade), the graphic is great, even with the 2,6GB settings, i wonder that the FX-8320 and the ASUS DirectCU-OC 760 works soo great at GTA 5! Only a few things are "off" which could be done by a 900-series I think,So I wonder, but maybe GTA 5 is simply one of the few games where it is like this and the other thing is the "optimized settings" from Geforce Experience, I would enable/do other settings, so I now enjoy it... but maybe GTA 5 already is a old game....