Comments Locked

92 Comments

Back to Article

  • mostlyharmless - Tuesday, January 13, 2015 - link

    "These processors are similar to their non-E counterparts, the FX-8370 and FX-8320, but with a lower base frequency but the same turbo frequency. This means, in theory, they should be as quick and responsive for most day-to-day tasks as their 125W brethren, but a bit behind when it comes to the hardcore processor mechanics. "

    Given what's said in the first sentence, isn't the second sentence just the opposite of the logical conclusion?
  • evilspoons - Tuesday, January 13, 2015 - link

    No. It bursts to the same frequency when it is otherwise idle, because power over time (heat output) has been low, so 'day to day tasks' are the same. But when you push it hard and you hit the thermal overhead, the fact it's capped at 100 instead of 125 means it's going to throttle back faster and therefore be slower.
  • mostlyharmless - Tuesday, January 13, 2015 - link

    Thanks! A much clearer explanation.
  • OrphanageExplosion - Tuesday, January 13, 2015 - link

    You guys really need to find a series of gaming benchmarks that are actually CPU heavy - none of those titles are (BF4 yes - but not that area). How about AC Unity or Crysis 3 for starters. Total War? Some more games that people are actually playing, perhaps?
  • postem - Tuesday, January 13, 2015 - link

    This is actually much more easy for them, as they already have a load of tests of the games on the review. They just tested the same games with this cpu and put the graph.
    BTW if you put this cpu and expect any stellar performance in gaming, and any AMD cpu, even the 220W monster you will be suffering.
    I was using a i7 950 @ 4.2. Until i updated to a 4790K i didnt realize how much bottleneck i had. Not only that but stutterfest was gone.

    Meanwhile, paying for a budget machine for office i would go with i3 or new pentium. The price difference isnt that much and you get more performance per buck.
    Getting the 220W processor is a complete insane bid; unless you are a fanatic AMD supporter, you can clearly get a better deal in terms of cooling and performance with an i5.
  • StevoLincolnite - Tuesday, January 13, 2015 - link

    My Upgrade from the Phenom 2 x6 1090T to my Core i7 3930K says otherwise.
    Same with the jump between my 3930K and 5930K.

    Granted... I also run in eyefinity, I am always GPU limited.
  • yannigr2 - Wednesday, January 14, 2015 - link

    Unless you are a fanatic Intel supporter, for an office machine you will go with a quad core FM2/+ processor.
  • OrphanageExplosion - Wednesday, January 14, 2015 - link

    Unless you want some kind of viable CPU upgrade path.
  • jabber - Wednesday, January 14, 2015 - link

    Office machines getting upgrades? Hahahahahaaaaaaaa

    Meanwhile back in the real world!
  • phoenix_rizzen - Thursday, January 15, 2015 - link

    We've successfully upgraded our AMD-based systems from Sempron to Athlon-II X2 to Athlon-II X3 to Athlon-II X4 without changing motherboards.

    Over the years we've switched motherboards in our systems for newly purchased systems, but even those have had CPUs upgraded.

    In our server systems, we've gone from AMD Opteron 6100 to 6200 to 6300 (8-core to 16-core) without changing motherboards.

    It's one of the main reasons we've standardised on AMD for just about everything: you can upgrade CPUs without changing motherboards. (Up to a point, of course; when we need more features from the chipset or faster versions of RAM, then we'll change motherboards.)

    The other nice thing about AMD CPUs is that every CPU supports all the same features (mainly the virtualisation-related features) across all models (even Sempron CPUs support SVM). Trying to decipher the Intel CPU feature matrix and model numbers is a nightmare! And they change CPU sockets on an almost yearly basis.
  • jabber - Friday, January 16, 2015 - link

    And with the labour charges etc. all included you could well have just bought a job lot of new Dell Dimensions.

    Plus decent AM3 chips are tres expensive now.
  • phoenix_rizzen - Friday, January 16, 2015 - link

    Actually, our hardware costs are decreasing slightly each year while the hardware it buys is increasing. Our current desktops are around $150 CDN including motherboard with Radeon graphics onboard, Athlon-II x4 CPU, 2 GB of RAM. We run diskless Linux, so no harddrive, no floppy drive, no optical drive; the only moving parts are the CPU, PSU, and case fans (and sometimes we even remove the case fan).

    Our original build with nVidia 6100 graphics onboard, 512 MB of RAM, and a Sempron CPU was over $200 CDN 7-odd years ago.

    Buying the CPUs in bulk for upgrades was less than half the cost of a new system. Buying RAM upgrades was much less than half the cost of a new system. And a single tech working for a full day could upgrade an entire lab of 30 stations with some time to spare for testing ... for less than the cost of a single new system.

    We've been doing this for just over 12 years now. We know which is less expensive for us, and it's not buying name brand computers with Intel CPUs and chipsets. Everytime we put a bid out for systems, the Intel systems are more expensive without being a whole lot more powerful, and they require discrete GPUs, whereas the AMD systems include graphics support on the motherboard (Intel 3D has improved over the years, but still doesn't hold a candle to nVidia or AMD).
  • ddriver - Wednesday, January 14, 2015 - link

    Why would you go for an AMD build? I am not a big fan of Intel's past practices, and as much as I sympathize with AMD their products are simply way too weak, their performance per watt ratio is so low the lower hardware price doesn't really matter, you still end up paying more for it when you account for the electricity bills.

    Besides, for an office machine, a 5W ARM board costing 35$ suffices.

    The only reason I can think of buying AMD is in case you want to burn some money to keep AMD afloat for the sake of not leaving intel without competition, not that AMD is much of a competitor anyway... More like a perpetually crippled "competitor" existing solely for the purpose of not running unopposed.
  • phoenix_rizzen - Thursday, January 15, 2015 - link

    See my post just above yours (posted after yours in time).
  • ddriver - Saturday, January 17, 2015 - link

    So the people who benefit from AMD are those who don't have enough money, so they can end up spending more for less over time? Doesn't sound like a good deal...
  • phoenix_rizzen - Monday, January 19, 2015 - link

    And I guess black is white and down is up in your world?
  • Jinx50 - Sunday, January 25, 2015 - link

    Ironically in contrast to all the misinformation spewed "above and below" I still play Crysis 3 on ultra with a overclocked 1090T @ 3.8GHz and a HD 6970.

    I'm still waiting for the unplayable game to arrive "to give me a reason to upgrade" meanwhile I have to ask, how many Intel chips and boards have you all burned cash on in the last 5 years?

    I'm not hating on Intel just stating facts "in my instance in regards of the bang for the buck factor", and I will probably snag an FX 8xxxx when this rig finally hits the medium settings wall.

    AMD is not a processor for those who don't have the money "on the contrary". It's the processor for those who want to KEEP THEIR MONEY. ROFL I could dump money on an Intel but do I want to ride that roller coaster NOPE..
  • Oxford Guy - Thursday, April 2, 2015 - link

    The minimum frame rates in that Bioshock Infinite chart are worrisome.
  • stefstef - Wednesday, January 21, 2015 - link

    nope. intel just has the better processor portfolio. this is not because amd cpus are so bad, but intel has the advantage of a much lower production process (22nm instead of 32nm). they are technically ahead in every sector: design, process and manufacturing. nonetheless amd makes sense as intel charges for the premium quite some good money. the usual jobs might be done by a amd as well as an intel.
  • TheinsanegamerN - Wednesday, January 14, 2015 - link

    I wouldnt go so far as to say that AMD wont give you a good experience. I traded, on a bet, my motherboard and cpu with my other pc gaming friend, and went from an i5 3570k to aq fx 6300. know how much of a difference there is? nothing. both get 60fps in everything at 1200p with my 770. i will say, if i have multiple game servers running in the background, the fx does not slow down nearly as much as the i5 ever did, even though hypothetically the i5 was more powerful, it couldnt multitask as well.
    and with the new consoles both coming with 8 core cpus, i think AMD chips will still work well, at least for the forseeable future.
  • OrphanageExplosion - Wednesday, January 14, 2015 - link

    I played Crysis 3 on an i3-4130 and an i5-4690K. On the jungle stages, there's a night and day difference. I suspect you would notice the same going from a 3570K down to an FX-6300. Most titles are GPU-bound, but CPU can cause frame-rate drops too.
  • Cryio - Wednesday, January 14, 2015 - link

    For Offfice an AMD APU A6 is more than enough.
  • aphroken - Wednesday, January 14, 2015 - link

    for office, a calculator and typewriter suffices
  • barleyguy - Thursday, January 15, 2015 - link

    Lame comment, seriously.

    My work laptop is a high end i7 (Dell 4800 mobile workstation), and my home office machine is an AMD FX-6300. The AMD machine feels every bit as fast.

    For typical office applications, there will be no noticeable difference between an AMD and Intel CPU. The storage is generally the bottleneck, so a fast SSD is more worthwhile upgrade than a faster processor.

    Even for development (which is why I have a mobile workstation), it's a barely noticeable difference. The longest part of a build is the unit tests, and they are mostly network bound.
  • AnandTechLies - Wednesday, January 21, 2015 - link

    some guy trying to tell people to waste money on an i7. if you buy an i7 it takes up most of the money (unless you have a shit ton of money) and the rest of the Desktop Suffers and AMD processors are actually GOOD. This IMFORMATIVE website lies and has been caught faking their review benchmarks to make people buy intel over AMD, read my username and LISTEN also do not trust Tomshardware it makes up bullshit to
  • D. Lister - Friday, January 23, 2015 - link

    Do you think the regular aluminium foil is as good as a tin foil for making hats? Or can the invisible martian invaders read our thoughts through it?
  • happycamperjack - Tuesday, January 13, 2015 - link

    I agree! Complained about this before. These gaming benchmarks are some of the worst games to use for both CPU and GPU beside maybe Tomb Raider. Consumer enthusiast CPU market is pretty much dominated by gamers. If you are running irrelevant benchmarks, what's the point then?

    Suggestion for games to benchmarks:

    Dragon Age Inquision: Incredibly well threaded and scaled. Hyperthreading actually gives a big boost in this game. A good indicator benchmark for upcoming Frostbite 3 engine games.

    Crysis 3 or Ryse: Son of Rome: Very well threaded and scaled games. Good benchmarks for upcoming crytek games.

    Maybes:
    Far Cry 3: I'd suggest Far Cry 4 when it's fixed. Another very well threaded game that's a well representation of open world game.

    Metal Gear Ground Zeroes: Good benchmark for upcoming FOX engine games such as the popular Pro Evolution series.

    I would also suggest AC Unity if it's not so broken for multi GPUs. Yea SLI and Crossfire is still broken this moment.
  • MapRef41N93W - Wednesday, January 14, 2015 - link

    Tomb Raider is horrible for CPU comparisons. One of the absolute worse GPU intensive games for testing CPU on the market. A Pentium G3258 can almost match a 4770k in that game.
  • OrphanageExplosion - Wednesday, January 14, 2015 - link

    Tomb Raider is useful for GPU testing as it's very heavy on GPU ramped up to the max but you're right in that its CPU usage is minimal.
  • OrphanageExplosion - Wednesday, January 14, 2015 - link

    I actually played Tomb Raider on a G3258 and a 4790K. With a 4.5GHz OC in place (and v-sync active), the Pentium was a total match for the i7. However, at stock speeds, the Pentium saw clear lag in CPU-heavy area - physics etc.

    However, the benchmarking sequence doesn't have any physics-heavy elements.

    Reviewers need to start actually playing games and finding the bottleneck areas rather than just running canned benchmarks that prove very little. If that means fewer datapoints so be it. I'd rather have a smaller amount of useful data rather than a larger amount of meaningless data.
  • happycamperjack - Wednesday, January 14, 2015 - link

    That's why I put "maybe". It's definitely not an ideal game to use for benchmark that's for sure.
  • Zap - Wednesday, January 14, 2015 - link

    Games that people are playing? You mean those super demanding ones like League of Legends and World of Warcraft, right? Because those two games have the lion's share of active gamers and actual game time right now. Since WoW's latest expansion came out, the two combined (using Raptr numbers) are 35% of the actual time spent in game for all PC gamers combined, trailed distantly by DotA2 at around 5% in 3rd place. "Demanding" games like BF4 can't even break the top 10, and barely exceed 1%.
  • jabber - Thursday, January 15, 2015 - link

    I would add the Sims to that very small list. The games I see installed on customers machines are actually pretty rare. In fact if you want to cover 90% of PC users then just include a Solitaire benchmark.
  • dr_psy - Tuesday, January 13, 2015 - link

    When are you people to stop that crappy "Power Consuption Delta" and come back to the raw values?

    This alone is reason enought to think in another webs to find reviews. Much more thes days when the power consuption values are so important. :(
  • hojnikb - Tuesday, January 13, 2015 - link

    c'mon amd, update your damn chipsets. Having almost 6 year old chipsets (900 series is really nothing more than just a rebadge of 800 series) with no entery level option (like h81 with intel) is just sad for a platform that is suposibly aim at budget segment.
  • Acreo Aeneas - Tuesday, January 13, 2015 - link

    You do realize this is a article about a AMD CPU right? AMD CPUs are not compatible with Intel chipsets. Wish people would read the article before coming up with random commentry.
  • hojnikb - Tuesday, January 13, 2015 - link

    who was talking about compatability with intel chipsets ?

    I'm just pointing out, that amd has no budget chipset option (yeah, i'm not gonna cout 760G, since that stuff is literally ancient).
  • hojnikb - Tuesday, January 13, 2015 - link

    >Wish people would read the article before coming up with random commentry.

    Same could be said for you.
  • silverblue - Tuesday, January 13, 2015 - link

    That's not what (s)he said. The 970 isn't as budget as we'd like - it can be about twice the price of an H81 board - and they're still on 65nm fabrication which means slightly higher system power consumption. A refreshed chipset for FX may not make sense, but if they were to offer more USB 3.0 ports, PCIe 3.0 and - heaven forbid - a 32nm southbridge, and cutting the price down a little, an FX system would make a little more sense.
  • hojnikb - Tuesday, January 13, 2015 - link

    Exactly my point. You can grab a cheap h81 mobo and the cheapest i5 for around the same money as 8320 and 970 mobo. And with intel you get superior platform (even the cheapest chipset offers native usb3 and pcie3), power consumption and better single thread performance.

    These amds make little sense outside of very specific workload, where many bulldozer cores come in handy.
  • cobrax5 - Wednesday, January 14, 2015 - link

    I don't think they were saying Intel chipsets work with AMD, they are saying Intel has a greater variety of options from value to performance.

    I'm actually a fan of AMD keeping their chipset as long as possible as it allows people to update just their old CPU. However, 6 years is a bit long. They really do need to completely revamp it. I guess they are just buying time until skybridge or whatever the ARM/x86 is called...
  • eanazag - Tuesday, January 13, 2015 - link

    Budget chipset is the 970. Real budget is a different AMD CPU like the AM1 socket or APU FM2 sockets.

    That is just how AMD does it, which is different than Intel. Intel offers 2 sockets. AMD has 3.
  • hojnikb - Tuesday, January 13, 2015 - link

    Not really. 970 is more like a midrange chipset like h97 with intel. You can't find cheap board with 970 like you can with intel.

    And having a different socket for low/midrange also kills it. Basicly you cant combine cheap board with a fast cpu like you can with intel.

    Amd should just stick to one socket. AM4 for apus and normal cpus like fx. That way you have a much bette selection for a given platform and better upgrade path (unlike fm2+).
    And ditch am1, nobody really needs socketed option on the lowest end. Just save a bit of money and solder that puppy in there.
  • abhaxus - Tuesday, January 13, 2015 - link

    I picked up an asrock 970 extreme4 on newegg for $69...how much cheaper do you need?
  • hojnikb - Wednesday, January 14, 2015 - link

    You can grab H81 mobo for as little as 44.99$. Thats a ~25$ difference, which could be spent elsewere.
  • jabber - Wednesday, January 14, 2015 - link

    Does it have serial and parallel ports on the back? ;-)
  • hojnikb - Wednesday, January 14, 2015 - link

    It has actually (looking at h81m-d plus).
    I dont see an issue here.
  • Cryio - Wednesday, January 14, 2015 - link

    Their newest chipset, the FM2+, is for home office, HTPC and low-end gaming. So if you want the latest in MOBO tech from AMD, you need to get either an APU or an Athlon.
  • LarsBars - Tuesday, January 13, 2015 - link

    I snagged two FX-8320Es for $119 each to use in two Hyper-V lab machines to leave powered on around the clock. I wanted the extra threads, high memory capacity, and low cost (and, as much as possible for an AMD machine, low power consumption.)

    They seem to be working out great. I am really glad to see AT still reviewing AMD even though it seems people have given up on them. Thanks, Ian.
  • Samus - Tuesday, January 13, 2015 - link

    Wow, AMD hasn't had a chipset update in 4 years? I wish Intel would take a note from them.
  • ExarKun333 - Tuesday, January 13, 2015 - link

    Yeah, who needs PCIe 3.0, M.2 drives, etc? LOL
  • royalcrown - Tuesday, January 13, 2015 - link

    M.2 is kind of dumb, stuff should just hurry up and migrate to pcie
  • YukaKun - Tuesday, January 13, 2015 - link

    You're not serious, right?

    The only real thing that the 990FX chipset needs is more USB3 ports and that's about it. I don't remember if the NB needs "DDR4 support" of some kind. Maybe extra interconnect logic.

    PCIe 3.0 is not relevant right now since PCIe 2.0 is still enough. And not even with 3 or 4 cards in tandem, PCIe 3.0 is justifiable as a "required feature". M.2 drives... I don't even know why it is something you'd want in a desktop PC.

    Cheers!
  • postem - Tuesday, January 13, 2015 - link

    All while z97 chipset forces you to scale down PCI for every device you plug in.
    Intel forces chipset/socket every 2 years to force you buy new motherboards, its part of their game with their partners.
    I really wish to go back to AMD, but ATM there is really not a reasonable performance cpu on their line and i really doubt will be any on foresee future.
  • Samus - Tuesday, January 13, 2015 - link

    Intel actually launched the 90-series chipset 9 months after the 80-series. Granted, all rev C0 and newer 80-series run any CPU the 90-series does, but a lot of first adopters of the H81/H87/Z87 got screwed into no upgrade path.

    I agree, M2 in the desktop makes zero sense, and PCIe 3.0 is useless and will be for years. Most GPU's don't even utilize the full bandwidth of PCIe 2.0 8x.

    As I said, Intel could think a little more forward (they're capable of doing so) but I think the marketing and bureaucratic politics are making the engineers push chipsets like there's no tomorrow. Intel actually makes as much selling chipsets as they do selling entry-level CPU's.
  • hojnikb - Tuesday, January 13, 2015 - link

    >The only real thing that the 990FX chipset needs is more USB3 ports and that's about it. I don't remember if the NB needs "DDR4 support" of some kind. Maybe extra interconnect logic.

    Not more. 990FX doesn't even have any kind of native usb3 at all. All you see is 3rd party support via asmedia or similar chips.

    So yeah, amd needs native usb3 and pcie 3.0 on their chipsets.
  • silverblue - Tuesday, January 13, 2015 - link

    Assuming there's demand, AMD should look producing the 970 in 32nm (GF must have some spare capacity, surely?). Throw in native USB3 and halve chipset power consumption all at the same time. A 970A, if you will.
  • hojnikb - Tuesday, January 13, 2015 - link

    they never gonna do that.
  • III-V - Tuesday, January 13, 2015 - link

    >PCIe 3.0 is not relevant right now since PCIe 2.0 is still enough. And not even with 3 or 4 cards in tandem, PCIe 3.0 is justifiable as a "required feature". M.2 drives... I don't even know why it is something you'd want in a desktop PC.

    Lol, no matter how many times this argument gets defeated, it still pops up.

    GPUs are not the only devices that utilize PCIe. Case in point: SSDs.
  • Cryio - Wednesday, January 14, 2015 - link

    PCIe 2.0 still isn't getting maxed out. You are loosing ... 3% of performance by not going with PCIe 3.0. A lot of people don't even know what M.2 drives are.

    We only need proper USB 3.0 implementation. Current 970/990 chips have USB 3.0, but they're not native and not as fast.
  • Kevin G - Tuesday, January 13, 2015 - link

    Updating the chipset is necessary to get new IO on to a platform. Things like USB 3.0 can then be integrated into the chipset so that a 3rd party controller is no longer needed.

    Rather the nice thing is that AMD hasn't changed socket in 5 years with AM3+. Intlel on the other hand has had three sockets of chips with dual channel DDR3, 16 PCIe lanes and DMI to the chipset. Sure, a few things changed between socket 1156, 1155 and socket 1150, but it would have been nice if Intel was forward thinking and maintained compatibility. The quad core Lynfield chips are still respectable in terms of CPU performance today.
  • Penti - Tuesday, January 13, 2015 - link

    They has basically frozen their whole platform with the failed Bulldozer release. Their chipset's hasn't changed much since AM2+/AM3 days, the NB is basically the same as the 2007 790FX. The only thing 990FX added over 790FX is probably the IOMMU-support (and updated the PCIe lanes used by the SB to PCIe 2.0 also called A-link Express) which was first found in the identical 890FX, also IOMMU was found in server chipsets built on the 800-series back in '10. The southbridge on AM3+ hasn't changed since launch and is basically the same as the 2010 SB850. It's planned that they have a 4 year or so gap. They canceled a lot of designs and plans. Including new server socket and server chipset. They scrapped the plans for new BD chips for AM3+/successor socket also which is why you have no Steamroller or Excavator, and why you have no 10/20-core 2-gen Bulldozer/Piledriver for servers. So the chipset is really 5 years old and based on the same tech as 7 year old NB built on the same process and had HT3 support back then too. So the NB is really 2007-era with minor changes, SB has no USB3 support.

    Intel basically did have a platform which didn't get a chipset update for ~3 years – the X79. That wasn't such a good deal thanks to no native USB3 support, few SATA6 ports and so on. Lots of bugs too.
  • stefantalpalaru - Tuesday, January 13, 2015 - link

    Here are a series of benchmarks I did on the same (rather modest) motherboard with a Phenom II X6 at 3.9 GHz and the FX-8320E at stock frequency and a 4.5 GHz overclock: http://openbenchmarking.org/result/1412036-KH-MERG...
  • mikato - Wednesday, January 14, 2015 - link

    Very cool! Thanks for this. If I interpret the colors correctly, it looks like the X6 wins mostly, but when the FX-8320E is overclocked, it wins mostly.
  • LarsBars - Saturday, January 17, 2015 - link

    I used some of the sorting tools and it looks like your summary is correct.

    But keep in mind that the Phenom II X6 is overclocked... so it's kind of hard to draw conclusions from it. I guess it would have made sense if it was stock vs stock (which we sort of already have in the AT article) or OC vs OC (since AT didn't OC an X6).

    $0.02
  • SpaceRanger - Tuesday, January 13, 2015 - link

    The only graph that I wanted to see was the power consumption graph, and it's not included.. :(
    I really wanted to see just how much they chopped off the power consumption with this go-around.
  • silverblue - Tuesday, January 13, 2015 - link

    Toms benched the 8370E and appeared to get some very interesting power readings. Though this doesn't necessarily ring true for the 8320E, it may be helpful nonetheless:

    http://www.tomshardware.com/reviews/amd-fx-8370e-c...
  • sonicmerlin - Tuesday, January 13, 2015 - link

    The most painful aspect of AMD's single threaded performance woes is that Intel hasn't even bothered with increasing IPC since Sandy Bridge. AMD needs their new architecture to be a smash hit if they want to avoid bankruptcy.
  • silverblue - Tuesday, January 13, 2015 - link

    And that's what bothers me about Carrizo not coming to the desktop. We don't know if that 30% IPC boost is across the board or mainly as a result of FPU gains, but at the very least, Carrizo should bury Phenom II and previous Bulldozer designs, and at least equal Lynnfield/Nehalem, at the same clocks but for far lower power consumption. Still, performance wise, it's not exactly a lofty goal - you'd need more than four cores for that.
  • xenol - Tuesday, January 13, 2015 - link

    Software most people use didn't take advantage of Bulldozer because they weren't multi-threaded. It's because software most people use don't take advantage of CPU performance period. Most of the programs in your task manager idle.

    Modern software is multi-threaded, as in, they have multiple threads. And all the major OSes (Windows, Linux, Mac OS X) schedule at the thread level on any available resource. If they don't take advantage of multiple cores, it's not because they are "single threaded", it's because what they do isn't very taxing for a CPU to do.
  • xenol - Tuesday, January 13, 2015 - link

    Since I can't seem to edit (or it's not made obvious)...

    I wanted to point out if a program is "single-threaded", it's using a synchronous threading model, where in threads will wait in line to be run. I'm pretty certain a lot of modern programs are asynchronous.
  • eanazag - Tuesday, January 13, 2015 - link

    With an updated chipset and new manufacturing node, this CPU could still play in the market.

    At release the 900 series chipset was better than Intel's current solution which had no USB 3 and no SATA 3. It took a generation too long for Intel to catch up with SATA 3 (Sandy Bridge). At Ivy Bridge Intel passed AMD in every aspect.

    We're talking Broadwell now and it is plain sad at this point. The Core i3's from Haswell beat the processor much of the time when not overclocked. The value proposition is not in AMD's favor unless some just wants to overclock something.
  • hojnikb - Tuesday, January 13, 2015 - link

    >At release the 900 series chipset was better than Intel's current solution which had no USB 3 and no SATA 3

    Not really. 900 didnt have native usb3 (so pretty much the same deal as intel -- they had to use 3rd party solution). And in 2009, sata6g wasn't really a thing yet (no fast ssds back then).
  • hojnikb - Tuesday, January 13, 2015 - link

    correction; 900 series was released in 2011, where sata6g ssds started to pop up...
  • Laststop311 - Tuesday, January 13, 2015 - link

    You just can't defend these FX line cpu's. You are a just a million times better off coughing up an extra 200 dollars and getting a nice z97 board + devils canyon i5 and overclocking it to an easy 4.4ghz. Even in software that can make use of all 8 cores for BD a 4.4ghz i5 devils canyon has a good chance of out muscling it or coming very close and for anything else the i5 just blows BD away.

    If you are going to buy an AMD FX based system just stop yourself and put 50 dollars away on your next 4 paychecks and get an i5-4690k + Z97 system instead and u will be much happier. Even if for some reason i had to have 8 cores in my next system i'd spend an extra 1400 or whatever and get an i7-5960x + x99 system. It's disgusting that an 4.0+ghz overclocked phenom II x6 1100t can outperform the new FX chips at MANY tasks and I would actually recommend an AMD buyer to just get an 1100t with quality water cooling over a current FX chip can find super cheap 1100t's which makes the 1100t + good water cooling the same price as a new FX + decent air cooler and u can just get better performance with the massively OC'd 1100t.
  • III-V - Tuesday, January 13, 2015 - link

    >You just can't defend these FX line cpu's.

    Oh yes you can. There are thousands of people in the AMD Defense Force that do just this every day.
  • silverblue - Wednesday, January 14, 2015 - link

    Haha I like this one. :)
  • jabber - Thursday, January 15, 2015 - link

    To me the difference is like going into a restaurant and ordering the AMD pizza which comes in at 15" for $20 or ordering the Intel pizza which is 18" and costs $5 more. Most people probably can't manage to eat the AMD let alone the Intel.

    It's not like we are back in the days with AMD stuck at 20 FPS average and Intel at 30FPS average.
  • Oxford Guy - Thursday, January 15, 2015 - link

    That's a really awful analogy. If your primary taxing use for your computer is gaming then AMD processors are a poor option.
  • jabber - Friday, January 16, 2015 - link

    Not really.

    "Oh noooo my $100 AMD chip is only giving me 86FPS!" How terrible!

    How did we manage back in the 1990's.
  • LeptonX - Tuesday, January 13, 2015 - link

    How long do you test with OCCT and what data size? I ask because I often have crashes after 3-6 hours and if I want total stability that lowers my OC by as much as 200MHz.
  • Oxford Guy - Sunday, April 19, 2015 - link

    If the chip even lasts that long at an 1.550 volts and what is most likely a temperature well above the rated maximum.
  • bsim500 - Wednesday, January 14, 2015 - link

    "Idle to Delta Power Consumption":-

    "95w" FX-8320E = 86w
    "95w" FX-8370E = 127w
    "125w" FX-8350 = 163w
    "220w" FX-9590 = 272w

    And this is precisely why people want the actual idle & load figures not worthless "delta" scores that still do not reflect if, eg, one platform is idling 20w higher than another, etc. It doesn't involve any extra work since you need to acquire both anyway to calculate the delta!
  • yannigr2 - Wednesday, January 14, 2015 - link

    Every time I see charts like these in the article, I am hitting my head on the wall shouting
    "32nm Thuban. 32nm Thuban you morons!".
  • Cryio - Wednesday, January 14, 2015 - link

    Anandtech guys, please, sometime in 2015 please update the charts with some newer games, that are also CPU dependent.

    • Tomb Raider is exclusively GPU dependent.
    • F1 2013 is old, 2014 was released some time ago. You should still probably benchmark GRID: AutoSport, since it's the newer game with the better optimized engine. F1 games always done poorly with AMD CPUs, for whatever reason.
    • Battlefield 4 should be test only on MP. And if you really want to enphasize that CPUs can do in that game, get a HD 7000/R 200 series AMD GPU and run the game on Mantle to see what the CPU can do.
    • Get an RTS game on that list.
    • Get Far Cry 4, Watch_Dogs, Crysis 3 or Metro Last Light Redux on this list. These are properly CPU hungy games.
  • Paddockrj - Wednesday, January 14, 2015 - link

    Dear members of this channel. You are eating a bull but a fly make you all bleed! AMD created those processors because a lot of people that has Am3+ 95w mobo's need it! If you have a 8120...8150...8320...8350...8370 or FX6xxx, you dont need 8320E or 8370E! AMD created those processor for a good upgrade ONLY.

    In gaming, FX8320E with a gtx770 or r9 280x gets the same fps that any Intel Core i. Some fps more, some fps less, but the same level. The diference happens when we try to play a Intel optmized game or a game that uses 2 or 4 cores better... MUCH BETTER. 1 core of Intel is better than 1 core of AMD, but we must see that in multicore, FX is excellent. Only i7 2nd or better can face a FX8350 per example in multicore.

    But now, the most important perfomance is single core. And this way, Core i3 and i5 4th perform better. However it is not the FX end. I think within two or less years, when 4k be a truth here in Brasil, who have a FX6xxx or better will give thanks God for that. Because 4K uses multicore too much and, I told you all, FX is excellent this track.

    We dont need be fanboys, we are getting nothing. I like AMD cause my Phenom 2 x6 + r9 270 run games like an i5 and I pay R$ 1200,00 less at least... More or less = U$ 500,00. That mean, I pay less and I do the same thing. Sure, i5 can convert a film in 30s, my old X6 can do it in 45s, but when I try to convert 2...3...4 films at the same time, X6 can do it in 3minutes, i5 = 5minutes. I lost in single, but in multicore I win. That is matter to me.

    All processors has you reason to exist! When people ask me: Ok, multicore is very good, but for gaming? AMD? Why?

    I can answer using this example. A lot of people say: I will buy an i5 4430 and a gtx750... or r7 250 or 7750... or 7730... MAN! Why? Oh God! Why?

    You must see that NOTHING... Read it well: NO-THING is more important than graphics card in gaming. If you will use a 7870 or r9 270 or gtx760 and want use this build over and over and over and over... Why dont you buy an FX6300 + mobo withi usb3, 1600mhz memory support? = R$650... But no! Intel is VERY VERY VERY HIGH MUCH ULTRA MAXED MAJOR... They think this way. Thinking this way, they buy an i3 4150 + mobo B85 = R$707...

    Ok... "tell me the diferences, man!"

    i3 will win fx6300 in single core test...
    i3 will convert film faster if the program use 2 cores (rare!)
    i3 will have 5fps more in some cases
    i3 will lose in multicore test
    i3 will suffer in 4K
    i3 will cost more

    And now I ask you: Why pay R$60 plus to do the same thing? With the same quality?

    I cant find a usual answer...
  • JumpingJack - Thursday, January 15, 2015 - link

    So what you are saying is that AMD's higher end desktop processors are essentially competitive with Intel low end desktop processors. No wonder they are cheaper.
  • BrokenCrayons - Thursday, January 15, 2015 - link

    I consulted my fortune cookie and zodiac to find out that delta charts for power are still pointlessly useless.
  • corsa - Friday, January 16, 2015 - link

    PCMark8 v2 OpenCL
    The only Benchmark that has no scores for the Blue Team, were they to embarrassing to publish Ian?
  • Oscarcharliezulu - Sunday, January 18, 2015 - link

    What's up with agisoft? Your link http://www.agisoft.ru shows an info box page saying they haven't plaid their hosting bills.
  • Oxford Guy - Sunday, April 19, 2015 - link

    "an alarming 1.550"

    Ridiculous chip-killing 1.550V you mean, right?

    I don't see any temperature data...
  • shirleymarquez - Friday, May 6, 2016 - link

    Looking at this over a year later, there is still a modest niche for buying the FX-8320E. Micro Center is currently selling them for $90 and you can get a suitable motherboard for $20 ($10 after rebate) if you buy it bundled with the processor. The performance is midrange at best, but it's more bang for the buck than you can get from any Intel CPU-motherboard combo for $100.

    It's not the way to go for an always-on computer like an HTPC though. For that you want an ultralow power processor.

Log in

Don't have an account? Sign up now