The AMD Radeon R9 295X2 Review

by Ryan Smith on 4/8/2014 8:00 AM EST
Comments Locked

131 Comments

Back to Article

  • eotheod - Tuesday, April 8, 2014 - link

    Same performance as crossfire 290X? Might be time to do a Mini-ITX build. Half the price of Titan Z also makes it a winner.
  • Torrijos - Tuesday, April 8, 2014 - link

    A lot of compute benchmark see no improvement from a single 290X...
    What is happening?
  • Ryan Smith - Tuesday, April 8, 2014 - link

    Most of these compute benchmarks do not scale with multiple GPUs. We include them for completeness, if only to not so subtly point out that not everything scales well.
  • CiccioB - Tuesday, April 8, 2014 - link

    Why not adding more real life computing tests like iRay that runs both for CUDA and OpenCL?
    Syntethic tests are really meaningless as they depends more on the particular istructions used to do... ermm.. nothing?
  • fourzeronine - Tuesday, April 8, 2014 - link

    iRay runs on CUDA only. LuxRender should be used for GPU raytrace benchmarking. http://www.luxrender.net/wiki/LuxMark

    Although the best renderers that support OpenCL are hybrid systems that only solve some of the problems on GPU and a card like this would never be fully utilized.

    The best OpenCL bench mark to have would be an agisoft photoscan dense point cloud generation.
  • Musaab - Wednesday, April 9, 2014 - link

    I have one question why didn't you use 2 R9 290X with water cool or 2 GTX 780Ti with water cool. I hate this marketing Mumbo Jumbo. if I want to pay this money I will chose two cards from above with water cool and with some OC work they will feed this card the dust and for the same money I can buy 2 R9 290 or 2 GTX 780.
  • Musaab - Wednesday, April 9, 2014 - link

    Sorry I mean three R9290 or three GTX 780
  • spartaman64 - Sunday, June 1, 2014 - link

    i doubt you can afford 3 of them and water cool them and 3 of them would have a very high tdp also many people would run into space restraints and the r9 295x2 out performs 2 780 ti in sli
  • krutou - Tuesday, April 22, 2014 - link

    Because water blocks and radiators don't grow on trees. Reviewers only test what they're given, all of which are stock.
  • patrickjp93 - Friday, May 2, 2014 - link

    They pretty much do grow on trees. You can get even a moderately good liquid cooling loop for 80 bucks.
  • Gizmosis350k - Sunday, May 4, 2014 - link

    A CPU block you mean?
  • C4$hFlo - Monday, May 12, 2014 - link

    Emphasis on "moderately good". Years of operation will require maintenance on any LC solution. Moderately priced solutions such as Thermaltake don't hold up as well, the liquid gets discolored, tubing cooks and becomes brittle, pumps fail etc... Liquid cooling isn't something to do on the cheap.
  • Saifur - Monday, December 8, 2014 - link

    Hello , can someone please advice , i have 4930 K , OC to 4.3 Ghz , 16 gig ram also OC ( slightly ) and i am planning on getting the r9 295x2 . Will the power supply i have be sufficient for this card ? This is my PSU - Cooler Master V850 - 850W . THanks
  • C4$hFlo - Monday, May 12, 2014 - link

    A pump, a reservoir, two waterblocks, a fan/pump controller and a plethora of connectors and tubing will probably end up costing the difference between an R9295X2 and two R290X water cooled. Remember, with the R9195X2 you get a free closed loop that requires no maintenance. After succesfully running dual HD6970s in a koolance loop for 3 years, I can say a free closed loop is a great selling point for me.
  • RoboJ1M - Friday, May 2, 2014 - link

    It'll be interesting when a card like this does not have 2xGPU + 2x4GB but 2xGPU + 8GB
    Eventually I guess we'll see the results of the HSA push trickle down (up?) to performance parts like this.
    Hey, maybe we'll see a 4xGPU + 16GB...
  • AMDisDead - Tuesday, April 8, 2014 - link

    Almost faster than 780 Ti SLI and draws hundrets more W. AMD lunatics in full glory.
  • willis936 - Tuesday, April 8, 2014 - link

    A dual GPU is arguably worth the extra 100W compared to the drawbacks of SLI in pretty much every use case. The deciding factor is upfront cost.
  • Guspaz - Tuesday, April 8, 2014 - link

    Dual GPU (from either company) has all the same limitations/drawbacks of SLI/Crossfire. It's literally the same thing.
  • lehtv - Wednesday, April 9, 2014 - link

    No, they don't have the same limitations. Since you don't have to populate two PCIe slots, you can install the 295X2 onto a microATX or ITX motherboard and case, or just use it in a regular ATX board that doesn't support x8/x8. For some, it could also matter to not have to populate a second PCIe slot and use additional PSU cables, even if their motherboard and PSU fully supported crossfire 290X. The ability to use only one PCIe slot has always been the primary selling point of any dual GPU card, ever.
  • Musaab - Wednesday, April 9, 2014 - link

    If you want to pay 1500 $ for video card and 2000+ for 4K monitor and 200+ for power supply then you are the one who pay 1500$ on core i 7 4960x with ROG X79. why should you think about 8x/8x motherboard lanes don't forget 32 GB 2400+ DDR 3 and two extra fast ssd . friend you needn't a case you need a cabinet
  • Dupl3xxx - Wednesday, April 9, 2014 - link

    $2k+ for a 4k screen? where are you wasing your money? In norway, you can get a 4k screen for just about 5kNOK, or just about 850USD, including tax! also, why would you need a $1500 CPU, whene the 4930k is 200MHz slower, for half the price?

    Also, WHY would you want 32GB of 2400MHz ram!?!?!?! There is next to no improvement over 1600MHz!

    As far as SSD's goes, a single samsung 250/500GB should be plenty, you got 32GB of ram to use as buffer!

    And if you want a "tight" system with insane preformance, the 295x2 is the best choice ATM. Double the 290x preformance, "half" the size.
  • lehtv - Wednesday, April 9, 2014 - link

    Another difference is the way this card handles heat compared to any 290X CF setup apart from custom water cooling. The CLLC combines the benefits of reference GPUs - the ability to exhaust hot air externally rather than into the case - with the benefits of third party cooling - the ability to keep temperatures and noise levels lower than those of reference blower cards. A 290X crossfire setup using reference cooling is not even worth considering for anyone who cares about noise output, while third party 290X crossfire is restricted to cases with enough cooling capacity to handle the heat.
  • Supersonic494 - Friday, April 11, 2014 - link

    You are right, but keep in mind on big limitation with normal crossfire/SLI is the space taken up by 2 big dual slot GPUs, with this it is only one slot; however other than that you might as well get 2 290x's
  • bj_murphy - Friday, April 11, 2014 - link

    Dual GPU doesn't have the requirement for 2 PCI-E slots; you can't do SLI/Crossfire in a Mini-ITX system for example.
  • HalloweenJack - Tuesday, April 8, 2014 - link

    muppet - 20w more in furmark , and 160 in games - not hundreds more. keep drinking the ananadtech koolaid.
  • WaltC - Tuesday, April 8, 2014 - link

    Interesting. [H] seems to have done some pretty thorough testing, and the AMD card blows by 780Ti SLI in every single case. Of course, [H] is testing @ 4k resolutions/3-way Eyefinity exclusively--but that's where anyone who shells out this kind of money is going to be. 1080P? Don't make me laugh...;)
  • WaltC - Tuesday, April 8, 2014 - link

    Can't edit, so I'll just say I don't know where "1080P" came from...;)
  • lwooood - Tuesday, April 8, 2014 - link

    Apologies for going slightly OT. Is there any indication when AMD fills in the middle of their product stack with GCN 1.1 parts?
  • sascha - Tuesday, April 8, 2014 - link

    I like to know that, too!
  • MrSpadge - Tuesday, April 8, 2014 - link

    I would say that indication is 20 nm chips, at the end of the year the earliest.
  • HalloweenJack - Tuesday, April 8, 2014 - link

    cheaper set of 780ti`s? 2 of them is $1300 > $1400 and the 295 isn't even in retail yet....

    anandtech going to slate the Titan Z as much? or is the pay cheques worth too much. shame to see the bias , anandtech used to be a good site before it sold out.
  • GreenOrbs - Tuesday, April 8, 2014 - link

    Not seeing the bias--Anandtech is usually pretty fair. I think you have overlooked the fact that AMD is a sponsor not NVIDA. If anything "slating" Titan Z would be more consistent of your theory of "selling out."
  • nathanddrews - Tuesday, April 8, 2014 - link

    What bias?

    http://www.anandtech.com/bench/product/1187?vs=107...
    Two 780ti cards are cheaper than the 295x2, that's a fact.
    Two 780ti cards consume much less power than the 295x2, that's a fact.
    Two 780ti cards have better frame latency than the 295x2, that's a fact.
    Two 780ti cards have nearly identical performance to the 295x2, that's a fact.

    If someone was trying to decide between them, I'd recommend dual 780ti cards to save money and get similar performance. However, if that person only had a dual-slot available, it would be the 295x2 hands-down.

    The Titan Z isn't really any competition here - the 790 (790ti?) will be the 295x2's real competition. The real question is will NVIDIA price it less than or more than the 295x2?
  • PEJUman - Tuesday, April 8, 2014 - link

    I don't think the target market for this stuff (295x2 or Titan Z) are single GPU slots, as Ryan briefly mentioned, most people who are quite poor (myself included), will go with 780TI x 2 or 290x x 2, These cards are aimed at Quads.

    AMD have priced it appropriately, roughly equal perf. potential for 3k dual 295x2 vs 6k for dual titan-z. Unfortunately, 4GB may not be enough for Quads...

    I've ventured into multiGPUs in the past, I find these rely too much on driver updates (see how poorly 7990 runs nowadays, and AMD will be concentrating their resource on 295x2). Never again.
  • Earballs - Wednesday, April 9, 2014 - link

    With respect, any decision on what to buy should made but what your application is. Paper facts are worthless when they don't hold up to (your version of) real world tasks. Personally I've been searching for a good single card to make up for Titanfall's flaws with CF/SLI. Point is, be careful with your recommendations if they're based on facts. ;)

    Sidenote: I managed to pick up a used 290x for MSRP with the intention of adding another one once CF is fixed with Titanfall. That price:performance, which can be had today, skews the results of this round-up quite a bit IMO.
  • MisterIt - Tuesday, April 8, 2014 - link

    By drawing that much power from the PCI-lane, won't it be a fire hassard? I'v read multiple post about motherboard which take fire at bitcoin/scryptcoin mining forums due to using to many GPU without using a power riser to lower the amount of power delivered trought the pci-lane.

    Would Anandtech be willing to test the claim from AMD by running the GPU at full load for a longer period of time under a fire controlled environment?
  • Ryan Smith - Tuesday, April 8, 2014 - link

    The extra power is designed to be drawn off of the external power sockets, not the PCIe slot itself. It's roughly 215W + 215W + 75W, keeping the PCIe slot below its 75W limit.
  • MisterIt - Tuesday, April 8, 2014 - link

    Hmm allright, thanks for the reply.
    Still rather skeptical, but I'll guess there should be plenty of users reviews before the time i'm considering to upgrade my own GPU anyways.
  • CiccioB - Tuesday, April 8, 2014 - link

    Don't 8-pin molex connector specifics indicate 150W max power draw? 215W are quite out of that limit.
  • Ryan Smith - Tuesday, April 8, 2014 - link

    Yes, but it's a bit more complex than that: http://www.anandtech.com/show/4209/amds-radeon-hd-...
  • CiccioB - Tuesday, April 8, 2014 - link

    Well, not, not exactly. One thing is not being PCI compliant, and that's a thing I can understand. Another thing is going beyond connectors electrical power specifications. If they put 3 connectors I would have not had any problem. But as it is they are forcing components specifications, not simple indications rules on maximum size and power draw.
  • meowmanjack - Tuesday, April 8, 2014 - link

    If you look at the datasheet for the power connector (I'm guessing on the part number but the Molex part linked below should at least be similar enough), each pin is rated for 23 A and the housing can support a full load on each pin. Even if only 3 pairs are passing current, the connector can deliver over 800W at 12V.

    The limiting factor for how much power can be drawn from that connector is going to be the copper width and thickness on the PCB. If AMD designed the board to carry ~20 A (which the presumably have) off each connector it won't cause a problem.
  • meowmanjack - Tuesday, April 8, 2014 - link

    Oops, forgot the datasheet
    http://www.molex.com/molex/products/datasheet.jsp?...
  • behrouz - Tuesday, April 8, 2014 - link

    Thanks For Link,Finally My Doubts were Resolved.
  • Ian Cutress - Tuesday, April 8, 2014 - link

    Most of the power will be coming from the PCIe power connectors, not the lane itself. If you have 5/6/7 in a single system, then yes you might start to see issues without the appropriate motherboard power connectors.
  • dishayu - Tuesday, April 8, 2014 - link

    I'm yet to read the review but FIVE HUNDRED WATTS? WOW!
  • Pbryanw - Tuesday, April 8, 2014 - link

    I'd be more impressed if it drew 1.21 Jigawatts!! :)
  • krazyfrog - Tuesday, April 8, 2014 - link

    On the second last page, the second last chart is of load GPU temperature when it should be load load noise levels.
  • piroroadkill - Tuesday, April 8, 2014 - link

    Reasonable load noise and temps, high performance. Nice.

    You'll want to get the most efficient PSU you can get your mitts on, though.

    Also, I would seriously consider a system that is kicking out 600 Watts of heat to be something you wouldn't want in the same room as you. Your AC will work overtime, or you'll be sweating your ass off.

    A GPU for Siberia! But then, that's not really a downside as such, just a side effect of having a ridiculous amount of power pushing at the edges of this process node.
  • Mondozai - Tuesday, April 8, 2014 - link

    "Reasonable noise and temps"? It is shockingly quiet during load for a dual GPU card. And it has incredibly low GPU temps, too.

    As for heat, not really, only if you have a badly ventilated room in general or live in a warm climate.
  • Smartgent - Tuesday, April 8, 2014 - link

    The card is watercooled!! not aircooled like Nividia chose to do with their 500W TitanZ. Ishould run very quiet, and should not affect your internal temps much, as long as you mount the radiator externally to your case.
  • Ian Cutress - Tuesday, April 8, 2014 - link

    It's a shame they're not making a version with a larger liquid cooler. Would like to see it with a 2x120 CLC and an overclock.
  • jtd871 - Tuesday, April 8, 2014 - link

    This card should have just been released with a full-cover block and let the enthusiasts/3rd-parties use whatever custom cooling they like.
  • Rambon3 - Tuesday, April 8, 2014 - link

    Great article. I wish I had a spare grand and a half to replace my 7970 CF set up. BTW It looks like you have an extra GPU Load temp chart on Page 17 where the Load noise chart should be positioned.
  • randomhkkid - Tuesday, April 8, 2014 - link

    I may have missed it in the article but I don't think you mentioned whether or not it would be possible to add an additional fan on that asutek cooler? This would surely bring down stock temperatures (albeit increase the noise) if one was thinking about overclocking further.
  • Ryan Smith - Tuesday, April 8, 2014 - link

    Yes, it's possible. You would need to come up with a matching fan and the screws to mount it, but there's nothing from a hardware perspective keeping you from mounting a second fan for push-pull. I don't know if it's easily visible in our pictures, but the fan power connector is exposed mid-way along the cable run, so you can split it there to get a second fan power header.
  • mpdugas - Tuesday, April 8, 2014 - link

    push-pull, perhaps?
  • Gunbuster - Tuesday, April 8, 2014 - link

    The perfect card for high resolution multi monitor rigs, oh wait frame pacing is still broken. Don't worry, just send in your $1500 and they'll fix it sometime in 2015 (maybe)
  • JDG1980 - Tuesday, April 8, 2014 - link

    Pay closer attention to the article. Frame pacing is still imperfect *on the old 7990*, not on the R9 295X2. It works fine on GCN 1.1 cards (290/290X/295X2) due to the new XDMA engine.

    Admittedly, it was odd for them to throw in some 7990 bashing in the review of a new card, so I can understand the confusion.
  • Mondozai - Tuesday, April 8, 2014 - link

    Why are you asking people to pay attention to articles they are commenting on?

    This is the internet.
  • Kevin G - Tuesday, April 8, 2014 - link

    Typo:
    "The PCB itself is 14 layers, making it an especially intricate PCB, but one necessary to carry 500W while also routing 1028 GDDR5 lines and 48 PCIe 3 lanes."

    I presume 1028 is actually representative of the combined 1024 bit wide memory bus.
  • Ryan Smith - Tuesday, April 8, 2014 - link

    Noted and fixed. Thank you.
  • CiccioB - Tuesday, April 8, 2014 - link

    Isn't 500W from 2x8 + PCI-E out of standards and connctor specifics and somewhat dangerous?
  • ShieTar - Tuesday, April 8, 2014 - link

    Its out of spec, as also discussed in the article, but very unlikely to be dangerous. 215W at 12V over 3 wires will draw about 6A per wire, still significantly below what the ATX standard considers safe:
    "Under normal or overload conditions, no output shall continuously provide more than 240 VA".

    The interesting question will be if the PSUs are up to the task of filtering the rapid switching between almost 0W and up to 425W on the rail which feeds the two connectors. But most modern, high power PSUs feed even more connectors on a single rail, so they should have little problem. And whoever mates a 1500$ GPU with a cheap PSU has nobody to blame but himself.
  • LoccOtHaN - Tuesday, April 8, 2014 - link

    Really fine Card for 4k gaming. Nice Job AMD/ATI and 1498$ is not expensive for Entusiast Level of Hardware. NICE
  • CiccioB - Tuesday, April 8, 2014 - link

    Broken frame pacing does not really put this card as the best choice for playing, IMHO.
  • LasseGrr - Tuesday, April 8, 2014 - link

    @CiccioB Maybe try reading the article again...
  • CiccioB - Tuesday, April 8, 2014 - link

    You may look at those frame per seconds graphs more closely. When provided, those graphs do not show good frame pacing timings, and where not provided, look to other site's reviews.
    But then, I'm sure who ever spend $1500 for a gaming card has all the reason to convince himself that the card is good nonetheless.
  • extide - Tuesday, April 8, 2014 - link

    What are you talking about?? First of all, an FPS graph shows NOTHING about frame timing variance.

    Second of all, they DO specifically show the frame timing, in different graphs, and it is fine, very similar to nVidia's results!

    Are you not able to read or something?!
  • CiccioB - Wednesday, April 9, 2014 - link

    Sorry, but what kind of graph are you looking at?
    First, not all reviewed games here have a frame pacing graph.
    Second, look at BF4, Crysis3, Thief graphs for example. Where do you see this card being better than a SLI of 780Ti?
    The FPS graphs are a mess for this 295x2. Ideally those graphs should be a thin line, not an area where the frame per seconds continously oscillate.

    But, well, as said, once you spend (or want to spend) $1500 you have to convince yourself there are no problems. The same was true for 7990 buyers that glorified that card. See now what a crappy card it is. And that was also true for all those that negated that AMD Crossfire configurations had problem with respect to nvidia ones before AMD tried to correct the problem with new drivers (that somewhat now work for DX11 games but not with DX9 and 10 ones, and that's maybe the reason that for older games there's not the frame pacing graphs...).
  • ruggia - Tuesday, April 8, 2014 - link

    I'm looking at results from pcper and toms too and I see nothing "broken". In most cases variances are better than 780 sli or low enough to not be an issue
  • magnusmundus - Tuesday, April 8, 2014 - link

    With a closed loop cooler for both GPU and CPU, you might as well go for a full custom loop and get better cooling and nicer aesthetics.
  • kyuu - Tuesday, April 8, 2014 - link

    Er... no? Two CLCs are still quite a bit different from setting up a custom loop.
  • cknobman - Tuesday, April 8, 2014 - link

    Your gaming test suite kinda sucks, please update it.
  • Ryan Smith - Tuesday, April 8, 2014 - link

    The gaming test suite is a constant work in progress, so we're always looking for new games to add to it.

    Do you have anything in particular you'd like to see? (Keeping in mind that it needs to be practical to benchmark it)
  • Earballs - Thursday, April 10, 2014 - link

    Titanfall at or above 1440 would be most lovely
  • jkhoward - Thursday, April 10, 2014 - link

    I still think that WoW should still be included in these benchmarks..
  • devione - Tuesday, April 8, 2014 - link

    Is it really impossible to cool this card without using an AIO cooler, like the Titan Z?
  • mickulty - Tuesday, April 8, 2014 - link

    That would require either huge amounts of binning that drives price right up (like the Titan Z), and/or significant reductions in clock speed to accommodate reduced voltage (almost certainly like the Titan Z), resulting in a card that's both overpriced and underpowered (like...). Of course, it's not really fair to compare a card that's with reviewers now and on the shelves in 2 weeks with a card that has only ever been seen as a mockup on one of nvidia's slides =).
  • devione - Tuesday, April 8, 2014 - link

    Fair points.

    I just have an irrational dislike for AIO coolers. I would hope to see custom aire cooled via 3rd party variants, but for a variety of practical reasons I doubt that is going to happen.
  • mickulty - Wednesday, April 9, 2014 - link

    Well, Arctic's 6990 cooler wasn't far off. The arctic mono is good for 300W and it should be possible to fit two such heatsinks on one card. So it's possible. The resulting card would be absolutely huge though, and wouldn't be nearly as popular with gaming PC boutiques (IE the target market).

    Oh, VRM cooling might be an issue too. I guess a thermaltake-style heatpipe arrangement would fix that.
  • SunLord - Tuesday, April 8, 2014 - link

    Huh looking at that board and layout of the cooling setup you can swap in two independent closed looped coolers pretty easily and try and overclock it if you want and since your rich if you buy this it's totally viable for any owner
  • nsiboro - Tuesday, April 8, 2014 - link

    Ryan, thank you for a wonderfully written and informative review. Appreciate much.
  • behrouz - Tuesday, April 8, 2014 - link

    Ryan Smith , Please Confirm this :

    The new nv's Driver Does Overclock GTX 780 Ti, From 928 to 1019Mhz.if So Temp should be increased.
  • behrouz - Tuesday, April 8, 2014 - link

    and also Power Consumption
  • Ryan Smith - Tuesday, April 8, 2014 - link

    Overclock GTX 780 Ti? No. I did not see any changes in clockspeeds or temperatures that I can recall.
  • PWRuser - Tuesday, April 8, 2014 - link

    I have a Antec Signature 850W sitting in the closet. 295X2 too much for it?
    It's this one: http://www.jonnyguru.com/modules.php?name=NDReview...
  • Dustin Sklavos - Tuesday, April 8, 2014 - link

    Word of warning: do not use daisy-chained PCIe power connectors (i.e. one connection to the power supply and two 8-pins to the graphics card). If AMD wasn't going over the per-connector power spec it wouldn't be an issue, but they are, which means you can melt the connector at the power supply end. Those daisy-chained PCIe connectors are meant for 300W max, not 425W.

    We've been hearing about this from a bunch of partners and I believe end users should be warned.
  • PWRuser - Tuesday, April 8, 2014 - link

    Thank you. According to specs my PSU could handle these GPU separately, I guess utilizing 2 PCIE slots via 2 separate cards alleviates the strain.
  • extide - Tuesday, April 8, 2014 - link

    No it has nothing to do with how many cards or slots. It's how many CABLES from the PSU.

    Sometimes you can have a single cable with two pcie connectors on the end, one daisy chained of the other. What he is saying is, don't use connectors like that, use two individual cables instead.

    Although, unless the PSU you are using has really crappy (thin) power cables, it should be OK even with a single cable. But yeah, it's definitely a good idea to use two!
  • Dustin Sklavos - Tuesday, April 8, 2014 - link

    Single cable is beyond spec for the connector. We've been hearing connectors actually melting. "Crappy" isn't really relevant here; this is the *only* card on the market that causes these kinds of problems.
  • Anders CT - Tuesday, April 8, 2014 - link

    500 watt power consumption is insane. It should come with an on-board dieselgenerator.
  • Blitzninjasensei - Saturday, July 12, 2014 - link

    The thought of this made my day. Thanks for the joke, needed it.
  • therfman - Tuesday, April 8, 2014 - link

    This is all very nice, but unless case space is at a premium, I fail to see the advantage of this card over two 290X cards with good coolers. The PowerColor PCS+ version of the 290X runs at 1050 MHz, is much quieter than the reference boards (40-42 dBA under load at 75cm), and is available for under $600. Is having a single-slot solution worth $300 extra? Not unless you really want have everything in a small form factor case.
  • Peeping Tom - Tuesday, April 8, 2014 - link

    Is that a giveaway I hear coming? ;-)
  • silverblue - Tuesday, April 8, 2014 - link

    Please, don't... I don't think I could stand to see a card of this calibre being offered only to those in the States... :|
  • JBVertexx - Tuesday, April 8, 2014 - link

    Is there any way to tell the temperatures of each of the two GPUs? Where does the temperature reading for the testing come from - is it an average of the 2, the hotter, or the cooler one?

    Reason I'm asking is I was skeptical a 120mm rad could effectively cool two of these GPUs. Given they are connected in series, one is bound to be measurably hotter than the other.

    Otherwise, this looks to be a winner. I was considering upgrading my uATX rig so I could do SLI. But with this card, I could keep the compact form factor.
  • JBVertexx - Tuesday, April 8, 2014 - link

    After some additional research on the web, it looks like the difference in temps between the 2 GPUs is only about 2 degrees under load, so pleasantly surprised with how well the 120mm radiator handles the cooling.
  • Ryan Smith - Tuesday, April 8, 2014 - link

    The temperature readings come from MSI Afterburner, which is capable of reading the temperatures via AMD's driver API. And unless otherwise noted, the temperature is always the hottest temperature.
  • srsbsns - Tuesday, April 8, 2014 - link

    The point of this driver was improvements the the HD7000 series and their rebrands... Anandtech missed this by benching an already optimized 290x dual card?
  • extide - Tuesday, April 8, 2014 - link

    Did you misread the article? They are simply comparing the frame pacing on the old stuff to the new stuff. Unfortunately, most people are too stupid to properly comprehend english, which is pretty damn sad if you ask me. Thus, a lot of people are either mistakenly thinking that this card has bad frame pacing, or that this review had anything to do with the frame pacing updates for GCN 1.0. NEITHER of those things are the case!
  • JDG1980 - Tuesday, April 8, 2014 - link

    These two different things really shouldn't have been in the same article. It's confusing, unfocused, and comes off as taking cheap shots at AMD over an old product. Let's be honest, there weren't many 7990s sold in the first place, and anyone who bought one for gaming and was disappointed with it could have resold it during the mining craze and at least broken even, if not actually turning a profit. A review of a new product isn't the best place to say "Old product X is still not perfect".
  • srsbsns - Tuesday, April 8, 2014 - link

    Are the Battlefield 4 benchmarks using mantle or directx?
  • Ryan Smith - Tuesday, April 8, 2014 - link

    Direct3D.
  • Blitzninjasensei - Saturday, July 12, 2014 - link

    Ryan, would you be able to do a comparison with Mantle as well as D3D? I would like to see how much the benefit is.
  • iamkyle - Tuesday, April 8, 2014 - link

    So...they're taking the Prescott approach to performance?

    "Bigger!!! Faster!!! Hotter!!!"

    Sounds like some Core2 Duo-type innovation is needed by AMD here to get temps and power down to a reasonable level here.
  • Mondozai - Tuesday, April 8, 2014 - link

    Temperatures are out of control?

    Can you even read a basic chart or is that too much for your tiny little head to handle?
  • Da W - Tuesday, April 8, 2014 - link

    Great EVGA GTX 780 superclocked to sell!
    Reason: bought too soon, i want this dual GPU bitch!
  • TheinsanegamerN - Friday, April 11, 2014 - link

    I'll buy it. my 550ti is getting a little long in the tooth
  • Mondozai - Tuesday, April 8, 2014 - link

    Basically, for a few hundred dollars you are paying a premium on noise and GPU load compared to 2 R9-290X in Crossfire.

    While this card has a frame pacing improvement that is massive compared to 7990, it still trails 780 Ti in SLI. Although the 780 Ti is painfully gimped on 4K resolutions due to VRAM bottlenecks.

    Maxwell's high-end cards in SLI is going to be beastly, since Nvidia is finally going to resolve the VRAM issue.
  • Samus - Tuesday, April 8, 2014 - link

    Very pretty cooler.
  • slickr - Tuesday, April 8, 2014 - link

    Is it me or is pricing on graphic cards INSANE? $1500 for this and then $3000 for Titan Z? I mean give me a break, I'd buy a freaking car with that money.
  • stefpats - Tuesday, April 8, 2014 - link

    my question is: is there going to be a GTX 790 or that plan is gone? i would like to compare similar things,and then decide what i m replacing 690 with.I ve waited long enough it's time for an upgrade.
  • TheinsanegamerN - Friday, April 11, 2014 - link

    My guess, no. They will have the 890, skipping over the 790 completely. 20nm and maxwell will make for a much more interesting dual GPU
  • ekagori - Tuesday, April 8, 2014 - link

    I like what AMD has done here, paying more attention to the high end packaging is good to see. Hopefully this trickles down to the next generation on 20nm. Considering how far AMD has been behind NVidia in terms of power consumption, seeing this card average under 500W at full load is pretty good. That's under 250W per chip, pretty close to what a 780ti does. You pay a slight premium for the better case and clc and still better than what NVidia wants for the Z.
  • HammerStrike - Tuesday, April 8, 2014 - link

    Shame there is still no HDMI 2.0 support on any consumer GPU's, including this one. Given that this is arround $250 more then two custom cooled 290X's with similar noise profiles it doesn't make much sense (at least to me) in standard or full size cases, but would be an interesting choice for a HTPC. Card is overkill for 1080p, but with new 4K TV's coming out that support HDMI 2.0 this would have made sense for that. As it is, while I can appreciate the design and performance, I don't see the value prop vs a couple of individual 290x's in CF.
  • TinHat - Friday, April 11, 2014 - link

    Can you not just use Displayport like everyone else? I believe its got far superior resolution support and because its loyalty free to produce, its cheaper too!
  • Camel51 - Wednesday, April 9, 2014 - link

    What is the static pressure of the 120mm fan? Is it easily replaceable? I'm wondering if it would make sense to replace it with an SP120 HP edition, a Noctua, or any other high static pressure/low noise fan (even if it would require modding it in by cutting wires).

    The review was excellent. Very informative and interesting. Took me a whole hour to read. Haha. I just wish the card was short enough to fit in the Obsidian 250D. Now for the Ares III!
  • Ryan Smith - Wednesday, April 9, 2014 - link

    I don't have any information on the static pressure, but the fan is easily replaceable. It's just a standard 120mm fan; so any other 2pin/3pin 120mm fan should work.
  • henrikfm - Wednesday, April 9, 2014 - link

    Does it make coffee?
  • mpdugas - Wednesday, April 9, 2014 - link

    Time for two power supplies in this kind of build...
  • rikm - Wednesday, April 9, 2014 - link

    huh?, no giveaway? why do I read this stuff?
    ok, seriously, love these reviews, but the thing I never understand is when they say Titan is better, but the charts seem to say the opposite, at least for compute.
  • lanskywalker - Wednesday, April 9, 2014 - link

    That card is a sexy beast.
  • jimjamjamie - Thursday, April 10, 2014 - link

    Great effort from AMD, I wish they would focus on efficiency though - I feel with the changing computing climate and the shift to mobile that power-hungry components should be niche, not the norm.

    Basically, a dual-750ti card would be lovely :)
  • IUU - Saturday, April 12, 2014 - link

    The sad thing about all this, is that the lowest resolution for these cards is considered to be the 2560x1440 one(for those who understand).
    Bigger disappointment yet, that after so many years of high expectations, the gpu still stands as a huge piece of silicon inside the pc that's firmly chained by the IT industry to serve gamers only.
    Whatever the reason for no such consumer applications,thiis is a crime, mildly put.
  • RoboJ1M - Thursday, May 1, 2014 - link

    The 4870 stories that were written here by Anand were my most memorable and favourite.
    That and the SSD saga.

    Everybody loves a good Giant Killer story.

    But the "Small Die Strategy" has long since ended?
    When did that end?
    Why did that end? I mean, it worked so well, didn't it?
  • patrickjp93 - Friday, May 2, 2014 - link

    People should be warned: the performance of this card is nowhere close to what the benchmarks or limited tests suggest. Even on the newest Asrock Motherboard the PCI v3 lanes bottleneck this about 40%. If you're just going to sequentially transform the same data once it's on the card, yes, you have this performance, which is impressive for the base cost, though entirely lousy for the Flop/Watt. But, if you're going to attempt to be moving 8GB of data to and from the CPU and GPU continuously, this card performs marginally better than the 290. The busses are bridge chips are going to need to get much faster for these cards to be really useful for anything outside purely graphical applications in the future. It's pretty much a waste for GPGPU computing.
  • patrickjp93 - Friday, May 2, 2014 - link

    *The busses AND bridge chips...* Seriously what chat forum doesn't let you edit your comments?
  • Gizmosis350k - Sunday, May 4, 2014 - link

    I wonder if Quad CF works with these
  • Blitzninjasensei - Saturday, July 12, 2014 - link

    I'm trying to imagine what kind of person would have 4 of these and why, maybe EyeFinity with 4k? Even then your CPU would bottleneck way before that, you would need some kind of motherboard with dual CPU slots and a game that can take advantage of it.
  • Saifur - Monday, December 8, 2014 - link

    Hello , can someone please advice , i have 4930 K , OC to 4.3 Ghz , 16 gig ram also OC ( slightly ) and i am planning on getting the r9 295x2 . Will the power supply i have be sufficient for this card ? This is my PSU - Cooler Master V850 - 850W . THanks

Log in

Don't have an account? Sign up now