The AMD Radeon R9 290X Review

by Ryan Smith on 10/24/2013 12:01 AM EST
Comments Locked

396 Comments

Back to Article

  • itchyartist - Thursday, October 24, 2013 - link

    Incredible performance and value from AMD!

    The fastest single chip video card in the world. Overall it is faster than the nvidia Titan and only $549! Almost half the price!

    Truly great to see the best performance around at a cost that is not bending you over. Battlefield 4 with AMD Mantle just around the corner. These new 290X GPUs are going to be uncontested Kings of the Hill for the Battlefield 4 game. Free battlefield game with the 290X too.Must buy.

    Incredible!
  • Berzerker7 - Thursday, October 24, 2013 - link

    ...really? The card is $600. You reek of AMD PR.
  • Novulux - Thursday, October 24, 2013 - link

    It says $549 in this very review?
  • Berzerker7 - Thursday, October 24, 2013 - link

    It does indeed. His article still smells like pre-written script.
  • siliconwizard - Thursday, October 24, 2013 - link

    Like all the reviews state GTX Titan is now irrelevant. 290X took the crown and saved the wallet.
  • siliconwizard - Thursday, October 24, 2013 - link

    Thinking that sphere toucher' s comment is accurate. Bit of salt here over amd taking over the high end slot and ridiculing the titan card. Only going to get worse once the Mantle enabled games are rleased. Nvidia is finished for battlefield 4. Crushed by amd, 290x and mantle.
  • MousE007 - Thursday, October 24, 2013 - link

    Mantle.....lol , nvidia Gsync just killed AMD
  • ninjaquick - Thursday, October 24, 2013 - link

    lol? a G-Sync type solution is a good candidate for being integrated into a VESA standard, and make it part of the Display's Information that is exchanged though DP/HDMI/DVI, so all AMD would need to do is make sure their drivers are aware that they can send frames to the screen as soon as they are finished. The best part would be that, with the whole Mantle deal, AMD would probably expose this to the developer, allowing them to determine when frames are 'G-Sync'd' and when they are not.
  • MousE007 - Thursday, October 24, 2013 - link

    No, there is a "hand- shake" between GPU and the monitor or tv, will not be supported with any other brand.
  • inighthawki - Thursday, October 24, 2013 - link

    You do realize that it can still be put into the VESA standard, right? Then only GPUs supporting the standard can take advantage of it. Also ANYONE who believes that GSync OR Mantle is going to "kill the other" is just an idiot.
  • Spunjji - Friday, October 25, 2013 - link

    Word.
  • extide - Thursday, October 24, 2013 - link

    That doesn't mean that AMD can't come up with a solution that might even be compatible with G-Sync... Time will tell..
  • piroroadkill - Friday, October 25, 2013 - link

    That would not be in NVIDIA's best interests. If a lot of machines (AMD, Intel) won't support it, why would you buy a screen for a specific graphics card? Later down the line, maybe something like the R9 290X comes out, and you can save a TON of money on a high performing graphics card from another team.

    It doesn't make sense.

    For NVIDIA, their best bet at getting this out there and making the most money from it, is licencing it.
  • Mstngs351 - Sunday, November 3, 2013 - link

    Well it depends on the buyer. I've bounced between AMD and Nvidia (to be upfront I've had more Nvidia cards) and I've been wanting to step up to a larger 1440 monitor. I will be sure that it supports Gsync as it looks to be one of the more exciting recent developments.

    So although you are correct that not a lot of folks will buy an extra monitor just for Gsync, there are a lot of us who have been waiting for an excuse. :P
  • nutingut - Saturday, October 26, 2013 - link

    Haha, that would be something for the cartel office then, I figure.
  • elajt_1 - Sunday, October 27, 2013 - link

    This doesn't prevent AMD from making something similiar, if Nvidia decides to not make it open.
  • hoboville - Thursday, October 24, 2013 - link

    Gsync will require you to buy a new monitor. Dropping more money on graphics and smoothness will apply at the high end and for those with big wallets, but for the rest of us there's little point to jumping into Gsync.

    In 3-4 years when IPS 2560x1440 has matured to the point where it's both mainstream (cheap) and capable of delivering low-latency ghosting-free images, then Gsync will be a big deal, but right now only a small percentage of the population have invested in 1440p.

    The fact is, most people have been sitting on their 1080p screens for 3+ years and probably will for another 3 unless those same screens fail--$500+ for a desktop monitor is a lot to justify. Once the monitor upgrades start en mass, then Gsync will be a market changer because AMD will not have anything to compete with.
  • misfit410 - Thursday, October 24, 2013 - link

    G-String didn't kill anything, I'm not about to give up my Dell Ultrasharp for another Proprietary Nvidia tool.
  • anubis44 - Tuesday, October 29, 2013 - link

    Agreed. G-sync is a stupid solution to non-existent problem. If you have a fast enough frame rate, there's nothing to fix.
  • MADDER1 - Thursday, October 24, 2013 - link

    Mantle could be for either higher frame rate or more detail. Gsync sounds like just frame rate.
  • TheJian - Friday, October 25, 2013 - link

    Incorrect. Part of the point of gsync is when you can do 200fps in a particular part of the game they can crank up detail and USE the power you have completely rather than making the whole game for say 60fps etc. Then when all kinds of crap is happening on screen (50 guys shooting each other etc) they can drop the graphics detail down some to keep things smooth. Gsync isn't JUST frame rate. You apparently didn't read the anandtech live blog eh? You get your cake and can eat it too (stutter free, no tearing, smooth and extra power used when you have it available).
  • MADDER1 - Friday, October 25, 2013 - link

    If Gsync drops the detail to maintain fps like you said, then you're really not getting the detail you thought you set. How is that having your cake and eating it too?
  • Cellar Door - Friday, October 25, 2013 - link

    How so? If Mantle gets 760gtx performance in BF4 from a 260X ..will you switch then?
  • Animalosity - Sunday, October 27, 2013 - link

    No. You are sadly mistaken sir.
  • Antiflash - Thursday, October 24, 2013 - link

    I've usually prefer Nvidia Cards, but they have it well deserved when decided to price GK110 to the stratosphere just "because they can" and had no competition. That's poor way to treat your customers and taking advantage of fanboys. Full implementation of Tesla and Fermi were always priced around $500. Pricing Keppler GK110 at $650+ was stupid. It's silicon after all, you should get more performance for the same price each year. Not more performance at a premium price as Nvidia tried to do this generation. AMD is not doing anything extraordinary here they are just not following nvidia price gouging practices and $550 is their GPU at historical market prices for their flagship GPU. We would not have been having this discussion if Nvidia had done the same with GK110.
  • inighthawki - Thursday, October 24, 2013 - link

    " It's silicon after all, you should get more performance for the same price each year"

    So R&D costs come from where, exactly? Not sure why people always forget that there is actual R&D that goes into these types of products, it's not just some $5 just of plastic and silicon + some labor and manufacturing costs. Like when they break down phones and tablets and calculate costs they never account for this. As if their engineers are basically just selecting components on newegg and plugging them together.
  • jecastejon - Thursday, October 24, 2013 - link

    R&D. Is R&D tied only to a specific Nvidia card? AMD as others also invest a lot in R&D with every product generation, even if they are not successful. Nvidia will have to do a reality cheek with their pricing and be loyal to their fans and the market. Today's advantages don't last for to long.
  • Antiflash - Thursday, October 24, 2013 - link

    NVIDIA's logic. Kepler refresh: 30% more performance => 100% increase in price
    AMD's logic. GCN refresh: is 30% more preformance => 0% increase in price
    I can't see how this is justified by R&D of just a greedy company squishing its more loyal customer base.
  • Antiflash - Thursday, October 24, 2013 - link

    Just for clarification. price comparison between cards at its introduction comparing NVIDIA's 680 vs Titan and AMD's 7970 vs 290x
  • TheJian - Friday, October 25, 2013 - link

    AMD way=ZERO PROFITS company going broke, debt high, 6Bil losses in 10yrs
    NV way=500-800mil profits per year so you can keep your drivers working.

    Your love of AMD's pricing is dumb. They are broke. They have sold nearly everything they have or had, fabs, land, all that is left is the company itself and IP.

    AMD should have priced this card at $650 period. Also note, NV hasn't made as much money as 2007 for 6 years. They are not gouging you or they would make MORE than before in 2007 right? Intel, Apple, MS are gouging you as they all make more now than then (2007 was pretty much highs for a lot of companies, down since). People like you make me want to vomit as you just are KILLING AMD, which in turn will eventually cost me dearly buying NV cards as they will be the only ones with real power in the next few years. AMD already gave up the cpu race. How much longer you think they can fund the gpu race with no profits? 200mil owed to GF in Dec 31, so the meager profit they made last Q and any they might have made next Q is GONE. They won't make 200mil profit next Q...LOL. Thanks to people like you asking for LOW pricing and free games.

    You don't even understand you are ANTI-AMD...LOL. Your crap logic is killing them (and making NV get 300mil less profit than 2007). The war is hurting them both. I'd rather have AMD making 500mil and NV making 1B than what we get now AMD at ZERO and NV at 550mil.
  • Blamcore - Friday, October 25, 2013 - link

    Wow, I was just remarking yesterday that NV fanbois had sunk to the level of apple fanbois, when I was seeing the argument "you just like AMD because you can't afford NV" on a few boards. Now here is apple fanbois famous argument "my company is better because they have a higher profit margin" Gratz your unreasonable bias just went up a level!
    I know, you aren't a fanboy, you are really a business expert here to recommend that a company should gain market share by releasing a card roughly equal to what it's competitor had out for months and pricing it the same as they do! Maybe the could have asked 650 if they released it last January
  • puppies - Saturday, October 26, 2013 - link

    R+D costs come from the sale price of the card. Are you tring to claim a $300 GPU costs $300 in materials? R+D costs also come from the fact that shrinking the process enables the manufacturer to get more cards per die each time.

    Look at Intel and AMD their chips don't go up in price each time they get faster, they stay at the same price point. The last 2 cards I have bought have been Nvidia but the next one will be AMD at this rate. I expect a 660TI to be faster and more energy efficient than a 560TI and at the same price point WHEN IT IS RELEASED and I think a lot of people are in the same boat. Nvidia is trying to push people into spending more each time they release a new model line up and it stinks.

    I don't care if a 660 is faster than a 560TI, forcing people to move down the GPU lineup just smacks of NVIDIA price gouging.
  • Samus - Thursday, October 24, 2013 - link

    I have to disagree with you Berzerker. Although his post clearly "overpromotes" the 290, it is incredible value when you consider it is faster and cheaper (by hundreds of dollars) than the Titan.

    -Geforce 660TI owner
  • Laststop311 - Thursday, October 24, 2013 - link

    For people that value a quiet computer, this card is trash
  • Spunjji - Friday, October 25, 2013 - link

    For people that value a quiet computer, all stock coolers are useless.

    People that value a truly quiet computer won't even be playing at this end of the GPU market.
  • Samus - Friday, October 25, 2013 - link

    This card is a great candidate for water cooling since the back of the PCB is essentially empty. Water cooling the face side is cheaper/easier, and this card can clearly use it.
  • HisDivineOrder - Friday, October 25, 2013 - link

    He didn't say "silent." He said "quiet." I'd argue the Titan/780/690 coolers were all "quiet," but not "silent."

    Since he said quiet, I don't think his expectation is unreasonable to expect a certain level of "quiet" at the $500+ range of discrete cards.
  • Nenad - Friday, October 25, 2013 - link

    780 with stock cooler is not useless, and it IS quiet (it is not 'silent')
    BTW, going by posted numbers it seems 290x will be TWICE as noisy as GTX780 ?
  • ballfeeler - Thursday, October 24, 2013 - link

    Methinks Berzerker7 is just salty and perhaps partial to nvidia.  Nothing itchy wrote is inaccurate, including the $550 price that Salty-Berzerker7 claimed was $600. 

    -          Fastest card ?  - yup

    -          Free game ? – yup

    -          Pooped all over titan ? –yup

    Do not be salty mr. Berzerker7.  AMD just roundhouse kicked nvidia square in the gonads with performance above Titan for half the price.
  • Shark321 - Thursday, October 24, 2013 - link

    At 1080p it's actually slower than Titan if you average all reviews across the sites. With some reviews even slightly slower than the 780. It's also the loudest card ever produced after 30 minutes of playing (9,6 Sone in Battlefield 3 according to PCGamesExtreme). With this noise it's not acceptable and there will be no other coolers for the time being.
  • heflys - Thursday, October 24, 2013 - link

    Slightly slower than a 780 overall? Even in Uber mode? Can you link me to these reviews if possible?
  • Shark321 - Thursday, October 24, 2013 - link

    I didn't say that. It's slightly slower in Uber Mode than Titan overall in 1080p (across all sites combined). In Quiet mode it's usually slightly faster than 780, slighttly slower in the minority of the reviews.
  • heflys - Thursday, October 24, 2013 - link

    I see now.
  • Jumangi - Thursday, October 24, 2013 - link



    What noob would buy a 290x or a Titan and run it in 1080p? A pointless resolution for these cards.
  • inighthawki - Thursday, October 24, 2013 - link

    What an arrogant post. There are tons of people who game on 1080p displays and buy a 290X or a Titan. Having a high framerate (i.e. consistently greater than 60fps) reduces the likelihood of stuttering while playing, while also making your system a bit more future-proof as new games come out. Not everyone cares about pixel count, some care more about quality per pixel. As you start seeing titles ship on Unreal 4, post-Crysis 3, etc, I will be laughing at you when I can still run my games at native resolution without the need to upscale or reduce quality.
  • puppies - Saturday, October 26, 2013 - link

    Anyone who buys a titan to play games at 1080p is insane, seriously they need locking up. You claiming otherwise does nothing.
  • Samus - Thursday, October 24, 2013 - link

    agreed. this card is for 2560x1440+ or multimonitor
  • TheJian - Friday, October 25, 2013 - link

    ROFL...Only if you don't believe in MAXING your games. Which nobody can do with a single card. Many sites comment on this, and show mins, even here with lower settings they hit below 30fps in a few of their games at 2560. In multiplayer you'd get crushed by guys hitting much higher fps at 1080p in many games. Maybe you'll be right at 20nm, but certainly there are far too many games you have to jockey settings around on to make you right today.
  • reddev - Thursday, October 24, 2013 - link

    Both LinusTechTips and OC3D, two reviewers I trust quite a bit, have it below the 780.
  • randomhkkid - Friday, October 25, 2013 - link

    Linus tech tips over clocks all the cards it tests, since the 780 has much more headroom (on the stick cooler) the gains were larger than the 290x so at stock the 290x is faster.
  • SolMiester - Monday, October 28, 2013 - link

    So you can OC a 780 on stock, but not the 290x to sustain the OC, which means 780 wins!, especially after the price drop to $500!, oh dear AMD 290x just went from hero to zero...
  • TheJian - Friday, October 25, 2013 - link

    I gave links and named the games previously...See my post. At 1080p 780 trades blows depending on the games. Considering 98.75% of us are 1920x1200 or less, that is important and you get 3 AAA games with 780, on top of the fact that it's using far less watts, less noise and less heat. A simple drop in price of $50-100 and 780 seems like a no brainer to me (disregarding the 780TI which should keep the same price as now I'd guess). Granted Titan needs a dunk in price now too, which I'm sure will come or they'll just replace it with a full SMX up-clocked titan to keep that price. I'm guessing old titan just died as 780TI will likely beat it in nearly everything if the rumored clock speed and extra smx are true. They will have to release a new titan ULTRA or something with another smx or up the mhz to 1ghz or something. OR hopefully BOTH.

    I'm guessing it's easier to just up the 100mhz or put it to 1ghz as surely manufacturing has gotten them to where all will do this now, more than having all SMX's defect free. Then again if you have a bad SMX just turn a few more off and it's a 780TI anyway. They've had 8 months to either pile up cherry picked ones, or just improve totally anyway so more can do this easily. Clearly 780ti was just waiting in the wings already. They were just waiting to see 290x perf and estimates.
  • eddieveenstra - Sunday, October 27, 2013 - link

    Titan died when 780gtx entered the room at 600 Euro. I'm betting Nvidia only brings a 780gtx ti and that's it. Titan goes EOL.
  • anubis44 - Thursday, October 24, 2013 - link

    This is the reference card. It's not loud unless you set it to 'Uber' mode, and even then, HardOCP thought the max fan speed should be set to 100% rather than 55%. Imagine how quiet an Asus Direct CUIII or Gigabyte Windforce or Sapphire Toxic custom cooled R9 290x will be.

    Crossfire and frame pacing all working, and R9 290X crushes Titan in 4K gaming (read HardOCP's review of this 4K section), all while costing $100 less than GTX780, and the R9 280X (7970) is priced at $299, and the R9 270X (7870) is now going for $180, and now Mantle API could be the next 3dfx Glide, and boost all 7000-series cards and higher dramatically for free...

    It's like AMD just pulled out a light sabre and cut nVidia right in half while Jsen Hsun just stares dumbly at them in disbelief. He should have merged nVidia with AMD when he had the chance. Could be too late now.
  • Shark321 - Thursday, October 24, 2013 - link

    There will be no custom cooling solution for the time being. It's the loudest card ever released. Twice as loud as 780/Titan in BF3 after 10 minutes of playing. Also Nvidia will bringt the 780Ti in 3 weeks, a faster cart at a comparable price, but quiet. AMD releases the 290x one year after NVidia, 2 years after NVidias tipeout. Nvidia will be able to counter this with a wink.
  • just4U - Thursday, October 24, 2013 - link

    Shark Writes: "It's the loudest card ever released."

    Guess you weren't around for the Geforce5...
  • HisDivineOrder - Thursday, October 24, 2013 - link

    The FX5800 is not ever dead. Not if we remember the shrill sound of its fans...

    ...or if the sound burned itself into our brains for all time.
  • Samus - Friday, October 25, 2013 - link

    I think the 65nm GeForce 280 takes the cake for loudest card ever made. It was the first card with a blower.
  • ninjaquick - Thursday, October 24, 2013 - link

    lol, the Ti can only do so much, there is no smaller node for either company to jump to, not until March for enough shipments to have stock for sales. The 290X just proves AMD's GCN design is a keeper. It is getting massively throttled by heat and still manages to pull a slight lead over the titan, at sometimes 15% lower clocks than reference. AMD needed a brand for this release season, and they have it.

    Both Nvidia and AMD are jumping to the next node in 2014. Nvidia will not release Maxwell on the current node. And there is no other node they would invest in going to.
  • HisDivineOrder - Thursday, October 24, 2013 - link

    The Ti could theoretically open up all the disabled parts of the current GK110 part. Doing that, who knows what might happen? We've yet to see a fully enabled GK110. I suspect that might eat away some of the Titan's efficiency advantage, though.
  • 46andtool - Thursday, October 24, 2013 - link

    I dont know where your getting your information but your obviously nvidia biased because its all wrong. AMD is known for using poor reference coolers, once manufactures like sapphire and HIS roll out there cards in a couple weeks Im sure the noise and heat wont be a problem. and the 780ti is poised to be between a 780gtx and a titan, it will not be faster than a 290x, sorry. We already have the 780ti's specs..what Nvidia needs to focus on is dropping its insane pricing.
  • SolMiester - Monday, October 28, 2013 - link

    Sorry bud, but the Ti will be much faster than Titan, otherwise there is no point, hell even the 780OC is enough to edge the Titan. Why are people going on about Titan, its a once in a blue moon product to fill a void that AMD left open with CUDA dev for prosumers...Full monty with perhaps 7ghz memory, wahey!
  • Samus - Friday, October 25, 2013 - link

    What in the world makes you think the 780Ti will be faster than Titan? That's ridiculous. What's next, a statement that the 760Ti will be faster than the 770?
  • TheJian - Friday, October 25, 2013 - link

    http://www.techradar.com/us/news/computing-compone...
    Another shader and more mhz.
    http://news.softpedia.com/news/NVIDIA-GeForce-GTX-...
    If the specs are true quite a few sites think it will be faster than titan.
    http://hexus.net/tech/news/graphics/61445-nvidia-g...
    Check the table. 780TI would win in gflops if leak is true. The extra 80mhz+1SMX mean it should either tie or barely beat it in nearly everything.

    Even a tie at $650 would be quite awesome at less watts/heat/noise possibly. Of course it will be beat a week later buy a fully unlocked titan ultra or more mhz, or mhz+fully unlocked. NV won't just drop titan. They will make a better one easily. It's not like NV just sat on their butts for the last 8 months. It's comic anyone thinks AMD has won. OK, for a few weeks tops (and not really even now other than price looking at 1080p and the games I mentioned previously).
  • ShieTar - Thursday, October 24, 2013 - link

    It doesn't cost less than a GTX780, it only has a lower MSRP. The actual price for which you can buy a GTX780 is already below 549$ today, so as usual you pay the same price for the same performance with both companies.

    And testing 4K gaming is important right now, but it should be another 3-5 years before 4K performance actually impacts sales figures in any relevant way.

    And about Titan? Keep in mind that it is 8 months old, still has one SMX disabled (unlike the Quadro K6000), and still uses less power in games than the 290X. So I wouldn't be surprised to see a Titan+ come out soon, with 15 SMX and higher base clocks, and as Ryan puts it in this article "building a refined GPU against a much more mature 28nm process". That should be enough to gain 10%-15% performance in anything restricted by computing power, thus showing a much more clear lead over the 290X.

    The only games that the 290X will clearly win are those that are restricted by memory bandwidth. But nVidia have proven with the 770 that they can operate memory at 7GHz as well, so they can increase Titans bandwidth by 16% through clocks alone.

    Don't get me wrong, the 290X looks like a very nice card, with a very good price to it. I just don't think nVidia has any reason to worry, this is just competition as usual, AMD have made their move, nVidia will follow.
  • Drumsticks - Thursday, October 24, 2013 - link

    http://www.newegg.com/Product/ProductList.aspx?Sub...

    Searched on Newegg, sorted by lowest price, lowest one was surprise! $650. I don't think Newegg is over $100 off in their pricing with competitors.
  • 46andtool - Thursday, October 24, 2013 - link

    http://www.newegg.com/Product/Product.aspx?Item=N8...

    your clearly horrible at searching
  • TheJian - Friday, October 25, 2013 - link

    $580 isn't $550 though right? And out of stock. I wonder how many of these they can actually make seeing how hot it is already in every review pegged 94c. Nobody was able to OC it past 1125. They're clearly pushing this thing a lot already.
  • ShieTar - Friday, October 25, 2013 - link

    Well, color me surprised. I admittedly didn't check the US market, because for more than a decade now, electronics used to be sold in the Euro-Region with a price conversion assumption of 1€=1$, so everything was about 35% more expensive over here (but including 19% VAT of course).

    So for this discussion I used our German comparison engines. Both the GTX780 and the R290X are sold for the same price of just over 500€ over here, which is basically 560$+19%VAT. I figured the same price policies would apply in the US, it basically always does.

    Well, as international shipping is rarely more that 15$, it would seem like your cheapest way to get a 780 right now is to actually import it from Germany. Its always been the other way around with electronics, interesting to see it the other way around for once.
  • 46andtool - Thursday, October 24, 2013 - link

    the price of a 780gtx is not below $649 unless you are talking about a refurbished or open box card.
  • TheJian - Friday, October 25, 2013 - link

    Wrong, Zotac price in cart $624. :) Personally I'd buy an OC card for $650 but that's just me.
    http://www.newegg.com/Product/Product.aspx?Item=N8...
  • 46andtool - Thursday, October 24, 2013 - link

    your comment makes no sense, all I see are excuses and misinformation in your post." It doesn't cost less than a GTX780, it only has a lower MSRP." is just stupid, battlefield 4 edition 290xs are already on newegg for $579, the only cheap 780gtxs you will find will be used ones.
  • chrnochime - Thursday, October 24, 2013 - link

    What 549? Every 780 on NE goes for 649. I want some of the kool-aid you're drinking.
  • HisDivineOrder - Friday, October 25, 2013 - link

    It IS loud. HardOCP have a tendency to be so "hard" they ignore the volume of the card. They aren't the most reliant of sites about the acoustics of a card. Not in the past and not today.
  • JDG1980 - Thursday, October 24, 2013 - link

    Regarding 1080p performance, so what? You don't need a $500+ video card to get acceptable frame rates at that resolution. A $200-$300 card will do just fine. $500+ video cards are for multi-monitor setups or high resolution (1440p+) displays.
    Regarding the noise, that's a problem - AMD clearly stretched things as far as they could go with GCN to reach the current performance level. I know that EK has already announced a 290X waterblock for those enthusiasts who use custom loops. I wouldn't be surprised to see someone come out with a self-contained closed-loop watercooler for the 290X, similar to those that have been available for CPUs for a couple years now. That might help fix the noise issues, especially if it used a dual 120mm/140mm radiator.
  • 46andtool - Thursday, October 24, 2013 - link

    we are just now breaking 60fps on 1080p on demanding games at max details, and even more demanding games are just around the corner so your telling people what exactly? And everybody knows AMD makes retarded reference coolers. So another moot point. Lets-try-and -discredit- AMDs- stellar -new product -anyway -we -can- but- the- only- way- we -know -how -is -by -grasping- at- straws.
  • inighthawki - Thursday, October 24, 2013 - link

    BS, there's absolutely nothing wrong with a high end card on a 1080p display. Just look at the benchmarks, Crysis 3 1080p on high, a 7970GE barely hits 60fps, and no doubt that will drop below 60 on many occasions (it's just an average). On top of that, not all games are nearly as well optimized as Crytek games, or are just far more complex. Total War: Rome 2, even the 290X doesn't barely hits 60fps on extreme with MEDIUM shadows. Or maybe look at Company of Heroes 2, and how even the 290X hits a min fps of 37fps on extreme.

    On top of all of that, high resolution IPS panels are super expensive, not everyone cares enough about that to spend the money. The difference between a quality 1080p and a quality 1440p panel can be almost as much as the video card itself.
  • patrioteagle07 - Thursday, October 24, 2013 - link

    Not really... You can find refurbed ZR30s for under $600
    If you are going to spend 1k on gfx its rather short sighted to keep your TN panels...
  • inighthawki - Thursday, October 24, 2013 - link

    That's at LEAST several hundred dollars more than the majority of people are willing to spend on a monitor. 1080p TN panels are fine for most people, including most gamers. What people care about is not monitor count, pixel count, or color accuracy. They want high quality shaded pixels and good framerate. This is where high end video cards on smaller monitors comes into play. There are plenty of reasons to do it. Do not confuse your own values as the same as what everyone else wants.
  • ShieTar - Friday, October 25, 2013 - link

    Also, an increasing number of players is considering 120 FPS to be the acceptable framerate, not 60FPS.
  • Sandcat - Friday, October 25, 2013 - link

    That depends on what you define as 'acceptable frame rates'. Yeah, you do need a $500 card if you have a high refresh rate monitor and use it for 3d games, or just improved smoothness in non-3d games. A single 780 with my brothers' 144hz Asus monitor is required to get ~90 fps (i7-930 @ 4.0) in BF3 on Ultra with MSAA.

    The 290x almost requires liduid...the noise is offensive. Kudos to those with the equipment, but really, AMD cheaped out on the cooler in order to hit the price point. Good move, imho, but too loud for me.
  • hoboville - Thursday, October 24, 2013 - link

    Yup, and it's hot. It will be worth buying once the manufacturers can add their own coolers and heat pipes.

    AMD has always been slower at lower res, but better in the 3x1080p to 6x1080p arena. They have always aimed for high-bandwidth memory, which is always performs better at high res. This is good for you as a buyer because it means you'll get better scaling at high res. It's essentially forward-looking tech, which is good for those who will be upgrading monitors in the new few years when 1440p IPS starts to be more affordable. At low res the bottleneck isn't RAM, but computer power. Regardless, buying a Titan / 780 / 290X for anything less than 1440p is silly, you'll be way past the 60-70 fps human eye limit anyway.
  • eddieveenstra - Sunday, October 27, 2013 - link

    Maybe 60-70fps is the limit. but at 120Hz 60FPS will give noticable lag. 75 is about the minimum. That or i'm having eagle eyes. The 780gtx still dips in the low framerates at 120Hz (1920x1080). So the whole debate about titan or 780 being overkill @1080P is just nonsense. (780gtx 120Hz gamer here)
  • hoboville - Sunday, October 27, 2013 - link

    That really depends a lot on your monitor. When they talked about Gsync and frame lag and smoothness, they mentioned when FPS doesn't exactly match the refresh rate you get latency and bad frame timing. That you have this problem with a 120 Hz monitor is no surprise as at anything less than 120 FPS you'll see some form of stuttering. When we talk about FPS > refresh rate then you won't notice this. At home I use a 2048x1152 @ 60 Hz and beyond 60 FPS all the extra frames are dropped, where as in your case you'll have some frames "hang" when you are getting less than 120 FPS, because the frames have to "sit" on the screen for an interval until the next one is displayed. This appears to be stuttering, and you need to get a higher FPS from the game in order for the frame delivery to appear smoother. This is because apparent delay decreases as a ratio of [delivered frames (FPS) / monitor refresh speed]. Once the ratio is small enough, you can no longer detect apparent delay. In essence 120 Hz was a bad idea, unless you get Gsync (which means a new monitor).

    Get a good 1440p IPS at 60 Hz and you won't have that problem, and the image fidelity will make you wonder why you ever bought a monitor with 56% of 1440p pixels in the first place...
  • eddieveenstra - Sunday, October 27, 2013 - link

    To be honnest. I would never think about going back to 60Hz. I love 120Hz but don't know a thing about IPS monitors. Thanks for the response....

    Just checked it and that sounds good. When becoming more affordable i will start thinking about that. Seems like the IPS monitors are better with colors and have less blur@60Hz than TN. link:http://en.wikipedia.org/wiki/IPS_panel
  • Spunjji - Friday, October 25, 2013 - link

    Step 1) Take data irrespective of different collection methods.

    Step 2) Perform average of data.

    Step 3) Completely useless results!

    Congratulations, sir; you have broken Science.
  • nutingut - Saturday, October 26, 2013 - link

    But who cares if you can play at 90 vs 100 fps?
  • MousE007 - Thursday, October 24, 2013 - link

    Very true, but remember, the only reason nvidia prices their cards where they are is because they could. (Eg Intel CPUs v AMD) Having said that, I truly welcome the competition as it makes it better for all of us, regardless of which side of the fence you sit.
  • valkyrie743 - Thursday, October 24, 2013 - link

    the card runs at 95C and sucks power like no tomorrow. only only beats the 780 by a very little. does not overclock well.

    http://www.youtube.com/watch?v=-lZ3Z6Niir4
    and
    http://www.youtube.com/watch?v=3OHKWMgBhvA

    http://www.overclock3d.net/reviews/gpu_displays/am...

    i like his review. its pure honest and shows the facts. im not a nvidia fanboy nore am i a amd fanboy. but ill take nvidia right how over amd.

    i do like how this card is priced and the performance for the price. makes the titan not worth 1000 bucks (or the 850 bucks it goes used on forums) but as for the 780. if you get a non reference 780. it will be faster than the 290x and put out LESS heat and LESS noise. as well as use less power.

    plus gtx 780 TI is coming out in mid November which will probably cut the cost of the current 780 too 550 and and this card would be probably aorund 600 and beat this card even more.
  • jljaynes - Friday, October 25, 2013 - link

    you say the review sticks with the facts - he starts off talking about how ugly the card is so it needs to beat a titan. and then the next sentence he says the R9-290X will cost $699.

    he sure seems to stick with the facts.
  • jljaynes - Friday, October 25, 2013 - link

    to be fair, he says it's expected to be - he doesn't call out price explicitly.

    and i am not making this up - i skipped ahead in the video because he was annoying me - and he was still talking about the looks of the card. to me the reviews seem more like an nvidia commercial. i clicked around the entire video - he spends the entire tests talking about specs and thermals.
  • looncraz - Friday, October 25, 2013 - link

    I've read a few reviews and noticed a trend you can verify:

    While gaming, the 290x only draws about the same amount as the 780, while putting out 10% or better average performance. It is only when you REALLY push the 290x that it draws its highest power - and to do that requires special tweaks from the reviewers, negating reality.

    The noise is a problem, the heat is a problem, the performance and power draw really are not. An overclocking a video card is about the dumbest thing ever... yeah, let's risk damaging a $500+ part for an extra 5% higher frame rate... It isn't like a $200 CPU where you go from 3.2GHz to 5GHz...

    No, we're talking about going from 1GHz to 1.1GHz.... and spending a premium for better cooling on top of it all...
  • siliconwizard - Thursday, October 24, 2013 - link

    Sure does and an amazing price that is. RIP Titan
  • chizow - Thursday, October 24, 2013 - link

    Article chart says $550, Newegg has them in stock now for $580 which may just be BF4 bundle premium: http://www.newegg.com/Product/Product.aspx?Item=N8...
  • Noble07 - Thursday, October 24, 2013 - link

    Yup. The bundled version will cost $580. If you look at the newegg page, you'll see is manufacturer has two products up, one with bf4 and one without.
  • patrioteagle07 - Thursday, October 24, 2013 - link

    Newegg normally charges $20+ over msrp launch week... MSRP is $549 ...
  • PCboy - Thursday, October 24, 2013 - link

    And the Titan is $1000. Just face the facts, Nvidia got rolled.
  • dragonsqrrl - Thursday, October 24, 2013 - link

    Rolled? Price drops sir. 8 months on, price drops.
  • tuklap - Thursday, October 24, 2013 - link

    but will they drop to the same level as r9 290x? seems to me that 290x is a great buy. take note. that is just the reference performer. What more for the AIB partners ^_^ PRICE DROPS PLEASE!!
  • Shark321 - Thursday, October 24, 2013 - link

    Titan is a compute card. In 3 weeks there will be 780Ti for $599, about 5-10% faster than 290X.
  • ninjaquick - Thursday, October 24, 2013 - link

    so 4-5% faster than Titan?
  • Drumsticks - Thursday, October 24, 2013 - link

    If the 780Ti is $599, then that means the 780 should see at least a $150 (nearly 25%!) price drop, which is good with me.
  • DMCalloway - Thursday, October 24, 2013 - link

    So, what you are telling me is Nvidia is going to stop laughing- all- the- way- to-the-bank and price the 780ti for less than current 780 prices? Current 780 owners are going to get HOT and flood the market with used 780's.
  • dragonsqrrl - Thursday, October 24, 2013 - link

    Why is it that this is only ever the case when Nvidia performs a massive price drop? Nvidia price drop = early adopters getting screwed (even though 780 has been out for ~6 months now). AMD price drop = great value for enthusiasts, go AMD! ... lolz.
  • Minion4Hire - Thursday, October 24, 2013 - link

    Titan is a COMPUTE card. A poor man's (relatively speaking) proper compute solution. The fact that it is also a great gaming card is almost incidental. No one needs a 6GB frame buffer for gaming right now. The Titan comparisons are nearly meaningless.

    The "nearly" part is the unknown 780 TI. Nvidia could enable the remaining CUs on 780 to at least give the TI comparable performance to Titan. But who cares that Titan is $1000? It isn't really relevant.
  • ddriver - Thursday, October 24, 2013 - link

    Even much cheaper radeons compeltely destroy the titan as well as every other nvidia gpu in compute, do not be fooled by a single, poorly implemented test, the nvidia architecture plainly sucks in double precision performance.
  • ShieTar - Thursday, October 24, 2013 - link

    Since "much cheaper" Radeons tend to deliver 1/16th DP performance, you seem to not really know what you are talking about. Go read up on a relevant benchmark suite on professional and compute cards, e.g. http://www.tomshardware.com/reviews/best-workstati... The only tasks where AMD cards shine are those implemented in OpenCL.
  • ddriver - Thursday, October 24, 2013 - link

    "Much cheaper" relative to the price of the titan, not entry level radeons... You clutched onto a straw and drowned...

    OpenCL is THE open and portable industry standard for parallel computing, did you expect radeons to shine at .. CUDA workloads LOL, I'd say OpenCL performance is all I really need, it has been a while since I played or cared about games.
  • Pontius - Tuesday, October 29, 2013 - link

    I'm in the same boat as you ddriver, all I care about is OpenCL in these articles. I go straight to that section usually =)
  • TheJian - Friday, October 25, 2013 - link

    You're neglecting the fact that everything you can do professionally in openCL you can already do faster in cuda. Cuda is taught in 600+ universities for a reason. It is in over 200 pro apps and has been funded for 7+yrs unlike opencl which is funded by a broke company hoping people will catch on one day :) Anandtech refuses to show cuda (gee they do have an AMD portal after all...LOL) but it exists and is ultra fast. You really can't name a pro app that doesn't have direct support or support via plugin for Cuda. And if you're buying NV and running opencl instead of cuda (like anand shows calling it compute crap) you're an idiot. Why don't they run Premiere instead of Sony crap for video editing? Because Cuda works great for years in it. Same with Photoshop etc...

    You didn't look at folding@home DP benchmark here in this review either I guess. 2.5x faster than 290x. As you can see it depends on what you do and the app you use. I consider F@H stupid use of electricity but that's just me...LOL. Find anything where OpenCL (or any AMD stuff, directx, opengl) beats CUDA. Compute doesn't just mean OpenCL, it means CUDA too! Dumb sites just push openCL because its OPEN...LOL. People making money use CUDA and generally buy quadro or tesla (they own 90% of the market for a reason, or people would just buy radeons right?).
    http://www.anandtech.com/show/7457/the-radeon-r9-2...
    DP in F@H here. Titan sort of wins right? 2.5x or so over 290x :) It's comic both here and toms uses a bunch of junk synthetic crap (bitmining, Asics do that now, basemark junk, F@H, etc) to show how good AMD is, but forget you can do real work with Cuda (heck even bitmining can be done with cuda)

    When you say compute, I think CUDA, not opencl on NV. As soon as you toss in Cuda the compute story changes completely. Unfortunately even Toms refuses to pit OpenCL vs. Cuda just like here at anandtech (but that's because both love OpenCL and hate proprietary stuff). But at least they show you in ShieTar's link (which craps out, remove the . at the end of the link) that Titan kills even the top quadro cards (it's a Tesla remember for $1500 off). It's 2x+ faster than quadro's in almost everything they tested. So yeah, Titan is very worth it for people who do PRO stuff AND game.
    http://www.tomshardware.com/reviews/best-workstati...
    For the lazy, fixed ShieTar's link.

    All these sites need to do is fire up 3dsmax, cinema4d, Blender, adobe (pick your app, After Effect, Premiere, Photoshop) and pit Cuda vs. OpenCL. Just pick an opencl plugin for AMD (luxrender) and Octane/furryball etc for NV then run the tests. Does AMD pay all these sites to NOT do this? I comment and ask on every workstation/vid card article etc at toms, they never respond...LOL. They run pure cuda, then pure opencl, but act like they never meet. They run crap like basemark for photo/video editing opencl junk (you can't make money on that), instead of running adobe and choosing opencl(or directx/opengl) for AMD and Cuda for NV. Anandtech runs Sony Vegas which a quick google shows has tons of problems with NV. Heck pit Sony/AMD vs. Adobe/NV. You can run the same tests in both on video, though it would be better to just use adobe for both but they won't do that until AMD gets done optimizing for the next rev...ROFL. Can't show AMD in a bad light here...LOL. OpenCL sucks compared to Cuda (proprietary or not...just the truth).
  • Pontius - Tuesday, October 29, 2013 - link

    Some good points Jian, I would like to see side by side comparisons as well. However, I've seen some studies that implement the same algorithm in both OpenCL and CUDA and the results are mostly the same if properly implemented. I've been doing GPU computing development in my spare time over the last year and OpenCL does have one advantage over CUDA that is the reason I use it: run-time compilation. If at run-time you are working with various data sets that involve checking many conditionals, you can compile a kernel with the conditionals stripped out and get a good performance increase since GPUs aren't good at conditionals like CPUs are. But in the end, I agree, more apples to apples comparisons are needed.
  • azixtgo - Thursday, October 24, 2013 - link

    the titan is irrelevant. I can't figure why the hell people think a $1000 GPU is even worth mentioning. It's not for sane people to buy and definitely not a genuine effort by nvidia. They saw an opportunity and went for it
  • Bloodcalibur - Thursday, October 24, 2013 - link

    It's $350 more because of it's compute performance ugh. It benchmarks 5-6x more than the 780 on DGEMM. This is why the card is priced a whopping $350 more than their own 780 which is only a few FPS lower on most games and setups. The only people that should've bought a Titan were people who both GAME and do a little bit of computing.

    To compare it to the 290x are what retarded ignorant people are doing. Compare it to the 780 which it does beat out. Now we have to wait for nvidia's response.
  • Cellar Door - Thursday, October 24, 2013 - link

    Read the review before trolling. It's $549
  • azixtgo - Thursday, October 24, 2013 - link

    technically it's a good value. I think. I despise the higher prices as well but who really knows the value of the product. Comparing a GPU to a lot of things (like a ps4 that has a million other components or a complete PC ), maybe not. but comparing this to nvidia... well...
  • Pounds - Thursday, October 24, 2013 - link

    Did the nvidia fanboy get his feelings hurt?
  • superflex - Thursday, October 24, 2013 - link

    Yes, and his wallet got shredded.
    Validation is a bitch.
  • piroroadkill - Thursday, October 24, 2013 - link

    Huh? You can go over to Newegg right now and buy one for $580.
  • Wreckage - Thursday, October 24, 2013 - link

    He did not say "Mantle" 7 times so he might not be from their PR department.

    Either way the 290 is hot, loud, power hungry and nothing new in the performance department. It's cheap but that won't last. Looks like we will have to wait form Maxwell for something truly new.
  • chrnochime - Thursday, October 24, 2013 - link

    You OTOH look like you can't RTFA.
  • xres625e - Thursday, October 24, 2013 - link

    sterven..
  • eddieveenstra - Sunday, October 27, 2013 - link

    stoere jongen ben je.... bah.
  • TrantaLocked - Sunday, October 27, 2013 - link

    Hm I wonder why I can find the 290X for $550 on newegg?
  • Dal Makhani - Thursday, October 24, 2013 - link

    lol AMD fanboy. This card is alright, nothing "uber". It brings some proper pricing sense back into the green team's head which is needed, the gtx 780 will be dropped to R9 290X pricing, and the 780 Ti will be Nvidia's new 650 dollar card. But to justify 100 dollar price difference, i dont know if 780 Ti can show big enough gains. That will be interesting to see next month.
  • tuklap - Thursday, October 24, 2013 - link

    I doubt that nvidia gtx 780 will drop price on par with r9 290x..
  • just4U - Thursday, October 24, 2013 - link

    I doubt it as well.. They didn't ever really drop prices on the 560Ti until it EOL..
  • mfergus - Thursday, October 24, 2013 - link

    Well it's uber in the sense that it brings much needed competition to Nvidia's very high priced high end cards. Nobody should of thought this card was going to be revolutionary though, it's on the same 28nm as all the other cards and it has the same architecture as the 7790 which had very minor compute changes compared to GCN 1.0
  • dragonsqrrl - Thursday, October 24, 2013 - link

    Dropping the 780 by $100 is the very least Nvidia would have to do to remain competitive, and personally I don't think that would be nearly enough. The 290X is performance competitive with Titan, and despite the fact that Titan is cooler, quieter, consumes less power, has a much better shroud, and superior DP performance, it should come down to the same price as the 290X to remain competitive due to the slightly higher performance of the 290X. A ~$550 Titan or ~$400 780 would be amazing.
  • mfergus - Thursday, October 24, 2013 - link

    I don't expect Titan's price to change much but I could be wrong. I never really thought of it as a standard gaming card, it's a total halo product with lots of memory and the only non Quadro/Tesla with full DP performance. The 780ti will be cheaper than Titan though and faster in gaming performance.
  • Bloodcalibur - Thursday, October 24, 2013 - link

    You truly are an idiot if you think that the Titan should be compared to a 290X and that the sentence ends there. The Titan itself only performs a small bit above nvidia's own 780, but the $350 price difference is there for it's compute performance. It's basically a budget workstation card with higher than 780 gaming abilities for those who game AND do a little bit of 3D computing. Derp.

    This retard actually suggested a gaming/workstation hybrid pried at $1000 should compete downwards with a $550 gaming card.
  • TheJian - Friday, October 25, 2013 - link

    It's comic these people forget it's a $2500 card when supported as a pro card (tesla - with all the driver support). You are practically stealing it for $1000 already. It's not meant for GAMERS only. It's really meant for people who GAME that also like to make MONEY from their gpu with REAL apps...That concept always seems lost on the AMD lover (and even some NV people who apparently just don't understand the product or pricing on it).
  • Sandcat - Thursday, October 24, 2013 - link

    AMD plant.
  • looncraz - Thursday, October 24, 2013 - link

    Anything like an nVidia shroom?
  • Homeles - Thursday, October 24, 2013 - link

    I'd love to trip on some of those.
  • jasonelmore - Thursday, October 24, 2013 - link

    i'm reading the review and tbo the 290x peformance is around 5% lower than the GTX 780. Now if you go "uber mode" yes it does beat the 780 in several benchmarks, and does not in some, but ubermode is nothing more than a 15% overclock.. Stock for Stock 780 still is winning.
  • jordanclock - Thursday, October 24, 2013 - link

    Uber mode IS stock. Just like CPUs will boost up speed bins when they have the thermal headroom, so will the 290X. Excluding Uber mode is just trying to avoid the fact that the 290X tops the 780 in the highest of settings and sounds disengenuous.
  • looncraz - Thursday, October 24, 2013 - link

    As jordanclock stated uber mode is just a simple thermal mode setting.

    Just imagine what will happen with a better cooler and the card can run at full-tilt non-stop... With its clock often reduced by 10-15%, we could very well see some jumps where it currently doesn't beat everything outright - and crossfire configurations should greatly benefit. The power draw is unfortunate, but the reality is that few will really worry about it beyond their power supply limits...

    If you leave the 290x in quiet mode and install better cooling, you will have the same performance as in uber mode (actually, probably better considering some are reporting bugs in the uber mode profile). Add to that the standard 5% or so gained in a few months of driver revisions, and the 780TI will need to be 5-10% faster than Titan to match the 290x in its non-reference clothing.
  • Steelytuba - Thursday, October 24, 2013 - link

    Are you reading the same review I just read? The 780 is only slightly faster in a small number of the 1080p benchmarks against the 290x running quiet mode. If you run any resolution higher than 1080p (which is really the only reason you would need a card in this category) and even if you do run 1080p the 290x is the better performer for $100 less.
  • Rontalk - Thursday, October 24, 2013 - link

    Freqen Nvidia, give me back my $1000 !!!
  • rituraj - Thursday, October 24, 2013 - link

    Burn their office and then sue them
  • Bloodcalibur - Thursday, October 24, 2013 - link

    Your dumb ass paid $350 extra for computing performance when all you do is game LOL!
  • eddieveenstra - Sunday, October 27, 2013 - link

    +10
  • blau808 - Thursday, October 24, 2013 - link

    The 780ti will easily beat the 290 for price/performance
  • Black Obsidian - Thursday, October 24, 2013 - link

    So you're anticipating greater than Titan performance for less than 290X pricing?

    If so, you might be interested in this bridge I have for sale. It's in Brooklyn. In good shape, available real cheap.
  • Bloodcalibur - Thursday, October 24, 2013 - link

    Titan is a gaming/workstation hybrid. That's why it costs $350 more than 780 with only a small gaming performance increase. You sounded really ignorant there lolol.
  • erple2 - Thursday, October 24, 2013 - link

    Too easy. Too easy. You need to actually read the post Above you before commenting on the price comparison.
  • rituraj - Thursday, October 24, 2013 - link

    No.
  • Naxirian - Thursday, October 24, 2013 - link

    Lol.... except it loses to the 780 in a bunch of benchmarks, a card that has already been on the market for 6 months and costs exactly the same as the R9 290X does. Not to mention the 780 Ti is due out next month, and from the looks of these benchmarks, will probably murder the 290X whilst being in the same price range. AMD are late to the party yet again. Wonder what they'll do when Nvidia launch the 800 series next year lol. Probably wait 9 months and then launch another out-dated gen like this time.
  • extide - Thursday, October 24, 2013 - link

    What planet are you on? The 290x is $100 cheaper, NOT the same price. AMD also had the 7970 out before nVidia had any 28nm cards out, so they weren't behind. Right now the 290x is the best bang/buck card period, and if you game in 4K it is THE best card. Facts are facts man....
  • QuantumPion - Thursday, October 24, 2013 - link

    According to the benchmarks it is only faster than the Titan at 4k resolution (probably due to its 4 gb ram and higher memory bandwidth). At normal resolutions it is about on par with the 780 depending on the game. So it is about on par with the 780 and ~$50 cheaper - we've yet to see what the 780 Ti can do though. Why the article claims the 290X is faster than the Titan and only $550 makes it seem like it was written before the actual benchmarks were performed.
  • Antiflash - Thursday, October 24, 2013 - link

    I've usually prefer Nvidia Cards, but they have it well deserved when decided to price GK110 to the stratosphere just "because they can" and had no competition. That's poor way to treat your customers and taking advantage of fanboys. Full implementation of Tesla and Fermi were always priced around $500. Pricing Keppler GK110 at $650+ was stupid. It's silicon after all, you should get more performance for the same price each year. Not more performance at a premium price as Nvidia tried to do this generation. AMD is not doing anything extraordinary here they are just not following nvidia price gouging practices and $550 is their GPU at historical market prices for their flagship GPU. We would not have been having this discussion if Nvidia had done the same with GK110.
  • blitzninja - Saturday, October 26, 2013 - link

    OMG, why won't you people get it? The Titan is a COMPUTE-GAMING HYBRID card, it's for professionals who run PRO apps (ie. Adobe Media product line, 3D Modeling, CAD, etc) but are also gamers and don't want to have SLI setups for gaming + compute or they can't afford to do so.

    A Quadro card is $2500, this card has 1 less SMX unit and no PRO customer driver support but is $1000 and does both Gaming AND Compute, as far as low-level professionals are concerned this thing is the very definition of steal. Heck, you SLI two of these things and you're still up $500 from a K6000.

    What usually happens is the company they work at will have Quadro workstations and at home the employee has a Titan. Sure it's not as good but it gets the job done until you get back to work.

    Please check your shit. Everyone saying R9 290X--and yes I agree for gaming it's got some real good price/performance--destroys the Titan is ignorant and needs to do some good long research into:
    A. How well the Titan sold
    B. The size of the compute market and MISSING PRICE POINTS in said market.
    C. The amount of people doing compute who are also avid gamers.
  • chimaxi83 - Thursday, October 24, 2013 - link

    Impressive. This cards beats Nvidia on EVERY level! Price, performance, features, power..... every level. Nvidia paid the price for gouging it's customers, they are going to lose a ton of marketshare. I doubt they have anything to match this for at least a year.
  • Berzerker7 - Thursday, October 24, 2013 - link

    Sounds like a bot. The card is worse than a Titan on every point except high resolution (read: 4K), including power, temperature and noise.
  • testbug00 - Thursday, October 24, 2013 - link

    Er, the Titan beats it on being higher priced, looking nicer, having a better cooler and using less power.

    even in 1080p a 290x approxs ties (slightly ahead according to techpowerup (4%)) the Titan.

    Well, a $550 card that can tie a $1000 card in a resolution a card that fast really shouldn't be bought for (seriously, if you are playing in 1200p or less there is no reason to buy any GPU over $400 unless you plan to ugprade screens soon)
  • Sancus - Thursday, October 24, 2013 - link

    The Titan was a $1000 card when it was released.... 8 months ago. So for 8 months nvidia has had the fastest card and been able to sell it at a ridiculous price premium(even at $1000, supply of Titans was quite limited, so it's not like they would have somehow benefited from setting the price lower... in fact Titan would probably have made more money for Nvidia at an even HIGHER price).

    The fact that ATI is just barely matching Nvidia at regular resolutions and slightly beating them at 4k, 8 months later, is a baseline EXPECTATION. It's hardly an achievement. If they had released anything less than the 290X they would have completely embarrassed themselves.

    And I should point out that they're heavily marketing 4k resolution for this card and yet frame pacing in Crossfire even with their 'fixes' is still pretty terrible, and if you are seriously planning to game at 4k you need Crossfire to be actually usable, which it has never really been.
  • anubis44 - Thursday, October 24, 2013 - link

    The margin of victory for the R9 290X over the Titan at 4K resolutions is not 'slight', it's substantial. HardOCP says it's 10-15% faster on average. That's a $550 card that's 10-15% faster than a $1000 card.

    What was that about AMD being embarassed?
  • Sancus - Thursday, October 24, 2013 - link

    By the time more than 1% of the people buying this card even have 4k monitors 20nm cards will have been on sale for months. Not only that but you would basically go deaf next to a Crossfire 290x setup which is what you need for 4k. And anyway, the 290x is faster only because it's been monstrously over clocked beyond the ability of its heatsink to cool it properly. 780/Titan are still far more viable 2/3/4 GPU cards because of their superior noise and power consumption.

    All 780s overclock to considerably faster than this card at ALL resolutions so the gtx 780ti is probably just an OCed 780, and it will outperform the 290x while still being 10db quieter.
  • DMCalloway - Thursday, October 24, 2013 - link

    You mention monstrously OC'ing the 290x yet have no problem OC'ing the 780 in order to create a 780ti. Everyone knows that aftermarket coolers will keep the noise and temps. in check when released. Let's deal with the here and now, not speculate on future cards. Face it; AMD at least matches or beats a card costing $100 more which will cause Nvidia to launch the 780ti at less than current 780 prices.
  • Sancus - Thursday, October 24, 2013 - link

    You don't understand how pricing works. AMD is 8 months late to the game. They've released a card that is basically the GTX Titan, except it uses more than 50W more power and has a bargain basement heatsink. That's why it's $100 cheaper. Because AMD is the one who are far behind and the only way for them to compete is on price. They demonstrably can't compete purely based on performance, if the 290X was WAY better than the GTX Titan, AMD would have priced it higher because guess what, AMD needs to make a profit too -- and they consistently have lost money for years now.

    The company that completely owned the market to the point they could charge $1000 for a video card are the winners here, not the one that arrived out of breath at the finish line 8 months later.

    I would love for AMD to be competitive *at a competitive time* so that we didn't have to pay $650 for a GTX 780, but the fact of the matter is that they're simply not.
  • DMCalloway - Thursday, October 24, 2013 - link

    Once again, against the Titan it's $450 cheaper, not $100. Against the gtx 780 it is a wash on performance at a cheaper price point. Eight months late to the game I'll agree on, however it took time to get in bed with Sony and Micro$oft which was needed if they (AMD) ever hope to get to the point of being able to release 'at a competitive time'. I'm amazed that they are still viable after the financial losses they suffered with the whole Intel paying OEM's to not release their current cpu gen. along side AMD's business . Sure, AMD won the law suit but the financial losses in market share was in the billions , Intel jumped ahead a gen. and the damage was done. Realistically, I believe AMD chose wisely to focus on the console market because the 7970ghz pushed hard wasn't really that far behind a stock gtx780.
  • Bloodcalibur - Thursday, October 24, 2013 - link

    Ever wonder why the TItan costs $350 more than their own GTX 780 while having only a small margin of improvement?

    Oh, right, compute performance.
  • anubis44 - Thursday, October 24, 2013 - link

    and in some cases, the R9 290X is as much as 23% faster in 4K resolution than the Titan, or in the words of HardOCP: : "at Ultra HD 4K it (R9 290X) just owns the GeForce GTX TITAN."
  • Bloodcalibur - Thursday, October 24, 2013 - link

    Once again, Titan is a gaming/workstation hybrid, that's why it costs $350 more than their own GTX 780 with only a small FPS improvement in gaming.
  • TheJian - Friday, October 25, 2013 - link

    Depends on the games chosen. For instance All 4K:
    Guru3d:
    Tombraider tied 4k 40fps (they consider this BARELY playable-though advise 60fps)
    MOH Warfighter Titan wins 7%
    Bioshock Infinite Titan wins 10% (33fps to 30, but again not going to be playable min in teens?)
    BF3 TIE (32fps, again avg, so not playable)
    The only victory at 4K is Hitman absolution here. So clearly it depends on what your settings are and what games you play. Also note the fps at 4K at hardocp. They can't max settings and every game is a sacrifice of some stuff (or a lot). Even at 2560 Kyle notes all were unplayable with everything on with avg's at 22fps and mins 12fps for all 3 basically...ROFL. How useful is it to win (or even lose) at a res you can't play at?

    http://www.techpowerup.com/reviews/AMD/R9_290X/24....
    Techpowerup tests all the way to 5760x1080, quoting that unless not tested. Here we go again...LOL
    World of Warcraft domination for 780 & Titan (over 20% faster on titan 5760!)
    SKYRIM - Both Titan and 780 win 5760
    Starcraft2 only went to 2560 but again clean sweep for 780/Titan bot over 10%
    Splintercell blacklist clean sweep at 2560 & 5760 for Titan AND 780 (>20% for titan both res)
    Farcry3 (titan and 780 wins at 5760 but at 22fps who cares...LOL but 10% faster than 290x)
    black ops 2 (only went to 2560, but titan wins all res)
    Metro TIE (26fps, again neither playable)
    Crysis 3 Titan over 10% win (25fps vs. 22, but neither playable...LOL)

    At hardocp, metro, tombraider, bf3, and crysis 3 were all UNDER 25fps min on both cards with most coming in at 22fps or so on both. I wish they would benchmark at what they find is PLAYABLE, but even then I'm against 4K if I have to turn all kinds of stuff off in the first place. Only farcry3 was tested at above 30fps...LOL. You need TWO cards for 4K gaming. PERIOD. If you have the money to buy a 4K monitor or two monitors you probably have the cash to do it right and buy 2 cards. Steampowered survey shows this as most have 2 cards above 1920x1200! Bragging about 4K gaming on this card (or even titan) is ridiculous as it just ends up in an exercise of turning crap off that devs wanted me to SEE. I wouldn't care if 290x was 50% faster than Titan if you're running 22fps who cares? Neither is playable. You've proven NOTHING. If we jump off a 100 story building I'll beat you to the bottom...Yeah but umm...We're both still dead right? So what's the point no matter who wins that game?

    Funfact: techspot.com tombraider comment (2560/1080p both tested-4xSSAA+16af)
    "We expected AMD to do better in Tomb Raider since they supported the title's development, but the R9 290X was 12% slower than the GTX Titan and 3% slower than the GTX 780"
    LOL. I hope they do better with BF4 AMD enhancements. Resident Evil 6 shows titan win also.
    http://www.techspot.com/review/727-radeon-r9-290x/...

    Tomshardware 4K quote:
    "In Gaming At 3840x2160: Is Your PC Ready For A 4K Display?, I concluded that you’d want at least two GeForce GTX 780s for 4K. And although the R9 290X is faster than even the $1000 Titan, I maintain that you need a pair in order to crank your settings up to where they should be."
    That was their ARMA quote...But it applies to all 4K...TWO CARDS. But their benchmarks are really low compared to everyone else for Titan in the same games. It's like took 10-15% off Titan's scores. IE, Bioshock infinite at guru3d shows titan winning 10%, but at toms losing by 20% same game, same res...WTF? That's odd right? Skyrim shows NV domination at 4k (780 also). Almost 20% faster for Titan & 780 (they tied) over Uber. Of course they turned off ALL AA modes to get it playable. Again, you can't just judge 4K by one site's games. Clearly you can find the exact opposite at 4K and come back down to reality (a res you can actually play at above 30fps) and titan is smacking them in a ton of games (far more wins than losses). I could find a ton more if needed but you should get the point. TITAN isn't OWNED at 4K and usually when it is as toms says of Metro "the win is largely symbolic though", yeah at 30fps avg it is pointless even turned down!
  • bronopoly - Thursday, October 24, 2013 - link

    Why shouldn't one of the cards you mentioned be bought for 1080p? I don't know about you, but I prefer to get 120 FPS in games so it matches my monitor w/ lightboost enabled.
  • Bloodcalibur - Thursday, October 24, 2013 - link

    Except the Titan is a gaming/workstation hybrid due to its computing ability. Anyone who bought a Titan just for gaming is retarded and paid $350 more than they would have on a 780. Titan shouldn t be compared to 290X for gaming. Its a good card for those who do both gaming and a little bit of computing.
  • looncraz - Thursday, October 24, 2013 - link

    Install a new cooler and the last two of those problems vanish... and you've saved hundreds... you could afford to build a stand-alone water-cooling loop just for the 290x and still have money to spare for a nice dinner.
  • teiglin - Thursday, October 24, 2013 - link

    I haven't finished reading the article yet, but isn't that more than a little hyperbolic? It just means NVIDIA will have to cut back on the amount it gouges for GK110. The fact that it was able to leave the price high for so long is nearly all good for them--it's just a matter of how quickly they adjust their pricing to match.

    It will be nice to have a fair fight again at the high-end for a single card.
  • bill5 - Thursday, October 24, 2013 - link

    Heh, I'm the biggest AMD fanboy around, but these top two comments almost smell like marketing.

    It's a great card, and the Titan was deffo highly overpriced, but Nvidia can just make some adjustments on price and compete. That 780 Ti they showed will surely be something in that vein.
  • HisDivineOrder - Thursday, October 24, 2013 - link

    It also beats it in highest temperature, which is ostensibly not the best thing to "win" at.
  • Ryan Smith - Thursday, October 24, 2013 - link

    As a heads up, a number of sections are still a work in progress. We got our second 290X much later than anticipated, so we worked on that so that we could bring you guys the Crossfire numbers and FCAT data right away.

    The rest of our writeup will be here by morning (we know you're all here for the charts anyhow).
  • Drumsticks - Thursday, October 24, 2013 - link

    Thanks for the review! Pretty impressive to say the least.

    Before people (inevitably) whine about this being an "unfinished" review, I like how this is. It isn't exactly "polished" right now, but 90% of people are going to see the finished state. This also allows them to hit the review timing like AMD surely wants them to, while making sure the review is up to the standards we're used to.
  • banvetor - Thursday, October 24, 2013 - link

    90% of Americans, you mean. Elsewhere in the world the day has already started for quite some time now...
  • bill5 - Thursday, October 24, 2013 - link

    kinda disapppointed with anand lately. you didn't even review most of the of the new amd lineup. and your bench suite is dated. Now we get a "work in progress" review?

    You're still in the upper echelon of sites with ease, but I think the site has slipped.
  • The Von Matrices - Thursday, October 24, 2013 - link

    Couldn't you have just done a pipeline type review "First impressions" with the charts and then released a second article with analysis later? That would have made more sense to me than putting "work in progress."
  • banvetor - Thursday, October 24, 2013 - link

    I also strongly disagree with this strategy of publishing half-baked reviews (actually, in this case, seems more like quarter-baked or something) and only later filling it in.

    It would indeed make much more sense to publish a quick pipeline piece with some key graphs, and later (less than 24hs from Ryan's post above) publish the full review. If nothing else, it's more honest.
  • mfenn - Thursday, October 24, 2013 - link

    Agree 100%. Situations like this are why the Pipeline exists. I don't come to Anandtech because you are the first. I come to Anandtech because you are the best. Take an extra few days to do a good, reference-quality review.
  • superflex - Thursday, October 24, 2013 - link

    Take an extra few days to do a good, reference-quality review which gives nVidia Titan the edge.
    There, fixed that for ya fanboy.
    The butthurt is strong in this one.
  • pattycake0147 - Thursday, October 24, 2013 - link

    I concur that this belongs in the pipeline. I waited to make a comment because of Ryan saying it would be finished in the morning, but morning is passed now and it is still a "work in progress". This is not what I have come to expect from Anandtech.
  • Deaks2 - Thursday, October 24, 2013 - link

    Hi Ryan,

    While I can appreciate the hard work put into the review, just putting up the charts without the explainer pieces is confusing. I had to read HardOCP's review to know what Uber was, your section on Powertune was a work in progress. Also, reading the temperature information was shocking, until, again, I read the relevant section of the HardOCP review and learned that the cards will operate at 95 deg C in order to reduce fan noise.

    Also, a comparison to the 7990, the current single-slot performance king, would have been useful, since the R9 290x and 7990 are currently priced similarly. Thankfully, TechSpot included the 7990 and various 7970 and 7950 CF configurations in their review.

    As usual, I came to this site first to read the review, but had to go elsewhere to get the context to the charts that you presented.

    Thanks!
  • BryanC - Thursday, October 24, 2013 - link

    Actually, I value your commentary more than the charts, which to be honest are similar to the other charts out there. =)
  • anubis44 - Thursday, October 24, 2013 - link

    Oh, I'm here for the crafty journalism and the witty banter in the discussion thread :)

    Seriously, thanks for going the extra mile with the crossfire and FCAT data, Ryan. Much appreciated. Any word on custom cards and the R9 290 (as opposed to the 290X) would also be greatly appreciated. The R9 290 may be my next card.
  • jeremynsl - Thursday, October 24, 2013 - link

    I don't mean to rag on you guys (up late working and all), but it is unprofessional to post unfinished reviews like this. Full stop.

    I know you have made commitments to hit embargo dates, but is it really worth compromising article quality to this degree? I mean, even if only 10% of readers see it in the unfinished state it's pretty bad. I would not be ok with this, if it was my site and/or my writing.
  • ZeDestructor - Thursday, October 24, 2013 - link

    Like quite a few other people, I would rather read the technical breakdown rather than the benchmark results. If I only cared about benchmark results, I would go around compiling results from several sites to account for configuration differences.

    It really boils down to preferring a complete review over a performance review. For performance there's the bench tool already, which is far more useful since I can filter out all the irrelevant results
  • Ryan Smith - Monday, October 28, 2013 - link

    Hi guys;

    Your comments have been heard, so please don't think they're falling on deaf ears.

    Frankly we're no more happy about this than the rest of you are, which is why we try to avoid something like this if at all possible. But in this case we can't do a meaningful write-up without the benchmark data, so there's really no getting around it in this case.

    The final article ended up being a hair over 22K words. That would normally be a week-long writing job, never mind the benchmarking (new GPU + CF). So I hope if nothing else the belatedly complete article is up to your standards as our readers.
  • Black Obsidian - Tuesday, October 29, 2013 - link

    I would definitely agree that the complete article is up to the standards I've come to expect from Anandtech.

    But I would also much prefer to wait an extra day or two for the complete article, rather than get the "fill in the blanks" that the 290X review started out as. I come to Anandtech for the in-depth analysis; if that's not available when I click on the article in the first place, I'm less inclined to even bother.
  • Stuka87 - Thursday, October 24, 2013 - link

    Incredible bang for the buck card. $550 is a chunk of money yes, but compared to the competition its a steal!
  • tential - Thursday, October 24, 2013 - link

    Jumped to Final Words and it's a work in Progress...
    That's my favorite part and it's the one I real first!!!!
    nooooo.
  • Elixer - Thursday, October 24, 2013 - link

    Looks like nice card...
    No wonder the green team got their panties in a bind.
  • AtwaterFS - Thursday, October 24, 2013 - link

    Nice, Nvidia has two choices, drop prices substantially, or drop trou.
    I certainly hope its the former, as I have no interest in seeing Jen-Hsun's pig in a blanket...
  • Wreckage - Thursday, October 24, 2013 - link

    Not really the Titan slayer we were hoping for.

    After all this wait and hype it's not surpassing last years GK110. I guess we don't look for big gains in new chips anymore.
  • The Von Matrices - Thursday, October 24, 2013 - link

    This is on the same process node as Titan. Big advances don't happen without process node shrinks, regardless of manufacturer.
  • ninjaquick - Thursday, October 24, 2013 - link

    A million times this... GCN is superior to Kepler, and probably Maxwell as well. It doesn't waste tons of die-space on large cache/special decoders/legacy bullshit. Game creators want massive arrays of simple math units that can voraciously crunch numbers, and that is what GCN delivers.
  • EJS1980 - Thursday, October 24, 2013 - link

    "GCN is superior to Kepler, and probably Maxwell as well"...HEHEH..... THIS IS PRETTY FUNNY, MAN......THANKS.....I NEEDED THAT! :P
  • Will Robinson - Thursday, October 24, 2013 - link

    Yeah right Wreckage...only slayed Titan by about $500 LOL
    Idiot.
  • chizow - Thursday, October 24, 2013 - link

    Great price and performance from AMD without a doubt, it's not quite a clean sweep but it is certainly a convincing victory for R9 290X, especially at higher resolutions. 780Ti may come close or even exceed 290X, but the damage to Nvidia's product stack has been done.

    The symmetrical irony here is that this generation's price escalation that began at $550 with AMD's first 28nm part, Tahiti, has come full circle and brought back to balance with AMD's last 28nm part, Hawaii, again at that $550 price point.

    I'm happy to see some balance back in the GPU market though, I stated this month's ago at 690 and Titan launch, that this $1K pricing model was an unsustainable business strategy for Nvidia. Now it's going to bite them squarely in the ass. They'll now reap what they sow and have to deal with the angst and anger from all of their most loyal fans who will undoubtedly feel as if they were cheated by the 690, Titan and even 780.
  • chizow - Thursday, October 24, 2013 - link

    I guess it couldn't be all good news though, it looks like the leaked thermals, power consumption and acoustics weren't exaggerated. Definitely loud and hot, with high operating temps. I guess AMD really pushed the GPU to the max, I doubt there will be much more OC headroom, at least with the reference cooler.

    If Nvidia wasn't so greedy with 690/Titan/780 pricing 290X might've been the next Fermi with mixed reactions, but I'm sure the price and performance will overshadow heat/power concerns.
  • t41nt3d - Thursday, October 24, 2013 - link

    NVIDIA won't be bitten in the ass at all due to the prices of the 690s and Titan. A god awful amount of people have paid $1000 (Including me) for one or more of these cards, and it's taken /8 Months/ for AMD to release a card that goes back and forth with the Titan - Doesn't outright beat it in some games, can go either way.

    NVIDIA have made a crapload of money from their two overpriced cards and have been reaping off this for a long time now, all they need to do is lower prices and no money lost. It's AMD who's been letting them get away with those prices for so long.

    It seems a perfectly viable business strategy for them to have done what they've done and to be so successful at it.
  • chizow - Thursday, October 24, 2013 - link

    I guess we will see, their job just got that much harder to try to sustain this $1000 pricing model now that a $550 card now matches and sometimes outperforms their $1000 card. What kind of performance do you think they're going to need next time to try and justify a $1000 pricetag?

    Again, Nvidia went down the path of an unsustainable business model predicated on the fact they would be an entire ASIC ahead of AMD. That lead was clearly short-lived due to Kepler's excellent overall performance, but Hawaii brought that dream back down to reality. Sure a bit late, but still in time to crash Nvidia's $1K parade.
  • Sandcat - Thursday, October 24, 2013 - link

    Perhaps they knew it was unsustainable from the beginning, but short term gains are generally what motivate managers when the develop pricing strategies, because bonus. Make hay whilst the sun shines, or when AMD is 8 months late.
  • chizow - Saturday, October 26, 2013 - link

    Possibly, but now they have to deal with the damaged goodwill of some of their most enthusiastic, spendy customers. I can't count how many times I've seen it, someone saying they swore off company X or company Y because they felt they got burned/screwed/fleeced by a single transaction. That is what Nvidia will be dealing with going forward with Titan early adopters.
  • Sancus - Thursday, October 24, 2013 - link

    AMD really needs to do better than a response 8 months later to crash anyone's parade. And honestly, I would love to see them put up a fight with Maxwell at a reasonable time period so they have incentive to keep prices lower. Otherwise, expect Nvidia to "overprice" things next generation as well.

    When they have no competition for 8 months it's not unsustainable to price as high as the market will bear, and there's no real evidence that Titan was economically overpriced because it's not like there was a supply glut of Titans sitting around anywhere, in fact they were often out of stock. So really, Nvidia is just pricing according to the market -- no competition from AMD for 8 months, fastest card with limited supply, why WOULD they price it at anything below $1000?
  • chizow - Saturday, October 26, 2013 - link

    My reply would be that they've never had to price it at $1000 before, and we have certainly seen this level of advancement from one generation to the next in the past (7900GTX to 8800GTX, 8800GTX to GTX 280, 280 GTX to 480 GTX, etc), so it's not completely ground-breaking performance increases even though Kepler overall outperformed historical improvements by ~20%, imo.

    Also, the concern with Titan isn't just the fact it was priced at ungodly premiums this time around, it's the fact it held it's crown for such a relatively short period of time. Sure Nvidia had no competition at the $500+ range for 8 months, but that was also the brevity of Titan's reign at the top. In the past, a flagship in that $500 or $600+ range would generally reign for the entire generation, especially one that was launched half way through that generation's life cycle. Now Nvidia has already announced a reply with the 780 Ti which will mean not one, but TWO cards will surpass Titan at a fraction of it's price before the generation goes EOL.

    Nvidia was clearly blind-sided by Hawaii and ultimately it will cost them customer loyalty, imo.
  • ZeDestructor - Thursday, October 24, 2013 - link

    $1000 cards are fine, since the Titan is a cheap compute unit compared to the Quadro K6000 and the 690 is a dual-GPU card (Dual-GPU has always been in the $800+ range).

    What we should see is the 780 (Ti?) go down in price and match the R9-290x, much to the rejoicing of all!

    Nvidia got away with $650-750 on the 780 because they could, and THAT is why competition is important, and why I pay attention to AMD even if I have no reason to buy from them over Nvidia (driver support on Linux is a joke). Now they have to match. Much of the same happens in the CPU segement.
  • chizow - Saturday, October 26, 2013 - link

    For those that actually bought the Titan as a cheap compute card, sure Titan may have been a good buy, but I doubt most Titan buyers were buying it for compute. It was marketed as a gaming card with supercomputer guts and at the time, there was still much uncertainty whether or not Nvidia would release a GTX gaming card based on GK110.

    I think Nvidia preyed on these fears and took the opportunity to launch a $1K part, but I knew it was an unsustainable business model for them because it was predicated on the fact Nvidia would be an entire ASIC ahead of AMD and able to match AMD's fastest ASIC (Tahiti) with their 2nd fastest (GK104). Clearly Hawaii has turned that idea on it's head and Nvidia's premium product stack is crashing down in flames.

    Now, we will see at least 4 cards (290/290X, 780/780Ti) that all come close to or exceed Titan performance at a fraction of the price, only 8 months after it's launch. Short reign indeed.
  • TheJian - Friday, October 25, 2013 - link

    The market dictates pricing. As they said, they sell every Titan immediately, so they could probably charge more. But that's because it has more value than you seem to understand. It is a PRO CARD at it's core. Are you unaware of what a TESLA is for $2500? It's the same freaking card with 1 more SMX and driver support. $1000 is GENEROUS whether you like it or not. Gamers with PRO intentions laughed when they saw the $1000 price and have been buying them like mad ever since. No parade has been crashed. They will continue to do this pricing model for the foreseeable future as they have proven there is a market for high-end gamers with a PRO APP desire on top. The first run was 100,000 and sold in days. By contrast Asus Rog Ares 2 had 1000 unit first run and didn't sell out like that. At $1500 it really was a ripoff with no PRO side.

    I think they'll merely need another SMX turned on and 50-100mhz for the next $1000 version which likely comes before xmas :) The PRO perf is what is valued here over a regular card. Your short-lived statement makes no sense. It's been 8 months, a rather long life in gpus when you haven't beaten the 8 month old card in much (I debunked 4k crap already, and pointed to a dozen other games where titan wins at every res). You won't fire up Blender, Premiere, PS CS etc and smoke a titan with 290x either...LOL. You'll find out what the other $450 is for at that point.
  • chizow - Saturday, October 26, 2013 - link

    Yes and as soon as they released the 780, the market corrected itself and Titans were no longer sold out anywhere, clearly a shift indicating the price of the 780 was really what the market was willing to bear.

    Also, there are more differences with their Tesla counterparts than just 1 SMX, Titan lacks ECC support which makes it an unlikely candidate for serious compute projects. Titan is good for hobby compute, anything serious business or research related is going to spend the extra for Tesla and ECC.

    And no, 8-months is not a long time at the top, look at the reigns of previous high-end parts and you will see it is generally longer than this. Even the 580 that preceded it held sway for 14-months before Tahiti took over it's spot. Time at the top is just one part though, the amount which Titan devalued is the bigger concern. When 780 launched 3 months after Titan, you could maybe sell Titan for $800. Now that Hawaii has launched, you could maybe sell it for $700? It's only going to keep going down, what do you think it will sell for once 780Ti beats it outright for $650 or less?
  • Sandcat - Thursday, October 24, 2013 - link

    I noticed your comments on the Tahiti pricing fiasco 2 years ago and generally skip through the comment section to find yours because they're top notch. Exactly what I was thinking with the $550 price point, finally a top-tier card at the right price for 28nm. Long live sanity.
  • chizow - Saturday, October 26, 2013 - link

    Thanks! Glad you appreciated the comments, I figured this business model and pricing for Nvidia would be unsustainable, but I thought it wouldn't fall apart until we saw 20nm Maxwell/Pirate Islands parts in 2014. Hawaii definitely accelerated the downfall of Titan and Nvidia's $1K eagle's nest.
  • TheJian - Friday, October 25, 2013 - link

    LOL. Tell that to both their bottom lines. I see AMD making nothing while NV profits. People who bought titan got a $2500 Tesla for $1000. You don't buy a titan just to game (pretty dumb if you did) as it's for pro apps too (the compute part of the deal). It's a steal for gamers who make money on the card too. Saving $1500 is a great deal. So since you're hating on NV pricing, how do you feel about the 7990 at $1000. Nice to leave that out of your comment fanboy ;) Will AMD now reap what they sow and have to deal with all the angry people who bought those? ROFL. Is the 1K business model unsustainable for AMD too? Even the 6990 came in at $700 a ways back. Dual or single chip the $1000 price is alive and well from either side for those who want it.

    I'd bet money a titan ultra will be $1000 again shortly if they even bother as it's not a pure gamer card but much more already. If you fire up pro apps with Cuda you'll smoke that 290x daily (which covers just about all pro apps). Let me know when AMD makes money in a quarter that NVDA loses money. Then you can say NV pricing is biting them in the A$$. Until then, your comment is ridiculous. Don't forget even as ryan points out in this article (and he don't love NV...LOL), AMD still has driver problems (and has for ages) but he believes in AMD much like fools do in Obama still...LOL. For me, even as an 5850 owner, they have to PROVE themselves before I ponder another card from them at 20nm. The 290x is hot, noisy and uses far more watts and currently isn't coming with 3 AAA games either. NV isn't shaking in their boots. I'll be shocked if 780TI isn't $600 or above as it should match Titan which 290x doesn't do even with the heat, noise and watts.

    And you're correct no OC room. Nobody has hit above 1125.

    If NV was greedy, wouldn't they be making MORE money than in 2007? They haven't cracked 850mil in 5 years. Meanwhile, AMD's pricing which you seem to love, has cause their entire business to basically fail (no land, no fabs, gave up cpu race, 8 months to catch up with a hot noisy chip etc). They have lost over $6B in the last 10yrs. AMD has idiots managing their company and they are destroying what used to be a GREAT company with GREAT products. They should have priced this card $100 higher and all other cards rebadged should be $50 higher. They might make some actual money then every quarter right? Single digit margins on console chips (probably until 20nm shrink) won't get you rich either. Who made that deal? FIRE THAT GUY. That margin is why NV said it wasn't worth it.
  • chizow - Saturday, October 26, 2013 - link

    AMD's non-profitability goes far beyond their GPU business, it's more due to their CPU business. People who got Titan didn't get a Tesla for $1000, they got a Tesla without ECC. Compute apps without ECC would be like second-guessing every result because you're unsure whether a value was stored/retrieved from memory correctly. Regard to 7990 pricing, you can surely look it up before pulling the fanboy card, just as you can look up my comments on 7970 launch pricing. And yes, AMD will absolutely have to deal with that backlash giving that card dropped even more precipitously than even Titan, going from $1K to $600 in only 4-5 months.

    I don't think Nvidia will make the same mistake with a Titan Ultra at $1K. I also don't think Nvidia fans who only bought Titan for gaming will fall for the same mistake 2x. If Maxwell comes out and Nvidia holds out on the big ASIC, I doubt anyone disinterested in compute will fall for the same trick if Nvidia launches a Titan 2 at $1K using a compute gimmick to justify the price. They will just point to Titan and say "wait 3 months and they'll release something that's 95% of it's performance at 65% of it's price". As they say, Fool me once, shame on you, fool me twice, shame on me.

    And no, greed and profit don't go hand in hand. In 2007-2008, Nvidia posted record profits and revenue for multiple consecutive quarters as you stated on the back of a cheap $230-$270 8800GT. With Titan, they reversed course by setting record margins, but on reduced revenue and profits. They basically covet Intel's huge profit margins, but they clearly lack the revenue to grow their bottomline. Selling $1K GPUs certainly isn't going to get them there any faster.
  • FragKrag - Thursday, October 24, 2013 - link

    great performance, but I'll wait until I see some better thermals/noise from aftermarket coolers :p
  • Shark321 - Thursday, October 24, 2013 - link

    As with the Titan in the beginning, no alternate coolers will be available for the time being (according to computerbase). This means even if the price is great, you will be stuck with a very noisy and hot card. 780Ti will outperform the 290x in 3 weeks. It remains to bee seen how it will be priced (I guess $599).
  • The Von Matrices - Thursday, October 24, 2013 - link

    This is the next GTX 480 or HD 2900 XT. It provides great performance for the price, that is if you can put up with the heat and noise.
  • KaosFaction - Thursday, October 24, 2013 - link

    Work in Progress!!! Whhhhaaaattttt I want answers now!!
  • masterpine - Thursday, October 24, 2013 - link

    Good to see something from AMD challenging the GK110's, I still find it fairly remarkable that in the fast moving world of GPU's it's taken 12 months for AMD to build something to compete. Hopefully this puts a swift end to the above $600 prices in the single GPU high end.

    More than a little concerned at the 95C target temp of these things. 80C is toasty enough already for the GTX780, actually had to point a small fan at the DVI cables coming out the back of my 780 SLI surround setup because the heat coming out the back of them was causing dramas. Not sure i could cope with the noise of a 290X either.

    Anyhow, this is great for consumers. Hope to see some aftermarket coolers reign these things in a bit. If the end result is both AMD and Nvidia playing hard-ball at the $500 mark in a few weeks time, we all win.
  • valkyrie743 - Thursday, October 24, 2013 - link

    HOLY TEMPS BATMAN. its the new gtx 480 in the temp's department
  • kallogan - Thursday, October 24, 2013 - link

    No overclocking headroom with stock cooler. That's for sure.
  • FuriousPop - Thursday, October 24, 2013 - link

    can we please see 2x in CF mode with eyefinity!? or am i asking for too much?

    also, Nvidia will always be better for those of you in the 30% department of having only max 1080p. for the rest of us in 1440p and 1600p and beyond (eyefinity) then AMD will be as stated by previous comments in this thread "King of the hill"....

    but none the less, some more testing in the CF+3x monitor department would be great to see how far this puppy really goes...

    i mean seriously whats the point of putting a 80 year old man behind the wheel of the worlds fastest car?!? please push the specs on gaming benchmarks pls (eg; higher res)
  • Bob Todd - Thursday, October 24, 2013 - link

    You honestly think only ~30% of the gaming market is at 1080p and ~70% is at >= 1440p?
  • FuriousPop - Thursday, October 24, 2013 - link

    look at what these cards are capable of.... if you want to spend +$500 when your running 1080p then im sorry but that is clearly overkill and you have too much money in your wallet!

    tbh i just guestimated from all the ppl i have read on numerous forums. could be more or less really! but seriously would u really want AMD for 1080p? me personally no, and i have cf 7970's but then again am running 1600p or eyefinity. so im the very little minority...
  • TheJian - Friday, October 25, 2013 - link

    98.75% use 1920x1200 or less. Most above that have TWO cards or more. No single card is overkill for 1080p if you like all your settings maxed. Every site that tested 1600p shows games that are turned down, never mind the quotes saying 4K is a DUAL CARD situation at best.

    Check steampowered survey for the data or even Ian's 2160P articles here. He has it in his 2nd article on that topic. He mentions 4% on 1920x1200 and up but neglects to say 1920x1200 and DOWN is 98.75. But even his numbers are bad including 1920x1200 in the high category. Even then it's less than 4% total all above. You don't even fit your own claim. You have two cards, that soundly beat a 290x. Your in the super duper puny tiny teeny weeny minority...LOL.
  • Shark321 - Thursday, October 24, 2013 - link

    I think 90% is at 1080p.
  • jljaynes - Thursday, October 24, 2013 - link

    steam hardware survey more or less agrees with your 90%
  • Pantsu - Thursday, October 24, 2013 - link

    On the other hand this is an enthusiast class card, and in this class 1080p probably doesn't have that big of a share. People buy these cards for 27", 30" and eyefinity/surround. Sure there is 120 Hz 1080p and S3D, but I don't think it's a bigger category.
  • andrewaggb - Thursday, October 24, 2013 - link

    exactly. If you're going high end you don't settle for a single 1080p monitor.... 90% of the people on stream don't have 290x's or titans either.
  • Ryan Smith - Thursday, October 24, 2013 - link

    2x in CF mode with Eyefinity would be 4K. On the PQ321 it technically operates as a pair of displays, and it's a higher resolution than 3x1080P.
  • zeock9 - Thursday, October 24, 2013 - link

    This is simply incredible, almost hard to believe, news that AMD set its price at lower-than-expected $549.
    As much as I hated AMD for their marketing schemes of late, I'll have to applaud them for bringing this solution at such a price point.

    Now I can't imagine what kind of a counter guys over at the green camp will bring to bear for us, since the faster 290x can be had for $100 less than the slower 780.

    1. they won't want 780ti to overtake Titan in terms of performance at $650, or it will make the latter completely obsolete barring very special circumstances with pro-sumers,
    which means they will have to introduce Titan Ultra at $1000, if they want to keep faster-than-Titan-780ti at $650 though I can't imagine even Titan Ultra being able to keep pace with 290x at 4k res due to lower ROP counts.

    2. or they can simply match 290x's price point at $549 with their upcoming 780ti and make it slower than Titan.

    It will really be interesting to see.
  • Shark321 - Thursday, October 24, 2013 - link

    GK110 taped out 2 years ago. Nvidia will have no problems matching the 290x with 780ti and beat it by 30% in Q1-Q2 2014. AMD is as always too late to the game.
  • zeock9 - Thursday, October 24, 2013 - link

    I'm sure they can/

    What I'm interested in the most is how they go about doing that in relation to their now oddly placed top dog, Titan, and at what price point.
  • ninjaquick - Thursday, October 24, 2013 - link

    So, the 780Ti is going to magically be 25% faster than the Titan? I'd like to see where they managed to get that much number crunching out of an ancient process node.
  • Kevin G - Friday, October 25, 2013 - link

    The problem for nVidia is that they have wait for a new process node to go much beyond Titan's performance. GK110 is 550 mm^2 which is on the edge of what TSMC can reliably manufacture. Without 20 nm, nVidia can't add more functional units to really increase performance. (They could enable one more SMX cluster but that'd only be good for <10% increase at equal clocks.)

    TSMC 20 nm production is set to come online in early 2014 which would mean end products would start shipping at Q2 at the earliest. nVidia has wisely moved their large die releases to the end of the generation so we're likely to see the GM104 chips before the GM100.
  • DMCalloway - Friday, October 25, 2013 - link

    The 780ti is going to be GK110 based, and should be out mid November so not sure where you're getting taped out? They'll push the clocks, enable another SMX cluster, and price it at or less than current gtx 780 prices. This of course is a good thing because it brings green team's pricing/performance inline with where it should've been at launch.
  • JDG1980 - Thursday, October 24, 2013 - link

    My prediction: The 780 Ti will have the same graphics specs as the original Titan, but will still have crippled GPGPU performance to maintain market segmentation. The new Titan will be the full GK110 chip with no parts disabled.
  • AFQ - Thursday, October 24, 2013 - link

    Since when does AnandTech started posting half-assed reviews?
  • Ananke - Thursday, October 24, 2013 - link

    Honestly, anything with price above PS4 is dead in the market. This is really impressive card, however they will likely sell couple thousand units and that's it. The fight is going to be in the $180-200 range, and I haven't seen anything substantial there yet. NVidia is great with GTX760 - sells a lot and the margins are great.
    p.p. I will most likely get R9-280X
  • piroroadkill - Thursday, October 24, 2013 - link

    Uh, not sure what you're talking about.
    The R9 270X is basically a 7870 on steroids (faster clocks, faster RAM), and is widely available for $200.
    The only NVIDIA card at that price is the 660. The 660 is crap in comparison to the 270X...

    The cheapest 760 is $250. Outside of the range you yourself defined.
  • TheJian - Friday, October 25, 2013 - link

    NV sold 100,000 Titans at $1000 in a few days with many more runs in the last 8months. AMD will likely sell at least a few hundred thousand. At the current price I'd expect a few 100K at least. This is $450 less than Titan and is pretty close (excluding heat, noise, watts). It will sell pretty well and I think they should have priced it $100 higher. NV wouldn't have needed to lower 780 and AMD would have made some real money for a change. Their pricing model seems to indicate they don't like making any money EVER.
    http://investing.money.msn.com/investments/financi...
    10yr summary...6.5Billion lost in 10yrs. Think they need to charge a bit more for their products? I mean even a monkey would see - LOSS, LOSS, Billion LOSS, LOSS, multiBillion LOSS, LOSS etc...and say...Hmmm...We need to raise pricing or lose money forever...LOL.
  • althaz - Thursday, October 24, 2013 - link

    I can see the first page of the article, but after that all I get on every page is "[work in progress]" instead of actual content.
  • xtrememorph - Thursday, October 24, 2013 - link

    With temp of 94 Celsius power draw of 405W during heavy load, don't this felt like a card that is heavily OC instead of technological innovation? Price wise.. i like it!!
  • xtrememorph - Thursday, October 24, 2013 - link

    ok regards to the price again.. with high Watt usage.. so in long run it will be more expensive to run. right?
  • TheJian - Friday, October 25, 2013 - link

    LOL...YEP over 4+yrs you'd be right as the wasted electricity adds up over that kind of time assuming you are an avid gamer before even buying this type of card.
  • hellcinder - Thursday, October 24, 2013 - link

    I'm confused. You're talking about $550 being a great bargain, yet you can get a 7990 for that price....Are you guys forgetting to take your ADD pills? What's so special about this 290 vs a 7990 when we throw in price models?
  • Shark321 - Thursday, October 24, 2013 - link

    7990 sucks. It's as loud as the 290x, draws even more power, and you get all the Crossfire problems.
  • zeock9 - Thursday, October 24, 2013 - link

    1. 7990 still has frame pacing issues in resolution above 1440p.
    2. CF scaling is much better for 290x since it is a single card.

    If you plan on gaming at 1080 with a single gpu solution, you don't need these top end cards to begin with, they are premium enthusiats' cards for those who can and will get the most out of them.

    So yes, they have plenty more to offer that 7990 currently can't.
  • t41nt3d - Thursday, October 24, 2013 - link

    You're obviously someone can't afford the best gpu's because you absolutely need them at 1080p.

    All this nonsense from people who have inferior cards thinking they can run max quality at 1080p astounds me, considering in the most demanding games and as is being proven with next gen games, the best single gpu's at the moment will just about run avg at 60fps on 1080p.
  • zeock9 - Thursday, October 24, 2013 - link

    And for people like you, they have mid tier cf/sli solutions like 7870cf or 660ti sli.

    Again, these premium cards are for those who want nothing but the best and going all out with their setups, and to those performance oriented minds, $550 indeed is a bargain. Period.
  • konondrum - Thursday, October 24, 2013 - link

    "To say it’s been a busy month for AMD is probably something of an understatement. After hosting a public GPU showcase in Hawaii just under a month ago, the company has already launched the first 5 cards in the Radeon 200 series – the 280X, 270X, 260X, 250, and 240 – and AMD isn’t done yet."

    I'm sorry but this statement is just plain silly marketing talk. Yes, I know with Mantle and RealAudio AMD is definitely trying to move forward and innovate.. but seriously? The "first 5 cards in the Radeon 200 series..." are already 2 years old (or at least what 6-9 months for Bonaire?) I come to AnandTech for in depth reviews and discussions about new technology, this is just lazily repeating AMD's marketing material.

    And while the overall performance of this card is impressive, the power consumption, thermals and noise levels are completely unacceptable. The GTX 480 was rightfully mocked at release, and this is at least as bad. This is definitely not the direction we need to be headed. I would never pay $500+ for a video card, but if I did, I sure as hell would be willing to pay a premium for a card that doesn't sound like a jet engine taking off.
  • Shark321 - Thursday, October 24, 2013 - link

    "The GTX 480 was rightfully mocked at release, and this is at least as bad." it's not "as" bad. A German review site compared the 290x to the 480 and in Uber mode the 290x is WAY louder than the 480.
  • ninjaquick - Thursday, October 24, 2013 - link

    I don't see the problem with this, though... 'Tis the nature of the beast called GCN. AMD could have thrown the extra TrueAudio hardware into the mix for the full range, but it really isn't needed. There are far too few games that would be implementing it, and taping out a full range of GCN1.1 cards would have cost more than you'd think.

    This node has been around for far too long, and any gains from an actual redesign would be minimal at best. The 290X literally performs exactly on par with what would be expected from its core configuration, think of it as intel's Core2Quad, just two Core2Duo's welded together. Except, thousands of them. Obviously much simpler, but that is the overall principle of GCN.
  • Whitereflection - Thursday, October 24, 2013 - link

    Sum it up: The best price/performance on the high end market, And absolutely a steal comparing to Titan. The heatsink can use some work, But I am sure we are going to see the custom cooler version within a month or two which will hopefully dropped the loaded temperature under 70.
  • Shark321 - Thursday, October 24, 2013 - link

    There will be no custom cooling solutions (according to computerbase). Just the same as with Titan. There is limited chip supply so 100% of the cards will be manufactured by AMD.
  • ninjaquick - Thursday, October 24, 2013 - link

    So, if I go and install a custom-built water-cooler on my 290X, AMD will come and take it away?

    Besides, AMD has never locked manufactures out of using their own cooler designs (at least I can't recall them ever doing that).

    There won't be any initial sales (for early adopters) that are non-reference, however give it a month after release and ASUS/HIS/Sapphire will all have custom coolers.
  • JDG1980 - Thursday, October 24, 2013 - link

    Whether or not that is AMD's plan currently (and I'd like to see an official English-language source saying that), I doubt it will hold up after the first wave of reviews, which all seem to like the card but dislike the cooler. Nvidia held the line on Titan because its cooler was actually good. This one sucks, and I don't see why AMD would deliberately handicap their cards by refusing to allow, for example, Asus or MSI to design better-than-reference versions.
  • taserbro - Thursday, October 24, 2013 - link

    To be fair, when I shop for high end, I don't look at the price tag and when I'm looking for value, I don't look at high end.
    The best performance per dollar in each generation will still be from multi-card setups of mid-end cards overclocked on custom designs and the folks who bought titans still got to actually use their cards for many many months with an incontestable advantage.
  • favro - Thursday, October 24, 2013 - link

    The modern day Radeon 9700....
  • JDG1980 - Thursday, October 24, 2013 - link

    AMD definitely deserves kudos for getting back to parity on the single-GPU front. That said, I find the power consumption and especially the temperature figures to be somewhat problematic - I'd be a bit concerned about longevity when the chip runs at 94 degrees under load. Perhaps the standard 290 will get TDP down to something closer to the 250W level and reduce temperatures accordingly. I think they pushed the architecture a bit hard for this one, but I can't really blame them since gamers usually care about performance above all else and since they are constrained by the process node. If only Hector Ruiz hadn't sold off the fabs...
    I wonder what their profit margins are on the 290X. Between this and the Mac Pro design win, I'm hoping that AMD can roll some money back into R&D to stay competitive in the future.
  • Dribble - Thursday, October 24, 2013 - link

    Great performance, I want a $500 titan now. That said "uber" figures should really read "water cooled" cause the temps and noise are just silly with the stock cooler - they are pretty bad with the normal 290X.
  • BHZ-GTR - Thursday, October 24, 2013 - link

    Great, Price ! I Buying R9 290X .

    Than 780 GTX Stronger And Cheap .
  • Mondozai - Thursday, October 24, 2013 - link

    Hellcinder, you insult people with ADD by assuming they are as stupid as you if you think the 7990 is a better card, which others have already pointed out.
  • thevoiceofreason - Thursday, October 24, 2013 - link

    Thoroughly disappointed with power consumption. Comparing to GK110: the same process node, similar clocks, 20% smaller die, one billion transistors less and it somehow uses more power?
  • YazX_ - Thursday, October 24, 2013 - link

    "The bigger question is whether they’re willing to compete with AMD on price"

    exactly, I dont care if it is Nvidia or AMD, the only thing that matters is the price per performance. so being a switcher between ATI, AMD and Nvidia (lately Nvidia), if they didnt drop the prices for GTX 770/780, then im gonna sell my card and get AMD, it wins in every single aspect.
  • Kougar - Thursday, October 24, 2013 - link

    130 comments, but the entire article is missing after the first page? Only seeing " [work in progress] "
  • Kougar - Thursday, October 24, 2013 - link

    It's been half a day and I've noticed the missing pages are starting to be fixed around one an hour or so.

    I respectfully suggest not posting at midnight if the review isn't finished being written. I can understand last minute revisions & fixes needing to be done as that's always the case with these, but posting a review with most of the pages completely empty and spending the rest of the day writing the pages for it is just bad no matter how one looks at it.
  • AlderaaN - Thursday, October 24, 2013 - link

    Thank you for the review!

    Looking forward to the 'XDMA: Improming CrossFire' work in progress section.
  • Sancus - Thursday, October 24, 2013 - link

    Given that this card has no over clocking headroom, unfortunately all Nvidia has to do is bin some OCed 780s and they'll beat it in every benchmark and still be 10db quieter.

    That's probably exactly what the 780ti is.
  • colonelclaw - Thursday, October 24, 2013 - link

    Great review of what looks like a very strong product from AMD. And about bloody time, too. As much as I like Nvidia cards, their pricing has recently verged on outright taking-the-piss.

    I'm now looking forward to two things - price drops from Nvidia and 3rd party cooling solutions for the 290X that don't make as much noise.
  • katalm - Thursday, October 24, 2013 - link

    Aaaaaaaaannnddd They're gone.... "Out of Stock" sayeth the great and powerful Newegg...
  • Da W - Thursday, October 24, 2013 - link

    Jesus!
    I'm happy for AMD and for this great price/performance ratio, but i don't want to boil water with my GPU!
  • Da W - Thursday, October 24, 2013 - link

    Lol @ Nvidia fanbois.
    Didn't you read the review?
    "AMD is essentially tied with GTX Titan, delivering an average of 99% of the performance of NVIDIA’s prosumer-level flagship. Against NVIDIA’s cheaper and more gaming oriented GTX 780 that becomes an outright lead, with the 290X leading by an average of 9% and never falling behind the GTX 780."
    That's it. Bottom line. No point in shooting at Anandtech saying they suck. No point in denying, in bringing other unreal "facts" to try to prove your point. Nvidia should drop their prices, period.
  • TheJian - Friday, October 25, 2013 - link

    Please see all the games I listed where 780 wins from other sites :) Guru3d, Techreport, Techspot, Techpowerup, etc.

    AMD fanboys...Don't you read MORE than one review? I could list more but they're all in the posts above. CTRL-F jian Not tied with titan, and certainly not when you realize its a Tesla for $1500 off already and now comes with 3 AAA games. You're only see Anandtech's game choices. See all the others I listed. The other sites would beg to differ on both cards.
  • maecenas - Thursday, October 24, 2013 - link

    This is great for consumers, we're seeing some really healthy competition here, hopefully this forces NVIDIA to lower prices. That power, temperature and noise section is a little off-putting, it'll be interesting to see what Asus and the other manufacturers can do with this card
  • Gunbuster - Thursday, October 24, 2013 - link

    No Titan SLI results? Hmmmmm
  • mczak - Thursday, October 24, 2013 - link

    Not every GCN 1.1 chip has TrueAudio, therefore it's not correct to lump these two together.
    Kabini is very much GCN 1.1 but has no TrueAudio (well at least I wouldn't know...).
    Kaveri will also be GCN 1.1, though I guess it might feature TrueAudio.
  • Jeff Lee - Thursday, October 24, 2013 - link

    Thank you for including the 5870 in this analysis. I still use my 5870 and its certainly getting a little long in the tooth. Great article, love to see competition.
  • SpaceRanger - Thursday, October 24, 2013 - link

    I'd love to read the whole review, but after Page 2 all I get is:

    [work in progress]

    Oh well..
  • robb213 - Thursday, October 24, 2013 - link

    Well that, or just delusional fanboy's (there are so many types...).

    I'm more pro-Nvidia myself, but facts are facts, and I will acknowledge them and advise others based on them unbiasly.

    Seems like a lot of people around here are acting like it's been years and years since AMD has had a lead over Nvidia, and that this is the celebration of years when it's only been the length of 1 revision/new architecture. I mean, this is the same pattern as usual for every cycle. Yes, Nvidia is at fault for their pricing too, but it's not like AMD hasn't done the same previously, nor Nvidia in the past.

    The predictions on staying with 28nm have been correct, and I believe people will eventually label the 290X as the new 480 with time (everyone raved about the 480, then it eventually caught on). Eventually Maxwell will come, and I bet those will be more powerful too. Then comes along the next AMD lineup being more powerful afterwards, and so the pattern restarts.
  • Kevin G - Thursday, October 24, 2013 - link

    Hrm, this article seems to be incomplete as I post this. I see three pages of [work in progress] and noticed a few grammar mistakes. The pages with graphs are just raw data and no follow up on the observations.

    From the 7th page:
    "The closest we came to a problem was with the intro videos for Total War: Rome 2, which have black horizontal lines due to the cards trying to the cards trying to AFR render said video at a higher framerate than it played at."

    The R9 290X Uber configuration isn't explained anywhere I can find. I presume it is a high fan RPM cooling setting for better boost performance or an outright overclock to continually hit 1 Ghz.

    Several of the graphs also have an asterisk by the R9 280X results and I have no idea what that is supposed to indicate.
  • robb213 - Thursday, October 24, 2013 - link

    Now I haven't played Rome 2 since maybe a few days after its' launch. Doesn't it still have a slew of optimization problems among other graphical problems?

    Just wondering why they used Rome 2, a game still being heavily patched afaik, compared to Shogun 2, which is still demanding, and runs great.
  • Ryan Smith - Friday, October 25, 2013 - link

    Patch 3, which was the patch released right before the 280X, settled the major GPU problems. I can't speak for the turn sequences, which are still CPU bottlenecked, but as far as GPU reviews go it's good for use.
  • TheJian - Friday, October 25, 2013 - link

    Total War Rome2 has had at least 4 patches with the 4th on 10/13. Might be a 5th by now.
    v1.4 = v1.4.0 = v1.00.7573 = #7573.461942 last one
    But site shows:
    https://help.sega.com/entries/22535104-Total-War-R...
    So new on 18th says post date. 5 patches to date.
  • Arbie - Thursday, October 24, 2013 - link


    Thanks for the Crysis Warhead benchmark. The game still leads in some ways and I still enjoy playing it. But more relevant here is that we can see the progression of GPU power over a long time span. Given the slower pace of PC graphics now, some of us haven't replaced our cards in quite a while. With this benchmark I can compare apples-to-apples with anything I own.

    The 290X looks great fundamentally, but 93 deg C at full load is too hot and will lead to a short card lifetime. This will probably be addressed by vendor cooling designs, and I'd wait for that.
  • polaco - Thursday, October 24, 2013 - link

    this remmembers me the ATI4850. from factory settings it reached almost 100%. I tuned the bios to adjust the fan throttling hysteresis and max rpms or something like that, that turned my card to 80°C. So I wouldn't be surprised if just tunning this card settings a bit will take you lower temps, powertune allows for much cool things that wasn't available at the 4850 moment. Vendor cooling solutions probably do better as you stated. :)
  • TheJian - Friday, October 25, 2013 - link

    Warhead is played by ZERO people on servers. How do you play alone? Servers I checked were empty ages ago and still just a month ago. They need to be using the games with the most users active, or highest sales (which might include games we can't tell active users if they're not server based like single player etc). Warhead is played by nobody, so it shows nothing. It's the same as firing up Doom 1. Nobody cares.
  • NeonFlak - Thursday, October 24, 2013 - link

    So the review is released but unfinished? I'm getting work in progress for a couple pages.
  • WeaselITB - Thursday, October 24, 2013 - link

    This is a really poor showing on AnandTech's part. I don't come to AT because of the speed that your reviews are posted, I come because of the quality and depth of the commentary that makes up the reviews. Charts are meaningless without the commentary surrounding them.

    Please, in the future, throw up a one-page benchmark chart with a paragraph stating "preliminary results" or something, and come back with an actual full-fledged review. This "[work in progress]" crap is just that -- crap -- especially now that I'm reading this at 10:20am Central and still seeing a half-article. I know you guys are capable of better.
  • lamovnik - Thursday, October 24, 2013 - link

    Here is 290x fan noise test http://www.youtube.com/watch?v=T5MbOGoEMDY
  • Notmyusualid - Thursday, October 24, 2013 - link

    P1SSED MYSELF laughing!

    And I only came by to smile at the fan boi-ish comments.

    Well done.
  • Kutark - Thursday, October 24, 2013 - link

    Honestly, no. The fanboism is rampant in these comments. And i don't know what they're smoking about it being faster than a titan, in most cases its barely equaling and usually slightly slower. Now, is that impressive for a card thats $600? Yes, absolutely. Am i impressed? No, not in the slightest. Titan was released in FEBRUARY. That was 8 months ago. And though i am pissed at nvidia for making it so expensive, and subsequently the 780/770's so expensive. The ONLY reason they were able to pull that kind of crap is because AMD was nowhere to be found. AMD literally couldn't compete until almost a year later? I'm sorry im just not impressed. I am glad they're finally stepping up their game, because one company having no real competition is never a good thing for the consumer.
  • Notmyusualid - Friday, October 25, 2013 - link

    I have to admit the lower power consumption of the Titan would draw me in. Electricity costs are a joke in the UK now.
  • Sancus - Thursday, October 24, 2013 - link

    haha, awesome.
  • hoboville - Thursday, October 24, 2013 - link

    Great review Ryan, always in depth and insightful.

    One of the things I have noticed over the years is that, depending on the card (core count / TDP) the reference blower is rarely ever better than aftermarket coolers made by ASUS, EVGA, etc (anyone else confirm / deny?). That is to say they seem to provide both less cooling and more noise. My concern is that the 290X is hitting 90 degrees C. Some people who have warmer residences may find themselves having trouble. Hopefully AMD will let their partners start making custom coolers by Black Friday!

    With regards to the price / performance, this reminds me of how the EVGA 780 Superclocked ACX (I think) blows away the Titan by being nearly as fast and so much cheaper. I've heard rumors that AMD may be extending their Never Settle Forever bundle to Hawaii, have you heard anything new about this? None the less, the value of an OC'd 780 would seem to be the same if the buyer were to Ebay off those game keys. Hopefully the 780 Ti will drop the 780 down to a reasonable price w/ game bundle, again giving us consumers more for our money.

    Two quick questions: How well does the 290X do for Litecoin / BTC? Will the XFX 290R include Battlefield 4?
  • FriendlyUser - Thursday, October 24, 2013 - link

    Great price and performance. Thanks for the timely review. I think this is a halo product, whose main role is to humiliate the Titan. The most interesting deal should be the 290 with a non-reference cooler.
  • g00ey - Thursday, October 24, 2013 - link

    Well it certainly killed my boxers, that's for sure...
  • Thomas1016 - Thursday, October 24, 2013 - link

    I bought a r9 290x this morning from Newegg. The bundled version is 579.00 the none bundled is 549.00 so the price that were quoted were in fact accurate.

    The one thing that does concern me a little is only having 1 fan on the card. But I also don't plan on tweeking it much if at all.
  • Shadowmaster625 - Thursday, October 24, 2013 - link

    This gpu has such good memory bandwidth... so why dont they just shove 3 steamroller modules in there and slap on a southbridge and make it into an elite gaming PC motherboard? It wouldnt cost them much more than $100 extra to make this video card into a complete PC motherboard. The transistor count would only increase by 25%, so power would probably stay under 400W.
  • Kevin G - Friday, October 25, 2013 - link

    3 Steamroller modules would make the die size go beyond what is manufacturable. Three modules would add ~90 mm^2 to the 428 mm^2 dies without any support logic and before the south bridge was added. When all is said and done, the die size would likely be north of 600 mm^2.

    Power draw under full load would easily exceed 450W. The only way to move that much heat would be liquid cooling. Including the south bridge would add too many IO pins on top of an already large IO configuration.
  • labotsirc - Thursday, October 24, 2013 - link

    Does anyone now if the 290X will support OpenCL 2.0, with the new dynamic parallelism feature?
    If yes, then i could buy 2 of these instead of one Titan, which does support this feature.
  • kyuu - Thursday, October 24, 2013 - link

    The conclusion here is that this is a great card at a great price, but you should wait for custom cooling solutions since the reference cooler is inadequate. Unless you are going to build your own cooling solution or simply MUST have the fastest single-GPU card right now.
  • Da W - Thursday, October 24, 2013 - link

    The reference cooler is noisy as hell, but it's a blower. At least it doesn't dump all the heat inside your case and let your other case fans handle it. It depends what you're looking for.

    Still make as much noise as the 5870 did, and it was a commercial success.
  • slickr - Thursday, October 24, 2013 - link

    Didn't think AMD will deliver, in fact I thought from seeing some initial benchmarks that AMD took over 6 months just to deliver a graphic card slower than Titan and that even with a cheap price it wouldn't be enough, boy was I wrong.

    This card beats Titan in so many games and in so many resolutions and is almost $500 cheaper, its also $100 cheaper than the GTX 780 and anywhere from 5% to 20% faster than the 780, that is just amazing.

    Hopefully this trickles down to the medium range cards and we are going to see cards like the 280x go for less than $250.

    I mean unless Nvidia positiones the Titan at $550 as well, then I don't think it will sell very much at all. In the 290x you have a better performing card at almost half the price, Nvidia has its work cut out for them and I sure hope the 780 TI edition really brings in the performance and price as well.
  • eanazag - Thursday, October 24, 2013 - link

    I'm sporting a Nvidia GPU in my rig. I don't see any option for Nvidia than to reduce both the Titan, 770, and 780 in cost. I can't expect the 780 Ti performance to trump the Titan. I will say that there is power and cooling room for Nvidia to ratchet things up and make this interesting. This is bold move on AMDs part and does wonders for consumers. Based on some of the other comments current news does not kill off either brand by the way. PC gaming and desktops are not dead.....
  • kwrzesien - Thursday, October 24, 2013 - link

    Ryan, can we get a pipeline article or retweet this article when it is complete? Thanks!
  • spiked_mistborn - Thursday, October 24, 2013 - link

    Nice job AMD! Competition is good! Also, feel free to use my GSYNC idea about putting a frame buffer in the display and letting the video card control the refresh rate. This post is from March 2013. Apparently adding a dash to make it G-Sync makes it different somehow. http://techreport.com/discussion/24553/inside-the-...
  • Sorodsam - Thursday, October 24, 2013 - link

    I'm surprised no one's commented on the new "AMD Center", or this troubling text:

    "You've landed on the AMD Portal on AnandTech. This section is sponsored by AMD."

    There's a big difference between a site that runs an occasional AMD ad and a site with an entire section that's expressively "sponsored by AMD", especially considering AnandTech's (former?) guiding principle that product reviewers shouldn't be aware of who exactly is buying advertising and when. They can hardly be unaware of it now.
  • MrMaestro - Thursday, October 24, 2013 - link

    The reason people aren't commenting on it is because it's already been commented on. Take a look at the article announcing AMD Centre. That is the appropriate place for such comments.
  • anevmann - Thursday, October 24, 2013 - link

    Ryan, will this card require PCIe 3.0 for gaming?

    Can you do a test with and without PCIe 3.0? I really want this card, but I wanna know in advance if I have to upgrade my system.

    A
  • Ryan Smith - Monday, October 28, 2013 - link

    I can't promise when it will be done (given the insanity of our schedule over the next 3 weeks), but at some point we will follow this up with a reprise article, that among other things will cover PCIe bandwidth vs. Crossfire scaling, CF testing in quiet mode, and some noise equalization to see what fan levels it would take to match a GTX 780 and what the resulting performance would look like.

    Anyhow, for a single card setup none of my data thus far supports PCIe 3.0 being a requirement. We're not to the point yet where PCIe 2.0 x16 is a general gaming bottleneck.
  • Hung_Low - Thursday, October 24, 2013 - link

    Is 290X really the absolute maxed out version Tahiti? Or did it also leave a lot of room like Titan is for GK110.
    Perhaps the GPUs used in 290x are those Tahiti's with imperfection, with the high quality Tahiti's saved for extreme edition 290x/290x+ ?
  • Ytterbium - Thursday, October 24, 2013 - link

    The 290 is the binned version 290X is the top
  • Atiom - Thursday, October 24, 2013 - link

    Time to cut the Titan price to less than half!!!!!!
  • TheJian - Friday, October 25, 2013 - link

    Sure as soon as 290x gets the compute abilities in pro apps that Titan has (the Tesla side of Titan). 3 AAA games, less heat, less noise, less watts, physx, cuda, pro apps perf and 6GB of memory. All that adds up to HIGHER PRICING than 290x. Tesla is $2500.
  • MADDER1 - Thursday, October 24, 2013 - link

    Awesome! We need this for CPUs too!
  • Ytterbium - Thursday, October 24, 2013 - link

    I'd prefer the review was finished before posting.
  • Hxx - Thursday, October 24, 2013 - link

    why do people compare this to the titan? Nobody buys a titan card unless they absolutely must have the fastest nvidia card. Its like buying a Lambo. How many other cars can you buy with less money that are just as fast if not faster....its like bragging rights for geeks, or in Lambo's case bragging rights for rich old folks.
  • Gigaplex - Thursday, October 24, 2013 - link

    What does that have to do with the complaint that the review shouldn't be posted before it has been finished?
  • aTaoZ - Thursday, October 24, 2013 - link

    Looks like the 290X's power regulation is really working. It's able to run at a specific temperature.Because of it, many games will run at lower core frequencies than 1GHz. Once the non-reference cooling solutions arrive, there will be even more performance coming out of these cards.

    In LegitReview's article, they dropped the target temperature to 65 Celsius, and the performance seems to drop between 7-20% depending on application.
  • HisDivineOrder - Friday, October 25, 2013 - link

    I suspect the card was actually built to run at 80-85, but when they saw the performance wouldn't be hitting at Titan-killer levels, they boosted it to 95 and said, "Yeah, it'll hold."

    I think some custom coolers and especially water cooling should make this card fly, but the default cooler is one of those tragedies that just makes you shake your head.
  • Owls - Thursday, October 24, 2013 - link

    [comment in progress]
  • OverclockedCeleron - Thursday, October 24, 2013 - link

    That made my day!
  • pattycake0147 - Friday, October 25, 2013 - link

    Yes! Comment of the day.
  • TheinsanegamerN - Friday, October 25, 2013 - link

    [reply in progress]
  • OverclockedCeleron - Thursday, October 24, 2013 - link

    Well done AMD, keep up the good work! Now it's the CPU turn ;).
  • aTaoZ - Thursday, October 24, 2013 - link

    Over in the overclocker uk forum, Gibbo is hinting the performance of non-x version in his 290X overclocking review. It is highly possible R9 290 will match GTX780 in performance at an even lower price.
  • steelmilkjug - Friday, October 25, 2013 - link

    It's good to see NVidia and AMD jockeying for the high end. Competition always helps keep prices realistic. Everytime I buy GPUs I go through the same mental struggle: Nvidia with better power/temp efficiency (roughly stated I know), AMD with slightly better price/performance ratios, sometimes significantly better. In the end, it always comes down to one thing for me: drivers. NVidia just seems to be more proactive in supporting their products, new drivers seem to come more often, and I feel the affect in games , usually. Less dual card issues, generally and a seemingly broader support of games in general. NVidia is a graphics hardware company specifically. AMD is not. They are a computer hardware company with a graphics department. And it shows. I'm an electrical engineer and I've seen how hard it is to get a job as an NVidia EE. It's a world renowned job, and they only pick the best in the world. I'm not sure AMD is quite as stringent.
  • manicmonday - Friday, October 25, 2013 - link

    So you think second tier engineers at AMD created a chip 25% smaller with better performance?
  • SunLord - Friday, October 25, 2013 - link

    Did Ryan quit or pass away 1/3 of the way through writing this review other this is pathetic so much for being a top tier review site.
  • piroroadkill - Friday, October 25, 2013 - link

    Yep. I gave them a while, some pages have now populated, but there are still too many empty ones. This is pretty poor.

    These days it seems like Apple gets all the coverage. Well, I couldn't really give a rats ass about massively long Apple articles. After all, Apple just wrap things up (such as GCN, Intel CPUs) into shiny boxes.

    I think most of your readership is probably interested in the underlying tech, too.
  • Da W - Friday, October 25, 2013 - link

    Man what do you do for a living? Quit bitching about other people's work, if you're not happy go somewhere else and STFU.
  • pattycake0147 - Friday, October 25, 2013 - link

    Nope piroroadkill is spot on with speaking his opinion. Anand continually asks for reader feedback, and he's doing just that.

    The rate at which this article is being finished is piss poor. Ryan said it would be finished in the morning the day of posting which meant in the next 12 hr or so. The main explanatory pages took about 24 hr to be completely fleshed out, and the graphs still don't have any text explaining the trends in performance. I actually value the author's commentary more than the graphs, and looking through a review which is incomplete over 36 hr after posting is much below Anandtech standards.

    I hate to bring it up because I like reading the vast majority of content on Anandtech regardless of market or complany, but I firmly believe piroroadkill is correct in saying that a new Apple product would have had a complete and thorough review shortly after NDA was lifted.
  • HisDivineOrder - Friday, October 25, 2013 - link

    He had three R9 290X's in one system. Crossing his chest, he took out his third and slid its PCIe into the test bed. Immediately, the room began to darken and a voice spoketh, "You dare install THREE R9 290X's into one system! You hath incurred the wrath of The Fixer, demon lord of the 9.5th circle of hell! Prepare for the doooooom!"

    Then the system erupted into flames, exploding outward with rapid napalm-like flames that sent him screaming out the door. Within seconds, the entire building was burning and within minutes there was nothing left but ashes and regrets.

    Ever since, he has been locked away in a mental health ward, scribbling on a notepad, "Crossfire," over and over. Some say on the darkest nights, he even dares to whisper a single phrase, "Three-way."
  • B3an - Saturday, October 26, 2013 - link

    LOL!
  • Ryan Smith - Monday, October 28, 2013 - link

    Hahaha!

    Thanks man, I needed that.
  • yacoub35 - Friday, October 25, 2013 - link

    It's a bit silly to list the 7970 as $549 when the truth is they can be had for as little as $200. And they're easily the best deal for a GPU these days.
  • yacoub35 - Friday, October 25, 2013 - link

    To clarify: A marketing piece lists "Launch prices", a proper review compares real-world prices.
  • yacoub35 - Friday, October 25, 2013 - link

    So double the ROPs on a new architecture and an extra GB of faster GDDDR results in maybe 10-20 more frames than a 7970GE at the resolution most of us run (1920x). Somehow I don't think that's worth twice the price, let alone the full $549 for someone who already owns a 7970.
  • Jumangi - Friday, October 25, 2013 - link

    Only a clueless noob with too much money in their pocket would buy a 290x if they are running at 1920 resolution.
  • kyuu - Friday, October 25, 2013 - link

    If you're just looking to game at high details on a single 1080p monitor, then no, the 290X isn't interesting as you're spending a lot of money for power you don't need. If you're gaming at 1440p or higher and/or using Eyefinity, then it's a different story.
  • Hulk - Friday, October 25, 2013 - link

    I just wanted to thank Ryan for getting up the charts before the rest of the article. We could have either waited for the entire article or gotten the performance charts as soon as you completed them and then the text later. Thanks for thinking of us and not holding back the performance data until the article was finished. It's exactly that type of thinking that makes this site the best. I can imagine you starting to work on the text and thinking, "You know what? I have the performance data so why don't I post it instead of holding it back until the entire article is finished."

    Well done as usual.
  • kyuu - Friday, October 25, 2013 - link

    I agree. Ignore at all the complainers; it's great to have the benchmark data available without having to wait for all the rest of the article to be complete. Those who don't want anything at all until it's 100% done can always just come back later.
  • AnotherGuy - Friday, October 25, 2013 - link

    What a beast
  • zodiacsoulmate - Friday, October 25, 2013 - link

    Donno, all the geforce cards looks like sh!t in this review, and 280x/7970 290x looks like haven's god...
    but my 6990 7970 never really make me happier than my gtx 670 system...
    well, whatever
  • TheJian - Friday, October 25, 2013 - link

    While we have a great card here, it appears it doesn't always beat 780, and gets toppled consistently by Titan in OTHER games:
    http://www.techpowerup.com/reviews/AMD/R9_290X/24....
    World of Warcraft (spanked again all resolutions by both 780/titan even at 5760x1080)
    Splinter Cell Blacklist (smacked by 780 even, of course titan)
    StarCraft 2 (by both 780/titan, even 5760x1080)
    Titan adds more victories (780 also depending on res, remember 98.75% of us run 1920x1200 or less):
    Skyrim (all res, titan victory at techpowerup) Ooops, 780 wins all res but 1600p also skyrim.
    Assassins creed3, COD Black Ops2, Diablo3, FarCry3 (though uber ekes a victory at 1600p, reg gets beat handily in fc3, however hardocp shows 780 & titan winning apples-apples min & avg, techspot shows loss to 780/titan also in fc3)
    Hardocp & guru3d both show Bioshock infinite, Crysis 3 (titan 10% faster all res) and BF3 winning on Titan. Hardocp also show in apples-apples Tombraider and MetroLL winning on titan.
    http://www.guru3d.com/articles_pages/radeon_r9_290...
    http://hardocp.com/article/2013/10/23/amd_radeon_r...
    http://techreport.com/review/25509/amd-radeon-r9-2...
    Guild wars 2 at techreport win for both 780/titan big also (both over 12%).
    Also tweaktown shows lost planet 2 loss to the lowly 770, let alone 780/titan.
    I guess there's a reason why most of these quite popular games are NOT tested here :)

    So while it's a great card, again not overwhelming and quite the loser depending on what you play. In UBER mode as compared above I wouldn't even want the card (heat, noise, watts loser). Down it to regular and there are far more losses than I'm listing above to 780 and titan especially. Considering the overclocks from all sites, you are pretty much getting almost everything in uber mode (sites have hit 6-12% max for OCing, I think that means they'll be shipping uber as OC cards, not much more). So NV just needs to kick up 780TI which should knock out almost all 290x uber wins, and just make the wins they already have even worse, thus keeping $620-650 price. Also drop 780 to $500-550 (they do have great games now 3 AAA worth $100 or more on it).

    Looking at 1080p here (a res 98.75% of us play at 1920x1200 or lower remember that), 780 does pretty well already even at anandtech. Most people playing above this have 2 cards or more. While you can jockey your settings around all day per game to play above 1920x1200, you won't be MAXING much stuff out at 1600p with any single card. It's just not going to happen until maybe 20nm (big maybe). Most of us don't have large monitors YET or 1600p+ and I'm guessing all new purchases will be looking at gsync monitors now anyway. Very few of us will fork over $550 and have the cash for a new 1440p/1600p monitor ALSO. So a good portion of us would buy this card and still be 1920x1200 or lower until we have another $550-700 for a good 1440/1600p monitor (and I say $550+ since I don't believe in these korean junk no-namers and the cheapest 1440p newegg itself sells is $550 acer). Do you have $1100 in your pocket? Making that kind of monitor investment right now I wait out Gsync no matter what. If they get it AMD compatible before 20nm maxwell hits, maybe AMD gets my money for a card. Otherwise Gsync wins hands down for NV for me. I have no interest in anything but a Gsync monitor at this point and a card that works with it.

    Guru3D OC: 1075/6000
    Hardwarecanucks OC: 1115/5684
    Hardwareheaven OC: 1100/5500
    PCPerspective OC: 1100/5000
    TweakTown OC: 1065/5252
    TechpowerUp OC: 1125/6300
    Techspot OC: 1090/6400
    Bit-tech OC: 1120/5600
    Left off direct links to these sites regarding OCing but I'm sure you can all figure out how to get there (don't want post flagged as spam with too many links).
  • b3nzint - Friday, October 25, 2013 - link

    "So NV just needs to kick up 780TI which should knock out almost all 290x uber wins, and just make the wins they already have even worse, thus keeping $620-650 price. Also drop 780 to $500-550"

    we're talking about titan killer here.
    titan vs titan killer, at res 3840, at high or ultra :

    coh2 - 30%
    metro - 30%
    bio - (10%) but win 3% at medium
    bf3 - 15%
    crysis 3 - tie
    crysis - 10
    totalwar - tie
    hitman - 20%
    grid 2 - 10%+

    2816 sp, 64rop, 176tmu, 4gb 512bit. 780 or 780ti won't stand a chance. this is titan killer dude wake up. only then then we're talking CF, SLi and res 5760. But for single card i go for this titan killer. good luck with gsync, im not gave up my dell u2711 yet.
  • just4U - Friday, October 25, 2013 - link

    Well.. you have to put this in context. Those guys gave it their editor's choice award and a overall score of 9.3 They summed it up with this..

    "
    The real highlight of AMD's R9 290X is certainly the price. What has been rumored to cost around $700 (and got people excited at that price), will actually retail for $549! $549 is an amazing price for this card, making it the price/performance king in the high-end segment. NVIDIA's $1000 GTX Titan is completely irrelevant now, even the GTX 780 with its $625 price will be a tough sale."
  • theuglyman0war - Thursday, October 31, 2013 - link

    the flagship gtx *80 $msrp has been $499 for every upgrade I have ever made. After waiting out the 104 fer the 110 chip only to have the insult of the previous 780 pricing meant I will be holding off to see if everything returns to normal with Maxwell. Kind of depressing when others are excited for $550? As far as I know the market still dictates pricing and my price iz $499 if AMD is offering up decent competition to keep the market healthy and respectful.
  • ToTTenTranz - Friday, October 25, 2013 - link

    How isn't this viral?
  • nader21007 - Friday, October 25, 2013 - link

    Radeon R9 290X received Tom’s Hardware’s Elite award—the first time a graphics card has received this honor. Nvidia: Why?
    Wiseman: Because it Outperformed a card that is nearly double it's price (your Titan).
    Do you hear me Nvidia? Please don't gouge consumers again.
    Viva AMD.
  • doggghouse - Friday, October 25, 2013 - link

    I don't think the Titan was ever considered to be a gamer's card... it was more like "prosumer" card for compute. But it was also marketed to people who build EXTREME! machines for maximum OC scores. The 780 was basically the gamer's card... it has 90-95% of the Titan's gaming capability, but for only $650 (still expensive).

    If you want to compare the R9 290X to the Titan, I would look at the compute benchmarks. And in that, it seems to be an apples to oranges comparison... AMD and nVIDIA seem to trade blows depending on the type of compute.

    Compared to the 780, the 290X pretty much beats it hands down in performance. If I hadn't already purchased a 780 last month ($595 yay), I would consider the 290X... though I'd definitely wait for 3rd party cards with better heat solutions. A stock card on "Uber" setting is simply way too hot, and too loud!
  • rituraj - Saturday, October 26, 2013 - link

    So that "PROSUMER" cookie that nv tried to sell was just a grandma's cookie and it's proven now. I will buy the next maxwell flagship AND AT $600. (Lol.. I have to pay 20% more here in India)
  • dazaj - Friday, October 25, 2013 - link

    why is there no titan sli in all these benches but there is 290x cf
  • Ryan Smith - Friday, October 25, 2013 - link

    Because we only have 1 Titan.
  • just4U - Friday, October 25, 2013 - link

    They likely didn't have two on hand..
  • dazaj - Friday, October 25, 2013 - link

    why is there no titan sli in any of these benches
  • drinkperrier - Friday, October 25, 2013 - link

    Just 1 question.
    Can i have the possibility to buy this videocard from AMD brand and not asus, sapphir and etc??
  • Ryan Smith - Friday, October 25, 2013 - link

    Unfortunately no. AMD does not directly sell Radeon cards to consumers. They only directly sell Radeons to OEMs, while FirePros are directly sold to everyone.
  • rituraj - Saturday, October 26, 2013 - link

    The best thing I can get out of this release is that nVidia is not going to be able to charge $1000 for its next flagship or ultra-flagship. Because it's been proven that no matter what crazy performance it gives, it can be released at a 500-600 range by AMD and therefore by nVidia too. Even if they do, charge that much people will j7st wait for a few months for AMD to release an equally powerful card at 600 or 500. Good move AMD...
  • SunLord - Saturday, October 26, 2013 - link

    Saturday and I'm still not sure the reviews done or at least Anandtech has dropped making explaining and commenting on test results
  • polaco - Saturday, October 26, 2013 - link

    Origin PC will loose lots of possible sells by having dropped AMD. NVidia monopolistic friends pay the consequences...
  • SolMiester - Monday, October 28, 2013 - link

    There is no way Origin PC or any other OEM would want to put this reference card in there systems..I cant wait to see RMA stats with this card...AMD blew the card after such a great GPU...how many times will they do this?
  • polaco - Saturday, October 26, 2013 - link

    This is an interesting article too for gamers that are looking for 4K:
    http://www.legitreviews.com/amd-radeon-r9-290x-vs-...
  • dwade123 - Saturday, October 26, 2013 - link

    All this shows is that GTX Titan is one efficient card. Better than both GTX 780 and AMD's offerings.
  • ehpexs - Saturday, October 26, 2013 - link

    Looks like AMD is a gen away from offereding a crossfire solution that can max out my triple crossovers @ 7680x1440
  • Th-z - Sunday, October 27, 2013 - link

    It seems AMD is pushing 290X really hard, to the point beyond its efficiency curve to try to win over larger chips with almost 1B more transistors from Nvidia. I wonder if reducing some ROPs and dedicate more die area to shader core may look like to 290X, or to go all in, designing a chip as large as Nvidia's top parts.
  • Ytterbium - Sunday, October 27, 2013 - link

    I'm sad they've gone to 1/8th FP, the 280X is a better compute card!
  • Animalosity - Sunday, October 27, 2013 - link

    Why can't people just accept that AMD has beaten Nvidia in every shape and form this time. Yeah, its always been back and forth. And it will again in the future, but for now AMD has the crown for everything except for power/sound levels. Keep in mind that not only does AMD own both next gen consoles, they are also running every one of these benchmarks on beta drivers which means that they will only continue to get better. Add mantle to the equation and Titan will have absolutely zero purpose in life. It was a good card. RIP Kepler.
  • Vortac - Sunday, October 27, 2013 - link

    Well, let's point out again that Titan has a much better FP64 performance, approx. 2.5x better than 290X, so "absolutely zero purpose" is not entirely correct. Of course, if you don't care about computing, then obviously 290X is a much better choice now.
  • Luke7 - Sunday, October 27, 2013 - link

    Are you talking about this?
    http://www.sisoftware.co.uk/?d=qa&f=gpu_financ...
  • Vortac - Sunday, October 27, 2013 - link

    In this interesting review Titan is pitted against 7970 which has 1/4 FP64 performance and is indeed very good for double precision calculations, especially with OpenCL. 290X has 1/8 FP64 and its double precision performance is worse than 7970, leaving Titan with some space to breathe.
  • Ytterbium - Monday, October 28, 2013 - link

    It's kind of funny that for computer your better off buying the lower down 280X cards, I guess that AMD saw what NVidia did with the Titan and though we'll have some of that, they'll be a pronsumer card with 1/4 FP for $1000
  • SolMiester - Monday, October 28, 2013 - link

    You obviously dont know 780Ti is coming! hell, even OC 780 is better than current Titan which is PROsumer for CUDA dev...that why it has different model naming..
    Anyway, if you can keep the 290x under 94c without throttling, it is indeed a great card, however the review I have seen with sustained gaming have it throttling up to -30%
  • motqalden - Sunday, October 27, 2013 - link

    Lol R9 290X CF "Uber" 66.6 dB(A)
    Go Red Team ^.^
    >.> 666 + Red team

    Remember. Competition is a good thing!
  • Ryan Smith - Monday, October 28, 2013 - link

    And I ran that test 3 times to make sure it wasn't a fluke. It really is 66.6dB(A).
  • faster - Sunday, October 27, 2013 - link

    SOFT LAUNCH Too bad the cards aren't available anywhere from 10/24/2013 to 10/27/2013 besides a handful on the first day that went to the lucky 100.
  • AnnihilatorX - Monday, October 28, 2013 - link

    Minor typo:

    The closest we came to a problem was with the intro videos for Total War: Rome 2, which have black horizontal lines due to the cards trying to the cards trying to AFR render said video at a higher framerate than it played at.
  • Ryan Smith - Monday, October 28, 2013 - link

    Thanks.
  • AnnihilatorX - Monday, October 28, 2013 - link

    I do hope 3rd party has more support watercooling, there is a gap in market for affordable prebuilt CPU with water blocks.
  • muziqaz - Sunday, November 3, 2013 - link

    On release date in ocUK website there were i think 2 cards with prebuilt watercooling block for available. So I guess watercooling is being supported ;)
  • AnnihilatorX - Monday, October 28, 2013 - link

    oops, CPU-->GPU
  • SirRaulo - Tuesday, October 29, 2013 - link

    Game changer!

    A faster card and $100 cheaper... even an nvidia fanboy would know the difference.... wait, a fanboy wouldnt... too bad.
  • willis936 - Tuesday, October 29, 2013 - link

    The game has changed twice a year for the past 20 years. If the change isn't changing one could argue that the game is staying the same.
  • apaceeee - Tuesday, October 29, 2013 - link

    Er...I don't think so ..
  • polaco - Tuesday, October 29, 2013 - link

    I have seen you used AMD Catalyst 13.11 (Beta 5) in the 290X benchmarks. Maybe would be nice if you can post some with the updated revision as they say to improve performance from 8% to 30% in several games.
    http://support.amd.com/en-us/kb-articles/Pages/lat...
    Thanks!
  • Ryan Smith - Tuesday, October 29, 2013 - link

    Those are versus Catalyst 13.9. There are no performance improvements in our game set between 13.11 v5 and v6.
  • polaco - Tuesday, October 29, 2013 - link

    Oh, ok if you say so. I got confused coz in one place it states that explicitly and in the other does not.
    "Performance improvements for the AMD APU Series (comparing AMD Catalyst 13.11 Beta6 to AMD Catalyst 13.9)" in the other place doesn't make that clarification just
    "Performance improvements"
  • wiyosaya - Wednesday, October 30, 2013 - link

    Personally, I think it would have been interesting to see a GTX 580 thrown in for the compute benchmarks.
  • rogerthat1945 - Thursday, October 31, 2013 - link

    The ASUS GTX 780 went up in price $120+ USD for me last night when I was expecting a price drop thids morning.

    Last week (and all last month at least, the ASUS GTX 780 price was around $745 USD (in Yen) on Amazon Jp.

    I put one in my shoping basket, and browsed some more for extra items (Zx Evo Headset considerations), and then heard about the NVidia cards price to be dropped for the GTX 780 range; so I held off going to checkout; however, this morning when I went to look at paying via the advertised price drop, BUT I found that Amazon have JACKED-UP the price to $867 US. :no:
    http://www.amazon.co.jp/ASUSTeK-GTX780%E3%83%81%E3...
    Question is;-

    "Where can I buy this card for a `proper` price (which popular site) where they will POST it via Air Mail to Japan (not a US military address)? :ange:

    Every site I tried from California to China do not post to Japan.

    Amazon Japan, you are Kraaayyy-Zee crayon users. :pt1cable:
  • photek242 - Saturday, November 2, 2013 - link

    Now i have a GTX 680 sli setup i think to sell the 2 cards and buy a AMD R290x

    Or stay with the sli setup?

    Here in belgium the r290x goes between 465 - 550 euro

    Tia
  • muziqaz - Sunday, November 3, 2013 - link

    Ryan, I don't know if you are still reading this or not, but regarding Vegas Pro, I suppose other codecs do not use GPUs as expected. I use mp4 format and even though sony and AMD are telling me that GPUs do accelerate that format it actually do not. I can't even get my 12 thread CPU to be loaded fully. Or maybe there is another codec which is supported by youtube which might get some GPU acceleration if enabled? maybe someone else can pitch in with suggestions? :)
  • mr_tawan - Tuesday, November 5, 2013 - link

    AMD card may suffer from loud cooler. Let's just hope that the OEM versions would be shipped with quieter coolers.
  • 1Angelreloaded - Monday, November 11, 2013 - link

    I have to be Honest here, it is beast, in fact the only thing in my mind holding this back is lack of feature sets compared to NVidia, namely PhysX, to me this is a bit of a deal breaker compared for 150$ more the 780 Ti gives me that with lower TDP/and sound profile, as we are only able to so much pull from 1 120W breaker without tripping it and modification for some people is a deal breaker due to wear they live and all. Honestly What I really need to see from a site is 4k gaming at max, 1600p/1200p/1080p benchmarks with single cards as well as SLI/Crossfire to see how they scale against each other. To be clear as well a benchmark using Skyrim Modded to the gills in texture resolutions as well to fully see how the VRAM might effect the cards in future games from this next Gen era, where the Consoles can manage a higher texture resolution natively now, and ultimately this will affect PC performance when the standard was 1-2k texture resolutions now becomes double to 4k or even in a select few up to 8k depth. With a native 64 bit architecture as well you will be able to draw more system RAM into the equation where Skyrim can use a max of 3.5 before it dies with Maxwell coming out and a shared memory pool with a single core microprocessor on the die itself with Gsync for smoothness we might see an over engineered GPU card capable of much much more than we thought, ATI as well has their own ideas which will progress, I have a large feeling Hawaii is actually a reject of sorts because they have to compete with Maxwell and engineer more into the cards themselves.
  • marceloviana - Monday, November 25, 2013 - link

    I Just wondering why does this card came with 32Gb gddr5 and see only 4Gb. The PCB show 16 Elpida EDW2032BBBG (2G each). This amount of memory will help a lot in large scenes wit Vray-RT.
  • Mat3 - Thursday, March 13, 2014 - link

    I don't get it. It's supposed to have 11 compute units per shader engine, making 44 on the entire chip. But the 2nd picture says each shader engine can only have up to 9 compute units....?
  • Mat3 - Thursday, March 13, 2014 - link

    2nd picture on page three I mean.
  • sanaris - Monday, April 14, 2014 - link

    Who cares? This card was never meant to compute something.

    It supposed to be "cheap but decent".
    Initially they made this ridiculous price, but now it is around 200-350 at ebay.
    For $200 it worth its price, because it can be used only to play games.
    Who wants to play games at medium quality (not the future ones), may prefer it.

Log in

Don't have an account? Sign up now