Best Video Cards: October 2014

by Ryan Smith on 10/30/2014 12:00 PM EST
Comments Locked

73 Comments

Back to Article

  • r3loaded - Thursday, October 30, 2014 - link

    For 4K gaming, how would triple GTX 970s fare against dual GTX 980s? The former would be $1047 while the latter would be $1158, so it seems like the 970s may give better performance for less?
  • Zak - Thursday, October 30, 2014 - link

    But you'd need a more expensive motherboard and a larger case so that may even out the cost.
  • nevertell - Thursday, October 30, 2014 - link

    And more powerful power supply. But ultimately he'd be spending less money on heating.
  • DParadoxx - Thursday, October 30, 2014 - link

    I've been really happy with my 3x670s. If I were to upgrade today I'd say 3x970s would be the way to go.
  • DanNeely - Thursday, October 30, 2014 - link

    Unless things have improved in the last year or so 3/4 card multi-GPU scaling is a lot worse than 2 card SLI/xFire (which often approaches 2:1). It's unlikely that the 14/22% (stock vs equal clock speeds) theoretical advantage would actually be achieved in the real world.
  • chizow - Thursday, October 30, 2014 - link

    Agreed, it's especially bad with Nvidia right now beyond 2 cards. There's not much point in going beyond 2 cards in SLI right now. This may improve with driver updates but I wouldn't buy 3 cards hoping for that to happen, I'd wait for the driver updates to come first.
  • Creig - Thursday, October 30, 2014 - link

    For 4K gaming, AMD would be a better choice as Crossfire scales better than SLI. In addition, AMD cards perform better at 4K than Nvidia cards. You could run dual 290x cards for $600 which would perform nearly identical to $1,100+ dual GTX 980s. Not to mention the free games bundles AMD offers.
  • chizow - Thursday, October 30, 2014 - link

    And the downside of 2x290X is it consumes ~300W more than 2x980, so while winter is coming, that's still a lot of difference in heat. Nvidia has room to improve their SLI configuration especially in 4K, we're only 2 drivers in while 290X drivers are relatively mature on a 1 year old part.
  • Creig - Friday, October 31, 2014 - link

    Yes, 290X's consume more power than 980's. But the 290X's also consume a lot less money than 980's. At current pricing, it makes absolutely no sense to buy 980's for 4K gaming when you could get the same experience with a pair of 290X's for $500 less. Not to mention the free games bundles.
  • chizow - Friday, October 31, 2014 - link

    Uh yeah, 290X cost a lot less, now, after Nvidia undercut their prices and forced them to slash them, but that was not always the case and just a little over a month ago, most AMD users paid more for a 290X than a GTX 980 and more for a 290 than for a GTX 970. At one point, not even a year ago, those very same cards were going for $550 and $750 respectively!

    Obviously there is some perceived value from the market for a part that is ~15% faster but uses ~65% of the power, or AMD would have kept their prices at $400/$550 instead of dropping them so drastically in the last couple of weeks. Instead you can find any 290X you want at $350 or less, and GTX 980s are still hard to come by even marked up over MSRP of $550.

    So yes the choice now becomes gaming in comfort vs. saving some money. There's certainly a LOT of great deals now thanks to the 970 launch that forced AMD to cut their prices. You can get cheap cards like the 290 and 780 for $250 or less, the 290X or 970 for $300 or less, and the 780Ti for $400 or less. You just have to decide on how much heat you can live with. :)

    Here's a pretty good testimonial from Kyle over at HardOCP, certainly known to be a pretty straight shooter regarding Nvidia/AMD over the years, sticking it to both equally at times regarding Fermi heat issues, AMD framepacing/runtframes etc.

    http://www.hardocp.com/article/2014/10/30/4_weeks_...
    " These 290X cards are HOT! If your computer is in a small room that sees ambient temperatures above 80F, you will not want a pair of these cards. I think you could live with one 290X but the heat that comes off these cards is insane. Luckily when I started testing these the temperatures were still warm here in Texas, so getting my office up to an ambient temperature of 78F was easy to do. A pair of these 290X in CrossFire can easily warm the room you are in up a few degrees. Under full load in Uber Mode, the exhaust temperature of these cards is over 150F. Yes, you can burn yourself on the exhaust ports of the cards should you be so inclined.

    A 300 watt delta between 290X CrossFire and 980 SLI is a huge number. It is easily recognizable when sitting next to the system. After a few hours of gaming with 290X CrossFire, you certainly had that sweaty gamer feeling about you. So I guess we can say that 980 SLI is the Gold Bond of the high end PC gaming world, because it kept me cool and dry. While this all seems like an attempt at being cute with words, it is not. 290X CrossFire makes me sweat in a room that is fully air conditioned after a few hours. GTX 980 SLI did not make me sweat. Simple as that."
  • Creig - Sunday, November 2, 2014 - link

    Since you seem to like quoting Kyle, I'll post this from the same article:

    "In most of the games tested GeForce GTX 980 SLI matched the same gameplay experience as AMD Radeon R9 290X CrossFire. This was surprising considering single-GPU GeForce GTX 980 is able to outperform single-GPU AMD Radeon R9 290X. We thought that naturally, putting two video cards together in SLI would equally excel past AMD Radeon R9 290X CrossFire. This wasn't the case. We got an unexpected result.

    Instead, GeForce GTX 980 SLI was on par, equal with AMD Radeon R9 290X CrossFire in performance, most of the time. There were some occasions that AMD Radeon R9 290X CrossFire was even better than GeForce GTX 980 SLI, for example in Watch Dogs. We didn't expect that, considering that game had the heavy hand of NVIDIA Game Works applied to it. Yet, the competition seems to be scaling much better with CrossFire on that game. It seems SLI isn't doing well there.

    We also saw problems in Alien: Isolation with SLI. Performance was more erratic, inconsistent compared to CrossFire. Performance even seemed to drop out when interacting with the computer terminals in the game under SLI. CrossFire had no problems with that."

    So it's plainly obvious that AMD 290X's in CrossFire are a much better value than 980's in SLI for 4K gaming. Yes, the 290X's put out more heat the 980s. But a pair of 290X's are still $500 cheaper than a pair of 980's. FIVE. HUNDRED. DOLLARS. And you get the exact same gaming experience.

    I'm pretty sure that most people would be willing to put up with the extra heat output of 290X Crossfire in exchange for keeping an extra $500 in their wallets.
  • chizow - Sunday, November 2, 2014 - link

    And again, no one is questioning 290X has strong value characteristics *NOW*, thanks to the 970 and 980 invalidating AMD's pricing structure in order to compete.

    But clearly the market finds value in a card that is faster and also much more power efficient when it comes to gaming. Indeed, you can't say for sure most people are willing to put up with the extra heat output, because if you scan the various deal threads you will see many users are in fact torn over the prospect of adding a 2nd 290/X or 780/Ti rather than selling their existing card and just buying 2x970 instead.

    Regarding the SLI comments from Kyle, in context, of course, he says he is surprised because the 980 beats the 290X so handily in single-GPU performance. There's no doubt Nvidia's SLI scaling right now leaves a lot to be desired, but the fact is its a new GPU on a new architecture. There's certainly optimism that Nvidia will improve their SLI scaling in upcoming driver revisions, and in the meantime, you still get to enjoy top of the line performance in gaming comfort with 300W less under load! THREE HUNDRED WATTS!
  • Warren21 - Monday, November 3, 2014 - link

    Chizow, you shouldn't try so hard to hide the fact that you are and always have been biased to NVIDIA. You could have said all that you did of actual substance in about two sentences.
  • chizow - Monday, November 3, 2014 - link

    Where am I hiding this fact? I freely admit I've preferred Nvidia since G80 because they have consistently offered the best solution when it comes to gaming, driver/software support, and related gaming technologies. Its easy to prefer great products from any company when they consistently meet or exceed your needs and expectations! I'm sure Creig would have no problem admitting his bias for AMD either, for whatever his reason.

    But yes, here's the actual substance in 2 sentences:

    It's a great time to buy a video card thanks to Nvidia and their release of the 970 and 980, which has compressed the price of all cards from the previous generations from both AMD and Nvidia. If you don't mind the 250+W heat of the 780/Ti or close to 300W of heat from the 290/X, then there are some amazing deals to be had right around that $300 price point set by the 970! Everyone wins regardless of where your "bias" lies! ;)
  • Creig - Tuesday, November 4, 2014 - link

    Chizow, why do you keep going on about what pricing is *NOW* as if it's somehow good for Nvidia? You say the 970 and 980 invalidated AMD's pricing structure? Well, the 290 and 290X have now invalidated Nvidia's new card's pricing. That's how it goes.

    The 980 is completely overpriced, thanks to AMD. You can pick up a pair of AMD 290 For $500 while a single Nvidia 980 is $550! You actually SAVE money by going with AMD's much faster solution. Only the 970 is worth considering. The 980 isn't even remotely competitive in the cost/performance category.
  • chizow - Tuesday, November 4, 2014 - link

    Because that's how it goes Creig. You should know this, other than the anomaly caused by the 7970's terrible launch pricing (which you probably defended), new cards push and compress the pricing down on the cards that preceded them. That's just called "progress".

    The 290 and 290X haven't invalidated anything, because for the bargain GPU shopper that doesn't mind 250+W space heaters, Nvidia has their own last-gen cards on offer as well, price cuts initiated a full month before AMD decided to get back in the game (wasn't this an AMD motto at one point?)! That's the beauty of it, you don't need to compare 2x290X to 2x980 when you can simply fall back to 2x780, 2x780Ti, or 2x970 all for similar pricing to 2x290X, and, you get all the benefits of Nvidia's superior driver support, features, and technologies as well!

    There's always going to be a premium on the halo part, and there's always going to be a trade-off falling back on older generations, and from what I have seen in the various deals forums and FS/FT threads, that is obviously something people consider in their upgrade tendencies. Its amazing how many 290/290Xs or even 780/780Tis you can find for even cheaper than retail, what are all these people running from I wonder? :D
  • Creig - Wednesday, November 5, 2014 - link

    You think all the way back to the 7970's launch price and call it terrible while completely bypassing the Nvidia Titan which debuted at a breathtaking $999? Ok then, next subject.

    AMD has COMPLETELY invalidated Nvidia's lineup. Did you actually read the article we're commenting on? Out of the seven categories, AMD took the top spot in five of them.

    Budget - AMD R7 260X
    Mainstream sweet spot - AMD Radeon R7 265
    1080P Gaming - AMD Radeon R9 280
    1440P Power - AMD Radeon R9 290
    The performance sweet spot - Nvidia Geforce GTX 970
    Taking the single GPU crown - Nvidia Geforce GTX 980
    4K for e - AMD Radeon R9 290 Crossfire

    And the "Taking the single GPU crown" is a hollow victory for the 980 as the "4K for me" category is not only faster, but cheaper by $19! Only the 970 stands alone as a win for Nvidia.

    Ultimately, the overall most important metric in video gaming is price/performance. That is an unarguable statement. And right now, AMD is the obvious leader.
  • chizow - Thursday, November 6, 2014 - link

    First of all, I've never defended Titan's pricing ever, in fact I've been very outspoken against Titan and even GK104 pricing. But nonetheless, it offered a massive increase in performance (over 2x from GF100/GF110), higher than the typical flagship to flagship gains and Nvidia charged a massive premium for it along with the ostensibly claimed compute capability. But this premium could ONLY occur because AMD priced their 7970 so terribly, it allowed Nvidia's SECOND fastest Kepler SKU to not only beat it, but also allowed Nvidia to price it CHEAPER. So yes, Tahiti pricing was terrible as it offered the worst increase in price:perf for a new generation and node we've ever seen, but just as I figured, you'd probably defend it.

    And AMD completely invalidated Nvidia's lineup? LOL, utter nonsense but no surprise coming from you, what initiated these price cuts again? Oh right, the 970 launch coupled with Nvidia cutting the price on their 780 and 780Ti. AMD responded a month later because they had to, they simply could not compete at their old price points and they were forced to undercut Nvidia further simply because that's how the cards fall.

    So yes, AMD as usual takes the cheap, budget-oriented segments, but who can't afford a card that is just as fast or faster with better features, support, and game-related technology for $10-20 more? I mean prices are so compressed now, the difference in price in the sub-$330 is a burrito at Chipotle nowadays. Who can't afford an extra burrito or two?

    Again, there's really no point in trying to drag the 980 down to the value market, Nvidia has plenty of last-gen options if power consumption is not a concern for similar price points (780/Ti) and of course the 970 that offers similar performance as the 290X at a similar price point, but with much lower power consumption. Its crazy to think you can run 2x970 for the same amount of power (probably even less) than a single 290X.

    I mean honestly anyone who wants a 290/X will have no trouble finding one, same goes for the 780/Ti for that matter, but the 970 is still really hard to get! You can find 290X for $250 or less on any FS/FT forum, bet you can't find (m)any 970s! Guess that counts for something huh? :)
  • Creig - Thursday, November 6, 2014 - link

    Here's a news flash for you, chizow. Nvidia is free to charge anything they want for their cards. AMD is not in control of Nvidia's pricing structure. Trying to blame AMD because Nvidia chose to price their Titan at $999 is utterly ridiculous. If Nvidia felt like releasing it at $250, AMD could not have done anything about it. Neither could they do anything about its actual $999 pricetag.

    Do you happen to recall the Titan Z? It was released at $3,000 while the faster AMD 295X2 was already available for $1,500. According to your theory, Nvidia should have lowered the price of the Titan Z because AMD had a competitive card already out there at half the price. Yet Nvidia still sold it for $3,000. It's Nvidia greed, pure and simple.

    And yes, Nvidia's lineup has been completely invalidated by AMD. Do you see the article? Five out of the seven categories were won by AMD. And the '4K for me' category is not only faster, but also CHEAPER than the 'Single GPU crown' GTX 980. I don't know why you're even trying to argue the results when they're posted right in front of you. AMD cards are the better value for nearly every segment of gaming according to Anandtech.

    As far as not finding a 970 in a FS/FT forum, it's only been released for a month while the 290/290X has been out for what? Over a year? How many people are going to buy a 970 and then try to sell it a month later? Try to keep it real, OK?
  • chizow - Thursday, November 6, 2014 - link

    And here's a news flash for you Creig, even if these cards are incomparable when it comes to support and features, Nvidia is still bound to AMD for as long as reviewers and casual users use "FPS" and simple "price vs. performance" metrics over all else, as you've attempted to do here. So yes, AMD does have a direct influence on Nvidia pricing, and vice versa, to imply otherwise is completely and utterly ignorant. Again, what caused AMD to cut their prices so drastically in the last month, since the GTX 970 launched? Are you going to stupidly sit here and claim they are both free to charge what they want? If that was the case, why didn't AMD hold course at their $400 and $550 price points for the 290/290X, respectively, they're free to charge what they want!

    The simple matter of it is, Nvidia could have NEVER gotten away with charging $500 for their 2nd tier ASIC if AMD had not priced Tahiti so terribly in the first place. But instead of looking like the villain in it all, Nvidia actually looked GOOD, because they beat AMD in price, performance, and thermals. The GPU Tri-fecta. So yes, if Nvidia is able to charge $500 for their 2nd tier chip and AMD has no answer for their 1st tier chip until a good 1.5 years later, that opens the door for a stupidly price ultra premium part, which we saw in the GTX Titan. Again, none of this is possible if AMD prices their flagship 7970 where it belongs, at the historical $380-$420 price point, but instead, AMD asks for $550 for a measly 20% increase over the 18 month old GTX 580! For someone who touts price:performance as the only important metric, you sure do have a hard time acknowledging Tahiti offered the worst increase in this metric, ever!

    Titan Z just built on this stupid pricing escalation but given how poorly it sold, and the fact Nvidia fans should not be fooled by the same Titan trick more than once, its highly doubtful Nvidia will try this again, and if they do, it certainly wont' sell as well as the first Titan.

    And of course Nvidia's line-up hasn't been invalidated when as I stated before, there is a card within $10-20 that performs as well or better than every listed card that offers better features as well. Its a subjective list subject to price fluctuations, but none of this discounts the fact it was indeed Nvidia's release of the 970/980 and the price drops on the 780/780Ti that invalidated AMD's entire product stack, forcing them to slash prices.

    And keeping it real? Again, who cares how long it has been on market? If what you said was true, y'know, how AMD offers the best parts with price:performance being the only important factor, then EVERYONE would be scrambling to buy these cards. Instead we see the opposite. Everyone is running AWAY from these hot last-gen cards and buying up the 970/980s in droves. We'll definitely try and keep this real as this action will be reflected in Nvidia's earnings call later tonite.

    And finally I do find its funny how you keep emphasizing 4K for the <1% of users who run this resolution. Do you run 4K? Of course you don't, not to mention anyone who is willing to spend over $1500 on a 4K TV or monitor isn't going to be sweating over a few hundred dollars, and more than likely, already bought their 290Xs at the $550-$750 they were selling for before the 970/980 cut their prices at the knees a month ago.
  • chizow - Thursday, November 6, 2014 - link

    And right on queue, back in reality, Nvidia enjoys record revenue on the strength of Maxwell GTX graphics cards:

    http://www.anandtech.com/show/8698/nvidia-fy-2015-...
    "GeForce branded GPU revenue was up 36% based on the continued strength of PC gaming."

    So much for AMD invalidating Nvidia's product lineup! I guess the market does see the value in Maxwell parts and is willing to pay the premium, regardless of what a single buyer's guide recommends and the AMD faithful believe.
  • Creig - Friday, November 7, 2014 - link

    All that proves is that the Nvidia faithful will just keep drinking the green Koolaid, no matter what. Just like their Apple brethren.
  • chizow - Thursday, November 20, 2014 - link

    No, it just proves what you have stated about price being the only important factor in sales is nonsense, which is really no surprise! Fact of the matter is, for every 290/290X being sold at slashed prices, there are more being sold for a fraction of that on the used parts market. People can't get away from the heat (and probably drivers) fast enough and its winter in the northern hemisphere!
  • Creig - Friday, November 7, 2014 - link

    It's amazing that you're still trying to blame Nvidia pricing on AMD. Nvidia is the only one who can set prices for Nvidia products. Nobody put a gun to Jen-Hsun's head and said "Price your Titan at $999! Or else!". To say that it's AMD's fault is utterly ludicrous. Just because Tahiti was priced high doesn't automatically mean that Nvidia had to follow suit. They could have released the Titan at whatever price point they wished. They could have released it at $550 and totally cut AMD off at the knees. But they saw an opportunity to gouge their customers like never before and went for it. So please, enough with the "Titan pricing was all AMD's fault" rhetoric. Nvidia can set whatever price they wish for their cards, regardless of where AMD has priced theirs.

    Does Nvidia take AMD pricing into account? Of course. Does AMD take Nvidia pricing into account? Of course. Did AMD force Nvidia to charge $999 for a Titan or $3,000 for a Titan Z? Of course not. Nvidia did that all on their own.

    And why do you keep ignoring the fact that Anandtech chose AMD cards as the best value in five of the seven categories? If Nvidia's cards were the better value, don't you think they would have been chosen instead? But they weren't. Could that change next month? Yes, most definitely. But for now, it's AMD that apparently has more appealing offerings for the money spent.

    And please, stop going on about how the 970/980 solved world hunger and brought peace to mankind. Yes, the 970/980 are good cards and raised everybody's eyebrows at their low power consumption for their performance level. And yes, AMD saw the writing on the wall and lowered their prices. But this has been happening back and forth for decades now. AMD releases a new card, and Nvidia lowers prices. Nvidia releases a new card and AMD drops prices. It's one of the oldest stories in the book. What matters is RIGHT NOW. RIGHT NOW, AMD has dropped prices low enough that it's Nvidia's 980 that appears overpriced relative to its performance level. If Nvidia chooses to drop the 980's price, things could change. The 970 seems to be sitting good relative to AMD's equivalent.

    Bringing up how many 970's are in FS/FT forums vs 290X/290s made no sense whatsoever. The 290X and 290 have been on sale for over a year. Bitcoin/Litecoin miners bought up every 290 series card that could be produced and still wanted more. Now that the Bitcoin mining craze has dropped off, a lot of these people have decided to sell off their cards and try to recoup as much money as possible. That alone accounts for a lot of the used 290 cards you see for sale these days. Nobody is going to buy a 970 and then try to sell it in a FS/FT forum a month later.

    And the only reason I brought up the "4k for me" category is because it's actually less expensive to go with the faster 290 CrossFire than a single 980. That represents an incredible value. Anybody considering a 980 would be better served buying a pair of 290s as they would receive better performance for less money. So as I stated earlier, a victory by the 980 in the "Taking the Single GPU Crown" category is a hollow victory at best. This leaves the 970 as Nvidia's only real showing in this month's roundup.
  • chizow - Friday, November 7, 2014 - link

    And its amazing that you still can't acknowledge Tahiti's role in the current ultra-premium pricing anomaly started by 28nm because of Tahiti's launch prices! Here's a simple question, and answer this honestly, do you honestly think Nvidia could've sold their GTX 680 based on their 2nd fastest ASIC for $500 if AMD had launched their 7970 at the expected price point of $380-$420? No, of course not, because they couldn't expect to sell many if they offered only 5-10% improvement at a 30-40% premium. The GTX 680 sells for maybe $400-450 max, as was originally rumored, but instead, Nvidia took the opportunity caused by AMD's launch to basically jump their 2nd tier ASIC an entire SKU level and make it their flagship. This allowed Nvidia to not only create a super-premium SKU in Titan, but also allowed them to sell their significantly cut-down top-end ASIC as their flagship in the GTX 780 and delay it over a year.

    Even removing AMD from the equation, the GTX 680 actually offered the worst price:performance for any Nvidia to Nvidia flagship part, offering only ~40-50% improvement at 100% of the cost compared to the GTX 480 and ~30-35% improvement for 100% of the half-gen GTX 580. Again, this was ONLY allowable due to Tahiti's horrible price:performance which allowed Nvidia to come out looking great due to the fact they beat AMD on price, performance, and thermals, but from a historical perspective, it was still a horrible increase in price:performance as typical gains are anywhere from 80-100% especially given this was both a node and generational upgrade from 40nm Fermi.

    And who is going on about the 970/980 solving world hunger? I never claimed anything of the sort, I simply accurately pointed out they were DIRECTLY responsible for the price structure we see today that EVERY gamer can enjoy going into the holiday season. You have only now finally acknowledged this. The market has corrected itself, and while someone who prefers AMD can always point to the slightly cheaper prices on a singular buying guide, anyone else who prefers the premium features and support of Nvidia can find a card that offers similar performance at a very slight premium, maybe $10-20 max. So again, its great that AMD can come up 1st where it doesn't matter (a buyer's guide based on FPS and price only, I guess) but where it matters, on the actual market, Nvidia continues to dominate where it counts, at the registers.

    And how does bringing up FS/FT, Ebay, 2nd hand and new market comparisons make no sense whatsoever. Again, YOU claimed price:performance was the ONLY important metric when it came to buying decisions, but these markets clearly show that is NOT the case. You want to say the 290/290X are invalidating Nvidia's product stack and the 970/980 specifically, when these markets show that is clearly not the case, and while the 290/X can be had for MUCH cheaper than the 970/980, they are the ones selling on these 2nd hand markets while the 970/980 are nowhere to be found and they are the ones that are readily available on any retailer site, while the 970/980 still see tight demand and/or inflated prices.

    And finally with the 4K comparison again, 4K is not a territory where value means much of anything, unless you are looking at the incredibly small cross-section of gamers that has 4K (<1%), is too cheap to afford $500 GPUs, doesn't already have 2+ 290/X, 780/Ti, or 970/980s, especially when there is already plenty of other options that don't include the 980 that offer compelling alternatives at a lower price point, again, thanks to Nvidia's pricing on the 780/Ti and 970/980, further sweetened by their latest game offer.
  • Creig - Thursday, November 13, 2014 - link

    Oh please, Tahiti did NOT start the current ultra-high premium pricing. Nvidia has nearly always had the highest MSRP on release of a new card:

    $650 Nvidia GTX 280
    $300 AMD HD 4870
    $400 Nvidia GTX 285
    $380 AMD HD 5870
    $500 Nvidia GTX 480
    $240 AMD HD 6870
    $500 Nvidia GTX 580
    $370 AMD HD 6970
    $550 AMD HD 7970
    $500 Nvidia GTX 680
    $1000 Nvidia Titan
    $550 AMD R9 290X

    It hasn't been until recently that AMD started bumping their top end MSRP to the $500 level. Nvidia is the one who has historically released their top end cards at $500 or higher. If anything, it's Nvidia and the people who buy their cards that you should be thanking for higher prices. Look at how many Nvidia fans bought the Titan for $1,000. And less than two years later, its performance is easily matched by a $300 AMD R9 290X.

    Now we have reports that the Nvidia Titan II is in the works. I wonder how many people are going to to plop down $1,000+ on those as well? And yet, I'm sure it's all the fault of the AMD 7970 to you despite the fact that Nvidia themselves have been the ones with $500+ cards long before AMD ever debuted one.

    Once again, both companies are free to charge whatever they wish. If Nvidia wants to be considered the price/performance leader, then that's completely up to them. I just wouldn't hold my breath waiting for it.

    "AMD can come up 1st where it doesn't matter"? So best price/performance doesn't matter?

    Wow.

    Just.... wow.

    Obviously you're here to defend Nvidia to the death, despite the fact that you're commenting in an article that shows AMD being the better choice in five out of seven categories. And with the overall fastest category (290 Crossfire) being cheaper than the fastest single GPU category (980). Nothing you can say, no rhetoric you can dig up is going to change those results. If Nvidia had been the better choice, they would have been listed as the winners. But they weren't.

    End of story.
  • chizow - Friday, November 14, 2014 - link

    Yes Nvidia has had the highest priced part, but that was because they had the FASTEST GPU for each generation as well. Tahiti *DID* start the trend because even as your own list shows, AMD reset to their traditional price points and Nvidia reset to theirs, offering predictable gains in performance at the same price points (ie. Murphy's Law).

    Tahiti broke rank because it followed *NVIDIA's* high-end pricing at the start of a new generation and process node, yet it clearly did NOT offer anywhere close to the expected increase in performance for the increase in price, just 20% increase in performance at 10% increase in price at launch. Now, look back at that chart and see if there is any such minor increase from Nvidia to Nvidia part. There isn't! The differences in performance for Nvidia are 60-80% increase in performance for the same price.

    Even AMD's own performance increase from their last-gen was bad. The 6970 (not 6870 btw) was $380 at launch and compared to the 7970, you were looking at a 50% increase in price for a 50% increase in performance. Now look at AMD's previous gains from generation to generation and once again....you see Tahiti is the anomaly.

    Obviously fewer people are going to plop down $1k for a Titan II knowing full well there will be a Ti GeForce version of it for far less, like I said before, Titan and Z will always exist, but they aren't going to be able to pull the same trick 2x now that its been played in the past.

    Nvidia certainly took this price escalation to another level, but AMD started it, without a doubt., because without the 7970 at the laughable $550 price tag which I am sure you defended, there would have been no opportunity for the 680 at $500, if anything it would've been $400-$420 or so, or more likely Nvidia would have gone forward and launched full GK104 as GTX 670Ti as originally planned, but instead, Nvidia took the opportunity of a lackluster Tahiti to jump their entire product stack an entire SKU and to create an ultra-premium range in Titan.

    Again, both companies are NOT free to charge what they want if they want to actually sell these cards, if what you were said was true, AMD would be free to have kept their cards at $550 and $400, but they dropped their prices didn't they? Without such a massive lead on AMD with higher than usual pricing, they could NOT have asked $1K for a card like Titan. It was only because their 2nd fastest SKU was faster and still priced high enough to sustain their revenues, which allowed them to sit on GK110 and introduce a Titan at a much higher price point. I mean if they can get $500 for their 2nd fastest SKU thanks to AMD's lackluster efforts, why not go for $1000 for what used to be their $500 GPU?

    And obviously you are here to defend AMD to the death, even though they were uncompetitive for months and still got slaughtered on the market, despite what this fine pricing guide has mandated. Unfortunately for AMD, it seems not enough people listened as the market clearly spoke and picked Nvidia, despite the premiums you and this pricing guide don't seem to find worthwhile.

    http://www.techpowerup.com/207180/big-swing-in-mar...
    "•AMD's discrete desktop shipments decreased 19% and notebook discrete shipments increased 10%. The company's overall PC graphics shipments decreased 7%."

    Oh and HardOCP updated that review you love to link at 4K, when max overclocks are applied, GTX 980 SLI wins convincingly in all but 1 game that didn't scale in SLI, and still uses 200W less to boot. TWO HUNDRED WATTS!!! Guess it might be worth the premium after all, and if not, there's always 2x970 for nearly the same price, performance but half the power. HALF. THE. POWER.

    http://hardocp.com/article/2014/11/11/nvidia_gefor...
  • Creig - Tuesday, November 18, 2014 - link

    Nvidia has had the "FASTEST GPU for each generation"? Really? FYI, the quotes below were taken directly from Anandtech video card reviews:

    "Let’s be clear here: the 5870 is the single fastest single-GPU card we have tested, by a wide margin. Looking at its performance in today’s games, as a $379 card it makes the GTX 285 at its current prices ($300+) completely irrelevant."

    “So at the end of the day AMD has once again retaken the performance crown for single-GPU cards, bringing them back to a position they last held nearly 2 years ago with the 5870. To that AMD deserves kudos, and if you’re in the market for a $500+ video card the 7970 is clearly the card to get – it’s a bit more expensive than the GTX 580, but it’s reasonably faster and cooler all at once.”

    “To that end at 2560x1440 – what I expect will be the most common resolution used with such a card for the time being – AMD is essentially tied with GTX Titan, delivering an average of 99% of the performance of NVIDIA’s prosumer-level flagship. Against NVIDIA’s cheaper and more gaming oriented GTX 780 that becomes an outright lead, with the 290X leading by an average of 9% and never falling behind the GTX 780.”

    Obviously AMD has held the Fastest Single GPU crown in the past, including its current top end card, the R9 290X.

    And as far as price escalation goes, how in the world could AMD have started it? Did you even read the list of video card releases and their MSRP that I posted above? Nvidia had FIVE single GPU video cards debut at $500 or more while AMD has had TWO. It's pretty easy to see which company started the high pricing patterns we have to contend with today.

    The only real reason Nvidia could charge $1,000 for the Titan is because their fans the same as Apple fans. They are willing pay nearly anything as long it has the right logo on the box. AMD has been the price/performance leader for many years, but it's hard to fight a herd mentality like Apple/Nvidia. A recent post in the [H]ardforums sums up this phenomenon:

    "If AMD releases 6 months ahead of nVidia 50% of the market will wait to pay 50% more for 12% more performance."

    There is the sole reason that Nvidia cards are so expensive. It isn't because AMD cards aren't competitive, it's due to the fact that Nvidia fans will fall all over themselves to pay extra for the same performance they could get from AMD cards. The old adage of "You can lead a horse to water, but you can't make him drink" could have been written with Nvidia fans in mind.

    So now you bring up marketshare as if that can disprove the results of this article which recommends AMD cards in five of the seven categories? Popularity does not automatically equal best deal. Which cards were most chosen as the best deals in their respective categories? AMD.

    Aaaaaand we're back to Nvidia fans claiming that power consumption is the overall most importance feature now that the 970/980 leads AMD. Back when it was AMD that had the most power efficient cards, power consumption was of no importance whatsoever. But hey, just keep moving those goal posts. Whatever it takes to keep Nvidia on top.
  • chizow - Thursday, November 20, 2014 - link

    Wow, its amazing how you continually try and comment given your lack of historical acumen relating to GPUs. But this should be no surprise since you've repeatedly tried to dismiss the fact AMD's "competitive" pricing was clearly a reaction to Nvidia invalidating their entire product stack with the GTX 970 and 980.

    Its shocking for anyone to take a review quote at a given point in time without taking into account its place in history. No one other than you might consider the 5870 the fastest card of any generation. It led 40nm/DX11 for all of 5 months because Nvidia was later to market, but once they answered, AMD had no response for the entire generation. Again, I already outlined the rules of engagement for GPUs given AMD and Nvidia are bound to them: process node at TSMC. For 40nm, Nvidia won with Fermi (GF100) and then refined Fermi (GF110), for which AMD had no response. AMD was first to market, but they led in fewer months for the generation given 40nm lasted from October 2009 until December 2011. AMD led for all 5-6 months of that, so you truly think they won the generation? lol. Once again, AMD has NEVER led at the end of a generation, nor have they led for the majority of any generation for single-GPU.

    And with the high price premiums, it should be obvious, Nvidia did have the highest price cards because they commanded it with the highest performing cards! Which once again, flies in the face of what you claimed, because if AMD had the lead, they would have had the highest priced cards. In fact, Nvidia held rather steady in the $500-650 range until the 7970 presented a price:performance anomaly, and suddenly we see a $1000 spike as a result! Simple cause and reaction. Anyone who had any historic perspective on pricing:performance of GPUs knew this was going to be the outcome given the coup Nvidia was able to pull off by masquerading a 2nd place part as their flagship, all because of Tahiti's poor price:performance as a pretender high-end part.

    Again, Titan could not exist in the past, and Nvidia did not present a $1K part in the past, they could only do so because GK110 so utterly destroyed AMD's fastest card and obviously, their own 2nd fastest card at the same time. So sure, while Nvidia brought this greed to a new level, it was ONLY possible because AMD presented a pauper as a king with Tahiti.

    And I guess if we are going to pull quotes from random forums, we can pull all the quotes saying how Nvidia cards give a better experience from drivers, to support, to features, to thermals, to new technologies, you name it! Again, AMD fans can claim all they like there's no difference but the people who buy Nvidia and stay with Nvidia know, there are features Nvidia offers that AMD does not that they know they can no longer live without! What is AMD going to offer someone like me who has come to rely on 3D Vision, G-Sync, SLI profiles, PhysX, GeForce Experience and the myriad other features and support I have come to expect from my graphics hardware??? Slidedecks, excuses, and an exceptionally dishonest spokesman in Richard Huddy? No thanks. But I know, in your eyes, all those features suck and aren't worth it. Go figure!

    And lastly lolol. Market share and financials clearly illustrate your claims that price:performance is the ONLY important metric in determining a sale is nonsense. Sounds like you need to do a better job of getting the word out! It seems you are stuck in the last decade where FPS bar graphs are enough to tell the whole story. Guess what, its not! Like I said, put 2 cards next to each other, even with slight differences one way or another in FPS and price, Nvidia is going to win because they offer the better PRODUCT.

    Aaaaaand we're back to AMD fans claiming that power consumption isn't the overall most important feature now that Nvidia leads AMD. Back when it was AMD that had the most power efficient cards, power consumption was the most important thing ever. But hey, just keep moving those goal posts, whatever it takes to keep AMD relevant in the discussion. Good thing for Nvidia and their fans, they are also the most powerful GPUs on the planet, giving them the virtual GPU tri-fecta (Price, Performance and Thermals!) for the 2nd major generational launch in a row!
  • Creig - Friday, November 21, 2014 - link

    So even though AMD may have had the fastest card available at the time, Nvidia still won because they were faster for a greater length of time on whatever node they happened to be produced on? Honestly, nobody really cares what process node a product is on. AMD has held the single GPU performance title multiple times in the past, as proven with benchmarks. The length of time on a process node is irrelevant. People buy video cards for their performance and price, not for their process node. Come on.

    Since you seem so fixated on the 7970 as the harbinger of doom for all video card pricing, try re-reading the conclusion from the Anandtech article:

    "So at the end of the day AMD has once again retaken the performance crown for single-GPU cards, bringing them back to a position they last held nearly 2 years ago with the 5870. To that AMD deserves kudos, and if you’re in the market for a $500+ video card the 7970 is clearly the card to get – it’s a bit more expensive than the GTX 580, but it’s reasonably faster and cooler all at once.”

    I don't see the words "price:performance anomaly" anywhere in there. Instead, I saw "if you're in the market for a $500+ video card the 7970 is clearly the card to get". AnandTech came right out and said point blank that the 7970 represented an acceptable price for its performance relative to the GTX 580. And that it was the fastest available single GPU card available at the time.

    And now, let's consider the following quotes from the AnandTech Nvidia Titan review:

    "With a price of $999 Titan is decidedly out of the price/performance race; Titan will be a luxury product, geared towards a mix of low-end compute customers and ultra-enthusiasts who can justify buying a luxury product to get their hands on a GK110 video card."

    "Back in the land of consumer gaming though, we have to contend with the fact that unlike any big-GPU card before it, Titan is purposely removed from the price/performance curve. NVIDIA has long wanted to ape Intel’s ability to have an extreme/luxury product at the very top end of the consumer product stack, and with Titan they’re going ahead with that."

    Well, there you have it. According to AnandTech, the 7970 was a reasonable price:performance card while the Titan was deliberately priced way above its performance level. Put simply, Titan and Titan Z pricing were nothing more than manifestations of Jen-Hsun Huang's ego gone wild. And the Nvidia fans bared their souls (and wallets) in tribute. The $550 AMD HD 7970 was in no way responsible for the astronomical pricing of the $1,000 Nvidia Titan. You may as well just admit defeat on that subject and move on. Because you simply can't prove otherwise.

    Yes, Nvidia has a myriad of extras. Just as AMD does (Mantle, FreeSync, Eyefinity, PowerTune, TrueAudio, XDMA Crossfire, HD3D) The difference is that AMD tends to try and keep features open for everybody to use while Nvidia deliberately keeps their features locked from everybody but those people with a green logo on their video card. Heck, they even lock out features from their own customers when an AMD card is even present in their computer! Hardware PhysX has been proven to work just fine on an Nvidia card with an AMD card as the primary renderer as evidenced by the time when Nvidia accidentally released a PhysX drive without the vendor ID lockout in place. But Nvidia immediately put the ID lockout back in for the next version. Nvidia has even gone on actively degrade gaming performance on AMD cards by use of deliberate overtessellation, by forcing AA computation without allowing the AA mode to actually be selected, by not allowing AMD the right to see source code on GameWorks libraries for optimization purposes, etc...

    Power consumption is not and never has been the most important measure of a performance video card. FPS and price are easily the overriding factors affecting video card purchasing decisions. Power consumption doesn't even come close. So by all means, keep going on about how power efficient Nvidia's cards are now that they (finally) have managed to overtake AMD in that respect.

    At the end of the day, the 980 is still too expensive, the 970 is a fairly good bargain and AMD has all the rest of the categories neatly wrapped up, as evidenced by the Best Video Cards: October 2014 review we're commenting in.
  • TEAMSWITCHER - Thursday, October 30, 2014 - link

    Now if only I could find a 4K monitor that was actually good...and didn't cost something north of $2000.
  • chizow - Thursday, October 30, 2014 - link

    Acer G-Sync 4K panel is getting a lot of positive reviews, of course requires Nvidia to maximize the benefit, but it only costs $800:

    http://hothardware.com/Reviews/Acer-XB280HK-4K-GSY...
    http://www.pcper.com/reviews/Displays/Acer-XB280HK...
  • Ryan Smith - Thursday, October 30, 2014 - link

    The biggest problem with a tri-970 setup right now is that attempting it with open air coolers is nothing short of madness. EVGA does have some blower designs, but as of when this article was written those cards are not in stock.
  • chizow - Thursday, October 30, 2014 - link

    MSI also has a reference blower that appears to use the stock 980PCB and cooler, sans Titan shroud. TD had them in stock briefly but they've been OoS since.

    http://www.tigerdirect.com/applications/SearchTool...

    Also, prices have already compressed even more. 780s, 290s, 290X, 780Ti all can be had for some great deals. Great prices for everyone thanks to the 970 and the disruption/compression it caused last month!
  • chizow - Thursday, October 30, 2014 - link

    Also I wonder, if the same doesn't apply with 2x290X or even 2x290, given 2 of those cards probably draws more power than 3x970. HardOCP found 2x290X drew 300W more than 2x980s!

    Definitely an issue with open air coolers though, its going to be really hard to dissipate ~600-900W of heat (2-3x 290s) from a case before those cards start throttling.
  • garadante - Friday, October 31, 2014 - link

    Those numbers HardOCP got are highly suspect. Reviews are seeing reference, uber 290Xs pulling only 80-90 watts more than a 980 which excludes the fact that the 290X experiences power draw decrease when it's got a half decent cooler that doesn't run it at 95 C constantly. The difference in power consumption between SLI 980s and Crossfired 290Xs should be 150-200 watts at most. And while that seems high, the fact that it's $500 cheaper means you'd have to run them at 100% load for about 60,000 hours with the electricity rate I have to make up the price difference.

    Looking at HardOCPs reference 290X review, they have a 70 watt discrepancy between Anandtech data on normal mode and 40 watts on uber mode. I'd question the validity of their power testing methodology considering that discrepancy (and how it seems to bias in favor of Nvidia cards) between Anandtech's data. And I don't consider Anandtech's 290X power draw values very valid considering they continue to use the reference 290X numbers in every comparison review when the reference 290X is widely known to be utter garbage while the nonreference designs are leaps and bounds superior.

    Perhaps I sound like an AMD fanboy but honestly, I'm not. I'm middle of the road. I acknowledge both companies' strengths and weaknesses, but I see so many review site testing methodology that skews in favor of Nvidia with the 970/980 that is frustrates me immensely. I just want some comparison results against a quality nonreference 980, 970, 290, and 290X with some tests normalized with framerate to take CPU power draw from the equation! I suspect the efficiency of Maxwell cards that every review site is heavily lauding are blatantly inaccurate. Are they efficient? Yes. Do they blow AMD out of the water? Not as much as review sites seem to keep repeating.
  • JarredWalton - Friday, October 31, 2014 - link

    Fun fact: I have two Gigabyte R9 290X GPUs (purchased at retail). They're blowers and they get super loud under load, but I'm able to run a pair of them off a Corsair RM650 PSU. I've never actually checked power draw under load, but I have to be close to the limit on that PSU, considering the overclocked CPU. Hahaha. But, it has four 8-pin PEG connectors, and I'm only using two of them. :-)
  • garadante - Friday, October 31, 2014 - link

    That further leads me to believe the HardOCP numbers are at fault. But could I please please please request that you guys at anandtech start using a proper nonreference 290X in your comparisons? A high end one because the difference between it and reference even at uber mode are significant and I want to see for sure the comparisons of the 290, 290X, 970, and 980 under best case conditions for all cards. So far almost every review sites testing methodology I've seen biases in favor of maxwell that I've seen. Did you at anandtech even ever test a nonreference 290X? I don't recall so, and I only recall 1 nonreference 290 being reviewed. Your card comparison graphs always seem to use reference data. We can all agree the reference cards are garbage but I'd like objective comparisons with reference cards that legitimately competitive and without (or significantly reduced) the flaws of the reference cards. (Pardon any errors. Typing on the phone is not cooperating with me today. )
  • Impulses - Friday, October 31, 2014 - link

    Huh, perhaps I could've gotten away with my Corsair AX750 after all... When I bought my 2x R9 290 (GB Windforce), all the numbers floating around lead me to believe I'd be right on the limit. I didn't wanna run the PSU at full load for hours, specially in Puerto Rico's 90 deg heat, so I upgraded.

    Ended up with ridiculous overkill in a 1200W Seasonic but I didn't want to do a small upgrade to 850W and potentially face the same situation in a few years. I had already upgraded from a nice Corsair HX520 to the AX750 when I bought 2x HD6950.

    There aren't many 9xxW units and the 1200W I ended up with was on sale and priced near a lot of the 1K units so... Could've saved myself the PSU upgrade with GTX 970s, or saved the same amount of money on the 290s by waiting some 4 months, but Maxwell was still an unknown over thy summer.

    It was a pleasant surprise tho, even tho I've gone red the last two times I upgraded I've owned more NV cards overall and I'm glad they pushed pricing and efficiency aggressively, everyone wins in the end.
  • chizow - Friday, October 31, 2014 - link

    Wow that's nuts, you have to be running close to or exceeding peak power draw on that 650! Do you run Vsync on in most of your games? You should really grab a Kill-a-Watt, they are only $20 or so on Newegg and blowing a PSU could definitely do a lot more more harm to your system than a $20 meter + a decent 850W or better PSU!
  • TiGr1982 - Saturday, November 1, 2014 - link

    Yes, I'm also surprised how that RM650 manages to pull TWO R9 290X at load, because my ONE R9 290 already makes my system consume around 400 W at full load (see above).
  • TiGr1982 - Saturday, November 1, 2014 - link

    Jarred,
    Can you please note the actual power consumption from the wall when you are running these TWO R9 290X with your RM650, as you are saying?

    I've got one R9 290 reference (with blower) and running it with with Corsair CX600M PSU, but that's ONE card.
    Still, when, say, Crysis 3 is uncapped at 60 fps (at 1920x1200, that's my monitor native resolution), my entire system is already around 400-440 W, according to my UPS watt counter (with H75-cooled slightly OCed Core i7-4790K, 32 GB of DDR3-2400 DRAM, three case fans etc).

    Yes, RM650 is much higher grade PSU, than CX600M, but that's TWO cards instead of ONE, so I can hardly imagine your situation. Probably, your RM650 is already around 650-700 W at load with TWO R9 290X cards - at its formal limits...
  • chizow - Friday, October 31, 2014 - link

    Well I will certainly give you the benefit of the doubt that you tend to be neutral, there's nothing in your post that would indicate otherwise. Personally, I freely admit I much prefer Nvidia products because they have given me the performance and features that keep me enthralled with PC gaming for the last 8 years or so since G80. I also find their initiatives and proactive investment in technology closely aligns to my interests when it comes to PC gaming.

    The one thing I would say regarding HardOCP's numbers is that while they tend to be a little heavier on the power consumption side, they are also a lot closer in terms of performance relative to the 980s in SLI. The problem I see with most 290X defenders is that they want to take its best-case performance without acknowledging the extreme power consumption that comes with it. Remember, the higher performance level these cards run, the higher their sustained and average TDP levels will be. In conditions that are not as well ventilated, like closed case or smaller test rooms, that means these cards may very well start throttling sooner. While that will look better for their power draw, it will also hurt their sustained benchmark performance. Its certainly possible HardOCP ran these cards in an environment that was conducive to running them at maximum clockspeeds for the entirety of the run, resulting in the shockingly high power consumption numbers.

    I would be curious to see some of the reviews where you say the custom cooled versions use less power, I am aware that lower temps and lower leakage can result in some power savings, but I would also think higher sustained clockspeeds would lead to higher power consumption over time. Also, the issue with power consumption isn't really about money saved per year, imo, its really about gaming in comfort for extended periods of time. Anyone who has used 2x250W or more GPUs in the past can attest to this, there's a pretty huge difference in 2x200 vs. 2x250+.

    Lastly, the reason AT uses both the default and Uber settings is detailed here in their GTX 980 Review:

    http://anandtech.com/show/8526/nvidia-geforce-gtx-...
    "And on a testing note, as is standard for our reviews we are using our reference Radeon R9 290X for our 290X benchmarks. For this reason we are including both the standard and uber modes for the sake of clarity and completeness. The temperature throttling that the reference 290X suffers from is basically limited to just the reference 290X, as the non-reference/custom models use open air coolers that have no problem dissipating the 290X’s full heat load. Both modes are included for this reason, to demonstrate how a reference 290X performs and how a custom model would perform. At this point you can still buy reference 290X cards, but the vast majority of retail cards will be of the non-reference variety, where the 290X Uber mode’s results will be more applicable."
  • Bobberr - Thursday, October 30, 2014 - link

    These recommendations don't take into account monitors with higher refresh rates. A single 780 (which I own) often is not enough for +100fps at 1080p using a monitor that can display all the frames.
  • Salvor - Friday, October 31, 2014 - link

    People with high refresh rates are a tiny minority, thus it makes more sense to base recommendations at 60hz. Obviously if you want 120fps@1080p you'll need around twice the GPU power of everyone else running 60.
  • Bobberr - Friday, October 31, 2014 - link

    That being said, even the 280 at 1080p is questionable.

    Metro LL: 48fps

    Bioshock: 66.2fps

    BF4: 48.3/ 75fps (cut those numbers in half for multiplayer, which is what the game is made for)

    Crysis 3: 55fps

    TW Rome 2: 45.9 fps

    Thief: 46.6fps

    Don't know about you, but averages at or below 60 are no good to me. Can't stand drops into the 40 or 50fps range, it's noticeable. However, I'd like to take back my comment about the article not taking higher refresh rates into account:

    "but owners of 120Hz 1440p monitors will find that it has enough power to push past 60fps at 1440p in several games." Right there in the article. Reading fail.
  • garadante - Friday, October 31, 2014 - link

    As a 280 (7950) owner I can say it's almost always 60+ fps in my experience, just turn down AA. Unfortunately for my strong urge to upgrade my rig, it's a solid card that will unfortunately last me a long time yet. But considering you can get a 290 for low $200s, the 280X is currently pointless and the huge gain in performance makes a $150+ 280 a difficult proposition. I'd consider crossfire if and only if 280s drop to $120, maybe $130. Otherwise the near term market is unfavorable unless you're in a budget that you can't work around.
  • Impulses - Friday, October 31, 2014 - link

    Yeah, Newegg has two R9 290 at under $250 right now, it's been years since we've seen that much power (in relative terms) for so little... Maybe towards the mid life of the 6950 when they were still unlockable and right around that price point. The GPU industry is one big case of deja vu.
  • garadante - Friday, October 31, 2014 - link

    I'm wondering what sort of sales we'll see around Black Friday. Prices may be already near bottom but they could go down another $30-60 in rare cases on Black Friday/Cyber Monday. I hope not, honestly. I'd be too tempted to snag a sub $200 290!
  • Impulses - Saturday, November 1, 2014 - link

    I'm curious yet cringing at the same time, seeing as how I paid $350 x2 for mine a few months ago... That being said, I've rarely seen really great BF deals on flagship cards the last few years. This year might be unique cause of NV's release timing and how long the 290/290x have been out tho.
  • hangslice - Thursday, October 30, 2014 - link

    I know I'm a minority, but I would love to see fanless graphics cards added to this list.
  • metayoshi - Thursday, October 30, 2014 - link

    To add to your minority list, I would also like to see special categories such as fanless video cards, as well as best small form factor video cards for the "slimline" cases. I mean, I own a full blown desktop with a GTX 780, but I was definitely interested in having a slimline case for something else as well.
  • hulu - Thursday, October 30, 2014 - link

    Maybe add an HTPC category? Selections in this category would emphasize video (post-)processing, HDMI and audio capabilities.
  • jimjamjamie - Thursday, October 30, 2014 - link

    I managed to get an Asus R9 290 DCUII for £210 last week. Good times to be had all round.
  • Lithium - Thursday, October 30, 2014 - link

    This one is for laughs.
    You reviewers cant save AMD.
    Don't do that, actually do that, its lame but fun.

    Only good products and technologies can save AMD and there is almost non.
    So good luck with saving AMD.
  • SantaAna12 - Thursday, October 30, 2014 - link

    2 980s: 1150$
    2 290x: 600$

    http://hardocp.com/article/2014/10/27/nvidia_gefor...

    Personally, I do not see any value in the current 980 offerings.
  • chizow - Thursday, October 30, 2014 - link

    Well fortunately for you, the market doesn't agree with your outlook. The world is your oyster when it comes to 290X choices, meanwhile, the GTX 980 is still extremely hard to find in stock!
  • TiGr1982 - Thursday, October 30, 2014 - link

    Really depends on where you are. E.g., I saw Zotac's AMP! GTX 980 on the shelves for 620 CAD in Toronto downtown core just yesterday. 3 were in stock yesterday, 2 now (Canadians are not really PC HW buyers on average, I would say). No GTX 980 are in stock in some other stores though, indeed.
  • chizow - Sunday, November 2, 2014 - link

    Yep definitely, but there is also something about the Zotac cards that always keep them in stock. I personally have never used them so I can't say, but I'm guessing lack of brand recognition/trust coupled with high levels of supply keep their cards in stock longer and more consistently than the other vendors. Same for PNY. They tend to have good stock levels where other brands like EVGA, MSI, Asus, Gigabyte sell out faster.
  • TiGr1982 - Monday, November 3, 2014 - link

    Well, to me, Zotac is fine (I have their TV box computer) - I see nothing wrong with Zotac. They also have Titans and even Titan Z as well, for example in their product lineup (both are reference designs though).
    PNY, in its turn, is also nVidia's partner for their professional Quadro cards, so there should be nothing wrong with PNY as well, I suppose. Quadro cards business is no small feat - professional boards require professional reliability standards, right?
  • chizow - Monday, November 3, 2014 - link

    Again, just reiterating some of the comments/concerns I have seen regarding Zotac along with the fact they tend to be the only cards in stock at any given time. PNY is Nvidia's Quadro/Tesla partner, but I've read their support isn't anywhere close to that level on their consumer gaming GPUs. Case in point, they have the "Lifetime Warranty" that only lasts for the life of the product, which can be interpreted as anything really. I guess people just aren't willing to risk these things when they are relatively confident the card they really want will be in stock in a matter of days.
  • Kjella - Thursday, October 30, 2014 - link

    Well, there's always 2x970 for $700 unless you want that very last bit of power, it is certainly way more powerful than a single 980. It has the same amount of memory, same features and ~81% of the shaders for ~60% of the cost. Of course speaking of "value" when you're talking about half a grand or more in graphics cards is a joke, but it's less excessive than 2x980. I was originally looking to buy that but it doesn't justify the last $450.
  • Impulses - Friday, October 31, 2014 - link

    With the current deals floating around, you can even score 2x290 for damn near $450, and they're already near the 290x... That's pretty cheap for something that can chew thru games at 4K or Eyefinity res.

    I'm almost pissed I didn't wait a few more months to get mine, but Maxwell was still an unknown then and rumors were even putting it as a 2015 release so oh well... Would've probably ended up with 290s anyway, but saved $100 per card.

    Them's the breaks...
  • SantaAna12 - Sunday, November 2, 2014 - link

    "980 Did not make me sweat."

    A 7$ a year difference in power. No need to be dramatic.

    2 290x: 600$
    2 980: 1150$

    And better framerates at 4K in some games.
  • chizow - Sunday, November 2, 2014 - link

    If you've ever run 2x250+W GPUs in a decent sized office, you'd know the difference in temps is what is dramatic, the difference in power costs is an afterthought for anyone who can afford $1000 in GPUs.
  • djallel84 - Sunday, November 2, 2014 - link

    how dou you classified GTX 970 as 1440 p high and AMD r9 290 as 1440 p med and they have equal performance ????
  • Communism - Sunday, November 2, 2014 - link

    http://www.ebay.com/sch/i.html?_odkw=R9+290+-%22R9...

    http://www.ebay.com/sch/i.html?_odkw=R9+290x+-%22R...
  • Vidmo - Friday, November 7, 2014 - link

    Please add a LOW PROFILE card category to this round up.
  • Qmoney - Wednesday, January 14, 2015 - link

    Fan boy war look out.. nvidia had the reins for a while. Now 4k gaming is coming and amd is taking over. Being a former nvidia fan boy, if I had to pick today, I'm taking a 295x2 over sli 980. Quit defending a company that doesn't give a crap about you. Go with logic.
  • nefar - Wednesday, February 4, 2015 - link

    Monthly guide, has there been another since October?
  • mikemcg - Friday, April 10, 2015 - link

    I'm a little confused. How can the 290X/295X2 be good 4K cards? From what I understand, they don't have HDMI 2.0 support, so you'd be limitted to 30fps on your bigscreen TV. And from my searching, there is no display port to HDMI 2.0 adpater that supports 4K at 60fps. I don't think the AMD cards are a good recommendation for 4K, unless you're willing to live with display port desktop monitors, and forgo 4K on bigscreen TVs.
  • alexbagi - Sunday, October 25, 2015 - link

    Loving the GTX 750 Ti! Best graphics card 2015 for the money
  • alexbagi - Sunday, October 25, 2015 - link

    http://www.144hzmonitors.com/graphics-card-buyers-...

Log in

Don't have an account? Sign up now