Best Video Cards: January 2014

by Ryan Smith on 1/17/2014 10:00 AM EST
Comments Locked

53 Comments

Back to Article

  • xTRICKYxx - Friday, January 17, 2014 - link

    Any reason to purchase a 290X over a 290? I don't think the $100+ premium is worth it at all. I feel I would be hard pressed to notice a performance difference at 4K with 2 290's/290x's in crossfire.
  • Ian Cutress - Friday, January 17, 2014 - link

    I'll take one of each.
  • Ian Cutress - Friday, January 17, 2014 - link

    Darn, that was meant to be a general comment :) For 290s in CFX, I don't think we have 290 CF in the engine, just 290X CF.
  • jwaight - Friday, January 17, 2014 - link

    Go with the 290 then get an adapter to mount a CLC on the card for quiet(er) running. Still waiting on the after-market cards to bring this option to us in bulk.
  • ImSpartacus - Friday, January 17, 2014 - link

    I'm surprised CLC isn't more popular on the 290. The card is simply begging to be differentiated by cooling.
  • ZackBarletto - Tuesday, April 1, 2014 - link

    I agree! but aside from the difference in core clock, the slightest in other aspects of the card may stand out more in a 4K environment. Overclocked(core, memory) at 1080p and 1440p, the 290 may even pass up the 290x.... but in a 4k environment, Im sure the slightest difference in pixel fill rate ect. may become a tad bit more clear.
  • JBVertexx - Friday, January 17, 2014 - link

    Why is everyone looking at the AMD cards as either for Mining or Gaming? Then you conclude that for Gaming, they are too expensive.

    What if you buy an AMD card for GAMING, but in the off time, you use it to MINE, and then use the proceeds offset the cost card? Jeeze, it's not rocket science to see that the AMD cards become the more economical choice for Gamers who are willing to Mine in their system's down-time.

    This past Christmas, I was going to buy a single GTX 760 for a PC for my 3 boys to share. But then I looked into Litecoin mining. I'm not Litecoin mining for "Income" or an "Investment", but merely to offset the purchase of the card.

    So using that logic, instead of buying the single GTX 760, I bought TWO AMD Cards, an R9 270x and R9 280x. Those two coupled with a year-old 7850 are now mining in our house after hours. A simple startup script that runs each night starts up the mining. When the boys get home from school, after they finish homework, they can close out the mining and start gaming to their heart's desire.

    Meanwhile, we have accumulated $300 in LTCs (no we haven't actually sold them yet) to offset the original purchase, and on top of it all, they're getting a good education in currency markets and markets in general.

    Bottom line, you don't have to be EITHER a miner or a gamer. We're gamers who are just mining because we CAN with the AMD cards. Consequently, that was 2 additional cards in demand for AMD cards vs. what would have been one for NVIDIA. So it might not just be "Miners" who are driving up the demand for AMD cards. It might be "Gamers" that are smart enough to see a way to offset the cost of their card.

    With that perspective, what card is the best value for the $$? In my mind, it's the AMD card.
  • Guspaz - Friday, January 17, 2014 - link

    Gaming and compute are different things. High-end AMD cards are better at compute than nVidia, but worse at gaming. People don't tend to buy a card intending to do both, so people buy differently depending on their use case.
  • JBVertexx - Friday, January 17, 2014 - link

    Seriously? According to all the articles I read, the R9 280x falls in between the GTX 770 and 760, much closer actually to the 770. The R9 270x is just a bit slower than the 760 at $30-50 less.

    So general statements like that are just ignorant of the facts. In fact, it also depends on your specific scenario. For example, if you are venturing in to 4K gaming (the true bleeding edge high-end), two R9-290x's will beat out two GTX 780ti's at a MUCH lower cost.
  • dylan522p - Friday, January 17, 2014 - link

    You must be kidding. AMD cannot compete with Quadro and Tesla.
  • lwatcdr - Friday, January 17, 2014 - link

    That depends on if your apps are OpenCL or Cuda based. nVidia's openCl performance is not great.
  • jublin - Friday, January 17, 2014 - link

    They are too expensive for gaming, when compared with the nVidia cards they're up against.
  • JBVertexx - Friday, January 17, 2014 - link

    Not if you mine on the side. See my initial post.
  • Drumsticks - Friday, January 17, 2014 - link

    Jublin just literally ignored everything you said. Lol.
  • lmcd - Friday, January 17, 2014 - link

    How much do you think you can mine anymore? Does it even offset your power bill?
  • eddman - Friday, January 17, 2014 - link

    Mining isn't viable anymore, especially for beginners. At its current state, It wouldn't even make enough money to pay for the used electricity.
  • talonz - Friday, January 17, 2014 - link

    True for coins such as Bitcoins, but Litecoins have no end to GPU mining in sight. The algorithm is not friendly to FPGAs or ASICs at this moment.
  • JBVertexx - Friday, January 17, 2014 - link

    Litecoin mining is viable - it's best if you join a pool though if all you have is one or two GPUs. It's not as something that's going to make you rich, but as a way to offset the cost of the card, it is definitely viable. I'm well on my way for paying for two GPUs I purchased just before Christmas, even net of electricity costs.
  • pax256 - Tuesday, January 21, 2014 - link

    Assuming you can get any worthwhile mining going. I spent 2 weeks trying to get anything other than Bitminer going on my 6950 to no avail. Mining is arcane and difficult and mining bitcoins directly makes no sense anymore. Most gamers arent into writing their own conf files or figuring out why they can get connected to a pool. But theres enough geeks out there who can mine who buy dozens and hundreds of cards to setup in racks to push the price the radeons to insane levels. As it is the price of the 290's dont add up for gaming. 600$ plus tax and ship for the 290 non x with vendor air cooler (reference cooler is terrible) is just to much.
  • JBVertexx - Wednesday, January 22, 2014 - link

    Not sure where the issues were. I'm not a techie by any means, but with a little help from Google, I was up and running with CGMiner in an afternoon. Bitcoin mining is a wasted effort, and even Litecoin mining I personally wouldn't try for an investment. My goal is simply to offset the cost of the cards, which is a much shorter-term goal, and which I am pretty close to completing. Any extra $$ after that, will just be gravy.
  • jlallen213 - Thursday, March 6, 2014 - link

    Would you think GTX 690 would be good enough for both mining and gaming?
  • imaheadcase - Friday, January 17, 2014 - link

    Be weary of the free games.

    If you don't get the code for it in box, you have to contact Amazon.com to get it if you buy it from there. Many customers never got the games until it was to late (promotion ended) so no codes worked.

    Amazon made good with me and just gave me Splittercell and Black Flag though!
  • nathanddrews - Friday, January 17, 2014 - link

    I was hoping to pick up a 290 once the modified coolers arrived for around $450, but the mining community has forced me rethink this plan. It's for the best, though. I'll just wait and save up until we get some sub-$1,000 4K60 panels that also operate at 1080p120, then go with a CF/SLI solution. By then maybe we'll have a GCN die shrink or Maxwell in addition to some of this Freesync/G-Sync mess sorted out.
  • cgalyon - Friday, January 17, 2014 - link

    My thinking is that the run on AMD cards may also damage their future potential as gaming cards. If the cards are not available for mainstream gaming audiences (due to inflation), then there is less incentive for developers to work with Mantle (why spend the time if fewer gamers have the cards?). If Mantle actually presents a significant advantage for AMD's cards going forward, then this may delay its adoption. Basically, as AMD is not receiving increased income from the card inflation, overall this run on their cards is a bad thing for them (other than the increase in inventory moved).

    On the other hand, if AMD wants to push their brand harder as a GPGPU, then this could help boost their brand image in that capacity (just not in gaming).
  • haukionkannel - Friday, January 17, 2014 - link

    Yes and no... There is demand for AMD cards allso for gaming, but the retailers has pushed the price too high. It will be remedied in time. The normal prize will come back, because the miners allready have all they want... Or so I hope... Or then AMD can increase production capasity (orders), because cards are flying out from the shelfs.
  • anandreader106 - Friday, January 17, 2014 - link

    cgalyon - When the miners move onto new cards, those 290s will find themselves on eBay, craigslist, etc and will ultimately fall in the hands of gamers. It's just a matter of when.

    Only my opinion of course.
  • eddman - Friday, January 17, 2014 - link

    I'm not sure if I'd want a card that has been worked almost to death at 100%, 24/7.
  • just4U - Tuesday, January 21, 2014 - link

    yep.. your best bet.. if you ever see used 280/290/ cards... DO NOT BUY.
  • Da W - Friday, January 17, 2014 - link

    AMD is worst at gamin ONLY IF you focus on 1080P charts, but then if you spend 500$+ to play 1080P you're an idiot.
    AMD 290s scaled better with resolution, matching or beating Nvidia at 4k res on the same games they were beten at 1080P.
    I don't buy Ryan Smith argument that 4k benchmarks don't mather cause 4k games are still unplayable with a single card. 30-40-50 fps is playable to me, and an extra 5-10 FPS over another card is significant, especially if you are below 60FPS. Nobody seems to be talking about eyefinity anymore, but i game with 3 screens, make it 3K res. That's 25% less demanding to the GPU than 4k. Now with AA off, you can pretty much play any game on high or ultra settings with a single 290/290X at 3600X1920.
  • nunomoreira10 - Friday, January 17, 2014 - link

    And what about 1080p 120Hz gaming?
    thats where alot hardcore gamers are going, and nvidia is better there.
  • nathanddrews - Friday, January 17, 2014 - link

    Exactly. I won't call myself a "hardcore gamer" because it sounds pretentious, but I try to get 120fps from all my games (if it is supported). It's hard to go back to anything less, I'll even sacrifice image quality and resolution in order to do it.
  • Akkuma - Tuesday, January 21, 2014 - link

    You're forgetting about all the people out there who are or would like to stream. Those extra fps will help keep streamers from having poor fps. Aside from streamers, MMOs can throw tons of characters onto the screen at once, so that needs to be factored in as well. Lastly, there are only roughly 4% of gamers running a single monitor that is over 1080p and 2.5% are running 1920x1200, leaving only 1.5% running significantly higher than 1080p. In who knows how long, those extra fps may make the difference between a good and subpar experience in a future game.
  • anandreader106 - Friday, January 17, 2014 - link

    Hey Ryan,

    Could you clarify what you mean by "Performance Band"? Sometimes I feel like I live in an alternate universe with these recommendations.

    I have a Radeon HD 7700 paired with a quad-core [email protected] and 8GB of RAM. I play at 1080p and I never have to turn down settings from 'High' even with new titles.

    I understand that "down the road" that will obviously change. Could you specify the time frame you have in mind when you use that term?

    Thanks.
  • Ryan Smith - Friday, January 17, 2014 - link

    Performance bands are nothing more than our terminology for lumping together video cards based on their similar performance, and what kind of settings you can expect to run on those cards. As you can see in the chart, we have bands for 1080p at low quality, 1080p at high quality, 1440p, etc

    In the case of the 7770, it can play some games at 1080p at high quality, but it's usually skirting 30fps averages with minimum framerates below that. And next generation games (as in games coming out this year, particularly XB/4 ports) will be more taxing yet.

    http://www.anandtech.com/bench/product/1079?vs=104...
  • nfriedly - Friday, January 17, 2014 - link

    Do cryptocoin miners tend to sell off older hardware eventually as the difficulty rises? If so, how long before there's a ton of R 280 & 290 cards on the market for cheep?

    (And, on a related note, is there anything that is no longer the latest and greatest at mining that can now be picked up used for cheep?)
  • resiroth - Saturday, January 18, 2014 - link

    The cards tend to burn out as they're run under pretty dire conditions. 4 cards in a case with fans at 100%, temps in 85-90C range 24/7. It's not a card you would want to buy.
  • vdidenko - Friday, January 17, 2014 - link

    Do you think it's time to start mentioning Linux driver support level (closed/OSS) in reviews, if not taking it as part of ratings? That may touch all components, but video cards are most problematic from my experience.
  • Jeffrey Bosboom - Friday, January 17, 2014 - link

    This, especially with SteamOS becoming a thing.
  • eanazag - Friday, January 17, 2014 - link

    Good article. I have a GTX660 and was considering mining on the side, but looked at the difference in performance. On a GTX660 it looked like a waste of electricity. I am tossing around the idea of a Kaveri on that note for one of my systems. To game and mine.
  • happycamperjack - Friday, January 17, 2014 - link

    Nah Kaveri would be horrible at mining. Probably around 150 Kh/s. A $200 270x would give you almost 500 Kh/s. So I think the best gaming/mining combo now without paying the hyper inflated premium is probably 3-way crossfire 270x. You'd get around $250 for litecoin mining for that setup.
  • DanNeely - Friday, January 17, 2014 - link

    Are there any indications about when this year AMD/nVidia expect to launch their next generation cards on a smaller process?
  • olegk - Saturday, January 18, 2014 - link

    In the $139-$209 category it's not clear why you would suggest Radeon R7 260X.

    GTX 480 is almost twice as good, and can be bought for $199 from Sears or Rakuten.

    In the $209-$329 category you suggest Radeon R9 270X, while GTX 770 is 50% better and can be bought for $315 at B&H. Looks like your price research messed up, because you suggest it in the next category.
  • Friendly0Fire - Saturday, January 18, 2014 - link

    Recommending a card that's 3 generations old by now isn't advisable. It won't be able to keep up with a lot of the new tech. The 480 is also notorious for its horrible power usage and noise.

    Also, as far as I can tell AT generally goes for the most common price instead of going for the lowest possible price because those generally don't last. Sure you can get the 770 at B&H at that price now, but are you sure that price will hold for the whole month that this guide is supposed to be valid for? By going for the general market price they avoid such fluctuations and make the guide valid for all of the period.
  • blzd - Sunday, January 19, 2014 - link

    It'd have to be a lot cheaper then $200 to make a GTX 480 seem appealing. Any money you save will be lost on your power bill over a year.
  • mapesdhs - Monday, January 20, 2014 - link


    $200 for a 480?? I get GTX 580 3GB cards for less than that. Much better than a 480.
    My gaming PC has two of them (Palit 783MHz versions); in SLI they're faster than a
    780 at a fraction of the cost. My AE machine has four of them, all 832MHz MSI
    Lightning Xtremes (quick test showed they run fine at 950MHz, should do over a GHz).

    Plenty of better options for bagging older cards than the noisy & hot 480.

    I wouldn't recommend normal 1.5GB 580s though, VRAM bit low for any future
    proofing, but a couple of 3GB 580s is sweet. Not yet playing the latest titles, but
    for FC2 with all details maxed out @ 1920x1200, 32x CSAA, all NVIDIA settings
    to Quality, etc., I get 130fps or so. Nice to be able to have Vsync on and never
    see any tearing, without spending a fortune. 8)

    Ian.
  • vishwa108 - Monday, January 20, 2014 - link

    Well presented Ryan. Miles better format and content that you know Tommee-robotica format who. Honesty/sincerity is one thing "The Media" lacks, being review freebie driven, just like its MSM cousins who are politically/coralling-sheep driven.
  • [email protected] - Monday, January 20, 2014 - link

    Jabin Jay Trapp said FWIW, coin mining with Radeon APU is pointless as power/coin to ratio is overwhelming....the only way to mine now, without using more power then the coin is worth is to use a dedicated ASICs chip, - Jabin Jay Trapp
  • JBVertexx - Wednesday, January 22, 2014 - link

    That's for Bitcoin mining. Most of the AMD GPUs that are being sold now are for Litecoin mining, which is viable.
  • ijh - Tuesday, January 21, 2014 - link

    Should I use 256-bit or 384-bit for memory interface?
  • Panta - Wednesday, January 22, 2014 - link

    your absolutely right!
    BS article.
  • apertotes - Saturday, January 25, 2014 - link

    Hi Ryan, I know this article is a bit old now, but I wanted to comment before the next one is published. I would love if you added a category to your analysis: 1080p 120hz. It is not quite 1080p high, since that is usually reserved for gaming at 1080p with high quality, but with 60 fps as a goal.

    So, me, being a complete ignorant in all these things, have no way of foretelling whether 2x270X CF would be better than a single 290 or 780 if I wanted to play The Witcher 2 at high quality, 1080p 120 Hz, for example.

    I raise this topic because one of the things that I think I learned is that memory does not get you more FPS per se, it mostly helps for higher resolutions, but if I do not need more than 1080p, would two cheap cards on SLI or CF help anything at all, or do I need to go for a single faster card?

    Oh, and congrats for your outstanding job.
  • tecsi - Tuesday, February 18, 2014 - link

    What about video cards for 4K productivity?
  • staryards - Friday, February 28, 2014 - link

    This is an outstanding example of carbon-copying Tom's Hardware's "Best Graphics Cards for the Money" format, which has literally existed for years...

    For shame, Anandtech.

Log in

Don't have an account? Sign up now