Best Video Cards: June 2014

by Ryan Smith on 6/30/2014 11:00 AM EST
Comments Locked

51 Comments

Back to Article

  • EzioAs - Monday, June 30, 2014 - link

    I would hope that the next gen GPUs should have a bit more VRAM than their predecessors. Mid-range should have at least 3GB and high-end should have 4GB+
  • dsx724 - Monday, June 30, 2014 - link

    With the mining craze, 290x's can be easily had for $400 new and 300 used. That blows everything Nvidia has to offer out of the water.
  • cgalyon - Monday, June 30, 2014 - link

    Could you point me to where the 290x can be found for that price range? (Not being snarky, I'm genuinely interested. Thanks!)
  • Flunk - Monday, June 30, 2014 - link

    http://www.ebay.com/sch/i.html?_trksid=p2050601.m5...
  • cgalyon - Monday, June 30, 2014 - link

    Thanks! Is it known if Bitcoin mining was particularly hard on these cards though? Like, is the life-expectancy on these cards especially short because of hard use?
  • DanNeely - Monday, June 30, 2014 - link

    It shouldn't be any worse than running any other GP-GPU app on the card. I've been running boinc science apps on my GPUs since the GTX 260 (generally 2-3 years/card) when I'm not using them for gaming. Out of 5 cards I've had one fan failure and one card that failed in an OS crashing way; both occurred after >2 years of use. A few months of hard use shouldn't be enough to cause problems unless you're keeping your cards for well beyond the point they become obsolete.
  • Gasaraki88 - Tuesday, July 29, 2014 - link

    Bitcoin mining using video cards are no longer cost effective. Even if electricity was free, the time it would take to make back the money you bought the card for would take too long.
  • DrApop - Tuesday, July 1, 2014 - link

    blows everything out of the water....doing what? Playing with your game controller?

    not in rendering, video encoding, Blender modelling.
  • Samus - Thursday, August 21, 2014 - link

    I think what he meant is the latest generation of AAA games are all optimized for Radeon's, but many supporting Mantle.
  • LoneWolf15 - Sunday, August 24, 2014 - link

    I got my two factory hot-clocked 280x cards for $175 each, in mint, in-box condition. Hard to argue with that.
  • lever_age - Monday, June 30, 2014 - link

    Minor mistake / correction: the top table lists a GTX 740 and GTX 730, but they're low enough in the stack that Nvidia doesn't give them GTX branding. They're GT. A "GTX 740" also shows up in one of the paragraphs.
  • Ryan Smith - Monday, June 30, 2014 - link

    Thanks. I write "GTX" so often that I don't even register that as an error.
  • Morawka - Monday, June 30, 2014 - link

    i think all the high end cards need to match the PS4 and XBone's 8GB mark. Especially with tons of people moving to 4k gaming this year.

    Nvidia and AMD have quite literally had to shelf their designs due to TSMC being behind by a lot on 20nm. TSMC missed their forcast by almost a year now and the timer keeps counting.

    I wonder if Nvidia is pulling the 880 off the shelf and saying "you know, we might need to tweak this design because of the way 4K gaming is really taking off.
  • BrightCandle - Monday, June 30, 2014 - link

    Its 5GB and that is a total for the game RAM and the GPU VRAM combined. The rest goes to the operating system and isn't available to the game. Its more likely that 4GB is the peak usable VRAM by a console port, and so far all console ports have need 2GB for their console quality graphics and only required more for the special PC graphics. Thus its not a driving force for GPUs, PC games is the driving force for more VRAM and it will increase as it always does generation to generation.
  • Makaveli - Monday, June 30, 2014 - link

    Thanks for making that correction BrightCandle you are 100% correct.

    "Especially with tons of people moving to 4k gaming this year."

    Morawka what tons of people? you mean with all the 30hz 4k displays out? and all the single gpu's that can push this fps. I think you are getting alittle head of yourself.
  • Morawka - Monday, June 30, 2014 - link

    the $600 60hz asus monitor just came out, others will no doubt follow over the next few months
  • euler007 - Monday, July 14, 2014 - link

    No serious gamer will go with that 60Hz panel, they will go the ROG Swift PG278Q.
  • nashathedog - Wednesday, July 16, 2014 - link

    There's plenty of 60hz 4k gaming monitors available, there's even v-sync versions coming in the Autumn, As for Brightcandles so far all console port statement, What console ports? There's only been one so far that was primarily a next gen build and that was Watchdogs and it used around 3.5 gb's at 1080p, The unified ram in the next gen consoles does leave us with the possibility that the next gen ports may be ram hogs unless there reigned in. But nothing is known for sure as we've only had one example so far.
  • nashathedog - Wednesday, July 16, 2014 - link

    What console ports? There's only been one so far that was primarily a next gen build and that was Watchdogs and it used around 3.5 gb's at 1080p on the PC, The unified ram in the next gen consoles does leave us with the possibility that the next gen ports may be ram hogs unless there reigned in when ported, But developers are lazy and as we saw Watchdogs was an example of a port that's not optimized for PC properly. But nothing is known for sure as we've only had one example so far.
  • eddman - Monday, June 30, 2014 - link

    Well, on the plus side, TSMC's delay is forcing them both to tweak and improve their current designs and reduce power consumption as much as possible.

    I suppose not having the luxury of a new process node is a good thing, from time to time,
  • StevoLincolnite - Friday, July 4, 2014 - link

    Nope.

    The difference between a PC and Console is that a console has a shared memory pool, PC has dedicated memories for specific devices.

    Essentially, with a console that 8Gb of memory will *never* be used completely for graphics duties, the GPU's are simply far to under powered for that.
    Conversely, something like 3-3.5G of Ram is used exclusively for the OS and other tasks, that leaves 5-4.5Gb of Ram for rendering, A.I, Caching, Sound, Networking, you name it.

    A modern high-endish PC will have a 3Gb+ video framebuffer with 8-16Gb of system Ram, usually the OS will gobble up roughly 2Gb of that system memory, with the games non-graphics assets chewing another 4Gb+ in a demanding scenario.
    The only time 3Gb should be an issue is with poorly made console ports and running at 1440P and higher resolutions.
  • Gasaraki88 - Tuesday, July 29, 2014 - link

    What? First the consoles SHARE that memory between the system and gpu so 8GB is nothing. Second is that the consoles can't do 4K at all. They can barely do 1080p at 60FPS. Third, 4K gaming is not "taking off". The cards required to run 4K games at a decent quality are too expensive.
  • zoxo - Monday, June 30, 2014 - link

    I won't 4k game in the foreseeable future, still, with texture sizes growing, more vram is probably nice for future proofing in that regard.
    The other aspect is GPGPU, where more ram hardly ever go amiss. So I'd very much appreciate more vram as a default on all future cards both desktop and mobile...
  • dannydeluxe - Monday, June 30, 2014 - link

    Why did you write "especially if it’s going to be paired up with a higher quality and more expensive IPS/IGZO monitor."? IPS is not what IGZO is. IGZO can be TN too.
  • Clorex - Wednesday, July 2, 2014 - link

    TN IGZO? How does that work? I thought IGZO was distinct from IPS and TN. So what kind of panel tech does IGZO refer to anyway?
  • Zizy - Thursday, July 3, 2014 - link

    TN/IPS (and *VA) refer to liquid crystal arrangement. IGZO is for the transistors (TFT) used to drive these pixels. Not related.
  • Jovec - Monday, June 30, 2014 - link

    VRAM requirements are going to continue to rise for reasons outside of improved visuals; namely that it's cheaper for devs to require more VRAM than it is to work on any targeted optimization. Just look at the last generation with 512MB of total RAM and the relative low quality of ports on PCs with 4GB+ RAM and 2GB+ of VRAM.
  • SwissCM - Monday, June 30, 2014 - link

    There was essentially a fire sale on ebay for R280/R290 cards about a month ago thanks to coin miners selling off. I managed to snag a 280X (with custom cooling and factory overclock) for $150. One hundred and fifty.

    It was used apparently but looked brand new, though out of the box. Works perfectly. I wish stuff like this happened more often.
  • theNiZer - Monday, June 30, 2014 - link

    The price/performance approach of this article is a little narrow minded IMHO - of course AMD will 'win', but as a long term AMD owner I must say - driver quality actually matters! Hence I would suggest anandtech to take the 'price/performance and quality' route, or has the deal with AMD (you know, the 'AMD center' is paid right?) blinded the writers?
  • theNiZer - Monday, June 30, 2014 - link

    Just to clarify - I buy Nvidia now, paying a little more for software quality actually makes sense to me now due to much frustration with immature AMD drivers in the past.
  • anandreader106 - Monday, June 30, 2014 - link

    The price/performance approach is not so much narrow minded as it is quantifiable. You can't quantify "quality" the way you're implying.

    I do not have a problem with AMD drivers, so where you might give AMD an "F", I might grade them an "A". I would give Nvidia drivers an "A" as well, though they are not perfect either.

    If you have the knowledge base to make a better informed buying decision for your needs and Nvidia fits your bill, then there's nothing wrong with that. For others that are looking for a simple measure of "bang for buck" that doesn't require in-depth knowledge, AMD wins......for now.
  • Wreckage - Monday, June 30, 2014 - link

    Exactly. There is a reason AMD cards are cheaper. If they had better cards they could sell them for more. Price per FPS is a terrible benchmark. You can strap a rocket to a golf cart and make it faster than a corvette, yet no one would ever recommend the golf cart over the vette.
  • piroroadkill - Monday, June 30, 2014 - link

    Disagree. I don't know what the big problems are with AMD's drivers, but they work fine for me, and I'm sure they do for countless others.

    NVIDIA's drivers have been absolutely terrible at times too, causing constant display driver resets.
  • andrewaggb - Monday, June 30, 2014 - link

    I'd have to agree. I switched from nvidia because of driver problems and bsod's back in the ut 2004 days.

    Using amd ever since (currently running 270x,265,7850,6870 in my gaming pc's) and my kids play lots of titles on steam and origin without problems. A few months ago a beta driver broke minecraft, but their next beta fixed it. That's the only problem in recent memory that affected us.

    Obviously you're mileage may vary.
  • andrewaggb - Monday, June 30, 2014 - link

    About the minecraft thing, It was also fixable by installing the previous driver. I just wanted to play around with mantle so I was trying the beta drivers out with battlefield 4.
  • dstarr3 - Monday, June 30, 2014 - link

    I still think my Viper V770 owns all other cards.
  • vargis14 - Tuesday, July 1, 2014 - link

    Dstar,
    You got me for a couple minutes until I realized you were referencing a around 15 year old card:)
    Brings back fond memories of my SLI Vodoo2 8mb cards with a p2 300 gateway computer That I had to add 2 80mm fans in the side panel to keep them from locking up from heat. I also remember how it was such a bit hit at a Warbirds convention we had to run to the local Circuit city in North Carolina and buy every 80mm fan and grills they had along with a stop to a hardware store for a drill and a jigsaw. I think I did 10 side panel jobs and around 20 other people ended up finding more fans and doing there cases also. Ahh the good old days.
  • plonk420 - Monday, June 30, 2014 - link

    the 7790, aka 260X is a decent living room gaming card ... before a spare 720p TV moved out of my current arrangement, i was getting ~220fps on SSFIV. a few minutes in DXHR and Ass Creed seemed perfectly smooth, tho neither game really grabbed me.

    moved the computer to a larger, 1080p TV and had to bump SSFIV down to 2xAA from 4 or 8x but still am getting over 120FPS. and what i thought was ~40FPS in Saints Row 4 (all maxed except SSAO off and shadows on medium) was 25-30FPS, and shockingly felt "smoothish"
  • fenix840 - Monday, June 30, 2014 - link

    I disagree with the guy that said 4k gaming is taking off and new cards should have 8GB. 4k is just beginning to hit mainstream and will probably remain prohibitively expensive for at least another couple of years. I think whether people like it or not, 1080p will remain the dominant resolution used by the masses at least until the end of the current console cycle(another 6-9 years).
    As to Ryan's question, I can say that the whole Watchdogs 3GB for ultra thing has definitely made me wait for Nvidia's next offering. I had wanted to get a 780 but now I don't feel I'd be future-proofed for 2 years with the 3GB standard. I think for me 4GB is now the minimum and 6GB is the max. The consoles can't utilize more than 6GB so I think you'd be hard-pressed to ever be VRAM bottlenecked with 6GB.....at 1080p. Which again is what I think MOST gamers play at.
  • siriq - Friday, July 4, 2014 - link

    Well, this is just the beginning : http://www.slashgear.com/tcl-ue5700-ultra-hd-tvs-t... 4K already knocking at the door and it is relatively cheap now and just gets cheaper.
  • Wolfpup - Tuesday, July 1, 2014 - link

    In theory I wouldn't mind buying a new GPU for 24/7 Folding @ Home to replace my GT 430 in a low end system that can't deal with much power draw...but what's astonishing is the GT 430 looks like it's still basically the same card as at least one version of the GT 730! Now that's some serious rebadging LOL

    Seems like there ought to be a Maxwell part that actually massively outperforms my Fermi based 96 core GT 430 in the same power envelope...I hope.
  • vargis14 - Tuesday, July 1, 2014 - link

    Vram amount is why before this past christmas I had to upgrade my 3+ year old 2600k@4800mhz systems SLI'ed 1gb EVGA 560ti SC cards with a pair of 4GB EVGA GTX 770 Classifieds.
    They perform outstanding and I am presently saving up for the 34" LGUM95 21:9 3440-1440 ultra widescreen IPS monitor. I am going with this resolution because it is only 2.4 times the pixels of 1080p and not 4 times the pixels of a 4kUHD panel also I love the 21:9 widescreen format for far more peripheral vision along with tons of desktop real estate. I read the review on the monitor here on anand and the backlight problems are being addressed and if you purchase one make sure the date is says at least june production on the monitor. The April models have the backlight problem. But you never know there might be a better/cheaper 34" 3440-1440 resolution monitor out by the time i can afford it, but as of now my heart is set on LG's UM95. Those looking for 34" of ultra widescreen bliss that cannot push as many pixels LG also has the UM65 a 34" 2560-1080 panel for around $300 less. $599 vs $899 Never know I could be happy with the lower resolution model but I would prefer the 1440p model vs 1080p one.
  • Freakie - Tuesday, July 1, 2014 - link

    So... What about the Titan Black? That is hands down the single-GPU king. I can understand leaving out the Titan Z, as the price of that thing is just utterly rediculous, but I should think that the Titan Black should be on the list especially if you absolutely do not want (or due to space requirements, don't have the room for) more than 1 card.
  • SirKnobsworth - Tuesday, July 1, 2014 - link

    The Titan Black won't offer significantly better gaming performance compared to the 780Ti. The main difference is more double precision floating point compute power and more memory (6GB vs 3GB), neither of which usually matters much in gaming.
  • DrApop - Tuesday, July 1, 2014 - link

    Game, game, game.

    Some of us do actual work using graphics cards. Not a single word on rendering capacity of these cards. If you own Adobe cs6 or below, amd is worthless. If you use Blender, cuda is your only option....and fermi trumps kepler.

    How about a graphics card roundup from a perspective of people who use these graphics cards for work or rendering video for small business. Not all of us can afford a $3000 graphics card.

    To be honest, I am sick and tired of all the talk about graphics cards and gaming. People have been doing video, modelling and other types of high quality rendering for a number of years now, yet have been completely ignored except for a benchmark or two.
  • Anonymous Blowhard - Wednesday, July 2, 2014 - link

    I'd say a good 95% of video card buyers never do anything but game on theirs, maybe 4% do the odd video render, and the rest is you.

    That's why you can't find content or reviews with a non-gaming focus.

    Besides, the answer for professional 3D is virtually always "go buy the fastest Quadro you can afford."
  • DrApop - Wednesday, July 2, 2014 - link

    Have you watched youtube lately? How many rendered videos by the average Joe are there? There has to be millions. Yes, I do agree that gaming is a huge market. But with AMD and Nvidia releasing what seems to be 5-10 cards every time they do some little update (even just a fake number update) one would think they would do some tweaking of one of their cards to strengthen video/modelling type rendering and market it as such.

    A dude/dudette that even does small wedding/events type videos (of which I am not one) would benefit from such a video card......and likely can't afford a workstation like Quadro card.
  • Iketh - Tuesday, July 8, 2014 - link

    LOL you're killing me... you've dug yourself into a false reality

    most people render with their CPUs... you get a better quality video this way

    GPUs can help with the editing, but a hyper-threaded quad-core is more than plenty
  • bnjohanson - Sunday, July 13, 2014 - link

    You're a fringe share of the market; there are publications dedicated to just this and this is where you should roam. Combining, comparing, and contrasting rendering with gaming in the same article is like Road & Track magazine illustrating the performance comparisons of a Porche 911 with a Ford F-250. It would be whiplash....
  • Samus - Monday, July 7, 2014 - link

    Where the frak are the new Maxwell cards already? This is ridiculous they're sitting on this flyass architecture and the fastest thing they have is a 69-watt 750Ti.
  • leminlyme - Tuesday, September 2, 2014 - link

    "Finally, the GTX 780 Ti in SLI is also going to be a viable alternative here. From a performance perspective it will trail the AMD setups by 5% or so at 4K, so while it can’t match the AMD setups hit-for-hit it doesn’t significantly fall behind, making it practical to get similar performance in the NVIDIA ecosystem."

    What's all this then? Unless you pick games with that very ridiculous heavy AMD favor (Which are jokes) the 780ti SLI beats the 290x CF in power everytime. If you use non-reference, then it still wins. On reference, it still wins. The only thing that beats 780SLI by number of gpus is the 295x2, which actually costs around 125% as much as a 780 ti SLI configuration, and has a mess of other downfalls, for like a 1-6% performance advantage.

    http://www.anandtech.com/bench/product/1073?vs=118...

    Those frame time variances though.

Log in

Don't have an account? Sign up now