At least the actual purchase price of the AMD cards are starting to come down to +10% or so above the MSRP/linked price.
Team Green has some inherent advantages as well:
- Linux drivers that aren't completely terrible - Shadowplay for streamers - GameStream, which combined with the free LimeLight app gives you streaming without shelling out for a SHIELD
So are the AMD linux drivers still terrible? I knew that was kinda the case a few years ago, but I would imagine AMD's current driver efforts would have helped the linux side of things too. Anyone got some experiences on that?
I have 2x R9 280xs and yes AMDs Linus drivers are still terrible, but better than my last experience, I didn't have to hex edit any binary files this time. To be fair Nvidia's drivers also need some work.
I doubt that you even are able to change a directory in linuX using a terminal. anyway, here are a tone of benchmarks on linuX graphics drivers, even comparison with the respective windows ones. http://goo.gl/8zJPQ and here are some with the latest drivers from both companies. http://goo.gl/jmkFTy
Remember that there are two sets of AMD drivers on Linux: open and closed source. The nVidia drivers are better than the AMD closed source drivers, but the AMD open source drivers are much better than the nVidia closed source drivers in terms of stability and working out of the box without having to fiddle with stuff. The capabilities and speed of the open source driver are much inferior to that of the closed source drivers, but that improved immensely in Linux kernel 3.13.
Nah, Raptr's fine. It is improving on a steady basis.
Their method - collective data and pattern computerization - of optimizing performance is also better from a long-term standpoint than Nvidia's stone-age method of individually testing games by tired and overworked interns.
That is nice for AMD users but can't really be called "Shadowplay". One feature missing right off the bat is streaming straight to Twitch just by logging in not to mention the "always recording" and "start recording whenever" features.
Forgot Gsync which as your card ages will keep you playing with it longer. Both Gsync and Freesync will cost a new monitor but Gsync is the real deal.
Also, a major point anandtech seems to always miss...NV=no phase3 drivers (yet?) ;) Will there be a phase 4? How much longer before they finally get everything working on the last gen cards? Enduro still having problems too last I checked. Raptr is currently a joke vs. NV's solution. NV usually has a profile for a game on release or many times while a game is in BETA (watchdogs for example, supported in 337.50 already). You can wait 6 months to a year on AMD for that (see hardocp driver recap for both sides).
Valve chose NV for a reason. You get what you pay for.
You also forgot Cuda, which for the first time AFAIK is in a game now (nascar 2014). If this is the direction of the future AMD will fall further behind as cuda has a TON of people that already know how to use it very well (7yrs of playing with it, taught in 400+ universities). It now runs on mobile too with K1, so should massively add to it's units in the wild giving even more reasons for devs to use it (already 350mil+ cuda gpus out there).
Although he could have worded it better, it was clear what he meant. He mentioned the 280 (vanilla), and he was asking if it is worth recommending as a card in between the 270X and the 280X. If you can stretch to afford the $300 280X, it's worth it.
But if your budget is right between the 270X and the 280? Go for the 280. It's not that much slower than a 280, and it's a big step above the 270X. You can get a few models on Newegg right now for $270 plus a $10-20 rebate card... so effectively ~$250.
ASUS GTX 770 DirectCU II 2 GB $299.99 after $10 MIR.
Slightly faster than the 280X for the same price, but the 280X may be slightly more future proof due to 3 GB VRAM vs 2 GB on the 770. The 770s OC very well, though, which may split the performance delta further. Pick your poison.
I'm running the EVGA GTX770 Superclocked and I don't have an ounce of regret in terms of performance per cost. The card's a monster. The 280X may be a bit cheaper, but it's also a bit slower, hotter, and louder. I'd gladly pay the slight price premium to have a card that performs better, cooler, and quieter. And I have done.
You're not really getting it if you're comparing a reference card to a factory-overclocked card that has been given a price premium to be better binned and have higher quality components for better overclocking.
Where on earth is Maxwell? I think someone somewhere should do a speculative or investigative article on why 20nm silicon is just flat out not here yet. I can't find any details on why.
20nm isn't here because of TSMC, apparently. There's even rumors of Nvidia developing more 28nm variants of Maxwell, otherwise they simply won't have anything on the market for a good long time.
Yes TSMC is late, but why are they late? This affects the usual flow of tech and it's important to take a high level look at what's going on and what this means. So far there's been a frustratingly small amount of talk even on the fact that the process is late let alone why and what it means.
TSMC is late because TSMC is always late. They've been over promising and under delivering (if not outright cancelling) on every process node they've sold for nearly a decade. Anyone who expects them to meet their stated launch dates for volume production of new processes now is as delusional as the people who're picking the dates in the first place.
And I expect nvidia to predict this, which they probably did to some extent judging that they went ahead with the small Maxwell chip at 28nm and built the 750ti. In fact that was so successful I'm hoping they will also go ahead and build medium Maxwell on the 28nm node for the 860 and 860ti, who cares if the die is a little larger when the process is mature and yields are high. Besides they seem to be getting great power efficiency with the 750ti and this is spread over a larger die size at 28nm vs 20nm so it is easier to dissipate the heat off the chip. I think an 860ti on 28nm would be a rocking card, strike now nvidia!
Especially now that TSMC has bigger fish to fry than GPU's.
Even when they get their nodes working, they favor mobility/ARM/SOC's now over GPU's. This is why nVidia did their test run of Maxwell on 28nm and--once it was a big success--they started transitioning more products of Maxwell over to 28nm.
It's difficult to believe there's not the Geforce 750ti suggested. In it's class it is the best card so far. Small, cheap, low power, powerful and comes with all the bells and whistles provided by nvidia.
The fact that this site has an AMD sponsored section and in this hit parade there's an AMD in each and all segment (even when it does not deserve it) may be not seen as a simple coincidence.
Except AMD has legitimately better price/performance at the low and mid range now that the coincraze is fading than Nvidia, and historically has. No Nvidia card to my knowledge beats it's AMD counterpart in terms of pure price/performance. Only in power consumption/thermals/acoustics will they win and typically only with reference designs with those last two categories.
You may be aware that the price of AMD cards reflects the fact that they come with less features than nvidia ones. Hotter chips, louder fans, delayed games support... "Physix, who care" I keep on hearing... but if you do not care about image quality and video effects, why are you going to buy a new discrete card with tons of RAM, shaders and GPU power? Keep your DX9 card and live with that! That's the cheaper solution with the best price/perfomance you can get!
A VW Golf is a good car, fast and comfortable enough, but it's not on PAR with a BMW. Yes, it has a better price/performace, but that doesn't make it better than a BMW!
The status of AMD (low) prices is determined by the market request. It's a thing that only fanboys forget. nvidia solutions come with a better quality that includes many things, from tools, drivers and general SW support up to ergonomy: see what AMD did with 290X fans and heat sink with respect to nvidia's 780/780Ti/Titan. The former is a toy with respect to the ones presented with the latter cards. Yes 290X is cheaper, but that is for some reasons that you may ignore but that doesn't make that crappy card better than nvidia ones.
About sponsorship, you may agree that having a sponsored section raises some doubts about objectivness. BBC stated this may times: if you have someone that pays you for advertising its own products you can't be free to express your opinions freerly and unbiased. What is your sponsor paying for otherwise? Anantech once was one of the best site to find highly objective and technical reviews on hardware. Now it is not much different than many other sites that do biased review, though these sites do not state who is sponsoring them. As Anantech has this sponsored section clear on first page, they should also state in all their reviews, comparisons and guides like this one who is sponsoring them and so that their reviews are not completely free and may respect some desires that the sponsors (that pay them) want. Such disclaimer is not uncommon: there are review sites kept by employees of a company that clearly state that they are employee and so their opinion is automatically biased.
In particular for this guide, it seems that nvidia does not produce anything good or competitive that are not 780/780Ti cards. Unfortunately market statitistics show a very different picture: so it is highly possible that the criteria used to evaluate this products are not the same that the users consider when buying their new gfx cards. One may wonder why these criteria are used instead of those that better match common users requirements. And pointing to an evident cause of bias is not that wrong, IMHO.
Why do people like you post remarks about an article without reading it? Everything you said about the 750ti is mentioned and discussed. What is wrong with you?
The checks don't clear if people figure out the game being played. Anand needs his precious Red checks. They are precious to him.
They help him afford all that Apple hardware he loves to "review."
The funny thing is I remember what anandtech was like before Anand decided to "experiment" and "try a Mac." That was back before he outgrew the enthusiast scene.
I disagree, this in not trolling, listen to this, the dalliance into macs and mobile are a distraction to the reason 99% of people who read this site are here, enthusiast PCs.
*Yawn* Yes yes if you don't agree then the opponent is a shill by default.
Where I come from there's a saying "call a man an idiot and you don't have to talk to him again - why would you talk to an idiot?" (its a rough translation - bear with me here lol) Similar applies here.
Agreed, its unseemly to have an AMD section and then find arguable inferior AMD products in each and every single category with a mention of Nvidia once and a while. Sure AMD is competitive on price if you let AMD define the price points.
So because they have the "AMD Center" they are obligated to ignore AMD in order to avoid looking "unseemly" to you? Right.
Pretty sure that the author of the article defined the price points. You even admit that it's "arguable" if the AMD products are inferior (not that I can find any fault with the author's recommendations or his reasoning), but obviously he's a shill if he doesn't agree with you?
Anyway, I think the most convincing bit of evidence that there is no shilling going on is that Anandtech is continually accused of being in Intel's, AMD's, Nvidia's, Apple's, Samsung's, Google's, and Microsoft's pockets simultaneously, depending on the day and the article.
A review or overview of some of the different cooler variants for the 290 would be interesting... I started looking over the Amazon/Newegg reviews (I know, not the best source, was just price checking) and there was all sorts of conflicting info about certain coolers, particularly in CF configurations... Lots of reviews bashed what I thought were mainstays like the ASUS Cu, dunno if they were just sandwiching cars or what but they'd often go on to comment that switching brands/coolers fixed it so...
According to your own bench there's a 50W difference at load. Let's use 30 hours/week of load for ease, which if you use the machine for video editing is very low.
.10c/KWh. 50W=.5c/hour 50W(.5 cents) times 30 hours = 15c
15c times 50 weeks (being conservative again) = $7.50
Keep your card 4 years, $7.50 times 4 = $30
No yeah, you're right, Nvidia can't compete on price at all /s
Then you factor in the superior stability of Nvidia products, superior support cycle (drivers), added features that Nvidia offers AND the fact that a card that's lower power will be quieter, or silent... the silence alone is priceless.
I could see that argument for the 270x vs GTX660 or even GTX760, but not the 750ti compared to the 265. Maxwell is just too good of an architecture.
1) $30 in electricity over 4 years is really not even worth considering, even assuming your math is correct (which I doubt).
2) Your penultimate paragraph about superior stability and all that is pure personal opinion, not fact. I've been running a 6950 for years and have had zero problems with stability, drivers, or support. I dunno what added features you're talking about, either. PhysX? Who cares. Quieter? Maybe if you're using a reference cooler (but who does that?).
$30 is a lot of money when you aren't rich. For my personal use it'd be at least double that, which for AMD would bump the cost of the 265 up above the 270x. Which is the point, if you're going to do a price comparison you have to take into account the cost of ownership.
Yes, my math is correct. 10c/kwh is the national average in America, higher through much of the country though. Lower in only a few places. PhysX, Shield, CUDA. I only actually use PhysX, but they're still possible value points for some.
Yes, I buy my cards and do no further modifications. Ain't no-body got time for dat.
You assume 30 hrs/week load of that GPU at max load. Nobody uses non-professional video card loaded with professional software that utilizes 100% load. Majority of people don't have time to play 7 hrs every day games too. So, your math in reality should bring the break even for price discrepancy at 25-40 years horizon :):) GTX 750 Ti is a beautiful card, but it is not price competitive, that's the truth. You pay premium for the brand. Should be around $100-110 max.
No, I'm not assuming max load at all. I took the figure from bench, loaded with Crysis, FAR from max.
7 hours a day?.... 7 times 7 is 49...
Play a game or render video for 20 hours total over the weekend and 2 hours/day during the week. 30 hours. I render video during the week as well, so I'm 40+.
Despite this I agree that it's too expensive for the class of performance it offers, but that's true of ALL GPU's right now. By at least $50/card.
Hrel - 30 hours per week, every week for 4 years is A LOT of gaming! You are a hardcore gamer with a lot of free time. I think you might be overestimating how many people fit your usage model.
Anandtech doesn't post random hypothetical savings because that varies by user and situation. (e.g. I live in an apartment and my utilities are covered in my rent.)
I much prefer the current process of using Performance/Dollar as the primary with a honorable mention to the Performance/Watt leader.
Well they do post crap claiming above 1920x1200 is important to everyone. Reality is less than ~2.5% of us are above this so who are they talking about (and a large chunk of that 2.5% is 1440p, very few users above this)? They have been claiming that and recommending crap on this since the 660TI article when Ryan was claiming people were massively buying off Ebay giving some dude in Korea your Visa#...ROFL. It's ok to them to buy a $100 cpu to pair with ANY single GPU card (to be clear the word ANY is used, which is retarded). See their 1440p articles...LOL. I could go on...
None of this article takes into account how happy you'll be with the purchase overall. Phase3 drivers are not my idea of fun and even those may not fix all the dx9/10 games etc. If it takes you 2-3yrs to get your drivers out of phase3 you should be charging less. If you have a crap raptr vs. NV's solution, you should charge less. IF you have enduro problems while the other guy's solution doesn't, again, charge less. Instead of just optimizing drivers that affect ALL games, you come out with a new API nobody needed. As shown by guru3d etc. In a cpu limited game Mantle shows great but it's in 2 games. NV has shown they can do the same in DX11 already, not to mention even better if OpenGL is used which has the same draw call advantage mantle does. I'd rather have most of my games affected as opposed to waiting for AMD to get more mantle games out for a FEW cards that can use it.
http://www.tomshardware.com/news/nvidia-geforce-33... massive star swarm gains in dx11. So no need for mantle right? Already winning that one vs. mantle so what is the point? I'm sure after NV's OpenGL speech we'll see many using that avenue soon also. Again no mantle needed.
http://www.geforce.com/whats-new/articles/nvidia-g... Check the 3 mantle games. None of them beat this new driver. They also show the same with star swarm so no surprise tomshardware saw the same massive increase. What is comic is they did this on DX11 in all 3 cases. What can be done with OpenGL? A few lines of code in their speech showed 4x faster (literally a few lines changed got this) and he commented he could easily have gotten 2x that with a little more effort (not bad for a phone call's effort). Devs don't have to do squat to get this to work. IT just works and for all DX11 cards. Mantle=DEV WORK for every game (and only for GCN).
Ryan claimed xbox1 would use Mantle...WRONG. Hypothetical crap all over in that Mantle article that didn't pan out. Reality is, Mantle is dead. If OpenGL or NV's new drivers don't kill it, DX12 will. If that doesn't work, MS can afford to pay devs to avoid Mantle until DX12 takes over as they can afford to spend billions to block your API and laugh (TTM income is 22.5Billion, a few billion to block your api is a joke to them if desired).
One more note, many people play more than 30hrs/week. These are precisely the people buying discrete cards. I can do that on a weekend in the right game...LOL. I just did it with HOMM 6 CE last weekend. :) I don't do that all year though but only because I have other things I can't/won't drop and I'm not figuring the PRO stuff that uses the gpu here either (which is some of the stuff I can't/won't drop). Content people can easily pile up that time with Cuda apps (Photoshop/premiere/AE etc) without touching a game at all.
So what you're saying is - someone who spends at least 4.5 hours a day gaming would buy a cheapo 750ti? *I don't buy that*
Conversely if 30 bucks over 4 (FOUR) years is a lot of money would you even buy a brand new gpu? Surely its second hand hardware all the way (if at all)...
So to make your point on price points. You'd factor in 8 bucks a year into the total price of hardware. Should they place yellow stickers on parts now saying you will save x amount each year? I am pretty sure most people lose/waste at least 8 dollars a year to washing or vending machines. If you are worried about your power consumption for years to come, just possibly might want to rethink buying a discreet video card.
And after he's reconsidered owning a graphics card and finding a new job he'll buy a $600 card because dollars don't matter and cost doesn't matter. I don't know why you guys are saying cost doesn't matter but you are. Cost matters.
Troll. 4 years electricity cost of $30 factors into a upfront video card purchase. I can't afford $30 spread over 4 years in my $200 luxury item purchase. Gtfo
I would kill for those dime sized kilowatt hours. They cost $0.43 where I live. Always worthy of consideration although not always a deciding factor on their own.
Does AMD have dynamic gpu software like Nvidia's GPU boost 2.0? I have R9 270X and it would be really cool to get top performance out of it, but I cant find anything like gpu boost for amd??
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
65 Comments
Back to Article
Anonymous Blowhard - Wednesday, April 30, 2014 - link
At least the actual purchase price of the AMD cards are starting to come down to +10% or so above the MSRP/linked price.Team Green has some inherent advantages as well:
- Linux drivers that aren't completely terrible
- Shadowplay for streamers
- GameStream, which combined with the free LimeLight app gives you streaming without shelling out for a SHIELD
DigitalFreak - Wednesday, April 30, 2014 - link
PhysX supporthammer256 - Wednesday, April 30, 2014 - link
So are the AMD linux drivers still terrible? I knew that was kinda the case a few years ago, but I would imagine AMD's current driver efforts would have helped the linux side of things too. Anyone got some experiences on that?Flunk - Wednesday, April 30, 2014 - link
I have 2x R9 280xs and yes AMDs Linus drivers are still terrible, but better than my last experience, I didn't have to hex edit any binary files this time. To be fair Nvidia's drivers also need some work.Yorgos - Wednesday, April 30, 2014 - link
I doubt that you even are able to change a directory in linuX using a terminal.anyway, here are a tone of benchmarks on linuX graphics drivers, even comparison with the respective windows ones. http://goo.gl/8zJPQ and here are some with the latest drivers from both companies. http://goo.gl/jmkFTy
hpglow - Wednesday, April 30, 2014 - link
Putting a capital x on the end of Linux and mispelling ton definatly adds to your credability. The CLI is the only place to use Linux x server junk.tuxRoller - Wednesday, April 30, 2014 - link
Yeah, don't tell Pixar they are doing it wrong.cnccnc - Thursday, May 22, 2014 - link
You definitely spelled "definatly" wrongbryanlarsen - Wednesday, April 30, 2014 - link
Remember that there are two sets of AMD drivers on Linux: open and closed source. The nVidia drivers are better than the AMD closed source drivers, but the AMD open source drivers are much better than the nVidia closed source drivers in terms of stability and working out of the box without having to fiddle with stuff. The capabilities and speed of the open source driver are much inferior to that of the closed source drivers, but that improved immensely in Linux kernel 3.13.designerfx - Wednesday, April 30, 2014 - link
Actually, Shadowplay is only for supported games, and AMD offers it via their Raptr application thingy - but I'm not sure if it's equivalent or not.HisDivineOrder - Wednesday, April 30, 2014 - link
The less said about Raptr, the better. ;)Mondozai - Thursday, May 1, 2014 - link
"The less said about Raptr, the better."Nah, Raptr's fine. It is improving on a steady basis.
Their method - collective data and pattern computerization - of optimizing performance is also better from a long-term standpoint than Nvidia's stone-age method of individually testing games by tired and overworked interns.
Because0789 - Wednesday, April 30, 2014 - link
Supported games? You can even stream the desktop so it now can work with literally anything.designerfx - Wednesday, April 30, 2014 - link
So, shadowplay is available for AMD past 77xx cards.http://www.reddit.com/r/Games/comments/1w4di9/howt...
Because0789 - Wednesday, April 30, 2014 - link
That is nice for AMD users but can't really be called "Shadowplay". One feature missing right off the bat is streaming straight to Twitch just by logging in not to mention the "always recording" and "start recording whenever" features.MadAd - Saturday, May 3, 2014 - link
so its missing a couple of bells and whistles, at least main functionality is availableTheJian - Monday, May 5, 2014 - link
Forgot Gsync which as your card ages will keep you playing with it longer. Both Gsync and Freesync will cost a new monitor but Gsync is the real deal.Also, a major point anandtech seems to always miss...NV=no phase3 drivers (yet?) ;) Will there be a phase 4? How much longer before they finally get everything working on the last gen cards? Enduro still having problems too last I checked. Raptr is currently a joke vs. NV's solution. NV usually has a profile for a game on release or many times while a game is in BETA (watchdogs for example, supported in 337.50 already). You can wait 6 months to a year on AMD for that (see hardocp driver recap for both sides).
Valve chose NV for a reason. You get what you pay for.
You also forgot Cuda, which for the first time AFAIK is in a game now (nascar 2014). If this is the direction of the future AMD will fall further behind as cuda has a TON of people that already know how to use it very well (7yrs of playing with it, taught in 400+ universities). It now runs on mobile too with K1, so should massively add to it's units in the wild giving even more reasons for devs to use it (already 350mil+ cuda gpus out there).
EzioAs - Wednesday, April 30, 2014 - link
The MSI R9 280 sells for $279 on Newegg and has 3 free games offer. Would you recommend between the R9 270X and R9 280X?Flunk - Wednesday, April 30, 2014 - link
The 280x is much more powerful, it has twice the shaders and more memory bandwith for a start.Alexvrb - Wednesday, April 30, 2014 - link
Although he could have worded it better, it was clear what he meant. He mentioned the 280 (vanilla), and he was asking if it is worth recommending as a card in between the 270X and the 280X. If you can stretch to afford the $300 280X, it's worth it.But if your budget is right between the 270X and the 280? Go for the 280. It's not that much slower than a 280, and it's a big step above the 270X. You can get a few models on Newegg right now for $270 plus a $10-20 rebate card... so effectively ~$250.
Alexvrb - Wednesday, April 30, 2014 - link
Correction, meant to say that the 280 is not that much slower than the 280X. It's a good compromise.if you can't afford the 280X.Voldenuit - Wednesday, April 30, 2014 - link
ASUS GTX 770 DirectCU II 2 GB $299.99 after $10 MIR.Slightly faster than the 280X for the same price, but the 280X may be slightly more future proof due to 3 GB VRAM vs 2 GB on the 770. The 770s OC very well, though, which may split the performance delta further. Pick your poison.
http://www.newegg.com/Product/Product.aspx?Item=N8...
dstarr3 - Wednesday, April 30, 2014 - link
I'm running the EVGA GTX770 Superclocked and I don't have an ounce of regret in terms of performance per cost. The card's a monster. The 280X may be a bit cheaper, but it's also a bit slower, hotter, and louder. I'd gladly pay the slight price premium to have a card that performs better, cooler, and quieter. And I have done.Mondozai - Thursday, May 1, 2014 - link
You're not really getting it if you're comparing a reference card to a factory-overclocked card that has been given a price premium to be better binned and have higher quality components for better overclocking.willis936 - Wednesday, April 30, 2014 - link
Where on earth is Maxwell? I think someone somewhere should do a speculative or investigative article on why 20nm silicon is just flat out not here yet. I can't find any details on why.Hace - Wednesday, April 30, 2014 - link
20nm isn't here because of TSMC, apparently. There's even rumors of Nvidia developing more 28nm variants of Maxwell, otherwise they simply won't have anything on the market for a good long time.willis936 - Wednesday, April 30, 2014 - link
Yes TSMC is late, but why are they late? This affects the usual flow of tech and it's important to take a high level look at what's going on and what this means. So far there's been a frustratingly small amount of talk even on the fact that the process is late let alone why and what it means.DanNeely - Wednesday, April 30, 2014 - link
TSMC is late because TSMC is always late. They've been over promising and under delivering (if not outright cancelling) on every process node they've sold for nearly a decade. Anyone who expects them to meet their stated launch dates for volume production of new processes now is as delusional as the people who're picking the dates in the first place.kwrzesien - Wednesday, April 30, 2014 - link
And I expect nvidia to predict this, which they probably did to some extent judging that they went ahead with the small Maxwell chip at 28nm and built the 750ti. In fact that was so successful I'm hoping they will also go ahead and build medium Maxwell on the 28nm node for the 860 and 860ti, who cares if the die is a little larger when the process is mature and yields are high. Besides they seem to be getting great power efficiency with the 750ti and this is spread over a larger die size at 28nm vs 20nm so it is easier to dissipate the heat off the chip. I think an 860ti on 28nm would be a rocking card, strike now nvidia!HisDivineOrder - Wednesday, April 30, 2014 - link
Especially now that TSMC has bigger fish to fry than GPU's.Even when they get their nodes working, they favor mobility/ARM/SOC's now over GPU's. This is why nVidia did their test run of Maxwell on 28nm and--once it was a big success--they started transitioning more products of Maxwell over to 28nm.
CiccioB - Wednesday, April 30, 2014 - link
It's difficult to believe there's not the Geforce 750ti suggested.In it's class it is the best card so far. Small, cheap, low power, powerful and comes with all the bells and whistles provided by nvidia.
The fact that this site has an AMD sponsored section and in this hit parade there's an AMD in each and all segment (even when it does not deserve it) may be not seen as a simple coincidence.
garadante - Wednesday, April 30, 2014 - link
Except AMD has legitimately better price/performance at the low and mid range now that the coincraze is fading than Nvidia, and historically has. No Nvidia card to my knowledge beats it's AMD counterpart in terms of pure price/performance. Only in power consumption/thermals/acoustics will they win and typically only with reference designs with those last two categories.Morawka - Wednesday, April 30, 2014 - link
Yeah but nvidia is winning where it matters. Performance per Watt. Mobile sales will be triple what these desktop cards will ever sell.Alexey291 - Saturday, May 3, 2014 - link
So erm why does it matter if they win in mobile cards if the article discusses discrete GPUs?I'm sure the mobile market has been discussed time and again on this webby as much as any other. But the article is about discrete GPUs...
CiccioB - Monday, May 5, 2014 - link
You may be aware that the price of AMD cards reflects the fact that they come with less features than nvidia ones.Hotter chips, louder fans, delayed games support... "Physix, who care" I keep on hearing... but if you do not care about image quality and video effects, why are you going to buy a new discrete card with tons of RAM, shaders and GPU power? Keep your DX9 card and live with that! That's the cheaper solution with the best price/perfomance you can get!
A VW Golf is a good car, fast and comfortable enough, but it's not on PAR with a BMW. Yes, it has a better price/performace, but that doesn't make it better than a BMW!
The status of AMD (low) prices is determined by the market request. It's a thing that only fanboys forget. nvidia solutions come with a better quality that includes many things, from tools, drivers and general SW support up to ergonomy: see what AMD did with 290X fans and heat sink with respect to nvidia's 780/780Ti/Titan. The former is a toy with respect to the ones presented with the latter cards. Yes 290X is cheaper, but that is for some reasons that you may ignore but that doesn't make that crappy card better than nvidia ones.
About sponsorship, you may agree that having a sponsored section raises some doubts about objectivness.
BBC stated this may times: if you have someone that pays you for advertising its own products you can't be free to express your opinions freerly and unbiased. What is your sponsor paying for otherwise?
Anantech once was one of the best site to find highly objective and technical reviews on hardware. Now it is not much different than many other sites that do biased review, though these sites do not state who is sponsoring them. As Anantech has this sponsored section clear on first page, they should also state in all their reviews, comparisons and guides like this one who is sponsoring them and so that their reviews are not completely free and may respect some desires that the sponsors (that pay them) want.
Such disclaimer is not uncommon: there are review sites kept by employees of a company that clearly state that they are employee and so their opinion is automatically biased.
In particular for this guide, it seems that nvidia does not produce anything good or competitive that are not 780/780Ti cards. Unfortunately market statitistics show a very different picture: so it is highly possible that the criteria used to evaluate this products are not the same that the users consider when buying their new gfx cards. One may wonder why these criteria are used instead of those that better match common users requirements. And pointing to an evident cause of bias is not that wrong, IMHO.
anandreader106 - Wednesday, April 30, 2014 - link
Why do people like you post remarks about an article without reading it? Everything you said about the 750ti is mentioned and discussed. What is wrong with you?HisDivineOrder - Wednesday, April 30, 2014 - link
Shhhhh. You're giving it away...The checks don't clear if people figure out the game being played. Anand needs his precious Red checks. They are precious to him.
They help him afford all that Apple hardware he loves to "review."
The funny thing is I remember what anandtech was like before Anand decided to "experiment" and "try a Mac." That was back before he outgrew the enthusiast scene.
anandreader106 - Wednesday, April 30, 2014 - link
Just.....wow.....Ryan please ignore these trolls. It was an excellent buyers guide.
ezorb - Friday, May 2, 2014 - link
I disagree, this in not trolling, listen to this, the dalliance into macs and mobile are a distraction to the reason 99% of people who read this site are here, enthusiast PCs.
Alexey291 - Saturday, May 3, 2014 - link
*Yawn* Yes yes if you don't agree then the opponent is a shill by default.Where I come from there's a saying "call a man an idiot and you don't have to talk to him again - why would you talk to an idiot?" (its a rough translation - bear with me here lol) Similar applies here.
ezorb - Friday, May 2, 2014 - link
Agreed, its unseemly to have an AMD section and then find arguable inferior AMD products in each and every single category with a mention of Nvidia once and a while. Sure AMD is competitive on price if you let AMD define the price points.kyuu - Saturday, May 3, 2014 - link
So because they have the "AMD Center" they are obligated to ignore AMD in order to avoid looking "unseemly" to you? Right.Pretty sure that the author of the article defined the price points. You even admit that it's "arguable" if the AMD products are inferior (not that I can find any fault with the author's recommendations or his reasoning), but obviously he's a shill if he doesn't agree with you?
Anyway, I think the most convincing bit of evidence that there is no shilling going on is that Anandtech is continually accused of being in Intel's, AMD's, Nvidia's, Apple's, Samsung's, Google's, and Microsoft's pockets simultaneously, depending on the day and the article.
Impulses - Wednesday, April 30, 2014 - link
A review or overview of some of the different cooler variants for the 290 would be interesting... I started looking over the Amazon/Newegg reviews (I know, not the best source, was just price checking) and there was all sorts of conflicting info about certain coolers, particularly in CF configurations... Lots of reviews bashed what I thought were mainstays like the ASUS Cu, dunno if they were just sandwiching cars or what but they'd often go on to comment that switching brands/coolers fixed it so...Hrel - Wednesday, April 30, 2014 - link
Nvidia can't compete on price huh?According to your own bench there's a 50W difference at load. Let's use 30 hours/week of load for ease, which if you use the machine for video editing is very low.
.10c/KWh. 50W=.5c/hour 50W(.5 cents) times 30 hours = 15c
15c times 50 weeks (being conservative again) = $7.50
Keep your card 4 years, $7.50 times 4 = $30
No yeah, you're right, Nvidia can't compete on price at all /s
Then you factor in the superior stability of Nvidia products, superior support cycle (drivers), added features that Nvidia offers AND the fact that a card that's lower power will be quieter, or silent... the silence alone is priceless.
I could see that argument for the 270x vs GTX660 or even GTX760, but not the 750ti compared to the 265. Maxwell is just too good of an architecture.
kyuu - Wednesday, April 30, 2014 - link
1) $30 in electricity over 4 years is really not even worth considering, even assuming your math is correct (which I doubt).2) Your penultimate paragraph about superior stability and all that is pure personal opinion, not fact. I've been running a 6950 for years and have had zero problems with stability, drivers, or support. I dunno what added features you're talking about, either. PhysX? Who cares. Quieter? Maybe if you're using a reference cooler (but who does that?).
Hrel - Wednesday, April 30, 2014 - link
$30 is a lot of money when you aren't rich. For my personal use it'd be at least double that, which for AMD would bump the cost of the 265 up above the 270x. Which is the point, if you're going to do a price comparison you have to take into account the cost of ownership.Yes, my math is correct. 10c/kwh is the national average in America, higher through much of the country though. Lower in only a few places. PhysX, Shield, CUDA. I only actually use PhysX, but they're still possible value points for some.
Yes, I buy my cards and do no further modifications. Ain't no-body got time for dat.
Ananke - Wednesday, April 30, 2014 - link
You assume 30 hrs/week load of that GPU at max load. Nobody uses non-professional video card loaded with professional software that utilizes 100% load. Majority of people don't have time to play 7 hrs every day games too. So, your math in reality should bring the break even for price discrepancy at 25-40 years horizon :):)GTX 750 Ti is a beautiful card, but it is not price competitive, that's the truth. You pay premium for the brand. Should be around $100-110 max.
Hrel - Wednesday, April 30, 2014 - link
No, I'm not assuming max load at all. I took the figure from bench, loaded with Crysis, FAR from max.7 hours a day?.... 7 times 7 is 49...
Play a game or render video for 20 hours total over the weekend and 2 hours/day during the week. 30 hours. I render video during the week as well, so I'm 40+.
Despite this I agree that it's too expensive for the class of performance it offers, but that's true of ALL GPU's right now. By at least $50/card.
Hrel - Wednesday, April 30, 2014 - link
Then there's people who play MMO's, like ESO. 100+ hours/week.xrror - Friday, May 2, 2014 - link
Dude, if you're playing an MMO for 100+ hours a week, then $30 more a year in electricity is the least of your worries.anandreader106 - Wednesday, April 30, 2014 - link
Hrel - 30 hours per week, every week for 4 years is A LOT of gaming! You are a hardcore gamer with a lot of free time. I think you might be overestimating how many people fit your usage model.Anandtech doesn't post random hypothetical savings because that varies by user and situation. (e.g. I live in an apartment and my utilities are covered in my rent.)
I much prefer the current process of using Performance/Dollar as the primary with a honorable mention to the Performance/Watt leader.
TheJian - Monday, May 5, 2014 - link
Well they do post crap claiming above 1920x1200 is important to everyone. Reality is less than ~2.5% of us are above this so who are they talking about (and a large chunk of that 2.5% is 1440p, very few users above this)? They have been claiming that and recommending crap on this since the 660TI article when Ryan was claiming people were massively buying off Ebay giving some dude in Korea your Visa#...ROFL. It's ok to them to buy a $100 cpu to pair with ANY single GPU card (to be clear the word ANY is used, which is retarded). See their 1440p articles...LOL. I could go on...None of this article takes into account how happy you'll be with the purchase overall. Phase3 drivers are not my idea of fun and even those may not fix all the dx9/10 games etc. If it takes you 2-3yrs to get your drivers out of phase3 you should be charging less. If you have a crap raptr vs. NV's solution, you should charge less. IF you have enduro problems while the other guy's solution doesn't, again, charge less. Instead of just optimizing drivers that affect ALL games, you come out with a new API nobody needed. As shown by guru3d etc. In a cpu limited game Mantle shows great but it's in 2 games. NV has shown they can do the same in DX11 already, not to mention even better if OpenGL is used which has the same draw call advantage mantle does. I'd rather have most of my games affected as opposed to waiting for AMD to get more mantle games out for a FEW cards that can use it.
http://www.guru3d.com/news_story/geforce_337_50_be...
Hitman/crysis3 have huge gains as they are cpu limited here.
http://www.tomshardware.com/news/nvidia-geforce-33...
massive star swarm gains in dx11. So no need for mantle right? Already winning that one vs. mantle so what is the point? I'm sure after NV's OpenGL speech we'll see many using that avenue soon also. Again no mantle needed.
http://www.geforce.com/whats-new/articles/nvidia-g...
Check the 3 mantle games. None of them beat this new driver. They also show the same with star swarm so no surprise tomshardware saw the same massive increase. What is comic is they did this on DX11 in all 3 cases. What can be done with OpenGL? A few lines of code in their speech showed 4x faster (literally a few lines changed got this) and he commented he could easily have gotten 2x that with a little more effort (not bad for a phone call's effort). Devs don't have to do squat to get this to work. IT just works and for all DX11 cards. Mantle=DEV WORK for every game (and only for GCN).
Ryan claimed xbox1 would use Mantle...WRONG. Hypothetical crap all over in that Mantle article that didn't pan out. Reality is, Mantle is dead. If OpenGL or NV's new drivers don't kill it, DX12 will. If that doesn't work, MS can afford to pay devs to avoid Mantle until DX12 takes over as they can afford to spend billions to block your API and laugh (TTM income is 22.5Billion, a few billion to block your api is a joke to them if desired).
http://blogs.nvidia.com/blog/2014/03/20/opengl-gdc...
OpenGL draw call vid.
One more note, many people play more than 30hrs/week. These are precisely the people buying discrete cards. I can do that on a weekend in the right game...LOL. I just did it with HOMM 6 CE last weekend. :) I don't do that all year though but only because I have other things I can't/won't drop and I'm not figuring the PRO stuff that uses the gpu here either (which is some of the stuff I can't/won't drop). Content people can easily pile up that time with Cuda apps (Photoshop/premiere/AE etc) without touching a game at all.
Hrel - Wednesday, May 7, 2014 - link
Excellent response.Alexey291 - Saturday, May 3, 2014 - link
So what you're saying is - someone who spends at least 4.5 hours a day gaming would buy a cheapo 750ti? *I don't buy that*Conversely if 30 bucks over 4 (FOUR) years is a lot of money would you even buy a brand new gpu? Surely its second hand hardware all the way (if at all)...
So yeah nonsense argument is nonsense.
Hrel - Wednesday, May 7, 2014 - link
Used hardware is prone to failure, so it's more expensive over time.Backedupdrive - Wednesday, April 30, 2014 - link
So to make your point on price points. You'd factor in 8 bucks a year into the total price of hardware. Should they place yellow stickers on parts now saying you will save x amount each year? I am pretty sure most people lose/waste at least 8 dollars a year to washing or vending machines. If you are worried about your power consumption for years to come, just possibly might want to rethink buying a discreet video card.anandreader106 - Wednesday, April 30, 2014 - link
+1And he also might want to look into finding a better job.
willis936 - Thursday, May 1, 2014 - link
And after he's reconsidered owning a graphics card and finding a new job he'll buy a $600 card because dollars don't matter and cost doesn't matter. I don't know why you guys are saying cost doesn't matter but you are. Cost matters.godlyatheist - Wednesday, April 30, 2014 - link
4 years? Are you serious?Hrel - Wednesday, April 30, 2014 - link
it's applied to people, not computer hobbyists. So yeah, 4 years is on the low end of an upgrade cycle for most.hero4hire - Wednesday, April 30, 2014 - link
Troll. 4 years electricity cost of $30 factors into a upfront video card purchase. I can't afford $30 spread over 4 years in my $200 luxury item purchase. Gtfobrucek2 - Wednesday, April 30, 2014 - link
I would kill for those dime sized kilowatt hours. They cost $0.43 where I live. Always worthy of consideration although not always a deciding factor on their own.owbert - Thursday, May 1, 2014 - link
Thank you for a monthly GPU guide. Much better than Tom's Hardwares. Really appreciate how succinct it is.p/s.: Could Anandtech please consider a monthly CPU recommendation (for gaming, htpc, productivity)?
NARC4457 - Monday, May 5, 2014 - link
I'll second that request. I think that would be a good addition.mauritz123 - Friday, May 23, 2014 - link
Does AMD have dynamic gpu software like Nvidia's GPU boost 2.0? I have R9 270X and it would be really cool to get top performance out of it, but I cant find anything like gpu boost for amd??