As long as you're not giving them additional voltage (which you can't do on this card): yes. GDDR5 does not consume all that much power, even if it is relatively more than DDR3. The airflow off of the fans is plenty for stock voltage.
No mantle is likely because it didn't give a great showing last time in the AMD mantle review. If I remember correctly, Thief maybe even had a performance regression with Mantle being used.
Because Mantle only has benefit in CPU-capped performance. When you run benchmarks on an i7 or better, Mantle has no tangible benefit and sometimes has regressions.
Apologies. The Mantle results have to be added manually since our graphing system can't handle multiple results for the same card automatically. I had actually entered in the data but neglected to regenerate the graphs.
Ok first of all EVGA Precision or the type of software has nothing to do with that, Also he has a valid concern about the V-Ram temps and the VRM. Also don't call people idiots! That's my Job! IDIOT!
Memory chips don't really need heatsinks, but memory VRMs do - so if overclocking memory, you have to watch their temperature and take care of their proper cooling.
I am happy to see both Nvidia cards come out packing heat. for once Nvidia isn't totally price gouging at launch. In fact, we should see AMD prices come down soon as AMD reviews Nvidia's sales numbers. The 290 and 290X are pushing like 10 months on the clock at pretty much launch prices. it is about time we the price have a reason to come down. The 970 is the card to do so.
Assumedly, Nvidia already has the 980 TI taped out for AMDs next response card. I am disappointed to see that there is no builtin audio decoder like in AMD. It means little hope to see many developers leveraging audio hardware to reduce CPU load in games with 5.1+ situations.
It is a GM204 upper-midrange chip as were any Gxx04 chips before... this is like the launch of the 680 and 670 GK104 cards, then in the refresh 770 and 760 used those same chips when the 780 GK110 high end came out with the true flagship core.
There will be a GM210 flagship taking the titan and x80 name, and the GM204 will go into the x70 and x60 cards at a lower price and this refresh is what I am interested in.
However like the 780 before it is is also a Gxx04 chip so therefore there will be a Gxx10 chip above it suited to cards like the 980ti as was the case with the 780ti.
we'll probably see a 980Ti in the form of a 20nm part. even if there is no architectural difference they could tweak it for a smaller process while possibly boosting the clock. who knows...1500MHz GPU & 8GHz GDDR5?
I'm not up to date, but if crypto-currency miners are still interested in radeon cards AMD might not rush to lower the price, or if they do the retailers will just pocket the difference and sell at the same price.
A 10% OC won't heat a chip to the point of damage. You will most likely first reach the point when the memory fails to function.
Btw Ryan, you keep saying it's "clocked at 7GHz" but I don't think that's true. Probably need to add the word "effective". Is the real clock 1/2 or 1/4 that?
You'll frequently see 1/4 in OC tools. In the end it doesn't matter, both numbers are valid: one is the frequency of the data on the bus (the high marketing number) and the other one is the frequency of the memory chips.
So you're saying it doesn't matter to say "clocked at 7GHz" when the actual clock is 1.75GHz. Well it matters to me! What if the chip multiplies the core clock by 2 internally; should we then say the core clock is 2.2GHz instead of 1.1GHz?
Thank u for including a full stable of previous gen video cards to compare it to! In particularly the 670 & 770! Gives us a better idea of how it performs!
Seconded. Still running a 6950 flashed to 6970 so having the stock 6970 as a point of reference made this the easiest buy ever. Roughly twice the performance with lower power, heat, and noise? Yes.
I have the same card (Sapphire 6950 Dirt 3, unlocked to 6970) and one fan of the 2 died last week (hitting 90-100 deg C in games). So with that I went to see what i can replace it with and the 970 ticks all my marks.
Certainly, but considering AMD has implemented some of the same things I'd expect an equivalent price to performance ratio from AMD for their new cards. Cut down the 290's bus to 256bit, clock the ram to 7ghz, and with the bandwidth compression you'd get a cheaper card with the same performance.
Still, nice all around to have choices, been looking at cards myself, and am going to build a system for my brother quite soon. Yay price wars!
What you describe is what Tonga should have been. Didn't turn out so well :/ Sure, the 285 is priced below GM204 cards, but the chip is almost as large and hence costs AMD the same to produce it. They SHOULD play in the same league.
"Crysis 3 Summary" - The GTX 670 trails the R9 290XU by 10%....
It should be the GTX 970 :)
Also, on the page with Company of Heroes - The Charts do not display correctly, or more specifically, their headers (the thick Blue bar/heading with info about resolution etc?) are cropped out on all the images except the first one.
"AMD would have to cut R9 290X’s performance by nearly $200 to be performance competitive, and even then they can’t come close to matching NVIDIA’s big edge in power consumption."
I have the reference style 4GB EVGA GTX 760 with the short PCB but it was discontinued shortly after launch. I got some 670/760 water blocks for SLI from Swiftech and found that only Zotac was making a short PCB 4GB GTX 760 card like my EVGA even though it has fewer memory chips (probably worse for over locking). Because the vast majority of GTX 760 cards had reference 680 PCBs, it is very difficult to tell which "reference" 760 this article is talking about. The rare short PCB or the longer one?
The short PCB and the stretched PCB were virtually identical, so to answer the question I'm technically comparing it to the short PCB, but either comparison is valid. The stretched section only contains a handful of additional discrete components; it's mostly to allow fitting an open air cooler.
The Radeon 290x is nearly a year old now. It would be surprising if an Nvidia GPU that sacrifices DP capability wouldn't be significantly quicker per mm^2 at this point.
The improvement in performance/watt is notable and nvidia deserve much credit for their work in this area.
"considering the fact that GTX 970 is a $329 card I don’t seriously expect it to be used for 4K gaming" You should and Nvidia should market it as lowering the entry price for 4k. It's kinda the least you need for 4k, can do 4k just not max setting everywhere for just 329$. Pair it with some bellow 500$ screen deal and 4k gaming is a lot more accessible than 2 weeks ago. We've looked at what's the bare minimum for 1080p for years and we are used to it, now it's 4k's turn to become more mainstream and we need to get used to looking at it the same way. in the US and even more so in China 4k screens are not that prohibitive anymore. 400-500$ for a screen, 330$ for a 970, an overclocked dual core and you can do budget 4k gaming at 1.2k$. 4k should be one of the reasons some people that buy 200$ cards might go for the 970 this time around. And with 20nm 4k should become a lot more affordable so it's time to not think of it as a small niche.
Can we get power usage with non reference 290/290X's? If I recall correctly, power usage drops something like 15-25 watts when it's running at closer to 70 C than 95 C as reference cooling profiles make it run.
Will the zero fan speed capability be something that will be delivered via a driver update? I hope so. Or will it require a "Rev B" of the hardware too?
I don't need silence, but quiet is nice. I assumed the 970 would be quite a bit quieter than the 980 due to lower TDP. The test results surprise me (not in a good way).
This makes me even more impressed with the 980. But it still costs $200 more. Tough choice.
You'd think they should be able to get the fans to at the very least spin down a lot further than that at idle - there seems to be at least three 970 cards capable of running on purely passive cooling at idle now. (Asus, MSI and Palit.).
So EVGA put all this work into lowering fan power consumption but forgot about idle noise? I find this perplexing . And i believe they had this same problem with other older models as well.
Also i don't like the idea of passive cooling. Running the card at 50C for a long time is not good for longevity. I had a passive Gigabyte card in the past that after a few years was showing colored pixels on the screen.
Better to use a low rpm (<1000) for quiet operation. You're not going to hear the difference anyway because you have other components making some kind of noise in the case.
Why use two 6-pin PCIe power connectors when a single 8-pin would do the job just fine? Would certainly cut down on the BOM, and of course cable clutter.
Remember you guys saying Nvidia is working from the low end up, no longer top down. Well, they should start releasing cards that way. I don't care about your overpriced space heaters, I care about the cards between $100 and $200. Release those first!
They did -- the 750 and 750Ti are the first generation Maxwell cards, released earlier this year. They didn't break new ground in price/performance, but they did in price/watt.
I think it's finally time to upgrade my vintage 460 to one of these 970's. I don't plan on upgrading my ancient i7 930 (Nahelem, Bloomfield) just yet.
I think I can eek more life out of my rig by bumping the GPU and keeping the rest of the innards the same for another year or two. I'm just surprised that I was able to get 4 good years out of that little 460 (mated with a 1080P monitor).
there won't be a $200 champion. the bang for buck cards are in the $300 and above price tier right now and for the foreseeable future. it was the r9 290 and now it's the 970. the next one i predict will be the 290's replacement it will have the texture compression of the 285 and be their second down part.
The $200 champion appears to really be the 280 atm for those sitting on older cards.. I'd like to say the 285 but it's higher up in the price bracket. The 960 should be interesting though when it comes out but I doubt it will be a $200 card.. Looking at the 970.. Im guessing it will sit at the $250 price point.
EVGA uses double ball bearing fans. That's where the extra noise comes from. Much noisier at idle but far more durable. I've really been pleased with the ACX cooler on my 770.
Actually, the noisy ACX on my 770 is the reason I won't be buying another EVGA anytime soon. They have given me a silent BIOS, which is great, but the ACX is still by far the loudest noise maker in my PC. It has also been proven by countless reviews: the idle and even load noise levels were higher than those of the stock GTX 770. And they are making the same mistake AGAIN - I really can't believe it. Well, they seem to come around after seeing all the competitors which seem to understand what it means to deliver silent cards.
The ACX cooler isn't bad, it's actually rather good - the only problem is that they never understood what a "silent" card means. They were always going for lower temperature over lower noise levels - even in idle, which made absolutely no sense.
In all honesty I cannot see how comparing a _reference_ R9 290X on Uber to this particular 970 is valid.
We really need a similar open air cooled R9 290X to really see how power, temps, and noise compare....Everyone knows that AMD's reference blower for the 290X just isn't up to the task of cooling that beast.
True. The R9 290 reference cooler is one of the worst options to chose and non reference has been much better! But still 970 is hard nail in 290X skin!
Good point. I guess some manufacturers just want that entry product in their stack of offerings and go with the reference design.
Thanks Ryan for the hard work on the nVidia 980/970 release, these articles were excellent. In the future perhaps consider a followup test comparing a bunch of cards to more so evaluate their coolers and OC potential. That could be very interesting taking some of the top and mid cards from each manufacturer and doing a quantitative analysis across the board.
I think it would be a great idea to comment on and analyze the effects of overclocking (extra OC through AB or PX) when even the non overclocked settings end up getting throttled.
For me, the most important thing about overclocking when the card is factory overclocked already is how much the throttling changes when the power target is increased. Any comments, Mr. Smith?
Increasing the power target helps, but it does not fully alleviate the issue. A 10% increase just isn't enough to eliminate all TDP throttling, thanks in big part to the fact that power consumption grows with the square of the voltage. GM204 would ideally like quite a bit of power to sustain a heavy workload at 1.243v. Which is why that's officially in boost territory, as NVIDIA only intends that voltage/bin to be sustained in light workloads.
Wow I figured that the 970 would run into far less issues sustaining max boost than the 980. But I guess it is drawing nearly as much power. I don't want to see anyone complaining about AMD cards and boost anymore, heh.
Anyway, the 970 still provides the absolute best bang for the buck and I'm stunned they didn't price it at $400. It's fast, reasonably priced, runs cool and quiet. It also is easy on power requirements, though I always overbuy on PSU anyway for headroom. Easy recommendation for anyone buying in the this price range!
Square of voltage, what are you smoking? P = IV = I^2 R = V^2 /R. The IC isn't a resistor. Typically current stays close to the same as you increase supply voltage.
You're right, thanks! Thinking about it, dynamic power increases by the square, and static is by a direct proportion, so total should be between the two. Dynamic probably dominates so it's probably much closer to the square.
What really bothers me is that EVGA is getting lazy, reusing older pcb's. This one looks like a 760... The VRM and phases look very primitive next to a card like the Asus Strix GTX 970. There was a time when EVGA used to wow me with custom designs, the last few years not so much as they invariably use reference boards. the issue I have with most of the reference boards is that coil buzz is noticeable. The Asus and MSI boards are using custom digital VRM's and super alloy caps.... Anyhow, nice review.
I’m confident in if we had two of what where the normal "AIB OC customs" of both a 970 and 290 things between might not appear so skewed. First as much as folks want this level of card to get them into 4K, there not... So it really just boils down to seeing what similarly generic OC custom offer and say "spar back and forth" @2560x1440 depending on the titles.
As to power I wish these reviews would halt the inadequate testing like it’s still 2004! The power (complete PC) should for each game B-M’d, and should record in retime the oscillation of power in milliseconds, then output the 'mean' over the test duration. As we know each title fluctuates boost frequency across every title, the 'mean' across each game is different. Then each 'mean' can be added and the average from the number of titles would offer to most straight-forward evaluation of power while gaming. Also, as most folk today "Sleep" their computers (and not many idle for more than 10-20min) I believe the best calculation for power is what a graphics card "suckles" while doing nothing like 80% each month. I’d more like to see how AMD ZeroCore impacts a machines power usage over a months’ time, verse the savings only during gaming. Consider gaming 3hr a day which constitutes 12.5% of a month, does the 25% difference in power gaming beat the 5W saved with Zerocore 80% of that month. Saving energy while using and enjoying something is fine, although wasting watts while doing nothing is incomprehensible.
Ehh, I recently bought 2x custom 290, but I've no doubt that even with a decent OC the 970 can st the very least still match it in most games... I don't regret the 290s, but I also only paid $350/360 for my WF Gigabyte cards, had I paid closer to $400 I'd be kicking myself right about now.
Maxwell truly is an impressive architecture, I just wish Nvidia would stop further gimping double precision performance relative to single precision with each successive generation of their consumer cards. GF100/110 were capped at 1/8, GK110 was capped at 1/24, and now GM204 (and likely GM210) is capped at 1/32... What's still yet to be seen is how they're capping the performance on GM204, whether it's a hardware limitation like GK104, or a clock speed limitation in firmware like GK110.
Nvidia: You peasants want any sort of reasonable upgrade in FP64 performance? Pay up.
"Company X: You peasants want any sort of reasonable upgrade in product Y? Pay up."
Well, that's capitalism for ya... :p. Seriously though, if less DP ability means a cheaper GPU then as a gamer I'm all for it. If a dozen niche DP hobbyists get screwed over, and a thousand gamers get a better deal on a gaming card then why not? Remember what all that bit mining nonsense did to the North American prices of the Radeons?
you seem to be under the illusion that nvidia intended to keep shooting themselves in the foot forever in regards to releasing their high end gpgpu chip under a gaming designation and relying on the driver (which is easy to hack) to keep people from buying a gamer card for workstation loads. face it they wised up and charge extra for fp64 and the higher ram count now. no more cheap workstation cards. the benefit as already described is cheaper gaming cards that are designed to be more efficient at gaming and leave the workstation loads to the workstation cards.
This is only partially true, and I think D. Lister basically suggested the same thing so I'll just make a single response for both. The argument for price and efficiency would really only be the case for a GK104 type scenario, where on die FP64 performance is physically limited to 1/24 FP32 due to there being 1/24 the Cuda cores. But what about GK110? There is no reason to limit it to 1/24 SP other than segmentation. There's pretty much no efficiency or price argument there, and we see proof of that in the Titan, no less efficient at gaming and really no more expensive to manufacture outside the additional memory and maybe some additional validation. In other words there's really no justification (or at least certainly not the justification you guys are suggesting) for why the GTX780 Ti couldn't have had 1/12 SP with 3GB GDDR5 at the same $700 MSRP, for instance. Of course other than further (and in my opinion unreasonable) segmentation.
This is why I was wondering how they're capping performance in GM204.
It would not surprise me if GM204 is crippled in FP64 in a similar way to GK104, with physically limited number of FP64 cores. Regarding to GK110, how the die are selected between FP64 crippled and professional cards is not known. You can imagine a case where the dies with defects in the FP64 cores can still be used in gamer cards, and thus have a bit more yield. But that's pure speculation, of course. Either way, Nvidia does this because this makes them more money, and they can get away with it. If you remember from your class in micro-economics, when the industry is in a state of monopoly or oligopoly, segmentation is the way to go for profit maximization. Unless AMD is willing to not segment their products, there is no pressure for Nvidia to change what they are doing. So we can argue that consumers are the losers in this state of things, and generally in monopoly and oligopoly that is indeed the case. But in this specific case with FP64, I have to ask: are there many/any consumer relevant applications that could really benefit from FP64? I'm curious to know. I would say that in order for these companies to care, the application need to have sufficient general relevance in the same order of magnitude as that for graphics. Those of us who uses the GPU in scientific computation such as simulations are the real losers in this trend. But then again, we were fortunate to have had this kind of cheap, off the shelf hardware that were so powerful for what we do. Looks that ride is coming to an end, at least for the foreseeable future. Personally, my simulation doesn't really benefit from double precision, so I'm pretty lucky. Even then I found that stepping from the GTX580 to a GTX680 core didn't improve performance at all. The silver lining there was that GTX690 had much better performance that the GTX590 for me, and I was able to get 4 GTX690's for some excellent performance. A GTX990 would be tempting, or maybe just wait for the 20nm iteration...
Of course GM204 is crippled in FP64. That's where nVidia is finding the improved power budget and reduction in wattage requirement. Frankly, I think it's pretty cheesy, and I've stopped listening to people creaming their jeans about how fabulous nVidia's low power is compared with AMD's. Of course it's going to loose it's power requirements if you cripple the hell out of it. Duh. The question is whether you will line up to get shafted with all the other drones, or if you'll protest this stupidity by buying AMD instead, and give nVidia the finger for this, as they rightly deserve. If we don't, AMD will have to take its FP64 circuitry out of their cards to compete.
What I said earlier had nothing to do with efficiency. If you were a prosumer and were in the market for double precision hardware... why would you want a $3000 pro GPU when you can get nearly the same performance from a <$1000 consumer variant? Not everyone cares for ECC VRAM. HPC guys et al would be all over it, resulting in an unfairly inflated retail value for the rest of us. When that happens, Nvidia is the one that gets the bad rep, just like AMD did during the bit mining fad. Why do you believe it is so important anyway?
Looking at the PCB, the FTW version does not have more VRMs than the SC or normal EVGA model. I only see four chokes, which is what the other cards have. MSI has 6 VRMs. I'm wondering if EVGA is also using the same low-end analog VRMs that the SC and regular EVGA cards use as well. All other 970's use higher end VRMs.
Also, we really need a round up of all the brands on here. Seeing the FTW version vs reference doesn't paint a usable picture for those looking to make a purchase.
pny has a 970 with the full complement of output's. 3 dp, 1 hdmi 2 and 1 dvi. It really pisses me off that most of the top tier makers like EVGA and ASUS decided to switch that to 1 dp, 1 hdmi and 2 dvi...
they did it because they have customers with at most one dp monitor and if using multiple monitors most have dvi still. also they have all these connectors they bought up laying around so....
Well, it was quite a run for me and AMD. I was a GeForce user back in the day with the Geforce 2 and then the Ti 4600. In 2002 I switched up to AMD when they released the 9700 Pro and never looked back. I have been waiting and waiting for the R9 cards to drop in price to go into my new X99 build. I waited for these 900 cards thinking the response would be somewhat quick as far as an announcement for a competing Radeon or at least a price drop for the 290X. But it never came. (Except that the 390 may be factory water cooled.... which was an "uh oh.." in my mind as far as heat and power is concerned). So 12 years later, I am now back to NVidia as I just ordered my GTX 980 yesterday. I think that NVidia finally released a card at a price and power consumption that just cannot be ignored. A truly impressive feat they have pulled off with the Maxwell line, and I have chosen to reward that effort with my business. Who knows, MaxWELL in a HasWELL-e build... perhaps fate was involved? It is to be named my Wellness system =P Will be quite an upgrade from a i7 920 and 6950. I can't wait to get it assembled!! And by the way, thanks for the 2 great reviews Ryan!
i'm curious why you'd spend all that money on a cpu ram motherboard config and then get a gaming card. you could have saved a ton of money and bought 4790, msi gaming series board ddr3 ram and bought 2 980's if gaming is your focus youd have the faster gaming cpu and graphics card setup that way.
I would agree if I were a 'tick tock' cadence upgrader, and if gaming were my only focus. I weighed going 4790 for a few months. But coming off of a X58 build that I've owned for 6 years it may be possible that this build stays with me just as long, and I believe a 6 core will be the better option than a 4 core in the long run. As far as a gaming card, this is a desktop build not a workstation.
Color me intrigued by the 970. I'm considering a pair of these in SLI for 4k gaming (as other reviews indicate they scale quite well), but I'm running a Sandy-Bridge era rig, with x8/x8 PCIe 2 as the only SLI option, and would like to know if I'd run into a PCIe bandwidth bottleneck on these cards. Any chance of a quick scaling test making it into the (assumed to be coming) SLI review?
I love this card and am really excited. BUT if this cannot run on a single PCIe connector despite being 145W only that is a real deal breaker as I'm definitely not going to replace my excellent PSU just because some moron decided to plan in some additional headroom...
if your psu doesn't have 2 6 pin pci-e connectors it can't be that excellent. the lepa 500 watt is $36 and will run the 970 no problem and is a decent supply. i shudder to think what you have if it can't handle this card.
An efficient quiet low power PSU from "be quiet!" for my HTPC which would be perfectly capable of dealing with the GTX 970. There're actually many PSUs which only have 1 PCIe connector because that's the norm not the execption that you'll need at most one. So I take it that all of you "shuddering" all day thinking of what horrible PSUs others might have, have their high end gaming rigs and don't give a crap about silence and/or efficiency -- good for you but maybe schedule a reality check sometime.
Yeah, but that "moron" who planned additional headroom is catering to the needs of the biggest market for these cards, who are gamers likely to overclock at some point. Not the niche need of using it in an htpc case with a low power single pcie psu.
You could use an adapter to get an extra 6-pin connector, which should work fine considering the low power draw of the 970. That is, if your excellent PSU has an HDD power dongle to spare.</snark>
Looking at the 970 cards on Newegg, any idea why all brands have the 2x DVI, 1x DisplayPort configuration? While nVidia.com lists the 970 specs as having 3x DisplayPort, 1 DVI port.
I was hoping to get a 970 with 3 display ports, so I'm wondering what is going on.
It seems the ACX Cooler similar to the EVGA 760 ACX has a design/manufacturing flew even the 2.0 version has the pipelines vs GPU position flew ; after throughly reading the reviews & users feedback ( EVGA GeForce GTX 970 ACX has misaligned GPU vs heatpipes) : http://www.overclock.net/t/1514624/eteknix-possibl... and http://www.guru3d.com/news-story/evga-geforce-gtx-... and http://forums.evga.com/m/tm.aspx?m=2221919&p=1 and http://www.eteknix.com/evga-gtx-970-feature-manufa... Therefore I would rather go with another brand Like the Asus Strix Series ; MSI Gaming Series or the Gegabyte G1 Gaming Series .. Less Risk Especially the warranty is RMA. As soon they are available at Stock update asap to order one .. Meanwhile Reason why they did not show the PCB of the ACX 2.0 and they showed us the FTW. I was interested to purchase the ACX version but now In a dilemma since the issue has not been mentioned in a tech site they I usually always trust!
Looks good! Mostly a side-grade for my 290's, but I knew that already. Probably the best purchase I made in a while, those 290's are gonna last me a bit. May have to ebay a 3rd for some tri-fire. I'll go nVidia next round depending, but I'm really impressed with the 980/970, but 290's right now are a steal!
Did the prices drop significantly? I'm scared to look, I got mine for $350 (after rebate) and $360 about a month ago, didn't see anything priced that low immediately after the 980/970 launch but... You'd think AMD would now have to at least match 970 pricing with the 290 in order to sell any at all.
FWIW, I did think the price I paid for the 290s was pretty terrific and haven't regretted it yet... When looking at SLI vs CF tests the performance gaps from game to game are even narrower, a huge AMD price drop would be slightly irksome tho I understand it's bound to happen. OTOH, being so close to the holiday season I doubt we'd see any good sale prices atop any price drops or beyond the 980/970 price points.
The 980 and 970 make me excited for what nVidia (and hopefully AMD) has in store for 20nm next year. I'm not going to upgrade from my water cooled 7970 (35% chip and 20% memory OC) for another 28nm GPU, even if nVidia managed to squeeze a lot out of the process.
Wow, I didn't expect the 970 to pull up stats like this compared to the 780. I don't know why, lol. The power consumption from the horsepower this card throws down is rather impressive compared to AMD's R9's. I may have to get one of these. I'm stuck with one monitor @ 1920x1080. This card will give me 60 FPS in just about everything it seems, probably over kill. But this seems like it'll hold me over until I have to do a complete new rebuild down the road(when I upgrade to 4k). I've got a couple years :)
I know AT's benchmarks portrays the Radeon 6970 as 'slow as balls' when it comes to gaming recent titles, but as someone who games more than most (on a 1080p monitor), i still find very little incentive to upgrade. Maybe if i gamed at higher resolutions.... Haven't really encountered a game (worth playing) that doesn't run on ultra settings in 1080p. Then again, steam has been shifting my dollars away from top tier games towards indie games for a while now. I own 425 games on steam and all of them give me no issue on my 'old' graphics card. Not knocking the new geforce cards at all....just wish developers would push the envelope a bit more.
Wel, old VLIW4-based HD 6970 is, say, something around 1/3 (33%) slower than GCN-based R9 280 aka HD 7950 Boost (which I have, BTW). But I believe that 6970 is more or less still good enough for 1080p, and, besides, probably the drivers for HD 6970 are pretty mature since HD 6970 is nearly 4 years old now (plus, Trinity and Richland also have VLIW4-based GPU, just 1/4 of HD 6970's Cayman GPU).
" For much of the last year NVIDIA has been more than performance competitive but not price competitive with AMD."
------
That's been a major bone of contention with me. They corrected pricing which was very much in line and competitive during the 460-80 days. Then went out to lunch thru the 560-80 era and marginally came back (almost not quite..) during the 660-80 only to head way out to lunch again with their 7x series. It boggles the mind and makes it hard to purchase when AMD is offering great deals on their end /w similar performance.
This is a solid upgrade, just thinking if this is worth it over GTX 670 as well. Slightly more power consumption for much greater performance and similar idle performance numbers.
330 for this is a steal. Very few cases one even needs a 980. Even so I am still buying the 980 I keep my hpus for a long time, still using radeon 5870 now
Thanks Ryan ! great article, enjoyed reading it as much as I did the one on the GTX 980. GTX 970 proves to be an excellent card in terms of VFM, its a rare event in the high end GFX card market.
I would love to see GTX 970 in an SLI benchmark and see how it handles UHD (read 4K) games. With its price point and performance it begs for a dual SLI setup.
I've got two reference cooler EVGA 970s (superclocked) coming from NewEgg on Tuesday. I'm not a big overclocker on GPUs as I'm on air and want all possible heat blown out the back, but can't wait. Coming from a single 680 and having recently moved up to 1440p, and not having to upgrade my Corsair 750W gold PS, it's just an absolute zero brainer.
Great review Ryan and thanks for continuing to show older games like Crysis Warhead and Grid 2 (which I use as a reference to compare with Grid Autosport benches)!
Great article as usual. I just signed up to ask: Will you do any reviews/comparisons of the semi-reference cards with the cheaper blower style coolers for the 970? There are quite a few options out there (at least two non-ACX EVGA cards for example). I would love to know just how much difference there is in temps and noise, and possibly performance between the various cooler types.
Looks like NVIDIA did pretty awesome dealing with the surprise that they had to produce another generation on TSMC 28nm. Frankly these will probably be the best cards made for years to come since they really have 28nm figured out and Maxwell is bringing huge performance/watt. It will be interesting to see if they even make a 960 - would it be a further crippled GM204 or something else, maybe the first 20nm chip?
So in classic AnandTech style it would be awesome to get an article on the inside story at NVIDIA about what they have gone through with Apple sucking up the first batch of 20nm at TSMC. I know they made some public noise about it - and think about it from the corporate perspective - they were used to getting first dibs on each die shrink and using that in the top-tier products. Now they are stuck a node behind, they may have to prioritize Tegra and mobile chips on 20nm first and leave desktop parts always a year behind. If that keeps workstation parts behind as well I can see why they would be pissed.
Why you did not mention EVGA has been caught with there chips being not aligned on the heat sink correctly..(Tho, they replied with it being how its suppose to be).
Asus is always just a solid company to fall back on.. and gigabyte is generally the same way.
Having always been an MSI guy, I've not really considered going with another vendor... until now. This looks like a nice card which also happens to conveniently match my color scheme whereas the red coloring of the MSI Gaming line sadly does not. Still, the overclocks are pretty much a wash and the only real differences seem to be in the cooling solution. The ACX 2.0 seems to be on par with the MSI, so I suppose I could go either way.
Is it the case that the ACX card uses only 4 power phases which is why overclocking it beyond the factory setting isn't going to work very well? There is no mention of power phases in your article.
Enthusiasts dont care about TDP that much. The 290x is held back by HSF cooling (Uber mode is actually stock advertised speeds) while the GTX 970 is not. Water-cool the 290x and OC it to 1200mhz and it will match a 980, surpassing it at 4K resolution easily.
I had a MSI GTX 970 and found that under heavy load the core clock was fluctuating and causing FPS drops. After having read this article, I now understand that its due to the TDP limit. Is this something that will/can be fixed or something permanent?
Those price are damn cheap. I would say, buying a gtx980 in the U.S wouldnt even buy a gtx 970 in Brazil. I'm living in Brazil right now and ordered an evga gtx 970 sc. Ok, how much did i pay for the gtx 970!! Nothing less than $750USD. the Gtx 970 at $750USD still very cheap for us Brazilian, the world's most expensive country. The evga gtx 980 is costing around $1100USD, not kidding, check for yourself.
Yeah that 970GTX FTW is a damn good card. I was shocked at how close the performance gap was of the 970 FTW and the 980(although this has always been the case with the FTW versions).
I'm MAD! When I first looked at the Gigabyte GTX 970 on amazon.com a month ago, it was $308, now it's $348. Just because sales are good, does that mean they should gouge the consumer for all they can?
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
155 Comments
Back to Article
Kalessian - Friday, September 26, 2014 - link
Is it really safe to overclock the memory like that when there aren't any heatsinks on them?Also, 1st?
Ryan Smith - Friday, September 26, 2014 - link
As long as you're not giving them additional voltage (which you can't do on this card): yes. GDDR5 does not consume all that much power, even if it is relatively more than DDR3. The airflow off of the fans is plenty for stock voltage.Viewgamer - Friday, September 26, 2014 - link
Why no Mantle benchmarks for Thief ?winterspan - Friday, September 26, 2014 - link
I'm assuming because this is an Nvidia review...eanazag - Friday, September 26, 2014 - link
No mantle is likely because it didn't give a great showing last time in the AMD mantle review. If I remember correctly, Thief maybe even had a performance regression with Mantle being used.Ammaross - Thursday, October 2, 2014 - link
Because Mantle only has benefit in CPU-capped performance. When you run benchmarks on an i7 or better, Mantle has no tangible benefit and sometimes has regressions.Viewgamer - Friday, September 26, 2014 - link
Or even Mantle benchmarks for BF4 for that matter ?Ryan Smith - Friday, September 26, 2014 - link
Apologies. The Mantle results have to be added manually since our graphing system can't handle multiple results for the same card automatically. I had actually entered in the data but neglected to regenerate the graphs.SeanJ76 - Monday, February 9, 2015 - link
Sounds like your not using EVGA PrecisonX 4.2.1, you can add as much voltage as you like to the 970 GTX FTW.........idiot.....P39Airacobra - Sunday, November 29, 2015 - link
Ok first of all EVGA Precision or the type of software has nothing to do with that, Also he has a valid concern about the V-Ram temps and the VRM. Also don't call people idiots! That's my Job! IDIOT!vred - Friday, September 26, 2014 - link
Memory chips don't really need heatsinks, but memory VRMs do - so if overclocking memory, you have to watch their temperature and take care of their proper cooling.eanazag - Friday, September 26, 2014 - link
I am happy to see both Nvidia cards come out packing heat. for once Nvidia isn't totally price gouging at launch. In fact, we should see AMD prices come down soon as AMD reviews Nvidia's sales numbers. The 290 and 290X are pushing like 10 months on the clock at pretty much launch prices. it is about time we the price have a reason to come down. The 970 is the card to do so.Assumedly, Nvidia already has the 980 TI taped out for AMDs next response card. I am disappointed to see that there is no builtin audio decoder like in AMD. It means little hope to see many developers leveraging audio hardware to reduce CPU load in games with 5.1+ situations.
JlHADJOE - Saturday, September 27, 2014 - link
I doubt we'll see a 980Ti. Like the 680 before it, the 980 is already a fully enabled chip.Kraelic - Saturday, September 27, 2014 - link
It is a GM204 upper-midrange chip as were any Gxx04 chips before... this is like the launch of the 680 and 670 GK104 cards, then in the refresh 770 and 760 used those same chips when the 780 GK110 high end came out with the true flagship core.There will be a GM210 flagship taking the titan and x80 name, and the GM204 will go into the x70 and x60 cards at a lower price and this refresh is what I am interested in.
randomhkkid - Saturday, September 27, 2014 - link
However like the 780 before it is is also a Gxx04 chip so therefore there will be a Gxx10 chip above it suited to cards like the 980ti as was the case with the 780ti.randomhkkid - Saturday, September 27, 2014 - link
beaten to it ;)Samus - Sunday, September 28, 2014 - link
we'll probably see a 980Ti in the form of a 20nm part. even if there is no architectural difference they could tweak it for a smaller process while possibly boosting the clock. who knows...1500MHz GPU & 8GHz GDDR5?squngy - Thursday, November 20, 2014 - link
I'm not up to date, but if crypto-currency miners are still interested in radeon cards AMD might not rush to lower the price, or if they do the retailers will just pocket the difference and sell at the same price.AnnonymousCoward - Saturday, September 27, 2014 - link
A 10% OC won't heat a chip to the point of damage. You will most likely first reach the point when the memory fails to function.Btw Ryan, you keep saying it's "clocked at 7GHz" but I don't think that's true. Probably need to add the word "effective". Is the real clock 1/2 or 1/4 that?
MrSpadge - Saturday, September 27, 2014 - link
You'll frequently see 1/4 in OC tools. In the end it doesn't matter, both numbers are valid: one is the frequency of the data on the bus (the high marketing number) and the other one is the frequency of the memory chips.AnnonymousCoward - Sunday, September 28, 2014 - link
So you're saying it doesn't matter to say "clocked at 7GHz" when the actual clock is 1.75GHz. Well it matters to me! What if the chip multiplies the core clock by 2 internally; should we then say the core clock is 2.2GHz instead of 1.1GHz?Black Obsidian - Monday, September 29, 2014 - link
And thus the reporting of EFFECTIVE clock speeds. Not everyone does (or can) understand the complexities of the underlying architecture.jtrdfw - Wednesday, September 30, 2015 - link
yes. heatsinks on memory are pretty much a scamMagickMan - Friday, September 26, 2014 - link
How about a 970 OC vs 290 OC comparison? I don't have a favored GPU, I just care about bang for buck.The_Assimilator - Friday, September 26, 2014 - link
What would be the point? 970 already equals or beats 290X, and you don't get much from overclocking Hawaii GPUs, apart from more heat.poohbear - Friday, September 26, 2014 - link
Thank u for including a full stable of previous gen video cards to compare it to! In particularly the 670 & 770! Gives us a better idea of how it performs!Tetracycloide - Friday, September 26, 2014 - link
Seconded. Still running a 6950 flashed to 6970 so having the stock 6970 as a point of reference made this the easiest buy ever. Roughly twice the performance with lower power, heat, and noise? Yes.roxamis - Monday, September 29, 2014 - link
I have the same card (Sapphire 6950 Dirt 3, unlocked to 6970) and one fan of the 2 died last week (hitting 90-100 deg C in games). So with that I went to see what i can replace it with and the 970 ticks all my marks.krazyfrog - Friday, September 26, 2014 - link
The price-to-performance ratio is strong with this one.Frenetic Pony - Friday, September 26, 2014 - link
Certainly, but considering AMD has implemented some of the same things I'd expect an equivalent price to performance ratio from AMD for their new cards. Cut down the 290's bus to 256bit, clock the ram to 7ghz, and with the bandwidth compression you'd get a cheaper card with the same performance.Still, nice all around to have choices, been looking at cards myself, and am going to build a system for my brother quite soon. Yay price wars!
MrSpadge - Saturday, September 27, 2014 - link
What you describe is what Tonga should have been. Didn't turn out so well :/Sure, the 285 is priced below GM204 cards, but the chip is almost as large and hence costs AMD the same to produce it. They SHOULD play in the same league.
thepaleobiker - Friday, September 26, 2014 - link
"Crysis 3 Summary" - The GTX 670 trails the R9 290XU by 10%....It should be the GTX 970 :)
Also, on the page with Company of Heroes - The Charts do not display correctly, or more specifically, their headers (the thick Blue bar/heading with info about resolution etc?) are cropped out on all the images except the first one.
Regards,
Vishnu
krazyfrog - Friday, September 26, 2014 - link
Also, last page, fifth paragraph"AMD would have to cut R9 290X’s performance by nearly $200 to be performance competitive, and even then they can’t come close to matching NVIDIA’s big edge in power consumption."
Should be '290X's price', I believe.
CZroe - Friday, September 26, 2014 - link
I have the reference style 4GB EVGA GTX 760 with the short PCB but it was discontinued shortly after launch. I got some 670/760 water blocks for SLI from Swiftech and found that only Zotac was making a short PCB 4GB GTX 760 card like my EVGA even though it has fewer memory chips (probably worse for over locking). Because the vast majority of GTX 760 cards had reference 680 PCBs, it is very difficult to tell which "reference" 760 this article is talking about. The rare short PCB or the longer one?Ryan Smith - Friday, September 26, 2014 - link
The short PCB and the stretched PCB were virtually identical, so to answer the question I'm technically comparing it to the short PCB, but either comparison is valid. The stretched section only contains a handful of additional discrete components; it's mostly to allow fitting an open air cooler.Atari2600 - Friday, September 26, 2014 - link
The Radeon 290x is nearly a year old now. It would be surprising if an Nvidia GPU that sacrifices DP capability wouldn't be significantly quicker per mm^2 at this point.The improvement in performance/watt is notable and nvidia deserve much credit for their work in this area.
Phasenoise - Friday, September 26, 2014 - link
Oh my the name of that card. "EVGA GeForce GTX 970 FTW ACX 2.0"Reads like my niece's texting log. omg lol gtx ftw, btb.
jjj - Friday, September 26, 2014 - link
"considering the fact that GTX 970 is a $329 card I don’t seriously expect it to be used for 4K gaming"You should and Nvidia should market it as lowering the entry price for 4k. It's kinda the least you need for 4k, can do 4k just not max setting everywhere for just 329$. Pair it with some bellow 500$ screen deal and 4k gaming is a lot more accessible than 2 weeks ago.
We've looked at what's the bare minimum for 1080p for years and we are used to it, now it's 4k's turn to become more mainstream and we need to get used to looking at it the same way.
in the US and even more so in China 4k screens are not that prohibitive anymore. 400-500$ for a screen, 330$ for a 970, an overclocked dual core and you can do budget 4k gaming at 1.2k$.
4k should be one of the reasons some people that buy 200$ cards might go for the 970 this time around.
And with 20nm 4k should become a lot more affordable so it's time to not think of it as a small niche.
garadante - Friday, September 26, 2014 - link
Can we get power usage with non reference 290/290X's? If I recall correctly, power usage drops something like 15-25 watts when it's running at closer to 70 C than 95 C as reference cooling profiles make it run.justaviking - Friday, September 26, 2014 - link
Zero Fan Speed Idling...Will the zero fan speed capability be something that will be delivered via a driver update? I hope so. Or will it require a "Rev B" of the hardware too?
I don't need silence, but quiet is nice. I assumed the 970 would be quite a bit quieter than the 980 due to lower TDP. The test results surprise me (not in a good way).
This makes me even more impressed with the 980. But it still costs $200 more. Tough choice.
Ryan Smith - Friday, September 26, 2014 - link
It will apparently be delivered via a vBIOS update, judging from what is being said on EVGA's forum.justaviking - Saturday, September 27, 2014 - link
Excellent. Thank you.Gunbuster - Friday, September 26, 2014 - link
Too bad it seems ACX 2.0 is a loud ass cooler, have used EVGA in the past but that moves it off my list.Qwertilot - Friday, September 26, 2014 - link
You'd think they should be able to get the fans to at the very least spin down a lot further than that at idle - there seems to be at least three 970 cards capable of running on purely passive cooling at idle now. (Asus, MSI and Palit.).maximumGPU - Friday, September 26, 2014 - link
Very tempting for us 670 owners!Although will look for models with quieter coolers. Seems silly to have a loud one with such a low TDP.
Dahak - Friday, September 26, 2014 - link
Did I miss the information about the compatibility issues that was indicated in the 980 review? or is it going to be in another article?JarredWalton - Friday, September 26, 2014 - link
It was briefly discussed on page 3:http://www.anandtech.com/show/8568/the-geforce-gtx...
Basically, it was mostly a problem with the ASRock motherboard Ryan uses for GPU testing.
sweeper765 - Friday, September 26, 2014 - link
So EVGA put all this work into lowering fan power consumption but forgot about idle noise? I find this perplexing . And i believe they had this same problem with other older models as well.Also i don't like the idea of passive cooling. Running the card at 50C for a long time is not good for longevity. I had a passive Gigabyte card in the past that after a few years was showing colored pixels on the screen.
Better to use a low rpm (<1000) for quiet operation. You're not going to hear the difference anyway because you have other components making some kind of noise in the case.
Tetracycloide - Friday, September 26, 2014 - link
In fairness, idle noise is much easier, just a BIOS change. Load noise required hardware revisions.The_Assimilator - Friday, September 26, 2014 - link
Why use two 6-pin PCIe power connectors when a single 8-pin would do the job just fine? Would certainly cut down on the BOM, and of course cable clutter.JarredWalton - Friday, September 26, 2014 - link
There are plenty of older (but still decent) PSUs that only have 6-pin PEG connectors, and 6-pin to 8-pin adapters are never a good idea IMO.cobalt42 - Friday, September 26, 2014 - link
I believe at least one released card does use a single 8-pin connector.jmke - Tuesday, September 30, 2014 - link
Asus strix 970 DC2OC uses a single 8-pin connectorHrel - Friday, September 26, 2014 - link
Remember you guys saying Nvidia is working from the low end up, no longer top down. Well, they should start releasing cards that way. I don't care about your overpriced space heaters, I care about the cards between $100 and $200. Release those first!cobalt42 - Friday, September 26, 2014 - link
They did -- the 750 and 750Ti are the first generation Maxwell cards, released earlier this year. They didn't break new ground in price/performance, but they did in price/watt.anandreader106 - Friday, September 26, 2014 - link
You mean performance/watt.Houdani - Friday, September 26, 2014 - link
I think it's finally time to upgrade my vintage 460 to one of these 970's. I don't plan on upgrading my ancient i7 930 (Nahelem, Bloomfield) just yet.I think I can eek more life out of my rig by bumping the GPU and keeping the rest of the innards the same for another year or two. I'm just surprised that I was able to get 4 good years out of that little 460 (mated with a 1080P monitor).
CaptainSassy - Friday, September 26, 2014 - link
Exactly the same rig and monitir but i will continue sitting on it :D 329$ is still pricy, i'll wait for true 200$ championwetwareinterface - Saturday, September 27, 2014 - link
there won't be a $200 champion. the bang for buck cards are in the $300 and above price tier right now and for the foreseeable future. it was the r9 290 and now it's the 970. the next one i predict will be the 290's replacement it will have the texture compression of the 285 and be their second down part.just4U - Sunday, September 28, 2014 - link
The $200 champion appears to really be the 280 atm for those sitting on older cards.. I'd like to say the 285 but it's higher up in the price bracket. The 960 should be interesting though when it comes out but I doubt it will be a $200 card.. Looking at the 970.. Im guessing it will sit at the $250 price point.dj christian - Friday, September 26, 2014 - link
What type of fans does it use? Ball bearing or sleeve bearing?bardolious - Friday, September 26, 2014 - link
EVGA uses double ball bearing fans. That's where the extra noise comes from. Much noisier at idle but far more durable. I've really been pleased with the ACX cooler on my 770.Chloiber - Sunday, September 28, 2014 - link
Actually, the noisy ACX on my 770 is the reason I won't be buying another EVGA anytime soon. They have given me a silent BIOS, which is great, but the ACX is still by far the loudest noise maker in my PC. It has also been proven by countless reviews: the idle and even load noise levels were higher than those of the stock GTX 770. And they are making the same mistake AGAIN - I really can't believe it.Well, they seem to come around after seeing all the competitors which seem to understand what it means to deliver silent cards.
The ACX cooler isn't bad, it's actually rather good - the only problem is that they never understood what a "silent" card means. They were always going for lower temperature over lower noise levels - even in idle, which made absolutely no sense.
Iketh - Monday, September 29, 2014 - link
enjoy the other brands' fans going out in 1-2 yearsalso if you read the article (updated 9/26, two days before your post), evga is releasing a passive idle bios which makes your whole post pointless
creed3020 - Friday, September 26, 2014 - link
In all honesty I cannot see how comparing a _reference_ R9 290X on Uber to this particular 970 is valid.We really need a similar open air cooled R9 290X to really see how power, temps, and noise compare....Everyone knows that AMD's reference blower for the 290X just isn't up to the task of cooling that beast.
haukionkannel - Friday, September 26, 2014 - link
True. The R9 290 reference cooler is one of the worst options to chose and non reference has been much better! But still 970 is hard nail in 290X skin!Lithium - Friday, September 26, 2014 - link
YepBut reference 290X still selling and used to make price as low as 449$.
So its valid
creed3020 - Monday, September 29, 2014 - link
Good point. I guess some manufacturers just want that entry product in their stack of offerings and go with the reference design.Thanks Ryan for the hard work on the nVidia 980/970 release, these articles were excellent. In the future perhaps consider a followup test comparing a bunch of cards to more so evaluate their coolers and OC potential. That could be very interesting taking some of the top and mid cards from each manufacturer and doing a quantitative analysis across the board.
AkibWasi - Friday, September 26, 2014 - link
970 has 52 FP64 cuda cores right ? why block diagram doesn't show those ?Ryan Smith - Friday, September 26, 2014 - link
NVIDIA does not include the FP64 CUDA cores in their diagrams for consumer chips. This has been the case as far back as GK104.AkibWasi - Saturday, September 27, 2014 - link
ain't those 896(64 per SMM) yellow colored boxes in Titan's diagram indicate FP64 cores ???Ryan Smith - Saturday, September 27, 2014 - link
Correct. NVIDIA only includes those cores on diagrams for their compute/pro GPUs.dexgen - Friday, September 26, 2014 - link
I think it would be a great idea to comment on and analyze the effects of overclocking (extra OC through AB or PX) when even the non overclocked settings end up getting throttled.For me, the most important thing about overclocking when the card is factory overclocked already is how much the throttling changes when the power target is increased. Any comments, Mr. Smith?
Ryan Smith - Friday, September 26, 2014 - link
Increasing the power target helps, but it does not fully alleviate the issue. A 10% increase just isn't enough to eliminate all TDP throttling, thanks in big part to the fact that power consumption grows with the square of the voltage. GM204 would ideally like quite a bit of power to sustain a heavy workload at 1.243v. Which is why that's officially in boost territory, as NVIDIA only intends that voltage/bin to be sustained in light workloads.Alexvrb - Saturday, September 27, 2014 - link
Wow I figured that the 970 would run into far less issues sustaining max boost than the 980. But I guess it is drawing nearly as much power. I don't want to see anyone complaining about AMD cards and boost anymore, heh.Anyway, the 970 still provides the absolute best bang for the buck and I'm stunned they didn't price it at $400. It's fast, reasonably priced, runs cool and quiet. It also is easy on power requirements, though I always overbuy on PSU anyway for headroom. Easy recommendation for anyone buying in the this price range!
AnnonymousCoward - Saturday, September 27, 2014 - link
Square of voltage, what are you smoking? P = IV = I^2 R = V^2 /R. The IC isn't a resistor. Typically current stays close to the same as you increase supply voltage.Ryan Smith - Saturday, September 27, 2014 - link
The formula for dynamic power consumption:P = C * V^2 * f
Where C is capacitance, f is frequency, and V is voltage. Those high boost bins are very expensive from a power standpoint.
AnnonymousCoward - Sunday, September 28, 2014 - link
You're right, thanks! Thinking about it, dynamic power increases by the square, and static is by a direct proportion, so total should be between the two. Dynamic probably dominates so it's probably much closer to the square.Footman36 - Friday, September 26, 2014 - link
What really bothers me is that EVGA is getting lazy, reusing older pcb's. This one looks like a 760... The VRM and phases look very primitive next to a card like the Asus Strix GTX 970. There was a time when EVGA used to wow me with custom designs, the last few years not so much as they invariably use reference boards. the issue I have with most of the reference boards is that coil buzz is noticeable. The Asus and MSI boards are using custom digital VRM's and super alloy caps....Anyhow, nice review.
Iketh - Monday, September 29, 2014 - link
i'm sure it has to do with their big heat sink design + bracing so the card doesn't flexCasecutter - Friday, September 26, 2014 - link
I’m confident in if we had two of what where the normal "AIB OC customs" of both a 970 and 290 things between might not appear so skewed. First as much as folks want this level of card to get them into 4K, there not... So it really just boils down to seeing what similarly generic OC custom offer and say "spar back and forth" @2560x1440 depending on the titles.As to power I wish these reviews would halt the inadequate testing like it’s still 2004! The power (complete PC) should for each game B-M’d, and should record in retime the oscillation of power in milliseconds, then output the 'mean' over the test duration. As we know each title fluctuates boost frequency across every title, the 'mean' across each game is different. Then each 'mean' can be added and the average from the number of titles would offer to most straight-forward evaluation of power while gaming. Also, as most folk today "Sleep" their computers (and not many idle for more than 10-20min) I believe the best calculation for power is what a graphics card "suckles" while doing nothing like 80% each month. I’d more like to see how AMD ZeroCore impacts a machines power usage over a months’ time, verse the savings only during gaming. Consider gaming 3hr a day which constitutes 12.5% of a month, does the 25% difference in power gaming beat the 5W saved with Zerocore 80% of that month. Saving energy while using and enjoying something is fine, although wasting watts while doing nothing is incomprehensible.
Impulses - Sunday, September 28, 2014 - link
Ehh, I recently bought 2x custom 290, but I've no doubt that even with a decent OC the 970 can st the very least still match it in most games... I don't regret the 290s, but I also only paid $350/360 for my WF Gigabyte cards, had I paid closer to $400 I'd be kicking myself right about now.Iketh - Monday, September 29, 2014 - link
most PCs default to sleeping during long idles and most people shut it offdragonsqrrl - Friday, September 26, 2014 - link
Maxwell truly is an impressive architecture, I just wish Nvidia would stop further gimping double precision performance relative to single precision with each successive generation of their consumer cards. GF100/110 were capped at 1/8, GK110 was capped at 1/24, and now GM204 (and likely GM210) is capped at 1/32... What's still yet to be seen is how they're capping the performance on GM204, whether it's a hardware limitation like GK104, or a clock speed limitation in firmware like GK110.Nvidia: You peasants want any sort of reasonable upgrade in FP64 performance? Pay up.
D. Lister - Friday, September 26, 2014 - link
"Company X: You peasants want any sort of reasonable upgrade in product Y? Pay up."Well, that's capitalism for ya... :p. Seriously though, if less DP ability means a cheaper GPU then as a gamer I'm all for it. If a dozen niche DP hobbyists get screwed over, and a thousand gamers get a better deal on a gaming card then why not? Remember what all that bit mining nonsense did to the North American prices of the Radeons?
D. Lister - Friday, September 26, 2014 - link
Woah, it seems they do tags differently here at AT :(. Sorry if the above message appears improperly formatted.Mr Perfect - Friday, September 26, 2014 - link
It's not you, the italic tag throws in a couple extra line breaks. Bold might too, I seem to remember that mangling a post of mine in the past.D. Lister - Sunday, September 28, 2014 - link
Oh, okay, thanks for the explanation :).wetwareinterface - Saturday, September 27, 2014 - link
this^you seem to be under the illusion that nvidia intended to keep shooting themselves in the foot forever in regards to releasing their high end gpgpu chip under a gaming designation and relying on the driver (which is easy to hack) to keep people from buying a gamer card for workstation loads. face it they wised up and charge extra for fp64 and the higher ram count now. no more cheap workstation cards. the benefit as already described is cheaper gaming cards that are designed to be more efficient at gaming and leave the workstation loads to the workstation cards.
dragonsqrrl - Saturday, September 27, 2014 - link
This is only partially true, and I think D. Lister basically suggested the same thing so I'll just make a single response for both. The argument for price and efficiency would really only be the case for a GK104 type scenario, where on die FP64 performance is physically limited to 1/24 FP32 due to there being 1/24 the Cuda cores. But what about GK110? There is no reason to limit it to 1/24 SP other than segmentation. There's pretty much no efficiency or price argument there, and we see proof of that in the Titan, no less efficient at gaming and really no more expensive to manufacture outside the additional memory and maybe some additional validation. In other words there's really no justification (or at least certainly not the justification you guys are suggesting) for why the GTX780 Ti couldn't have had 1/12 SP with 3GB GDDR5 at the same $700 MSRP, for instance. Of course other than further (and in my opinion unreasonable) segmentation.This is why I was wondering how they're capping performance in GM204.
hammer256 - Saturday, September 27, 2014 - link
It would not surprise me if GM204 is crippled in FP64 in a similar way to GK104, with physically limited number of FP64 cores.Regarding to GK110, how the die are selected between FP64 crippled and professional cards is not known. You can imagine a case where the dies with defects in the FP64 cores can still be used in gamer cards, and thus have a bit more yield. But that's pure speculation, of course.
Either way, Nvidia does this because this makes them more money, and they can get away with it. If you remember from your class in micro-economics, when the industry is in a state of monopoly or oligopoly, segmentation is the way to go for profit maximization. Unless AMD is willing to not segment their products, there is no pressure for Nvidia to change what they are doing.
So we can argue that consumers are the losers in this state of things, and generally in monopoly and oligopoly that is indeed the case. But in this specific case with FP64, I have to ask: are there many/any consumer relevant applications that could really benefit from FP64? I'm curious to know. I would say that in order for these companies to care, the application need to have sufficient general relevance in the same order of magnitude as that for graphics.
Those of us who uses the GPU in scientific computation such as simulations are the real losers in this trend. But then again, we were fortunate to have had this kind of cheap, off the shelf hardware that were so powerful for what we do. Looks that ride is coming to an end, at least for the foreseeable future. Personally, my simulation doesn't really benefit from double precision, so I'm pretty lucky. Even then I found that stepping from the GTX580 to a GTX680 core didn't improve performance at all. The silver lining there was that GTX690 had much better performance that the GTX590 for me, and I was able to get 4 GTX690's for some excellent performance. A GTX990 would be tempting, or maybe just wait for the 20nm iteration...
anubis44 - Wednesday, October 22, 2014 - link
Of course GM204 is crippled in FP64. That's where nVidia is finding the improved power budget and reduction in wattage requirement. Frankly, I think it's pretty cheesy, and I've stopped listening to people creaming their jeans about how fabulous nVidia's low power is compared with AMD's. Of course it's going to loose it's power requirements if you cripple the hell out of it. Duh. The question is whether you will line up to get shafted with all the other drones, or if you'll protest this stupidity by buying AMD instead, and give nVidia the finger for this, as they rightly deserve. If we don't, AMD will have to take its FP64 circuitry out of their cards to compete.D. Lister - Sunday, September 28, 2014 - link
What I said earlier had nothing to do with efficiency. If you were a prosumer and were in the market for double precision hardware... why would you want a $3000 pro GPU when you can get nearly the same performance from a <$1000 consumer variant? Not everyone cares for ECC VRAM. HPC guys et al would be all over it, resulting in an unfairly inflated retail value for the rest of us. When that happens, Nvidia is the one that gets the bad rep, just like AMD did during the bit mining fad. Why do you believe it is so important anyway?Subyman - Friday, September 26, 2014 - link
Looking at the PCB, the FTW version does not have more VRMs than the SC or normal EVGA model. I only see four chokes, which is what the other cards have. MSI has 6 VRMs. I'm wondering if EVGA is also using the same low-end analog VRMs that the SC and regular EVGA cards use as well. All other 970's use higher end VRMs.wetwareinterface - Saturday, September 27, 2014 - link
the ftw is not the top end designation it isn't even better than the sc cards in most cases it's lower clocked than the sc and just has extra ram.for evga the cards are custom clocked cards are in order
sc
ftw
ssc
sc signature
classified
again the ftw can have lower clocks than the sc or the same clocks but usually has more ram
Subyman - Saturday, September 27, 2014 - link
I never said it was. The article mentioned it had 1 more power phase than the others, but from the pictures it obviously doesn't.Subyman - Friday, September 26, 2014 - link
Also, we really need a round up of all the brands on here. Seeing the FTW version vs reference doesn't paint a usable picture for those looking to make a purchase.Mr Perfect - Friday, September 26, 2014 - link
Is anyone going to pair this with the 980's blower? That would be quite impressive.Oh, and get the 970's IO up to par. Again, the 980's configuration would be better. Dual DVI indeed...
Margalus - Friday, September 26, 2014 - link
pny has a 970 with the full complement of output's. 3 dp, 1 hdmi 2 and 1 dvi. It really pisses me off that most of the top tier makers like EVGA and ASUS decided to switch that to 1 dp, 1 hdmi and 2 dvi...pixelstuff - Friday, September 26, 2014 - link
Same here. Annoyed.wetwareinterface - Saturday, September 27, 2014 - link
they did it because they have customers with at most one dp monitor and if using multiple monitors most have dvi still. also they have all these connectors they bought up laying around so....Gigaplex - Monday, September 29, 2014 - link
It's trivial to convert from DP to DVI but not the other way around.Mr Perfect - Monday, September 29, 2014 - link
Hmm, it has a blower too. It looks like their own design though, I wonder if it matches the 980's blower in noise and cooling.ggathagan - Friday, September 26, 2014 - link
Gigabyte's version has the same I/O setup as the 980Mr Perfect - Monday, September 29, 2014 - link
Good find. That one's an option then.HanzNFranzen - Friday, September 26, 2014 - link
Well, it was quite a run for me and AMD. I was a GeForce user back in the day with the Geforce 2 and then the Ti 4600. In 2002 I switched up to AMD when they released the 9700 Pro and never looked back. I have been waiting and waiting for the R9 cards to drop in price to go into my new X99 build. I waited for these 900 cards thinking the response would be somewhat quick as far as an announcement for a competing Radeon or at least a price drop for the 290X. But it never came. (Except that the 390 may be factory water cooled.... which was an "uh oh.." in my mind as far as heat and power is concerned). So 12 years later, I am now back to NVidia as I just ordered my GTX 980 yesterday. I think that NVidia finally released a card at a price and power consumption that just cannot be ignored. A truly impressive feat they have pulled off with the Maxwell line, and I have chosen to reward that effort with my business. Who knows, MaxWELL in a HasWELL-e build... perhaps fate was involved? It is to be named my Wellness system =P Will be quite an upgrade from a i7 920 and 6950. I can't wait to get it assembled!! And by the way, thanks for the 2 great reviews Ryan!wetwareinterface - Saturday, September 27, 2014 - link
i'm curious why you'd spend all that money on a cpu ram motherboard config and then get a gaming card. you could have saved a ton of money and bought 4790, msi gaming series board ddr3 ram and bought 2 980's if gaming is your focus youd have the faster gaming cpu and graphics card setup that way.HanzNFranzen - Saturday, September 27, 2014 - link
I would agree if I were a 'tick tock' cadence upgrader, and if gaming were my only focus. I weighed going 4790 for a few months. But coming off of a X58 build that I've owned for 6 years it may be possible that this build stays with me just as long, and I believe a 6 core will be the better option than a 4 core in the long run. As far as a gaming card, this is a desktop build not a workstation.asgallant - Friday, September 26, 2014 - link
Color me intrigued by the 970. I'm considering a pair of these in SLI for 4k gaming (as other reviews indicate they scale quite well), but I'm running a Sandy-Bridge era rig, with x8/x8 PCIe 2 as the only SLI option, and would like to know if I'd run into a PCIe bandwidth bottleneck on these cards. Any chance of a quick scaling test making it into the (assumed to be coming) SLI review?boozed - Friday, September 26, 2014 - link
Improvements in performance and efficiency have finally progressed to the point that it makes sense to upgrade that four year old system. Wow.Daniel Egger - Friday, September 26, 2014 - link
I love this card and am really excited. BUT if this cannot run on a single PCIe connector despite being 145W only that is a real deal breaker as I'm definitely not going to replace my excellent PSU just because some moron decided to plan in some additional headroom...Razorbak86 - Friday, September 26, 2014 - link
"single PCIe connector... excellent PSU"Good one. LOL
wetwareinterface - Saturday, September 27, 2014 - link
if your psu doesn't have 2 6 pin pci-e connectors it can't be that excellent. the lepa 500 watt is $36 and will run the 970 no problem and is a decent supply. i shudder to think what you have if it can't handle this card.Daniel Egger - Saturday, September 27, 2014 - link
An efficient quiet low power PSU from "be quiet!" for my HTPC which would be perfectly capable of dealing with the GTX 970. There're actually many PSUs which only have 1 PCIe connector because that's the norm not the execption that you'll need at most one. So I take it that all of you "shuddering" all day thinking of what horrible PSUs others might have, have their high end gaming rigs and don't give a crap about silence and/or efficiency -- good for you but maybe schedule a reality check sometime.maximumGPU - Sunday, September 28, 2014 - link
Yeah, but that "moron" who planned additional headroom is catering to the needs of the biggest market for these cards, who are gamers likely to overclock at some point. Not the niche need of using it in an htpc case with a low power single pcie psu.D. Lister - Monday, September 29, 2014 - link
You could use an adapter to get an extra 6-pin connector, which should work fine considering the low power draw of the 970. That is, if your excellent PSU has an HDD power dongle to spare.</snark>pixelstuff - Friday, September 26, 2014 - link
Looking at the 970 cards on Newegg, any idea why all brands have the 2x DVI, 1x DisplayPort configuration? While nVidia.com lists the 970 specs as having 3x DisplayPort, 1 DVI port.I was hoping to get a 970 with 3 display ports, so I'm wondering what is going on.
yefi - Friday, September 26, 2014 - link
The overclock on this is pretty weak. I get 1314Mhz core with a max boost of 1554Mhz with the MSI Frozr 970. I see reviewers hitting ~1500Mhz also.mr.techguru - Saturday, September 27, 2014 - link
It seems the ACX Cooler similar to the EVGA 760 ACX has a design/manufacturing flew even the 2.0 version has the pipelines vs GPU position flew ; after throughly reading the reviews & users feedback ( EVGA GeForce GTX 970 ACX has misaligned GPU vs heatpipes) : http://www.overclock.net/t/1514624/eteknix-possibl... and http://www.guru3d.com/news-story/evga-geforce-gtx-... and http://forums.evga.com/m/tm.aspx?m=2221919&p=1 and http://www.eteknix.com/evga-gtx-970-feature-manufa... Therefore I would rather go with another brand Like the Asus Strix Series ; MSI Gaming Series or the Gegabyte G1 Gaming Series .. Less Risk Especially the warranty is RMA. As soon they are available at Stock update asap to order one .. Meanwhile Reason why they did not show the PCB of the ACX 2.0 and they showed us the FTW. I was interested to purchase the ACX version but now In a dilemma since the issue has not been mentioned in a tech site they I usually always trust!bsim500 - Saturday, September 27, 2014 - link
Good review. Is there going to be a GTX 960 to fill the rather large gap between the GTX 970 and 750Ti?LancerVI - Saturday, September 27, 2014 - link
Looks good! Mostly a side-grade for my 290's, but I knew that already. Probably the best purchase I made in a while, those 290's are gonna last me a bit. May have to ebay a 3rd for some tri-fire. I'll go nVidia next round depending, but I'm really impressed with the 980/970, but 290's right now are a steal!Impulses - Sunday, September 28, 2014 - link
Did the prices drop significantly? I'm scared to look, I got mine for $350 (after rebate) and $360 about a month ago, didn't see anything priced that low immediately after the 980/970 launch but... You'd think AMD would now have to at least match 970 pricing with the 290 in order to sell any at all.Impulses - Sunday, September 28, 2014 - link
FWIW, I did think the price I paid for the 290s was pretty terrific and haven't regretted it yet... When looking at SLI vs CF tests the performance gaps from game to game are even narrower, a huge AMD price drop would be slightly irksome tho I understand it's bound to happen. OTOH, being so close to the holiday season I doubt we'd see any good sale prices atop any price drops or beyond the 980/970 price points.Death666Angel - Saturday, September 27, 2014 - link
The 980 and 970 make me excited for what nVidia (and hopefully AMD) has in store for 20nm next year. I'm not going to upgrade from my water cooled 7970 (35% chip and 20% memory OC) for another 28nm GPU, even if nVidia managed to squeeze a lot out of the process.StormFuror - Saturday, September 27, 2014 - link
Wow, I didn't expect the 970 to pull up stats like this compared to the 780. I don't know why, lol. The power consumption from the horsepower this card throws down is rather impressive compared to AMD's R9's. I may have to get one of these. I'm stuck with one monitor @ 1920x1080. This card will give me 60 FPS in just about everything it seems, probably over kill. But this seems like it'll hold me over until I have to do a complete new rebuild down the road(when I upgrade to 4k). I've got a couple years :)eek2121 - Sunday, September 28, 2014 - link
I know AT's benchmarks portrays the Radeon 6970 as 'slow as balls' when it comes to gaming recent titles, but as someone who games more than most (on a 1080p monitor), i still find very little incentive to upgrade. Maybe if i gamed at higher resolutions.... Haven't really encountered a game (worth playing) that doesn't run on ultra settings in 1080p. Then again, steam has been shifting my dollars away from top tier games towards indie games for a while now. I own 425 games on steam and all of them give me no issue on my 'old' graphics card. Not knocking the new geforce cards at all....just wish developers would push the envelope a bit more.TiGr1982 - Sunday, September 28, 2014 - link
Wel, old VLIW4-based HD 6970 is, say, something around 1/3 (33%) slower than GCN-based R9 280 aka HD 7950 Boost (which I have, BTW). But I believe that 6970 is more or less still good enough for 1080p, and, besides, probably the drivers for HD 6970 are pretty mature since HD 6970 is nearly 4 years old now (plus, Trinity and Richland also have VLIW4-based GPU, just 1/4 of HD 6970's Cayman GPU).just4U - Sunday, September 28, 2014 - link
The biggest thing for me is this..." For much of the last year NVIDIA has been more than performance competitive but not price competitive with AMD."
------
That's been a major bone of contention with me. They corrected pricing which was very much in line and competitive during the 460-80 days. Then went out to lunch thru the 560-80 era and marginally came back (almost not quite..) during the 660-80 only to head way out to lunch again with their 7x series. It boggles the mind and makes it hard to purchase when AMD is offering great deals on their end /w similar performance.
hamiltus - Sunday, September 28, 2014 - link
So my main question is if the GTX 970 is a good upgrade from the GTX 670 for 1440p/4K gaming?coldpower27 - Sunday, September 28, 2014 - link
This is a solid upgrade, just thinking if this is worth it over GTX 670 as well. Slightly more power consumption for much greater performance and similar idle performance numbers.970 has a nice ring to it. :)
Laststop311 - Sunday, September 28, 2014 - link
330 for this is a steal. Very few cases one even needs a 980. Even so I am still buying the 980 I keep my hpus for a long time, still using radeon 5870 nowThe-Fox - Sunday, September 28, 2014 - link
Thanks Ryan ! great article, enjoyed reading it as much as I did the one on the GTX 980.GTX 970 proves to be an excellent card in terms of VFM, its a rare event in the high end GFX card market.
I would love to see GTX 970 in an SLI benchmark and see how it handles UHD (read 4K) games.
With its price point and performance it begs for a dual SLI setup.
Nfarce - Sunday, September 28, 2014 - link
Guru3D has done it. It's highly impressive.Nfarce - Sunday, September 28, 2014 - link
I've got two reference cooler EVGA 970s (superclocked) coming from NewEgg on Tuesday. I'm not a big overclocker on GPUs as I'm on air and want all possible heat blown out the back, but can't wait. Coming from a single 680 and having recently moved up to 1440p, and not having to upgrade my Corsair 750W gold PS, it's just an absolute zero brainer.Great review Ryan and thanks for continuing to show older games like Crysis Warhead and Grid 2 (which I use as a reference to compare with Grid Autosport benches)!
Scimitar11 - Sunday, September 28, 2014 - link
Great article as usual. I just signed up to ask: Will you do any reviews/comparisons of the semi-reference cards with the cheaper blower style coolers for the 970? There are quite a few options out there (at least two non-ACX EVGA cards for example). I would love to know just how much difference there is in temps and noise, and possibly performance between the various cooler types.AndrewJacksonZA - Monday, September 29, 2014 - link
Ryan, are you using the horrendously bad stock AMD coolers for the 290X noise and temperature readings?kwrzesien - Monday, September 29, 2014 - link
Looks like NVIDIA did pretty awesome dealing with the surprise that they had to produce another generation on TSMC 28nm. Frankly these will probably be the best cards made for years to come since they really have 28nm figured out and Maxwell is bringing huge performance/watt. It will be interesting to see if they even make a 960 - would it be a further crippled GM204 or something else, maybe the first 20nm chip?So in classic AnandTech style it would be awesome to get an article on the inside story at NVIDIA about what they have gone through with Apple sucking up the first batch of 20nm at TSMC. I know they made some public noise about it - and think about it from the corporate perspective - they were used to getting first dibs on each die shrink and using that in the top-tier products. Now they are stuck a node behind, they may have to prioritize Tegra and mobile chips on 20nm first and leave desktop parts always a year behind. If that keeps workstation parts behind as well I can see why they would be pissed.
ppi - Monday, September 29, 2014 - link
When can we expect image quality tests?mr.techguru - Tuesday, September 30, 2014 - link
Why you did not mention EVGA has been caught with there chips being not aligned on the heat sink correctly..(Tho, they replied with it being how its suppose to be).Asus is always just a solid company to fall back on..
and gigabyte is generally the same way.
As for the 970's... MSI>Gigabyte>ASUS>EVGA.
EVGA's Problem: http://www.guru3d.com/news-story/evga-geforce-gtx-...
Everything you need to know about the MSI 970: http://www.guru3d.com/articles_pages/msi_geforce_g...
Gigaplex - Wednesday, October 1, 2014 - link
Neither of those links workdibbademevos - Tuesday, September 30, 2014 - link
hidibbademevos - Tuesday, September 30, 2014 - link
hi
SkyBill40 - Wednesday, October 1, 2014 - link
Having always been an MSI guy, I've not really considered going with another vendor... until now. This looks like a nice card which also happens to conveniently match my color scheme whereas the red coloring of the MSI Gaming line sadly does not. Still, the overclocks are pretty much a wash and the only real differences seem to be in the cooling solution. The ACX 2.0 seems to be on par with the MSI, so I suppose I could go either way.Oxford Guy - Saturday, October 4, 2014 - link
Is it the case that the ACX card uses only 4 power phases which is why overclocking it beyond the factory setting isn't going to work very well? There is no mention of power phases in your article.Kanuj5678 - Sunday, October 5, 2014 - link
GTX 970 beats the shit out of everything and that too in style with lowest TDPCheers
Kanuj
ambientblue - Wednesday, April 29, 2015 - link
Enthusiasts dont care about TDP that much. The 290x is held back by HSF cooling (Uber mode is actually stock advertised speeds) while the GTX 970 is not. Water-cool the 290x and OC it to 1200mhz and it will match a 980, surpassing it at 4K resolution easily.igyb - Tuesday, October 7, 2014 - link
Is the gtx 970 just an underclocked 980? i might just get that because i cant really afford a 980.Kimtastic - Tuesday, October 21, 2014 - link
Dear Ryan,I had a MSI GTX 970 and found that under heavy load the core clock was fluctuating and causing FPS drops. After having read this article, I now understand that its due to the TDP limit. Is this something that will/can be fixed or something permanent?
I would be grateful for your advice. Many thanks.
hoohoo - Thursday, October 23, 2014 - link
Thank you for including an HD7970 in the test!Shoiti2 - Monday, November 3, 2014 - link
Those price are damn cheap. I would say, buying a gtx980 in the U.S wouldnt even buy a gtx 970 in Brazil. I'm living in Brazil right now and ordered an evga gtx 970 sc. Ok, how much did i pay for the gtx 970!! Nothing less than $750USD.the Gtx 970 at $750USD still very cheap for us Brazilian, the world's most expensive country.
The evga gtx 980 is costing around $1100USD, not kidding, check for yourself.
thesid - Monday, November 10, 2014 - link
I have a rig with i2600K and 16gigs of 12800ddr3 ram and a gtx 570, if i upgrade to the 970 gtx will it be bottlenecked by the rest of the components?atl - Friday, January 30, 2015 - link
Under power consumption, i would like to have comparison of GPU only consumption also, not whole system.SeanJ76 - Monday, February 9, 2015 - link
Yeah that 970GTX FTW is a damn good card. I was shocked at how close the performance gap was of the 970 FTW and the 980(although this has always been the case with the FTW versions).sheriff12 - Tuesday, July 7, 2015 - link
I'm MAD! When I first looked at the Gigabyte GTX 970 on amazon.com a month ago, it was $308, now it's $348. Just because sales are good, does that mean they should gouge the consumer for all they can?thelategamer - Friday, May 5, 2017 - link
Here's my honest review of the GTX 970 and upgrading to SLI 970s rather than getting underperforming 1070s and 1080s - http://www.thelategamer.com/video-game-review/late...