And you need to quit trolling Anandtech. $200? For a 290X? LOL!! It's nearly the same speed as Nvidia's $1,000 Titan! If any company ever needed to drop pricing, it's Nvidia.
Let's be realistic: based on GTX 970 starting at $330, when GTX 960 will be released, it will be around $250-260 (I suppose, no less than that) to replace older Kepler-based GTX 760.
Yeah that may be so but here their all on back-order and they range from 370-420 in price. So it seems like it's the same as the 670/770 launch. Our dollar (CAD) is a bit down from what it was though so that's a factor to.
Big order mean big discount. That is beauty of game bundles. AMD/Nvidia are able to buy them for You for much, much cheaper, then You could (in same time frame).
The cards are fairly comparable so I'd say the quicker AMD gets those prices down the better. I've been looking at the 970s but I'd opt out for a 290x all pricing being equal.
Why in the world would you opt for a 290x over a gtx970 at an equal price? The 970 has lower power consumption, better overclocking, better acoustics.... it's simply better every way possible.
Once you OC the 970 or 980, there is no contest whatsoever. We just have to keep in mind that most people don't overclock unless the card ships that way by default.
Looking strictly at frames per second, the 290X is neck and neck with the 970 in a lot of games, behind a little in some games, ahead by a lot in others (especially at 4K). Plus you get a ton of free games with the 290X. It's not a terrible choice if all you care about is pure performance.
An OC 290X can even hang with a stock 980 in some games for $200 less, which is a lot of money, halfway to a second 290X. Again, for the enthusiast overclocker, the 290X is not even in the running. The number of cases where a 290X is a better choice than a 970 or 980 is very, very small, but I can understand why some people would jump on the lower price.
There are a lot of options if your going 970 (performance wise..) 290, 290x, 780 would all be excellent alternatives in my opinion. Once you start eyeing that 980 though... There isn't anything comparable (right now..) for single GPUs.
Power consumption is the last thing I think about in my gaming rig because, at the end of the day, you're talking maybe an extra dollar or two a month with heavy usage.
For 1 card configs this is generally true, you probably won't notice too much difference, however, in the case of a 970 vs. 290X, the difference is ~100W, which is definitely noticeable. Just turn on a 100W lightbulb in your office/gaming room and you'll notice a different in just a few minutes. And, if you start getting into multiple card configs, that becomes a massive difference.
Also, the main concern isn't really power costs, or electric bill for most people. At least not for me. It's mainly about ambient room temperatures and using your computer in a comfortable environment. Going low power does make a really big difference, winter is coming sure, so you can just crack a window, but the difference in room temps are really tangible.
"Just turn on a 100W lightbulb in your office/gaming room and you'll notice a different in just a few minutes" You must be out of your mind. Please. Don't listen to this guy, he has no idea what he is talking about. The graphics card isn't going to be making a noticeable difference in ambient room temperature unless you live in a sealed box.
LOL, I guess you haven't used a lightbulb in a while, it's hard to find 100W bulbs nowadays (they're actually quite dangerous), but you can get a decent comparison by using a couple 60W. Or just use 2 more 24" or larger monitors. They're about 50W each. Just goes to show you people don't seem to understand how big of an impact all these electronics have on ambient air temperatures.
You must have ac, garbage disposal, and you don't realize how much harder it needs to work depending on the cpu usage. I don't have ac and I can definitely tell the difference in the summer when my i5 and HD4600 are working when playing civ v compared to web browsing.
Speaking of Civ 5 .. my system bogs down even with a 4790K and a Radeon280. I actually got a message stating my system was running poorly after several hrs of play. Never saw that before. AH well.. Beyond Earth only 8 more sleeps! :D
Average US electricity cost is 13 cents/kwh. So the 100W difference between the cards is 1.3 cents every hour. Let's say you game 40 hours/week, which if you're spending this kind of money on a GPU should be a minimum. 52x40=2080, for ease and vacations we'll say 2000 even. It costs an additional 1.3 cents to run AMD every hour, over 2000 hours that's $26/year. Hardly negligible. Especially considering that extra $30/year or so isn't getting you anything, you're paying more for nothing at all, talk about stupid.
Where do you live that you *only* pay 13 cents/kwh? On my bill I have 17 for the power itself, an addition 9 for the transmission cost, 5 on top of that for some odd "renewable energy surtax" and an extra 2 for "local line maintenance". Why my line maintenance is based on the amount I use, and not a flat fee, is beyond me. Sum total though, my power runs 33~34cents/kwh, not including the administration fees.
Yeah, people say "this card is 20 dollars cheaper it has such better price/performance" and in the next breath say something like "the electricty savings are only 3-4 dollars a month, big deal". Being that people will keep the cards for 2+ years, 3 dollars a month in electricty savings is $36 over 2 years, which almost certainly more than offsets a 20 dollar higher price up front.
And that doesn't even consider extra room cooling costs. For a heavy gamer, it probably is worth it to get a card that uses 100 less watts and performs the same as another card even if the first card costs 50 dollars more.
I don't think it works quite that way.. I usually have my computer on 24/7 but was working out of town for 3months.. the savings on my electrical bill wasn't as much as I'd hoped it would be. I was thinking around $50 a month.. turned out to be 12ish. There are many factors that come into play I am sure and you can't just assume a instant savings of any given amount.
Not sure what your poor estimation of your total electric usage while away compared to when you're there has to do with a pretty well-established difference in energy usage between two systems. It's simple. You pay the utility d dollars per kWhr used. If your system means you use x less kWhr in a month than another system, you are going to pay dx dollars less than if you had the other system. These are the factors that come into play.
1. I don't over clock my cards and don't buy pre overclocked cards. 2. The performance is fairly equal so if the price is as well.. why not? 3. Backorder on 970s whereas there is stock on 290/x 4. Go big or go home. I'd take a 980 but going down to the 970 you have just opened up several different options.
"The 970 has lower power consumption, better overclocking, better acoustics.... it's simply better every way possible."
Except in Company of Heroes 2, which is the only game I really ever play on the PC. However, I'm holding out for the 20nm R9 390/390X cards in early 2015 anyhow. They should blow the GTX970/980 cards completely out of the water.
According to "bench" comparisons on Anand.. it consumes less... but to say "MUCH" well.. no.. not really.. but yeah.. it's a cooler running card, that does consume less power which is always a good thing provided performance is there and it is.
lol its almost 1/2 the power consumption when comparing the 290X to 970, keep in mind, AT benches are total system power. Most would consider 1/2 "MUCH".
no their not.. geez your taking one of the highest overclocked 290x and using it as a comparison. I am looking at stock speeds which is the baseline you should be comparing.
Look Chi, There is no question that the 970 is a great card.. but it's not the 980. It's slower than the 780ti. "IF" you can get a 780ti for the same price (..new) it's a great deal. "IF" you can get a 290x for the same price (.. definitely new don't want a miners card) or cheaper then it's a option.
"IF" you can get a 780 or 290 for 50-100 less.. their also great options but you'll take a 10-20% hit in performance.
No, I am not using the highest overclocked 290X, if you look at the AnandTech GTX 980 Review you would understand:
http://www.anandtech.com/show/8526/nvidia-geforce-... "At this point you can still buy reference 290X cards, but the vast majority of retail cards will be of the non-reference variety, where the 290X Uber mode’s results will be more applicable."
You can't take the 290X's best-case performance and disregard the additional heat that comes with it. So yes, under Uber mode, the difference in power is MUCH lower.
No one contested any of the rest of what you are saying, just clarifying your attempt to correct another poster is clearly inaccurate. The difference is MUCH lower for the 970 over the 290X.
Whilst I'll admit that there's little support for it, there's one thing that AMD sports within the 290X that NVIDIA cannot claim to have - audio processing.
Quieter 290Xs can be found than the reference model.
Interesting, and definitely good for consumers. Speaking of video cards, how about another good video card roundup with recommended cards at the different levels? I'm thinking of getting a new video card soon, and I'd like to see the recommendations. Unless you're hearing about some big new tech coming from either or both, in which case wait until they release it so I have a better comparison :)
Yeah the his can be had for $290Cad in some places to... but that 760? pfft.. good luck getting it at 200 in Canada. It's consistently been a $300 card and looking over recent pricing (..for us) you can buy a 285 cheaper than a 760 which is just plain stupid.
add to that the 770 and 780Ti certainly isn't offering anything close to what I'd consider a bargain here either.. it's not till you get to a 780 that you start to go hmmm.. I wonder but that's comparable to the 970s in performance but with less ram.
No surprise AMD has had to drop prices further, sooner, because they are still competing against Nvidia's last-gen which are getting steep price cuts to move while Maxwell parts are out-of-stock. 780Ti can be had for $400-$420, 780 can be had for $300-$330, so until these parts also sell out AMD will have to compete by dropping the 290X and 290 below those price points.
Well.. if we could actually get a 780ti for $400.. that would be quite nice. Much depends on where you are I think. Here in Canada the cheapest 780ti is $520 (out of stock of course..) with most just dropping their price too 980 levels to make it seem like a deal. New egg in the states has the cheapest 780 at 430. A good option if your looking at the 970 to since it beats it by 8-13% in performance.
forget the rebates.. their a non starter but I'd take a 780ti over a 970 if the price was the same. Wouldn't you? it's 7-13% faster.
The 780 (non ti) is comparable to the 290 (non x) They'd have to undercut by a decent amount to make them tempting buys.. $260ish I'd say would be a sweet spot.
Oh, and rebates are no problem if you can print a few forms, sign your name, and mail an envelope within the requisite time period. Takes like 5 minutes, I get every one of them I send out for.
I get every rebate I send out too, months later and sometimes in the form of an annoying prepaid credit card. Pay now and we might give you some of the money back later is the most annoying practice in the industry.
god I wish there was a edit button.. anyway, it's odd you know I remember people talking the 780.. it being 20% slower than the 780ti but looking at the 970 which is slower than the 780ti it appears to walk all over the 780. Something odd about that.. have to run the math I suppose.
I've actually seen in most of the hot deals threads people are more interested in the 970 over the 780Ti, even though I personally think the 780Ti is a much better built card. Just that power diff and the new Maxwell features I guess in addition to only being ~10% slower than 780Ti, but also a good bit cheaper.
How is 370 a good deal? The cards have practically equal performance but in every other performance metric the gtx 970 is the winner. Uses less power, gives off less heat, makes less noise. If anything the GTX 970 should be the more expensive card since it ties or better the 290x in every way. In all reality if the gtx 970 is 330 then the r9 290x should be like 290 or 300 because the gtx 970 is a slightly better card. plus the 390x is just around the corner. So anyone looking to buy a radeon now is rly best off waiting for their next generation
Firstly not all AMD's partners cards are loud! Sapphire, HIS, ... make models that are moderately quiet. Secondly, what almost every review forgets to mention is that AMD R9 cards still have better compute performance. While it's not important to some, as time passes, it will get more and more important. Especially now that the current generation consoles have enough GPU power (and not a comparable amount of CPU power).
Get those Maxwell GPUs working on compute and their power usage quickly jumps up. Get those Hawaii GPUs on a better implementation of PowerTune and their power usage will drop a little.
Really, all the cries of "AMD iz doomed!!!!1" are premature. I'd be surprised if they didn't have a high-end answer coming within six months.
I'm just curious, if you have any source on next gen Radeon e.g. 390X or whatever which, you say, is just around the corner? Or just rumours currently?
Yes, indeed, everybody in touch has NDA, i guess - this is a usual practice. However, right now, I think it may be beneficial for AMD itself to tease something somehow to prevent some amount of people switching to nV by buying GM204 - of course, if AMD has something really new to tease.
Can someone explain to me why you would place power efficiency high on the list of reasons to pick one over the other? Is electricity that expensive in some places?
It's not the electicity usage, mainly - it's noise, heat, and in some cases it may be the need to upgrade the power supply (PSU) (read: buy a new better PSU also and do something about the old PSU, e.g. sell it for pennies).
It's more about heat output and ambient room temps. Generally, 1 card up to 250W is not going to make a huge difference for most high-end users, that's tolerable I'd say for most, especially if the difference between options is <50W. But once you start tacking on multiple GPUs, with multiple displays and a high-end rig backing it, the difference in ambient room temps in your PC gameroom/office and the rest of your house can be massive.
I generally tell people to do the "lightbulb test" to illustrate the difference. Using a 100W oldschool incandescent bulb will definitely impact room temps in just a few minutes to the point you can feel it. There is no difference here between choosing a card that consumes 150W and 250W in that respect.
Most people are more tolerant in the winter time, when they can just use the PC as a source of heat, or open a window, but in warmer summer months, there is no escaping it unless you crank up the AC.
I can't say I've ever felt a temperature difference from having a 100W incandescent bulb, I suppose the size of the room and its proximity would be important factors, however I have been in a small bathroom with a 250W bulb before and it was quite a thing to behold, both in terms of light and heat output.
If AMD were to eke decent performance gains from their drivers over the next few months, the apparent power consumption gap may not matter so much. Here's to hoping they have a team dedicated to looking at their D3D performance.
Haha yeah 250W might leave you with a bit of a tan even! But yes proximity/size of room definitely matters before you feel it, I use a floor standing lamp next to my PC desk and as soon as I turn it on, I feel the difference in the room in a matter of minutes (my home office is maybe 10'x14'). Over time however, the ambient temps in the room will rise.
But there's other examples people can relate to in their own upgrading/buying experiences, for example, going from an OC'd X58 rig to my current Z87 rig dropped light PC usage temps down ~75W as well, from 200W to ~125W. Also noticeable. And a few years ago, going from 2xGTX 480 to 2xGTX 670 made a massive difference in ambient room temps to the point I won't ever go 2x250W GPUs ever again. I've since swapped those 2x670s to a single 980 and again, big drop, but I will probably go back to 2x980 again in the next few months.
Yes, I agree; regarding Hawaii, its power/heat situation is the reason I decided not to change my current quiet ~ 200 W HD 7950 Boost (with DualX fans), bought a while ago, on Hawaii (especially, I almost don't really use it recently due to a lack of time).
That and 290/290X are right at the limits of their thermal output, quite a lot of them throttle, which means there's no overclocking headroom, unless you swap the cooler it comes with.
A 290 or X variant with the stock cooler is non starter for most I think. They have to overhaul that big time. Nvidia hit the right mark with their stock high end cooler. Even though there is way better out there it has a nice bling feel to it and cools reasonably well to the point where no one minds owning it. AMD has to follow suit.
IMO, just spending a little more $ for reduced power consumption to have a lesser environmental impact is virtuous behaviour. However the 970 also happens to be cheaper than the 290X; so you save $ on the purchase, save $ on power bills and save the environment a little. What's not to like?
But say the prices were reversed, I'd still opt for a 970 because of the lower power consumption.
Good environmental behavior is a good thing. I'm just afraid videocards don't matter much in terms of environmental impact, as long as people here in North America like to use vehicles with 5.7 L engines for personal transportation :)
oooh I just bought a new F-150 and I think it has a V8 variant of that :D (I do carry 32F ladders and usually about 800LB of equipment btw so it's warranted) I was using a 08 Ford Ranger before that with a V6 in it and this one is getting roughly 25% better fuel economy.
They say the 2015s will be even better dropping 700LB in weight and redesigned and while I was tempted to wait... I thought hmm... if this is near total overhaul what's the chances it's going to come with a few kinks that take time to work out.
The environmental impact of any single source will tend to be small in comparison to the aggregate of all sources. Change has to start somewhere.
The most important thing is a message is being sent. A big no no to energy/power inefficiency. When consumers have a preference for lower power consumption, corporations will try and work towards satisfying that preference. Personally I'd like to see the direction into reduced energy consumption spill into other industries too.
There are benefits to you even if you're an individual who is completely indifferent to environmental concerns. Electricity (or any source of energy) will be cheaper when there is less demand for it. It will save everyone $ on power consumption.
Yeah, I'll take the 290X, but please keep that glorified DLC of a game with the incomprehensible Aussie slang, 90's sitcom humor, and the low gravity "feature" that first came out with the original Unreal Engine in '99 - and I say so as a huge fan of the first two games. :(
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
82 Comments
Back to Article
meacupla - Tuesday, October 14, 2014 - link
The gaming bundles from both nvidia and AMD were not very interesting to me, so unfortunately, that extra "value" is lost on me.Just give me a good card, that I don't have to pay a kidney for, and we'll call it a day.
Wreckage - Wednesday, October 15, 2014 - link
AMD needs to drop the 290X to $200 and let people go and buy the games they like.Creig - Wednesday, October 15, 2014 - link
And you need to quit trolling Anandtech. $200? For a 290X? LOL!! It's nearly the same speed as Nvidia's $1,000 Titan! If any company ever needed to drop pricing, it's Nvidia.Hrel - Wednesday, October 15, 2014 - link
In fairness Nvidia is being very aggressive with the 970 and 980, even if the 760 is still over-priced.I'd really like to see them release the GTX960 at an MSRP of $200, tops!
TiGr1982 - Wednesday, October 15, 2014 - link
Let's be realistic: based on GTX 970 starting at $330, when GTX 960 will be released, it will be around $250-260 (I suppose, no less than that) to replace older Kepler-based GTX 760.just4U - Wednesday, October 15, 2014 - link
The 970 is being priced comparable to the 770/670 that came before. The 980 is competitively priced.Because0789 - Wednesday, October 15, 2014 - link
Nope, the 670 & 770 launched at $400 and then the 770 dropped to $330 and then got replaced by the 970 launched at $330.just4U - Thursday, October 16, 2014 - link
Yeah that may be so but here their all on back-order and they range from 370-420 in price. So it seems like it's the same as the 670/770 launch. Our dollar (CAD) is a bit down from what it was though so that's a factor to.meacupla - Wednesday, October 15, 2014 - link
$1000 titan? titan was discontinued, and for a while too, bro.And did you totally miss out on the GTX970?
They are far cheaper than what they replaced, which was GTX780.
przemo_li - Wednesday, October 15, 2014 - link
If they bought those games for full price....But they have not nor have Nvidia.
Big order mean big discount. That is beauty of game bundles.
AMD/Nvidia are able to buy them for You for much, much cheaper, then You could (in same time frame).
just4U - Tuesday, October 14, 2014 - link
The cards are fairly comparable so I'd say the quicker AMD gets those prices down the better. I've been looking at the 970s but I'd opt out for a 290x all pricing being equal.tviceman - Wednesday, October 15, 2014 - link
Why in the world would you opt for a 290x over a gtx970 at an equal price? The 970 has lower power consumption, better overclocking, better acoustics.... it's simply better every way possible.nathanddrews - Wednesday, October 15, 2014 - link
Once you OC the 970 or 980, there is no contest whatsoever. We just have to keep in mind that most people don't overclock unless the card ships that way by default.Looking strictly at frames per second, the 290X is neck and neck with the 970 in a lot of games, behind a little in some games, ahead by a lot in others (especially at 4K). Plus you get a ton of free games with the 290X. It's not a terrible choice if all you care about is pure performance.
An OC 290X can even hang with a stock 980 in some games for $200 less, which is a lot of money, halfway to a second 290X. Again, for the enthusiast overclocker, the 290X is not even in the running. The number of cases where a 290X is a better choice than a 970 or 980 is very, very small, but I can understand why some people would jump on the lower price.
just4U - Wednesday, October 15, 2014 - link
There are a lot of options if your going 970 (performance wise..) 290, 290x, 780 would all be excellent alternatives in my opinion. Once you start eyeing that 980 though... There isn't anything comparable (right now..) for single GPUs.
FITCamaro - Wednesday, October 15, 2014 - link
Power consumption is the last thing I think about in my gaming rig because, at the end of the day, you're talking maybe an extra dollar or two a month with heavy usage.hojnikb - Wednesday, October 15, 2014 - link
yeah, but with higer power also comes louder system :)anubis44 - Wednesday, October 22, 2014 - link
None of the aftermarket R9 290/290X cards (eg. Gigabyte, Asus, MSI, Sapphire, XFX) are loud. They're pretty close to silent. Get your facts straight.chizow - Wednesday, October 15, 2014 - link
For 1 card configs this is generally true, you probably won't notice too much difference, however, in the case of a 970 vs. 290X, the difference is ~100W, which is definitely noticeable. Just turn on a 100W lightbulb in your office/gaming room and you'll notice a different in just a few minutes. And, if you start getting into multiple card configs, that becomes a massive difference.Also, the main concern isn't really power costs, or electric bill for most people. At least not for me. It's mainly about ambient room temperatures and using your computer in a comfortable environment. Going low power does make a really big difference, winter is coming sure, so you can just crack a window, but the difference in room temps are really tangible.
garbagedisposal - Wednesday, October 15, 2014 - link
"Just turn on a 100W lightbulb in your office/gaming room and you'll notice a different in just a few minutes"You must be out of your mind.
Please. Don't listen to this guy, he has no idea what he is talking about. The graphics card isn't going to be making a noticeable difference in ambient room temperature unless you live in a sealed box.
chizow - Wednesday, October 15, 2014 - link
LOL, I guess you haven't used a lightbulb in a while, it's hard to find 100W bulbs nowadays (they're actually quite dangerous), but you can get a decent comparison by using a couple 60W. Or just use 2 more 24" or larger monitors. They're about 50W each. Just goes to show you people don't seem to understand how big of an impact all these electronics have on ambient air temperatures.Yojimbo - Thursday, October 16, 2014 - link
You must have ac, garbage disposal, and you don't realize how much harder it needs to work depending on the cpu usage. I don't have ac and I can definitely tell the difference in the summer when my i5 and HD4600 are working when playing civ v compared to web browsing.just4U - Thursday, October 16, 2014 - link
OT:Speaking of Civ 5 .. my system bogs down even with a 4790K and a Radeon280. I actually got a message stating my system was running poorly after several hrs of play. Never saw that before. AH well.. Beyond Earth only 8 more sleeps! :D
Hrel - Wednesday, October 15, 2014 - link
Average US electricity cost is 13 cents/kwh. So the 100W difference between the cards is 1.3 cents every hour. Let's say you game 40 hours/week, which if you're spending this kind of money on a GPU should be a minimum. 52x40=2080, for ease and vacations we'll say 2000 even. It costs an additional 1.3 cents to run AMD every hour, over 2000 hours that's $26/year. Hardly negligible. Especially considering that extra $30/year or so isn't getting you anything, you're paying more for nothing at all, talk about stupid.takeship - Wednesday, October 15, 2014 - link
Where do you live that you *only* pay 13 cents/kwh? On my bill I have 17 for the power itself, an addition 9 for the transmission cost, 5 on top of that for some odd "renewable energy surtax" and an extra 2 for "local line maintenance". Why my line maintenance is based on the amount I use, and not a flat fee, is beyond me. Sum total though, my power runs 33~34cents/kwh, not including the administration fees.Yojimbo - Thursday, October 16, 2014 - link
Yeah, people say "this card is 20 dollars cheaper it has such better price/performance" and in the next breath say something like "the electricty savings are only 3-4 dollars a month, big deal". Being that people will keep the cards for 2+ years, 3 dollars a month in electricty savings is $36 over 2 years, which almost certainly more than offsets a 20 dollar higher price up front.Yojimbo - Thursday, October 16, 2014 - link
Sorry, 72 dollars over 2 years. In this case it's quite a significant difference.Yojimbo - Thursday, October 16, 2014 - link
And that doesn't even consider extra room cooling costs. For a heavy gamer, it probably is worth it to get a card that uses 100 less watts and performs the same as another card even if the first card costs 50 dollars more.just4U - Thursday, October 16, 2014 - link
I don't think it works quite that way.. I usually have my computer on 24/7 but was working out of town for 3months.. the savings on my electrical bill wasn't as much as I'd hoped it would be. I was thinking around $50 a month.. turned out to be 12ish. There are many factors that come into play I am sure and you can't just assume a instant savings of any given amount.Yojimbo - Friday, October 17, 2014 - link
Not sure what your poor estimation of your total electric usage while away compared to when you're there has to do with a pretty well-established difference in energy usage between two systems. It's simple. You pay the utility d dollars per kWhr used. If your system means you use x less kWhr in a month than another system, you are going to pay dx dollars less than if you had the other system. These are the factors that come into play.just4U - Wednesday, October 15, 2014 - link
1. I don't over clock my cards and don't buy pre overclocked cards.2. The performance is fairly equal so if the price is as well.. why not?
3. Backorder on 970s whereas there is stock on 290/x
4. Go big or go home. I'd take a 980 but going down to the 970 you have just opened up several different options.
anubis44 - Wednesday, October 22, 2014 - link
"The 970 has lower power consumption, better overclocking, better acoustics.... it's simply better every way possible."Except in Company of Heroes 2, which is the only game I really ever play on the PC. However, I'm holding out for the 20nm R9 390/390X cards in early 2015 anyhow. They should blow the GTX970/980 cards completely out of the water.
HJTh3Best - Tuesday, October 14, 2014 - link
No thanks, I'm all green but not because of the Nvida brand, its because the GTX 970 consume MUCH less power than the R9 290X.just4U - Tuesday, October 14, 2014 - link
According to "bench" comparisons on Anand.. it consumes less... but to say "MUCH" well.. no.. not really.. but yeah.. it's a cooler running card, that does consume less power which is always a good thing provided performance is there and it is.mindbomb - Tuesday, October 14, 2014 - link
The difference is pretty significant. It's like a 100 watt difference at load. 100 watts can power like 3 computers with ulv haswells.przemo_li - Wednesday, October 15, 2014 - link
Isn't that stock Nvidia card?OCed GPUs have worse pow/perf ratio. (Power consumption go up faster then frequency)
chizow - Tuesday, October 14, 2014 - link
lol its almost 1/2 the power consumption when comparing the 290X to 970, keep in mind, AT benches are total system power. Most would consider 1/2 "MUCH".just4U - Wednesday, October 15, 2014 - link
12watts difference at idle and 60watts in games according to the bench. That's not half Chi.chizow - Wednesday, October 15, 2014 - link
http://anandtech.com/bench/product/1355?vs=1059290X Uber (this is what aftermarket cards are clocked at) draws 97W more in Furmark and 97W in Crysis than 970.
Back out ~150W for the rest of the system and you see, it is about half
(300-150)=150 and (397-150)=247
(284-150)=134 and (381-150)=231
"MUCH" less power. ;)
just4U - Wednesday, October 15, 2014 - link
no their not.. geez your taking one of the highest overclocked 290x and using it as a comparison. I am looking at stock speeds which is the baseline you should be comparing.Look Chi, There is no question that the 970 is a great card.. but it's not the 980. It's slower than the 780ti. "IF" you can get a 780ti for the same price (..new) it's a great deal. "IF" you can get a 290x for the same price (.. definitely new don't want a miners card) or cheaper then it's a option.
"IF" you can get a 780 or 290 for 50-100 less.. their also great options but you'll take a 10-20% hit in performance.
garbagedisposal - Wednesday, October 15, 2014 - link
Your love for nvidia is so strong. You must think about their products every day before you go to sleep. What a sad existence.chizow - Wednesday, October 15, 2014 - link
No, I am not using the highest overclocked 290X, if you look at the AnandTech GTX 980 Review you would understand:http://www.anandtech.com/show/8526/nvidia-geforce-...
"At this point you can still buy reference 290X cards, but the vast majority of retail cards will be of the non-reference variety, where the 290X Uber mode’s results will be more applicable."
You can't take the 290X's best-case performance and disregard the additional heat that comes with it. So yes, under Uber mode, the difference in power is MUCH lower.
No one contested any of the rest of what you are saying, just clarifying your attempt to correct another poster is clearly inaccurate. The difference is MUCH lower for the 970 over the 290X.
Creig - Wednesday, October 15, 2014 - link
Well, if you really wanted to save electricity, you should just stick to integrated video then.Wreckage - Wednesday, October 15, 2014 - link
The 290x is louder and puts out more heat. It performs about the same, but costs more and has fewer features.silverblue - Wednesday, October 15, 2014 - link
Whilst I'll admit that there's little support for it, there's one thing that AMD sports within the 290X that NVIDIA cannot claim to have - audio processing.Quieter 290Xs can be found than the reference model.
Drizzt321 - Tuesday, October 14, 2014 - link
Interesting, and definitely good for consumers. Speaking of video cards, how about another good video card roundup with recommended cards at the different levels? I'm thinking of getting a new video card soon, and I'd like to see the recommendations. Unless you're hearing about some big new tech coming from either or both, in which case wait until they release it so I have a better comparison :)meacupla - Tuesday, October 14, 2014 - link
I've seen XFX R9 290 for as low as C$280, which is like US$250just4U - Tuesday, October 14, 2014 - link
Yeah the his can be had for $290Cad in some places to... but that 760? pfft.. good luck getting it at 200 in Canada. It's consistently been a $300 card and looking over recent pricing (..for us) you can buy a 285 cheaper than a 760 which is just plain stupid.just4U - Tuesday, October 14, 2014 - link
add to that the 770 and 780Ti certainly isn't offering anything close to what I'd consider a bargain here either.. it's not till you get to a 780 that you start to go hmmm.. I wonder but that's comparable to the 970s in performance but with less ram.anubis44 - Wednesday, October 22, 2014 - link
Not to mention that any decent GTX970 is very close to or over C$400 in Canada. Not exactly the 'bargain' it's supposed to be in all the reviews.anubis44 - Wednesday, October 22, 2014 - link
You can get an XFX R9 290 for C$269 at NCIX right now, with $30 MIR:http://www.ncix.com/detail/xfx-radeon-r9-290-doubl...
chizow - Tuesday, October 14, 2014 - link
No surprise AMD has had to drop prices further, sooner, because they are still competing against Nvidia's last-gen which are getting steep price cuts to move while Maxwell parts are out-of-stock. 780Ti can be had for $400-$420, 780 can be had for $300-$330, so until these parts also sell out AMD will have to compete by dropping the 290X and 290 below those price points.just4U - Wednesday, October 15, 2014 - link
Well.. if we could actually get a 780ti for $400.. that would be quite nice. Much depends on where you are I think. Here in Canada the cheapest 780ti is $520 (out of stock of course..) with most just dropping their price too 980 levels to make it seem like a deal. New egg in the states has the cheapest 780 at 430. A good option if your looking at the 970 to since it beats it by 8-13% in performance.chizow - Wednesday, October 15, 2014 - link
Newegg has 780Ti for $400 after rebate, 2 options from PNY for $399.99 and another for $409.99http://www.newegg.com/Product/ProductList.aspx?Sub...
There have even been a few drops to $380 in Shell Shocker or Deals of the Day.
780s are going for $300, some as low as $280 with promo codes.
just4U - Wednesday, October 15, 2014 - link
forget the rebates.. their a non starter but I'd take a 780ti over a 970 if the price was the same. Wouldn't you? it's 7-13% faster.The 780 (non ti) is comparable to the 290 (non x) They'd have to undercut by a decent amount to make them tempting buys.. $260ish I'd say would be a sweet spot.
chizow - Wednesday, October 15, 2014 - link
Oh, and rebates are no problem if you can print a few forms, sign your name, and mail an envelope within the requisite time period. Takes like 5 minutes, I get every one of them I send out for.Flunk - Thursday, October 16, 2014 - link
I get every rebate I send out too, months later and sometimes in the form of an annoying prepaid credit card. Pay now and we might give you some of the money back later is the most annoying practice in the industry.just4U - Wednesday, October 15, 2014 - link
god I wish there was a edit button.. anyway, it's odd you know I remember people talking the 780.. it being 20% slower than the 780ti but looking at the 970 which is slower than the 780ti it appears to walk all over the 780. Something odd about that.. have to run the math I suppose.chizow - Wednesday, October 15, 2014 - link
I've actually seen in most of the hot deals threads people are more interested in the 970 over the 780Ti, even though I personally think the 780Ti is a much better built card. Just that power diff and the new Maxwell features I guess in addition to only being ~10% slower than 780Ti, but also a good bit cheaper.Laststop311 - Wednesday, October 15, 2014 - link
How is 370 a good deal? The cards have practically equal performance but in every other performance metric the gtx 970 is the winner. Uses less power, gives off less heat, makes less noise. If anything the GTX 970 should be the more expensive card since it ties or better the 290x in every way. In all reality if the gtx 970 is 330 then the r9 290x should be like 290 or 300 because the gtx 970 is a slightly better card. plus the 390x is just around the corner. So anyone looking to buy a radeon now is rly best off waiting for their next generationmilli - Wednesday, October 15, 2014 - link
Firstly not all AMD's partners cards are loud! Sapphire, HIS, ... make models that are moderately quiet.Secondly, what almost every review forgets to mention is that AMD R9 cards still have better compute performance. While it's not important to some, as time passes, it will get more and more important. Especially now that the current generation consoles have enough GPU power (and not a comparable amount of CPU power).
przemo_li - Wednesday, October 15, 2014 - link
And that Mantle thing in few games, add extra bit.silverblue - Wednesday, October 15, 2014 - link
Get those Maxwell GPUs working on compute and their power usage quickly jumps up. Get those Hawaii GPUs on a better implementation of PowerTune and their power usage will drop a little.Really, all the cries of "AMD iz doomed!!!!1" are premature. I'd be surprised if they didn't have a high-end answer coming within six months.
TiGr1982 - Wednesday, October 15, 2014 - link
I'm just curious, if you have any source on next gen Radeon e.g. 390X or whatever which, you say, is just around the corner? Or just rumours currently?The Von Matrices - Wednesday, October 15, 2014 - link
If they have any official information, it's highly likely that they're under an NDA anyway and can't tell you it's official.TiGr1982 - Wednesday, October 15, 2014 - link
Yes, indeed, everybody in touch has NDA, i guess - this is a usual practice.However, right now, I think it may be beneficial for AMD itself to tease something somehow to prevent some amount of people switching to nV by buying GM204 - of course, if AMD has something really new to tease.
IntelligentAj - Wednesday, October 15, 2014 - link
Can someone explain to me why you would place power efficiency high on the list of reasons to pick one over the other? Is electricity that expensive in some places?TiGr1982 - Wednesday, October 15, 2014 - link
It's not the electicity usage, mainly - it's noise, heat, and in some cases it may be the need to upgrade the power supply (PSU) (read: buy a new better PSU also and do something about the old PSU, e.g. sell it for pennies).chizow - Wednesday, October 15, 2014 - link
It's more about heat output and ambient room temps. Generally, 1 card up to 250W is not going to make a huge difference for most high-end users, that's tolerable I'd say for most, especially if the difference between options is <50W. But once you start tacking on multiple GPUs, with multiple displays and a high-end rig backing it, the difference in ambient room temps in your PC gameroom/office and the rest of your house can be massive.I generally tell people to do the "lightbulb test" to illustrate the difference. Using a 100W oldschool incandescent bulb will definitely impact room temps in just a few minutes to the point you can feel it. There is no difference here between choosing a card that consumes 150W and 250W in that respect.
Most people are more tolerant in the winter time, when they can just use the PC as a source of heat, or open a window, but in warmer summer months, there is no escaping it unless you crank up the AC.
silverblue - Wednesday, October 15, 2014 - link
I can't say I've ever felt a temperature difference from having a 100W incandescent bulb, I suppose the size of the room and its proximity would be important factors, however I have been in a small bathroom with a 250W bulb before and it was quite a thing to behold, both in terms of light and heat output.If AMD were to eke decent performance gains from their drivers over the next few months, the apparent power consumption gap may not matter so much. Here's to hoping they have a team dedicated to looking at their D3D performance.
chizow - Wednesday, October 15, 2014 - link
Haha yeah 250W might leave you with a bit of a tan even! But yes proximity/size of room definitely matters before you feel it, I use a floor standing lamp next to my PC desk and as soon as I turn it on, I feel the difference in the room in a matter of minutes (my home office is maybe 10'x14'). Over time however, the ambient temps in the room will rise.But there's other examples people can relate to in their own upgrading/buying experiences, for example, going from an OC'd X58 rig to my current Z87 rig dropped light PC usage temps down ~75W as well, from 200W to ~125W. Also noticeable. And a few years ago, going from 2xGTX 480 to 2xGTX 670 made a massive difference in ambient room temps to the point I won't ever go 2x250W GPUs ever again. I've since swapped those 2x670s to a single 980 and again, big drop, but I will probably go back to 2x980 again in the next few months.
TiGr1982 - Wednesday, October 15, 2014 - link
2 GTX 480 - that was quite a grill, I guess :)(480 was fast for its time, but not exactly power efficient even by 40 nm GPU standards :) )
chizow - Wednesday, October 15, 2014 - link
Hehe yep, extremely hot and power hungry, but not quite as bad as 290/290X ;)But yes that's exactly why I won't go 2x250W GPUs again.
TiGr1982 - Wednesday, October 15, 2014 - link
Yes, I agree; regarding Hawaii, its power/heat situation is the reason I decided not to change my current quiet ~ 200 W HD 7950 Boost (with DualX fans), bought a while ago, on Hawaii (especially, I almost don't really use it recently due to a lack of time).meacupla - Wednesday, October 15, 2014 - link
Heat and power constraints in an mITX system?That and 290/290X are right at the limits of their thermal output, quite a lot of them throttle, which means there's no overclocking headroom, unless you swap the cooler it comes with.
just4U - Wednesday, October 15, 2014 - link
A 290 or X variant with the stock cooler is non starter for most I think. They have to overhaul that big time. Nvidia hit the right mark with their stock high end cooler. Even though there is way better out there it has a nice bling feel to it and cools reasonably well to the point where no one minds owning it. AMD has to follow suit.aphroken - Wednesday, October 15, 2014 - link
IMO, just spending a little more $ for reduced power consumption to have a lesser environmental impact is virtuous behaviour. However the 970 also happens to be cheaper than the 290X; so you save $ on the purchase, save $ on power bills and save the environment a little. What's not to like?But say the prices were reversed, I'd still opt for a 970 because of the lower power consumption.
TiGr1982 - Wednesday, October 15, 2014 - link
Good environmental behavior is a good thing. I'm just afraid videocards don't matter much in terms of environmental impact, as long as people here in North America like to use vehicles with 5.7 L engines for personal transportation :)just4U - Thursday, October 16, 2014 - link
oooh I just bought a new F-150 and I think it has a V8 variant of that :D (I do carry 32F ladders and usually about 800LB of equipment btw so it's warranted) I was using a 08 Ford Ranger before that with a V6 in it and this one is getting roughly 25% better fuel economy.They say the 2015s will be even better dropping 700LB in weight and redesigned and while I was tempted to wait... I thought hmm... if this is near total overhaul what's the chances it's going to come with a few kinks that take time to work out.
aphroken - Thursday, October 16, 2014 - link
The environmental impact of any single source will tend to be small in comparison to the aggregate of all sources. Change has to start somewhere.The most important thing is a message is being sent. A big no no to energy/power inefficiency. When consumers have a preference for lower power consumption, corporations will try and work towards satisfying that preference. Personally I'd like to see the direction into reduced energy consumption spill into other industries too.
There are benefits to you even if you're an individual who is completely indifferent to environmental concerns. Electricity (or any source of energy) will be cheaper when there is less demand for it. It will save everyone $ on power consumption.
D. Lister - Friday, October 17, 2014 - link
Yeah, I'll take the 290X, but please keep that glorified DLC of a game with the incomprehensible Aussie slang, 90's sitcom humor, and the low gravity "feature" that first came out with the original Unreal Engine in '99 - and I say so as a huge fan of the first two games. :(SeanJ76 - Sunday, October 26, 2014 - link
AMD=GARBAGESeanJ76 - Sunday, October 26, 2014 - link
290X is easily beaten by the 330$ 970GTX....... why would anyone buy a 290X??