"This is what we call the trifecta of graphics hardware: better performance, lower power, and lower prices. When NVIDIA unveiled the GTX 750 Ti back in February, it achieved the same trifecta for the $150 market segment" No it didn't. Lower power, yes, but performance per $ of the GTX 750 Ti was below that of its competitors.
Sorry, I guess I should have clarified that a bit more. I've tweaked the text now. It's true that AMD had (and still has) very competitive parts for the $150 market, though it's really a case of making a sacrifice on power to get higher performance. GM107 is a more balanced part overall I think, but like all $150 GPUs you have to compromise on some areas.
In my opinion, the point of GeForce and Radeon is gaming, so gaming performance should be the last function to be compromised. Reduced power/heat is nice, but not if it doesn't fulfill its primary function at the appropriate price.
"Cool, is that the new 750 Ti?" "Yeah dude, it only draws like 60W so it doesn't even need an extra power connector!" "How much did it cost?" "$165... but the long-term costs of running the card will..." "How does it play?" "Well, it's OK, but it's slower than AMD's less expensive cards."
Requiring only single PCI-E connector is critical since most people have a PSU that have one of these. I upgraded to the GTX 660 since it was the best performing card at the time that only required one PCI-E connector.
I regularly see EVGA 750 Ti's for $120-$130 AR. I honestly didn't know they MSRP for $150 until now. I've purchased two for builds and they've always been less than that when I looked.
Yeah I got an EVGA 750TI FTW for $125 including shipping and tax. $165 is definitely too much too pay. For the price I got it, it's a great card and overclocks very well.
Probably a 990 before TitanZ anytime Nvidia wants to bring it to market. We shouldn't be asking for another $3k failboat though. Titan 2 and Z2 will probably be 2015 on 20nm with GM200 Big Maxwell.
Yeah, "Big Maxwell" is still not here I don't think -- this is the Large Maxwell, but the replacement for GK110 is probably going to come out in another 6-9 months knowing NVIDIA. They've already shown with GK110 that they can do an even larger chip, so I suspect we'll see something like 3072 CUDA cores in a 550-600 mm2 chip next year. Will it be GM200 or GM210, though, that's the question!
I should also note that all signs are NVIDIA and AMD are completely skipping 20nm and will wait for the FinFET version (which is sort of 16nm, depending on how you want to look at it). I don't think we'll see that until late 2015 at best, but TSMC is welcome to prove me wrong.
would be good if you could point to what those signs are! some of us are hesitant about buying right now since we don't want to get burned by a 20nm maxwell refresh showing up in a few short months.
Don't wait for the next round if you see a good reason to upgrade now. Otherwise wait, as always. The reason is pretty simple: it costs nVidia a lot to perform such a large scale launch and produce those new chips (big Maxwell 2 and GM206 are also coming, says the rumor mill). They wouldn't do this if they'd plan to introduce new chips soon. The GPU update cycles have become to long, and the cost for them to high to perform any short-lived stunts. That's why AMD was not updating their entire lineup straight to GCN 1.1, and is not performing a full update to GCN 1.2 now either.
The only emotion you should ever expect TSMC to invoke is disappointment. I hate to say that the people claiming that Moore's law is dead were right but it's time to face some facts here.
Ah yes, good point, forgot the latest rumors were that GM200/210 would also be a monster 600+mm^2 chip on 28nm. With Maxwell's efficiency, it should still come in at ~GK210 TDP and will probably be a similar increase over GK210 as GM204 was over GK104.
Indeed, since "medium" Maxwell GK204 (GTX 980/970) has 16 SMMs, the really big one (GM210 or GM200, whatever the naming will be) is expected to have anything from 24 to 30 SMMs, putting the CUDA core count from 3072 to 3840. If the last case, then one will get 3840 CUDA cores for 4K, 3840x2160 :)
I have never actually been able to test 10-bit color, but my understanding is that this is an application issue along with a driver issue -- current hardware can already do 10-bit color, but NVIDIA at least only exposes this on Quadro cards. But there are some that say it works on GeForce with Linux drivers. TBH, if you're an imaging professional that really needs 10-bit color output and you know how to get it, then you can probably afford paying a premium to get a Quadro. And if you're not an imaging professional, I think the real world benefit will be negligible at best. (Aren't the 10-bit LCDs really just 8-bit panels with FRC anyway?)
Thx a lot Jarred, i was thinking because all the >40" 4K TVs are advertised having 10bit panels & Billions of colors. Together with 4K specifications with vastly improved color space support for UHD (Rec.2020) & BluRay 4K (xvColor) (and finally 60p & 4:4:4 chroma subsampling) i thought the only missing piece to be the gpu... As a prosumer obviously $4000 for a gtx780 equivalent gpu for gaming is just silly. But i am still interested in a 4K 10bit capable wide color gamut TV since thats a 10 year investment and thus be future proof... so thanks for pointing my eyes to FRC and 10bit PR niceties!
GTX 980 is GM204, not GM207; there is a doubt there will be GM207 (2nd gen small Maxwell) because there is already 1st gen small Maxwell out there, GM107 (GTX 750 Ti/GTX 750), and it seems to be good enough already in its class.
"NVIDIA doesn't disappoint, however, dropping power consumption by 18% relative to the GTX 780 Ti while improving performance by roughly 10% and dropping the launch price by just over 20%."
That 18% change is total system power consumption. Since you are comparing performance changes due to the graphics card alone, it is appropriate to compare the change in power consumption due to the graphics card alone.
The GTX 970 looks to be about as close to the perfect GPU as one could expect in 2014. I'm tempted to upgrade, but my OC'd Radeon 7850 should tide me over until Pascal. By then hopefully there will be some DX12 games to take advantage of all this great hardware.
I'm as excited now as I was for the GTX 670 launch (which is a lot). While AMD tempted me temporarily with their R9 2XX series, their throttling issues and huge power consumption kept me away. I will have a GTX 970. I just hope AMD responds with either a big price drop or new GPUs to keep the performance/price war going.
I'd hate to be schmuck who finally decided to cave and go ahead and purchase a new PC with a Titan Black GPU, and less than two weeks later see he could have afforded dual 980s for the same price, less power consumption and undoubtedly far better performance.
My question to you would be; if you're both wealthy enough to afford those GPUs and enough of an enthusiast to be reading and commenting this article, why on earth would you have bought such pricey hardware literally weeks before a new launch? It's not as if it was a well kept secret, it's been known for months that September was the golden month. Can never get my head around people spending so much money with so little research into the product beforehand.
Super computer builders are going to love the 980, or its fully unlocked Quadro/Titan variant with 8-12GB GDDR5. A SC built with those, plus the new Xeons, is going to put up some record breaking numbers in performance and efficiency.
Presumably, based on how things were with Kepler GPUs, GK204 Maxwell GPU has relatively low FP64 performance, so its usage in HPC (High Performance Computing) will be somewhat limited.
The supposed Big Maxwell (GM200 or GM210, whatever the name), however, is then expected to be a computing monster with full FP64 performance.
Wow, I'm pointlessly saying what I said in response to the other argument...that is really impressive. Noticeably better performance than a Geforce 780 TI for 64 watts LESS power on the same die process.
NICE!
Even if the 980 was the end of it, it would STILL be a nice replacement given significantly less power with better performance, but of course things get even more exciting when you consider they can probably go bigger than the 980 next year...think of a part with 2880 or whatever cores with Maxwell's efficiency... It's equally as exciting as notebooks. From my mobile 680 to the mobile 880 they've managed to (surprisingly) eke out quite a nice boost of performance between having one more set of cores active and surprisingly faster clocks, but Maxwell will be a really nice boost on top of that.
Hmm...I wonder how the Geforce GTX 750 does with Folding at Home versus a Geforce GT 430... I just realized that the 750 MIGHT be low power enough to work in a lame "slim" system with half height cards and a small PSU. I've had it Folding for the past 4.5 years on a GT 430, without much better I could stick in there...I need to look into this.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
41 Comments
Back to Article
Kalelovil - Friday, September 19, 2014 - link
"This is what we call the trifecta of graphics hardware: better performance, lower power, and lower prices. When NVIDIA unveiled the GTX 750 Ti back in February, it achieved the same trifecta for the $150 market segment"No it didn't. Lower power, yes, but performance per $ of the GTX 750 Ti was below that of its competitors.
JarredWalton - Friday, September 19, 2014 - link
Sorry, I guess I should have clarified that a bit more. I've tweaked the text now. It's true that AMD had (and still has) very competitive parts for the $150 market, though it's really a case of making a sacrifice on power to get higher performance. GM107 is a more balanced part overall I think, but like all $150 GPUs you have to compromise on some areas.nathanddrews - Friday, September 19, 2014 - link
In my opinion, the point of GeForce and Radeon is gaming, so gaming performance should be the last function to be compromised. Reduced power/heat is nice, but not if it doesn't fulfill its primary function at the appropriate price."Cool, is that the new 750 Ti?"
"Yeah dude, it only draws like 60W so it doesn't even need an extra power connector!"
"How much did it cost?"
"$165... but the long-term costs of running the card will..."
"How does it play?"
"Well, it's OK, but it's slower than AMD's less expensive cards."
crimson117 - Friday, September 19, 2014 - link
"But it was the best card that my Dell could handle without upgrading the PSU, and it beats my old integrated GPU by far, so I went with it!"davidcTecher - Saturday, September 27, 2014 - link
Requiring only single PCI-E connector is critical since most people have a PSU that have one of these. I upgraded to the GTX 660 since it was the best performing card at the time that only required one PCI-E connector.MadMan007 - Friday, September 19, 2014 - link
Hmm, then good thing I got my MSI GTX 750 Ti for <$100 all told, after an AMEX discount, MIR, and selling the game coupon that came with it.nathanddrews - Friday, September 19, 2014 - link
You are clearly the minority on that one, but I congratulate you!Samus - Saturday, September 20, 2014 - link
I regularly see EVGA 750 Ti's for $120-$130 AR. I honestly didn't know they MSRP for $150 until now. I've purchased two for builds and they've always been less than that when I looked.aenews - Saturday, September 20, 2014 - link
Yeah I got an EVGA 750TI FTW for $125 including shipping and tax. $165 is definitely too much too pay. For the price I got it, it's a great card and overclocks very well.Samus - Saturday, September 20, 2014 - link
Wait, come again, a GTX 750Ti is as fast as a GTX 660 and R7 270X, both which cost more?Kalelovil - Saturday, September 20, 2014 - link
No, it isn't.RU482 - Friday, September 19, 2014 - link
any idea if a Maxwell refresh of the Titan-Z is in the works?chizow - Friday, September 19, 2014 - link
Probably a 990 before TitanZ anytime Nvidia wants to bring it to market. We shouldn't be asking for another $3k failboat though. Titan 2 and Z2 will probably be 2015 on 20nm with GM200 Big Maxwell.JarredWalton - Friday, September 19, 2014 - link
Yeah, "Big Maxwell" is still not here I don't think -- this is the Large Maxwell, but the replacement for GK110 is probably going to come out in another 6-9 months knowing NVIDIA. They've already shown with GK110 that they can do an even larger chip, so I suspect we'll see something like 3072 CUDA cores in a 550-600 mm2 chip next year. Will it be GM200 or GM210, though, that's the question!JarredWalton - Friday, September 19, 2014 - link
I should also note that all signs are NVIDIA and AMD are completely skipping 20nm and will wait for the FinFET version (which is sort of 16nm, depending on how you want to look at it). I don't think we'll see that until late 2015 at best, but TSMC is welcome to prove me wrong.maximumGPU - Friday, September 19, 2014 - link
would be good if you could point to what those signs are! some of us are hesitant about buying right now since we don't want to get burned by a 20nm maxwell refresh showing up in a few short months.MrSpadge - Friday, September 19, 2014 - link
Don't wait for the next round if you see a good reason to upgrade now. Otherwise wait, as always. The reason is pretty simple: it costs nVidia a lot to perform such a large scale launch and produce those new chips (big Maxwell 2 and GM206 are also coming, says the rumor mill). They wouldn't do this if they'd plan to introduce new chips soon. The GPU update cycles have become to long, and the cost for them to high to perform any short-lived stunts. That's why AMD was not updating their entire lineup straight to GCN 1.1, and is not performing a full update to GCN 1.2 now either.willis936 - Friday, September 19, 2014 - link
The only emotion you should ever expect TSMC to invoke is disappointment. I hate to say that the people claiming that Moore's law is dead were right but it's time to face some facts here.chizow - Saturday, September 20, 2014 - link
Ah yes, good point, forgot the latest rumors were that GM200/210 would also be a monster 600+mm^2 chip on 28nm. With Maxwell's efficiency, it should still come in at ~GK210 TDP and will probably be a similar increase over GK210 as GM204 was over GK104.TiGr1982 - Monday, September 22, 2014 - link
Indeed, since "medium" Maxwell GK204 (GTX 980/970) has 16 SMMs, the really big one (GM210 or GM200, whatever the naming will be) is expected to have anything from 24 to 30 SMMs, putting the CUDA core count from 3072 to 3840. If the last case, then one will get 3840 CUDA cores for 4K, 3840x2160 :)chizow - Friday, September 19, 2014 - link
Great write-up Jarred. Concise and packed full of info. You sure that's only 1k words? :)JarredWalton - Friday, September 19, 2014 - link
I rounded down from 1040 (1049 if you count the title). Hahaha.MadMan007 - Friday, September 19, 2014 - link
So it's 1 kibiword user space, with some overprovisioning?bernstein - Friday, September 19, 2014 - link
@JarredWalton do you know if the gtx 980 (gm207) can output 10-bit color depth on a 10bit panel?JarredWalton - Monday, September 22, 2014 - link
I have never actually been able to test 10-bit color, but my understanding is that this is an application issue along with a driver issue -- current hardware can already do 10-bit color, but NVIDIA at least only exposes this on Quadro cards. But there are some that say it works on GeForce with Linux drivers. TBH, if you're an imaging professional that really needs 10-bit color output and you know how to get it, then you can probably afford paying a premium to get a Quadro. And if you're not an imaging professional, I think the real world benefit will be negligible at best. (Aren't the 10-bit LCDs really just 8-bit panels with FRC anyway?)bernstein - Monday, September 22, 2014 - link
Thx a lot Jarred, i was thinking because all the >40" 4K TVs are advertised having 10bit panels & Billions of colors. Together with 4K specifications with vastly improved color space support for UHD (Rec.2020) & BluRay 4K (xvColor) (and finally 60p & 4:4:4 chroma subsampling) i thought the only missing piece to be the gpu... As a prosumer obviously $4000 for a gtx780 equivalent gpu for gaming is just silly. But i am still interested in a 4K 10bit capable wide color gamut TV since thats a 10 year investment and thus be future proof... so thanks for pointing my eyes to FRC and 10bit PR niceties!TiGr1982 - Monday, September 22, 2014 - link
GTX 980 is GM204, not GM207; there is a doubt there will be GM207 (2nd gen small Maxwell) because there is already 1st gen small Maxwell out there, GM107 (GTX 750 Ti/GTX 750), and it seems to be good enough already in its class.chizow - Saturday, September 20, 2014 - link
Haha nice, impressive amount of info in such a low word count, again nice job.wtfbbqlol - Friday, September 19, 2014 - link
"NVIDIA doesn't disappoint, however, dropping power consumption by 18% relative to the GTX 780 Ti while improving performance by roughly 10% and dropping the launch price by just over 20%."That 18% change is total system power consumption. Since you are comparing performance changes due to the graphics card alone, it is appropriate to compare the change in power consumption due to the graphics card alone.
Stochastic - Friday, September 19, 2014 - link
The GTX 970 looks to be about as close to the perfect GPU as one could expect in 2014. I'm tempted to upgrade, but my OC'd Radeon 7850 should tide me over until Pascal. By then hopefully there will be some DX12 games to take advantage of all this great hardware.tomvs123 - Friday, September 19, 2014 - link
I'm as excited now as I was for the GTX 670 launch (which is a lot). While AMD tempted me temporarily with their R9 2XX series, their throttling issues and huge power consumption kept me away. I will have a GTX 970. I just hope AMD responds with either a big price drop or new GPUs to keep the performance/price war going.Treynolds416 - Saturday, September 20, 2014 - link
1000 words? You could have cut a couple to make it 980Subyman - Saturday, September 20, 2014 - link
Cutting a couple would make it 998 words though. :PPhelim - Saturday, September 20, 2014 - link
I'd hate to be schmuck who finally decided to cave and go ahead and purchase a new PC with a Titan Black GPU, and less than two weeks later see he could have afforded dual 980s for the same price, less power consumption and undoubtedly far better performance.Yeah, I'd hate to be that idiot.
*Hangs head*
JarredWalton - Saturday, September 20, 2014 - link
You still get much higher FP64 performance at least. If you need that, it's useful to have Titan.D. Lister - Sunday, September 21, 2014 - link
Aww man... :(Syphadeus - Sunday, September 21, 2014 - link
My question to you would be; if you're both wealthy enough to afford those GPUs and enough of an enthusiast to be reading and commenting this article, why on earth would you have bought such pricey hardware literally weeks before a new launch? It's not as if it was a well kept secret, it's been known for months that September was the golden month. Can never get my head around people spending so much money with so little research into the product beforehand.D. Lister - Sunday, September 21, 2014 - link
Super computer builders are going to love the 980, or its fully unlocked Quadro/Titan variant with 8-12GB GDDR5. A SC built with those, plus the new Xeons, is going to put up some record breaking numbers in performance and efficiency.TiGr1982 - Sunday, September 21, 2014 - link
Presumably, based on how things were with Kepler GPUs, GK204 Maxwell GPU has relatively low FP64 performance, so its usage in HPC (High Performance Computing) will be somewhat limited.The supposed Big Maxwell (GM200 or GM210, whatever the name), however, is then expected to be a computing monster with full FP64 performance.
Wolfpup - Wednesday, September 24, 2014 - link
Wow, I'm pointlessly saying what I said in response to the other argument...that is really impressive. Noticeably better performance than a Geforce 780 TI for 64 watts LESS power on the same die process.NICE!
Even if the 980 was the end of it, it would STILL be a nice replacement given significantly less power with better performance, but of course things get even more exciting when you consider they can probably go bigger than the 980 next year...think of a part with 2880 or whatever cores with Maxwell's efficiency... It's equally as exciting as notebooks. From my mobile 680 to the mobile 880 they've managed to (surprisingly) eke out quite a nice boost of performance between having one more set of cores active and surprisingly faster clocks, but Maxwell will be a really nice boost on top of that.
Wolfpup - Wednesday, September 24, 2014 - link
Hmm...I wonder how the Geforce GTX 750 does with Folding at Home versus a Geforce GT 430... I just realized that the 750 MIGHT be low power enough to work in a lame "slim" system with half height cards and a small PSU. I've had it Folding for the past 4.5 years on a GT 430, without much better I could stick in there...I need to look into this.