what this means in english is you are going to see these cards in prebuilt HP, Dells, Alienwares, Gateways, Acers, Asus, etc but not on the normal market where you can buy the cards by themselves at newegg or best buy.
I cannot see where is the crisis and how is nvidia superior, especially when these 660 arrived half year late (AMD is already preparing for 8000 series)......are you just an nV fan? :)
$200 would be a good price point, but hard for Nvidia. Its the same core as the $500 card so their cost is not very low. Only way I can see it being where it needs to be, $200 range, is if yields are way up on the 28nm line.
I suspect this card actually means the opposite. Continuing disappointing yields have resulted in enough chips with problem areas to offer a farther cut down card. Based on the relative die areas involved; I suspect they probably have more chips that will fit into this bin than the 660 TI's.
It may also be that the yields have improved but they've built up a stockpile of chips that aren't up to snuff for a 660 TI but are still usable. This might also be a relatively short lived OEM product that ends up being replaced by a new card based on a new chip instead of harvested GK104 chips.
LMAO - the usual rage3d amd fans wallowing in their piss faced hatred, pretending to be a CEO or a profit margin professional attempting to tell the successful company what to do and how to run their business, as if AMD ever made a profit which they haven't and nVidia ALWAYS does. The little idiot rascals should do their dummy speculation when amd releases a losing money card - oh that's every single time ! How about the little parrot headed fools resume up for AMD ? Lots of talkie talk ahead for them... oh wait... when amd releases a card they tell us the oceans have parted... no time for telling amd how run it's losng business model better - why try to improve upon failure... they're so smart they can analyze how nVidia can't possibly make money this time (again) and tell us all how nVidia should run their biz - because they really, really really smart- way smarter than the successful and profit margin a plenty nVidia. Yeah, I am so glad those amd fans are so doggone schmartie schmart.
Hey looking like an IDIOT is called "fitting in with the stupid crowd of PURE SPECULATION morons". At least looking the same, I am the only one who is correct. Sucks to be you.
You're the fifth person saying he's either stupid, crazy or simply a madman. From the comments of the 660 ti and now here.
My philosophy teacher once told me something I really liked and to this day, it's my favorite statement. He was saying:
If someone says you're a horse, you can doubt that it's true. If a second person walks to you and says you're a horse you can still doubt, but if a third person sayd the same thing again, you should start thinking about buying a saddle.
He's at his fifth time in less than a week but there's nothing to do with him, he still thinks he's sane to spread so much hate about AMD the way he does it. The best it will do to write like you do, it's that people will be REALLY affraid to buy Nvidia cards for the sake of becoming just a little bit more like you.
A $200 price point might be a huge loss or it might not. It depends on if the 680 core that wound up being reused across the 670, 660 Ti, and now the 660 OEM are truly the 560/560 Ti/460 replacements they seem to be.
If so, then it's not a huge loss for nVidia to use these flawed chips for lower end cards because they were built from the ground up to be mainstream GPU's to begin with.
Sure, they're "losing" money by not selling them as 670's or 680's, but the fact is you can only sell so many $400 and $500 GPU's. The market for that's only so big. Lots of consumers want to buy 560/560 Ti/460-class GPU's. The market for THAT is so much bigger and if nVidia has a good supply of chips just waiting on consumers to buy them, they might chop some performance off and throw 'em out there rather than let chips collect dust waiting on consumers to decide to buy $400+ video cards. Especially when said chips are so much smaller than competition's GPU's and they have basically one product line at 28nm while their competitor is busy churning out several 28nm products, each one being squeezed by nVidia's solitary line.
I suspect these chips are a lot cheaper than you're giving them credit for. You imagine this like it's a 580-sized GPU when in fact it's a 560-sized GPU that they're upcharging the price on simply because their competitor's chip lacked in performance and couldn't keep up. I suspect nVidia makes a lot of money off selling these chips at $200, $300, $400, and $500. Because it was built for cards in the $200-$300 range to begin with.
Great news for nVidia. Not so great for AMD or the consumers, but AMD's slacking off is what got us here.
"A $200 price point might be a huge loss or it might not. "
"I suspect these chips are a lot cheaper than you're giving them credit for."
I don't think people realise how cheap chips actually are to manufacture, with the sale price actually being used to contribute to R&D, marketing, building fabs, etc. rather than the cost of the chip itself. Chips are usually sold for at least ten times more than it costs to manufacture them, so even though it might seem as though selling a binned chip that sells for $500 in a high end card for $200 in a low end card is getting close towards selling for a loss, it's almost certainly still selling for much more than cost price (which is probably under $50).
It seems that way because A M *'in D always manages to lose money on every chip they put in every card. LOL Oh dat.... While nVidia always makes that sweet and large profit margin... Oh doggone dat ! (angry face) The commies no like.
No, my points are 100% CORRECT. The fanboys and FOOLS are the idiots telling each other with exactly ZERO data what the cost is card to market. Here, I'll help the crybaby idiots like you. Go to Charlie's and dig up his cost analysis (with DOLLAR DATA) on the two card wars. It's Fermi vs amd's junk I believe. At least there, one could start to talk like they had a SINGLE FACT to use. Without that, dig up the wafer cost, manufacturing cost, parts good per wafer, AND THEN YOU NEED PRICE CHARGED BY AMD OR NVIDIA ... Good luck without some big bucks to buy up the facts, because they ARE NOT AROUND. If you really want to do the work buy the sales data, then extract discrete, then go to the same Q, and get the profit/loss chart from amd or nvidia, then good luck separating out discrete cards from that... including a SINGLE release. No, YOU'LL BE CALLING WIKILEAKS FOR ANY ANSWER AT ALL.
You just said it yourself, you're the biggest Nvidia fanboy I've ever seen in my whole life. So in the end, from what I've said and the above statement we can only deduct one thing from that and I leave it up to you...
He never said your points were not correct. he only said they are displayed childishly and for that, they gotta be the most childish things I've ever read about video cards. I've been arguing about you just to tell how embarassing you are making things since we had that first discussion about the 7970. To this day, you didn't realize a freaking thing. You're either really dumb or a kid in his teenage hormonal crysis.
He doesn't wanna go anywhere to check you're comments because they're displayed surrounded by so much hate that no one cares about you. Speak like a normal person and maybe you'll get some credit.
And yes, in addition to that, like you said the GK104 was originally intended as a midrange part before Nvidia realised how much they'd overestimated the competition by and decided to rebrand it as high end instead. The GTX 680 was originally intended as a 560 Ti replacement and thus intended to sell for that kind of price point. So in this instance the profit margin is no doubt even more huge than it normally is with chips.
I'll give you one freaking clear example of what you could do to have credit and so people will read you without saying you'Re crazy. If you don'T care about my example NOW, then you'Re just lost.
Your above comment above would have the same result if you just wrote:
This is just speculations Malphas, rumors, we can't say iwhat the gtx 680 was supposed to replace as it is a totally different part from the last gen. But based on how it performs, they surely knew it wasn't meant to replace the midstream gtx 560ti.
There you go, polite still as effective, readable and respectful. Your point was understood and people will not get mad at you for telling them:
Stupid gourds cowboys whippings pigs....... we got it, your insult vocabulary has been exposed we know what you're capable of, now write things with more sense please, would you?
The 550/660Ti VRAM setup makes my skin crawl. I'm sure it's a great marketing point, but I'd rather not pay for extra memory when there's a performance hit involved, especially when I doubt I do anything that uses more than 1.5GB in the first place. Very interested in a retail product and OCability!
The 660Ti has been shown here to beat the 7970 sometimes and to have exactly zero performance hit - but we can all thank the reviewer for making his DISPROVEN speculation foremost in your mind by mentioning it almost every page and then commenting how surprising it was when it didn't show up. More fuel for idiot amd fanboys, exactly what the pr doctor ordered.
You do realize that with new drivers the Radeon HD 7970 Ghz Edition is the single fastest GPU overall right? and that the Radeon HD 7970 gives the GTX 680 a run for its money with the new driver + lower price point?
You do realize these things right?
And we're still talking video games here. If we look towards GPGPU tasks it's not even close. Kepler is a horrible GPGPU architecture. It is actually a step backwards from the GTX 580 in terms of Computational Performance on both Single and Double precision workloads.
Unless all you do is gaming or there is some specific nVIDIA feature you require and are only willing to purchase a single card (Other than PhysX... no other useful features really... maybe CUDA but everyone uses OpenCL now anyway) or you have some aversion to AMD drivers (odd considering nVIDIA has been having more issues for the last year/year and a half in that dept) then there is no real compelling reason to purchase an nVIDIA card.
I'm just being honest. Some people care about GPGPU performance.. you really ought to consider that before you post non-coherent garbles of text that make you appear as though you're suffering from some sever mentally illness.
1. Go read the 660TI charts to find out mr clueless. Oh, forgot , you're being honest, and therefore immense ignorance is required. 2. compute / amd = winzip 3. after that nVidia wins everything 4. Go look at a real compute amd cpu, they need software, the release announcement/ review is up 5. If you disagree tell the author of that article you haven't and likely won't read, heck, you didn't even look at the 660Ti charts here in it's release review 6. The above proves you're a loon, with a big mouth.
Sure it beats the 7970 on a couple test, but I wouldn't venture to say the 660ti review puts it anywhere near the 7970 on average. Its much closer to the 7870/7950.
" The real differentiator may come down to software, with AMD having invested virtually everything into OpenCL.( LOL - THAT MEANS ALMOST NOTHING SINCE THEY ARE BROKE) Success here means that AMD needs to continue turning developers away from CUDA and towards OpenCL, as they can’t sell hardware to developers until developers can run their projects on AMD’s hardware in the first place." ( And cannot offer a 10th of the support nVidia does, they will fail, and whomever falls for the lure ( poor open source neckbeards who dial , email and complain a lot LOL ) will get burned.
Looks like amd will continue to lose money and then go bankrupt. Nvidia is ahead in OpenCL anyway. ROFL
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
36 Comments
Back to Article
Dragonborn - Wednesday, August 22, 2012 - link
what does OEM mean ??euler007 - Wednesday, August 22, 2012 - link
Original Equipment ManufacturerRoland00Address - Wednesday, August 22, 2012 - link
what this means in english is you are going to see these cards in prebuilt HP, Dells, Alienwares, Gateways, Acers, Asus, etc but not on the normal market where you can buy the cards by themselves at newegg or best buy.tipoo - Wednesday, August 22, 2012 - link
System builders can use it, you can't.Unclean - Wednesday, August 22, 2012 - link
http://lmgtfy.com/?q=oemPatflute - Wednesday, August 22, 2012 - link
It better be $199. If it is I'll get it! (Retail version)maroon1 - Wednesday, August 22, 2012 - link
I think this should perform somewhere between HD7850 and HD7870If it cost $200 I would certainly buy it.
silverblue - Wednesday, August 22, 2012 - link
Depending on the title, that could be a small gap (Crysis: Warhead)... or a rather significant one (DiRT 3).CeriseCogburn - Thursday, August 23, 2012 - link
AMD has a crysis, they had better use their head to go to war, and that doesn't mean slinging pr dirt 3 times on the superior competitor nVidia.dmnwlv - Friday, August 24, 2012 - link
I cannot see where is the crisis and how is nvidia superior, especially when these 660 arrived half year late (AMD is already preparing for 8000 series)......are you just an nV fan? :)abhaxus - Saturday, August 25, 2012 - link
You clearly are very much a stranger to sarcasm.Samus - Monday, August 27, 2012 - link
yeah for $200 it'd be worth it, it's sure to be at least as fast as a 560...right?Marlin1975 - Wednesday, August 22, 2012 - link
$200 would be a good price point, but hard for Nvidia. Its the same core as the $500 card so their cost is not very low. Only way I can see it being where it needs to be, $200 range, is if yields are way up on the 28nm line.DanNeely - Wednesday, August 22, 2012 - link
I suspect this card actually means the opposite. Continuing disappointing yields have resulted in enough chips with problem areas to offer a farther cut down card. Based on the relative die areas involved; I suspect they probably have more chips that will fit into this bin than the 660 TI's.kpb321 - Wednesday, August 22, 2012 - link
It may also be that the yields have improved but they've built up a stockpile of chips that aren't up to snuff for a 660 TI but are still usable. This might also be a relatively short lived OEM product that ends up being replaced by a new card based on a new chip instead of harvested GK104 chips.Assimilator87 - Wednesday, August 22, 2012 - link
Considering the die size is smaller than GF114, it probably doesn't lose too much profit going into a $200 card.CeriseCogburn - Thursday, August 23, 2012 - link
LMAO - the usual rage3d amd fans wallowing in their piss faced hatred, pretending to be a CEO or a profit margin professional attempting to tell the successful company what to do and how to run their business, as if AMD ever made a profit which they haven't and nVidia ALWAYS does.The little idiot rascals should do their dummy speculation when amd releases a losing money card - oh that's every single time !
How about the little parrot headed fools resume up for AMD ?
Lots of talkie talk ahead for them... oh wait... when amd releases a card they tell us the oceans have parted... no time for telling amd how run it's losng business model better - why try to improve upon failure... they're so smart they can analyze how nVidia can't possibly make money this time (again) and tell us all how nVidia should run their biz - because they really, really really smart- way smarter than the successful and profit margin a plenty nVidia.
Yeah, I am so glad those amd fans are so doggone schmartie schmart.
jkostans - Friday, August 24, 2012 - link
I can't imagine the time you wasted coming up with that unreadable comment. You look like an idiot.CeriseCogburn - Saturday, August 25, 2012 - link
Hey looking like an IDIOT is called "fitting in with the stupid crowd of PURE SPECULATION morons".At least looking the same, I am the only one who is correct.
Sucks to be you.
Galidou - Wednesday, August 29, 2012 - link
You're the fifth person saying he's either stupid, crazy or simply a madman. From the comments of the 660 ti and now here.My philosophy teacher once told me something I really liked and to this day, it's my favorite statement. He was saying:
If someone says you're a horse, you can doubt that it's true. If a second person walks to you and says you're a horse you can still doubt, but if a third person sayd the same thing again, you should start thinking about buying a saddle.
He's at his fifth time in less than a week but there's nothing to do with him, he still thinks he's sane to spread so much hate about AMD the way he does it. The best it will do to write like you do, it's that people will be REALLY affraid to buy Nvidia cards for the sake of becoming just a little bit more like you.
HisDivineOrder - Wednesday, August 22, 2012 - link
A $200 price point might be a huge loss or it might not. It depends on if the 680 core that wound up being reused across the 670, 660 Ti, and now the 660 OEM are truly the 560/560 Ti/460 replacements they seem to be.If so, then it's not a huge loss for nVidia to use these flawed chips for lower end cards because they were built from the ground up to be mainstream GPU's to begin with.
Sure, they're "losing" money by not selling them as 670's or 680's, but the fact is you can only sell so many $400 and $500 GPU's. The market for that's only so big. Lots of consumers want to buy 560/560 Ti/460-class GPU's. The market for THAT is so much bigger and if nVidia has a good supply of chips just waiting on consumers to buy them, they might chop some performance off and throw 'em out there rather than let chips collect dust waiting on consumers to decide to buy $400+ video cards. Especially when said chips are so much smaller than competition's GPU's and they have basically one product line at 28nm while their competitor is busy churning out several 28nm products, each one being squeezed by nVidia's solitary line.
I suspect these chips are a lot cheaper than you're giving them credit for. You imagine this like it's a 580-sized GPU when in fact it's a 560-sized GPU that they're upcharging the price on simply because their competitor's chip lacked in performance and couldn't keep up. I suspect nVidia makes a lot of money off selling these chips at $200, $300, $400, and $500. Because it was built for cards in the $200-$300 range to begin with.
Great news for nVidia. Not so great for AMD or the consumers, but AMD's slacking off is what got us here.
Malphas - Thursday, August 23, 2012 - link
"A $200 price point might be a huge loss or it might not. ""I suspect these chips are a lot cheaper than you're giving them credit for."
I don't think people realise how cheap chips actually are to manufacture, with the sale price actually being used to contribute to R&D, marketing, building fabs, etc. rather than the cost of the chip itself. Chips are usually sold for at least ten times more than it costs to manufacture them, so even though it might seem as though selling a binned chip that sells for $500 in a high end card for $200 in a low end card is getting close towards selling for a loss, it's almost certainly still selling for much more than cost price (which is probably under $50).
CeriseCogburn - Thursday, August 23, 2012 - link
It seems that way because A M *'in D always manages to lose money on every chip they put in every card.LOL
Oh dat....
While nVidia always makes that sweet and large profit margin...
Oh doggone dat ! (angry face)
The commies no like.
Malphas - Friday, August 24, 2012 - link
Your posts are embarrassing and childish. Being a fanboy of a corporate (be it nVidia or AMD, or anyone else) is imbecilic.CeriseCogburn - Saturday, August 25, 2012 - link
No, my points are 100% CORRECT.The fanboys and FOOLS are the idiots telling each other with exactly ZERO data what the cost is card to market.
Here, I'll help the crybaby idiots like you.
Go to Charlie's and dig up his cost analysis (with DOLLAR DATA) on the two card wars. It's Fermi vs amd's junk I believe.
At least there, one could start to talk like they had a SINGLE FACT to use.
Without that, dig up the wafer cost, manufacturing cost, parts good per wafer, AND THEN YOU NEED PRICE CHARGED BY AMD OR NVIDIA ...
Good luck without some big bucks to buy up the facts, because they ARE NOT AROUND.
If you really want to do the work buy the sales data, then extract discrete, then go to the same Q, and get the profit/loss chart from amd or nvidia, then good luck separating out discrete cards from that... including a SINGLE release.
No, YOU'LL BE CALLING WIKILEAKS FOR ANY ANSWER AT ALL.
Galidou - Wednesday, August 29, 2012 - link
''The fanboys and FOOLS are the idiots''You just said it yourself, you're the biggest Nvidia fanboy I've ever seen in my whole life. So in the end, from what I've said and the above statement we can only deduct one thing from that and I leave it up to you...
He never said your points were not correct. he only said they are displayed childishly and for that, they gotta be the most childish things I've ever read about video cards. I've been arguing about you just to tell how embarassing you are making things since we had that first discussion about the 7970. To this day, you didn't realize a freaking thing. You're either really dumb or a kid in his teenage hormonal crysis.
He doesn't wanna go anywhere to check you're comments because they're displayed surrounded by so much hate that no one cares about you. Speak like a normal person and maybe you'll get some credit.
Malphas - Thursday, August 23, 2012 - link
And yes, in addition to that, like you said the GK104 was originally intended as a midrange part before Nvidia realised how much they'd overestimated the competition by and decided to rebrand it as high end instead. The GTX 680 was originally intended as a 560 Ti replacement and thus intended to sell for that kind of price point. So in this instance the profit margin is no doubt even more huge than it normally is with chips.CeriseCogburn - Saturday, August 25, 2012 - link
I can't believe how stupid you people are, and how incorrect rumors drive your gourds like cowboys whippings pigs into the corral.Galidou - Wednesday, August 29, 2012 - link
I'll give you one freaking clear example of what you could do to have credit and so people will read you without saying you'Re crazy. If you don'T care about my example NOW, then you'Re just lost.Your above comment above would have the same result if you just wrote:
This is just speculations Malphas, rumors, we can't say iwhat the gtx 680 was supposed to replace as it is a totally different part from the last gen. But based on how it performs, they surely knew it wasn't meant to replace the midstream gtx 560ti.
There you go, polite still as effective, readable and respectful. Your point was understood and people will not get mad at you for telling them:
Stupid gourds cowboys whippings pigs....... we got it, your insult vocabulary has been exposed we know what you're capable of, now write things with more sense please, would you?
leliel - Wednesday, August 22, 2012 - link
The 550/660Ti VRAM setup makes my skin crawl. I'm sure it's a great marketing point, but I'd rather not pay for extra memory when there's a performance hit involved, especially when I doubt I do anything that uses more than 1.5GB in the first place. Very interested in a retail product and OCability!CeriseCogburn - Saturday, August 25, 2012 - link
The 660Ti has been shown here to beat the 7970 sometimes and to have exactly zero performance hit - but we can all thank the reviewer for making his DISPROVEN speculation foremost in your mind by mentioning it almost every page and then commenting how surprising it was when it didn't show up.More fuel for idiot amd fanboys, exactly what the pr doctor ordered.
RickLaRose - Saturday, August 25, 2012 - link
Oh brother...Beat the Radeon HD 7970 in what?
You do realize that with new drivers the Radeon HD 7970 Ghz Edition is the single fastest GPU overall right? and that the Radeon HD 7970 gives the GTX 680 a run for its money with the new driver + lower price point?
You do realize these things right?
And we're still talking video games here. If we look towards GPGPU tasks it's not even close. Kepler is a horrible GPGPU architecture. It is actually a step backwards from the GTX 580 in terms of Computational Performance on both Single and Double precision workloads.
Unless all you do is gaming or there is some specific nVIDIA feature you require and are only willing to purchase a single card (Other than PhysX... no other useful features really... maybe CUDA but everyone uses OpenCL now anyway) or you have some aversion to AMD drivers (odd considering nVIDIA has been having more issues for the last year/year and a half in that dept) then there is no real compelling reason to purchase an nVIDIA card.
I'm just being honest. Some people care about GPGPU performance.. you really ought to consider that before you post non-coherent garbles of text that make you appear as though you're suffering from some sever mentally illness.
CeriseCogburn - Monday, August 27, 2012 - link
1. Go read the 660TI charts to find out mr clueless. Oh, forgot , you're being honest, and therefore immense ignorance is required.2. compute / amd = winzip
3. after that nVidia wins everything
4. Go look at a real compute amd cpu, they need software, the release announcement/ review is up
5. If you disagree tell the author of that article you haven't and likely won't read, heck, you didn't even look at the 660Ti charts here in it's release review
6. The above proves you're a loon, with a big mouth.
Midwayman - Friday, September 7, 2012 - link
Sure it beats the 7970 on a couple test, but I wouldn't venture to say the 660ti review puts it anywhere near the 7970 on average. Its much closer to the 7870/7950.CeriseCogburn - Monday, August 27, 2012 - link
http://www.anandtech.com/show/6191/amd-announces-f..." The real differentiator may come down to software, with AMD having invested virtually everything into OpenCL.( LOL - THAT MEANS ALMOST NOTHING SINCE THEY ARE BROKE) Success here means that AMD needs to continue turning developers away from CUDA and towards OpenCL, as they can’t sell hardware to developers until developers can run their projects on AMD’s hardware in the first place." ( And cannot offer a 10th of the support nVidia does, they will fail, and whomever falls for the lure ( poor open source neckbeards who dial , email and complain a lot LOL ) will get burned.
Looks like amd will continue to lose money and then go bankrupt.
Nvidia is ahead in OpenCL anyway. ROFL
Jim746 - Sunday, February 10, 2013 - link
I purchased a retail version to see if it was faster and it was actually slower, so I returned it.