Well, considering NVidia doesn't really even 'make' anything, a billion dollars profit in a quarter is pretty good for shuffling GPU's from TSMC to their partners.
Apple, on the other hand, sells a variety of products direct to consumer. NVidia makes reference designs, their partners actually produce them though. NVidia doesn't sell anything direct to consumer...unless you consider the Shield an actual product?
Well, considering Apple doesn't really even 'make' anything, 20 billion dollars profit in a quarter is pretty good for shuffling hardware's from Foxconn to their partners.
Edzieba, your play on words wreaks of a desperate attempt for attention. Apple doesn’t shuffle anything to their “partners”. Nobody is licensed to make Apple products except Apple. They do t have partners. You need authorization to just sell Apple products! Apple sells direct to customer (CTO) and has an entirely different business model, ecosystem, and portfolio to nvidia. My whole point was in rebutted to mrspadges attempt to compare the two companies...that appearantly went right over your head.
When nvidia opens a chain of stores, cuts off their partners and exclusively makes products ala 3Dfx, and burns all their OEM contracts, then perhaps they will become more like Apple lol
More than half of their revenues come from the Consumer market as you can see in the last available quarter report. Of course, if you consider that they surely do not have a 60% margin on consumer products you can expect that the margins for the other half of the revenues are really rally large, so they make more money with the revenues of the smaller but high specialized business.
How is the Shield not an actual product? I've had my Shield TV for a couple years now and it's by far the best streaming media player there is. It's available all over the place as well, including local stores like Best Buy.
Did you... um, time-warp from the 1950's... or something? I guess you think mining, agriculture, and manufacturing are the only sectors of the economy that actually matter?
Just try typing that on a keyboard plugged into a PC with a blank slab of silicon for a CPU. And no OS, software, or websites to visit, either. And I hope you like driving to all the factories that make the parts, because delivery companies don't "make" anything, either.
"We don't like miners sucking up all the GFX available" (But we sure do like the profit more, so we're not gonna do anything about it, but we'll still pretend to feel bad for gamers)
And as long as increasing supply means risking a glut of used cards in the future, damaging the current product lineup at the time, they will not make more cards to ease pricing. I refuse to call them GPUs, because they're not anymore (as the opening of the vid ref'd below shows); modern designs are AI/compute engines (hence the good matching for mining), with the table scraps coupled to a display output for gamers. We haven't had new 3D realism feature innovation in gaming for years.
Plus, the modern fad for high frequency displays and VR, and gamers themselves focusing so much on games that don't need new features, means game devs are loathe to make games that are significantly more taxing than previous titles; we'll never see another Crysis, such frame rates as there were in reviews of the time would be viewed most poorly by modern tech sites, erring as they do to the vocal and wealthy crowd who use freesync/gsync on high res displays, or VR. It's a vicious circle which won't be broken until gamers start demanding different types of games that need new new features to support more immersive worlds, such as properly modelled fluids and malleable environments (mud, fire, water, lava, smoke, flood, tsunami, volcano eruption, earthquake, etc.), the sort of thing that would make games like Tomb Raider much more interesting. So many combat games, but none yet that allow one to dig pits or tunnels, construct roads, etc.? IMO the original Red Faction was more interesting in that regard than newer games. Typical AAA FPS titles don't need anything more than is already available, especially since the marketing focus is so much on online deathmatch and now also microtransactions, instead of a genuinely engaging single player experience.
In theory there's a market opportunity for a 3rd party to design an all new GPU that focuses on *gaming*, but I can't see that happening, not unless a company with money to spend decides it's worth creating something worthwhile. IBM could do it, they could hire the talent, etc., but it's not a market level they care much about.
Collectively, the only thing gamers can do that might make any difference is to simply not buy the products at the currently inflated prices. To some extent this is happening, used pricing on ebay has shot up recently (used 980 Tis are fast approaching their original MSRP), but not enough to persuade AMD or NVIDIA to change their approach. NVIDIA could certainly design a new, genuine gaming GPU if they wanted to, but I don't think they care.
The giddy heights of the money available from Enterprise/AI/compute markets reminds of the way SGI became so obsessed with the big lucrative sales, ignoring mass market opportunities for cheaper products that would provide a sustainable product line and user base in the long term; back then, NVIDIA stepped in and grabbed the limelight. Now it seems NVIDIA is chasing the same money rainbow. Understandable, as it's great for shareholders, but in the long term, who knows, and in the meantime it's driving gamers nuts, making PC building very expensive. The staggering increases in RAM pricing add fuel to the fire, ditto the degraded value of modern SSDs.
Your comment is a complete clueless rant. What's the meaning of "NVIDIA could certainly design a new, genuine gaming GPU if they wanted to, but I don't think they care"? This is not limited to nvidia, of course. What do you have not understood that modern gaming base most of their work on exploiting the calculation capacities of the modern GPU? What do you think Vulkan and DX12 have been thought for? To use the same pre-made driver executed DX11 algorithms (which however use as much calculation capabilities as possible nonetheless)? Why do you thing that modern games run faster on a 10TFLOPS GPU rather than a 5TFLOPS one? Because they have equal number of ROPs? Because the new GPU are not "genuine gaming GPUs" like the old ones? What's the meaning of posting so many lines of craps when you do not know even what you are talking about?
Fact is a simple one: modern games have been constantly requiring more and more computational capabilities to the GPUs to run at decent frame rates. GPU have fattened for satisfy those requirements. The great work put in this has transformed the GPUs from a mere gaming focused product into a more general (and powerful) computational product useable also outside the gaming market.
However today has not understood why gaming and mining are the same market is the same one that thought that GPGPU was only executin FP64 calculations and believed that only AMD HW had such features, while all those GPU doing only FP32 calculations were toys. They are also the same people that now can't understand why nvidia is making almost $3B with a $1B of net revenue while the other company playing in the same market is struggling reaching the break even point, all while the GPU market is booming.
Agreed, his comment was utter non-sense. Every quarter it's the gaming department that brings in the most revenue, and he claims nvidia doesn't care lol. As you explained, games drove gpus more powerful to the point where they started being used for tasks that could exploit them. Cudos to nvidia for seeing the huge potential of that trend and bringing in things like CUDA, AI applications, etc..
GPGPU is older than CUDA and OpenCL. Check out Brook, Sh, Cg, etc. if you don't believe me.
gpgpu.org used to be the main new source for developments in the field. It's still up and searchable, but I can't see a way to see beyond one page of history or search results. I guess you can use archive.org to see old postings. I know it goes back at least as far as 2003.
Credit where it's due: Nvidia definitely foresaw the AI trend and was among the first to get on board.
I actually agree with his comment. The question is, after the crypto-currency mining slows down, or upon release of the next gen of cards, things may change drastically in the entire market in terms of supply and demand. Sure you can go balls deep into the current market thinking it won't turn, but if it does, you're talking billions of dollars of investments with no way to recoup it. From the business perspective it's best to just go steady. Put out what you know there's steady demand for. Let the market dictate the prices. If this trend persists adjust slowly.
You guys think this is a joke? It's how huge businesses have went bankrupt in the past.
None has negated that the mining business is causing a stress on GPU market with many questions about the future. The only possible (not meaning mandatory) solution is increasing the production of whatever now is a bottleneck for the supply chain. There are no other solutions like believing in different products for gaming and for mining. They use the same resources and so the HW must be the same. And I do not see any OEM creating a "mining Tesla card without video connectors" to sell at half the price, the only way to make miners buy them instead of the "normal video cards" they can then resell. It would be stupid for their own business (less margins) and would not really resolve the problem if the bottleneck is the produced number of GPUs or ram chips.
I think, by reading this kind of comment, that many have not understood much of the GPU market during these years.
mapesdhs has a few solid points. Most if not all PC games today are console ports. Many loaded with DRM's like Denuvo and the like. And most games, regardless of platform, are mediocre. PS4 has had a few good titles in the last 10 years, but that's about it. VR was supposed to take off but so far it hasn't happened. As the cost of headsets it coming down, hopefully we'll see a change. Game developers are still stuck in DX11, even though DX12 and Vulkan are vastly superior. And as Sn3akr points out, Nvidia and AMD complain about miners but are happy with the profits. So why would they increase supply ? The really sad part is that with the exception of Intel, here in the West we don't make stuff anymore. We just "design". OLED screens ? Samsung. SoC's ? TSMC and Samsung. Memory ? SSD's ? NVME's ? The same story. Final assembly ? You know the answer. If these companies charge as much as they want, who do we blame it on ? They got us by the balls and gouge us the same way Intel and Nvidia have been doing it for so long. And that's not about to change anytime soon. Look at Intel for example. Instead of going after TSMC and Samsung, they go after Nvidia. Way to go.
Oh, I forgot abut Apple. They "design" a lot of stuff too. They have over 100 billion sitting in the bank and even more stashed away in offshore accounts. With that kind of money, they could build some plants here in the good old USA, develop their own technologies and make a lot of stuff. But why do that ? Instead, let's sell overpriced smartphones and 15 inch laptops for 3000 bucks. But hey, they did built a nice oval campus. To "design" more stuff.
How can you possibly be so hostile to design? Everything that makes one CPU (or GPU) good and another bad is down to design. Same with software. Or cars, even.
Design actually matters. When people talk about these tech companies designing, they don't just mean the product packaging or the case.
They want to increase supply, because most of the margin from recent price increases is going to board partners and the retail channel - not AMD and Nvidia. The best way for the chip makers to increase revenues is to make more chips.
And I'm quite suspicious that the real villain behind lagging quality in PC games is the rise of the iGPU. These things are much weaker than even the GPUs of the consoles you deride. Even the latest AMD APU can barely keep up with a GTX 1030.
BTW, Intel *does* have a foundry business. Way to go on your research.
Indeed. No doubt it takes pressure off your supply chain logistics when you roll cards from one factory in Asia over next door into a mining operation ;)
I have a feeling that, to a large extent, Nvidia is just getting future profits in early due to crypto mining. I think that eventually, crypto mining is going to come crashing down, and the market will suddenly be flooded with used GPUs.
When that happens, Nvidia's sales are going to take a hit. They'll probably still be profitable overall, but I expect their margins will take a serious hit as prices come down to compete with all the used mining cards.
(This kind of happened once already, except it didn't last very long.)
It depends on when the crash will happen. If Ampere is already out and selling, a flood of old, worn, slower used GPUs will not be a problem, as the used market is not a problem whenever a new series is launched. The one going to be hit mostly will be AMD, as not having anything new (and that will happen only in 2019) it will see the market flooded by the same GPUs it is selling as new. That would really bring its GPU prices to low prices, lower than the one they are applying now that grant them no margins at all.
AMD is completely toast if mining dies. From Steam stats NV now is outnumbering them 10:1 in installed base. Heck, even the 1080 Ti has beat any AMD card ever released in that metric, and don't tell me nobody buy 1080 Tis for mining.
I don't understand how AMD doesn't have any new gaming products, this year. 2016 = Polaris. 2017 = Vega + rebadged Polaris. How can they not have anything for 2018?
This is a joke. You can't see how they take the gaming market seriously.
If a second-hand card is cheap enough, gamers will still buy them. Like, if you had only enough money for a GTX 1050, but used 980 Ti's were also selling for that much, most would judge it worth the risk to get one.
Even if you have to replace the fans and thermal paste, there'll be a huge market for DIY kits and many will still take the trouble for the amount of benefit.
You are the one to wake up and read something more than comics. It has been written in a previous post also. Fiscal years are different than solar years. Nvda fiscal years ends in February and the company chose to use the ending year to identify the period instead of the starting year.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
46 Comments
Back to Article
boozed - Thursday, February 8, 2018 - link
*looks at Apple*Call that a gross margin?
*holds up quarterly report*
THAT's a gross margin.
xype - Friday, February 9, 2018 - link
*looks back at Nvidia*Cute profit, there!
MrSpadge - Friday, February 9, 2018 - link
Well, you can't have that many apples in the world.Samus - Friday, February 9, 2018 - link
Well, considering NVidia doesn't really even 'make' anything, a billion dollars profit in a quarter is pretty good for shuffling GPU's from TSMC to their partners.Apple, on the other hand, sells a variety of products direct to consumer. NVidia makes reference designs, their partners actually produce them though. NVidia doesn't sell anything direct to consumer...unless you consider the Shield an actual product?
edzieba - Friday, February 9, 2018 - link
Well, considering Apple doesn't really even 'make' anything, 20 billion dollars profit in a quarter is pretty good for shuffling hardware's from Foxconn to their partners.UltraWide - Friday, February 9, 2018 - link
LOL you guys are killing me with the humor! :clap:Samus - Friday, February 9, 2018 - link
Edzieba, your play on words wreaks of a desperate attempt for attention. Apple doesn’t shuffle anything to their “partners”. Nobody is licensed to make Apple products except Apple. They do t have partners. You need authorization to just sell Apple products! Apple sells direct to customer (CTO) and has an entirely different business model, ecosystem, and portfolio to nvidia. My whole point was in rebutted to mrspadges attempt to compare the two companies...that appearantly went right over your head.When nvidia opens a chain of stores, cuts off their partners and exclusively makes products ala 3Dfx, and burns all their OEM contracts, then perhaps they will become more like Apple lol
JoeyJoJo123 - Monday, February 12, 2018 - link
Get #rekt applel shill.vladx - Sunday, February 18, 2018 - link
Lmao Samus - Apple shill confirmed.sowoky - Friday, February 9, 2018 - link
https://www.nvidia.com/en-us/shop/Not to mention, much of their revenue is not from consumer products, but from automakers, auto suppliers, oems, enterprises, cloud providers, etc.
CiccioB - Friday, February 9, 2018 - link
More than half of their revenues come from the Consumer market as you can see in the last available quarter report.Of course, if you consider that they surely do not have a 60% margin on consumer products you can expect that the margins for the other half of the revenues are really rally large, so they make more money with the revenues of the smaller but high specialized business.
jmunjr - Saturday, February 10, 2018 - link
How is the Shield not an actual product? I've had my Shield TV for a couple years now and it's by far the best streaming media player there is. It's available all over the place as well, including local stores like Best Buy.mode_13h - Tuesday, February 20, 2018 - link
Did you... um, time-warp from the 1950's... or something? I guess you think mining, agriculture, and manufacturing are the only sectors of the economy that actually matter?Just try typing that on a keyboard plugged into a PC with a blank slab of silicon for a CPU. And no OS, software, or websites to visit, either. And I hope you like driving to all the factories that make the parts, because delivery companies don't "make" anything, either.
webdoctors - Friday, February 9, 2018 - link
Net Income $1118MGross Margin 61.9%
Niiiiiiiiiiiiiiiiiiiice.
CiccioB - Friday, February 9, 2018 - link
The gross margin is incredible.That's what sums up the differences with respect to AMD.
Sn3akr - Friday, February 9, 2018 - link
"We don't like miners sucking up all the GFX available"(But we sure do like the profit more, so we're not gonna do anything about it, but we'll still pretend to feel bad for gamers)
souperior intellect - Friday, February 9, 2018 - link
Market forces dude - suck it up!mapesdhs - Friday, February 9, 2018 - link
And as long as increasing supply means risking a glut of used cards in the future, damaging the current product lineup at the time, they will not make more cards to ease pricing. I refuse to call them GPUs, because they're not anymore (as the opening of the vid ref'd below shows); modern designs are AI/compute engines (hence the good matching for mining), with the table scraps coupled to a display output for gamers. We haven't had new 3D realism feature innovation in gaming for years.https://www.youtube.com/watch?v=PkeKx-L_E-o
Plus, the modern fad for high frequency displays and VR, and gamers themselves focusing so much on games that don't need new features, means game devs are loathe to make games that are significantly more taxing than previous titles; we'll never see another Crysis, such frame rates as there were in reviews of the time would be viewed most poorly by modern tech sites, erring as they do to the vocal and wealthy crowd who use freesync/gsync on high res displays, or VR. It's a vicious circle which won't be broken until gamers start demanding different types of games that need new new features to support more immersive worlds, such as properly modelled fluids and malleable environments (mud, fire, water, lava, smoke, flood, tsunami, volcano eruption, earthquake, etc.), the sort of thing that would make games like Tomb Raider much more interesting. So many combat games, but none yet that allow one to dig pits or tunnels, construct roads, etc.? IMO the original Red Faction was more interesting in that regard than newer games. Typical AAA FPS titles don't need anything more than is already available, especially since the marketing focus is so much on online deathmatch and now also microtransactions, instead of a genuinely engaging single player experience.
In theory there's a market opportunity for a 3rd party to design an all new GPU that focuses on *gaming*, but I can't see that happening, not unless a company with money to spend decides it's worth creating something worthwhile. IBM could do it, they could hire the talent, etc., but it's not a market level they care much about.
Collectively, the only thing gamers can do that might make any difference is to simply not buy the products at the currently inflated prices. To some extent this is happening, used pricing on ebay has shot up recently (used 980 Tis are fast approaching their original MSRP), but not enough to persuade AMD or NVIDIA to change their approach. NVIDIA could certainly design a new, genuine gaming GPU if they wanted to, but I don't think they care.
The giddy heights of the money available from Enterprise/AI/compute markets reminds of the way SGI became so obsessed with the big lucrative sales, ignoring mass market opportunities for cheaper products that would provide a sustainable product line and user base in the long term; back then, NVIDIA stepped in and grabbed the limelight. Now it seems NVIDIA is chasing the same money rainbow. Understandable, as it's great for shareholders, but in the long term, who knows, and in the meantime it's driving gamers nuts, making PC building very expensive. The staggering increases in RAM pricing add fuel to the fire, ditto the degraded value of modern SSDs.
CiccioB - Friday, February 9, 2018 - link
Your comment is a complete clueless rant.What's the meaning of "NVIDIA could certainly design a new, genuine gaming GPU if they wanted to, but I don't think they care"?
This is not limited to nvidia, of course.
What do you have not understood that modern gaming base most of their work on exploiting the calculation capacities of the modern GPU? What do you think Vulkan and DX12 have been thought for? To use the same pre-made driver executed DX11 algorithms (which however use as much calculation capabilities as possible nonetheless)?
Why do you thing that modern games run faster on a 10TFLOPS GPU rather than a 5TFLOPS one? Because they have equal number of ROPs? Because the new GPU are not "genuine gaming GPUs" like the old ones?
What's the meaning of posting so many lines of craps when you do not know even what you are talking about?
Fact is a simple one: modern games have been constantly requiring more and more computational capabilities to the GPUs to run at decent frame rates. GPU have fattened for satisfy those requirements. The great work put in this has transformed the GPUs from a mere gaming focused product into a more general (and powerful) computational product useable also outside the gaming market.
However today has not understood why gaming and mining are the same market is the same one that thought that GPGPU was only executin FP64 calculations and believed that only AMD HW had such features, while all those GPU doing only FP32 calculations were toys.
They are also the same people that now can't understand why nvidia is making almost $3B with a $1B of net revenue while the other company playing in the same market is struggling reaching the break even point, all while the GPU market is booming.
CiccioB - Friday, February 9, 2018 - link
"However today" was meant to be "Whoever today".What about the edit feature?
maximumGPU - Friday, February 9, 2018 - link
Agreed, his comment was utter non-sense. Every quarter it's the gaming department that brings in the most revenue, and he claims nvidia doesn't care lol. As you explained, games drove gpus more powerful to the point where they started being used for tasks that could exploit them. Cudos to nvidia for seeing the huge potential of that trend and bringing in things like CUDA, AI applications, etc..mode_13h - Wednesday, February 21, 2018 - link
GPGPU is older than CUDA and OpenCL. Check out Brook, Sh, Cg, etc. if you don't believe me.gpgpu.org used to be the main new source for developments in the field. It's still up and searchable, but I can't see a way to see beyond one page of history or search results. I guess you can use archive.org to see old postings. I know it goes back at least as far as 2003.
Credit where it's due: Nvidia definitely foresaw the AI trend and was among the first to get on board.
niva - Friday, February 9, 2018 - link
I actually agree with his comment. The question is, after the crypto-currency mining slows down, or upon release of the next gen of cards, things may change drastically in the entire market in terms of supply and demand. Sure you can go balls deep into the current market thinking it won't turn, but if it does, you're talking billions of dollars of investments with no way to recoup it. From the business perspective it's best to just go steady. Put out what you know there's steady demand for. Let the market dictate the prices. If this trend persists adjust slowly.You guys think this is a joke? It's how huge businesses have went bankrupt in the past.
CiccioB - Saturday, February 10, 2018 - link
None has negated that the mining business is causing a stress on GPU market with many questions about the future.The only possible (not meaning mandatory) solution is increasing the production of whatever now is a bottleneck for the supply chain.
There are no other solutions like believing in different products for gaming and for mining. They use the same resources and so the HW must be the same.
And I do not see any OEM creating a "mining Tesla card without video connectors" to sell at half the price, the only way to make miners buy them instead of the "normal video cards" they can then resell. It would be stupid for their own business (less margins) and would not really resolve the problem if the bottleneck is the produced number of GPUs or ram chips.
I think, by reading this kind of comment, that many have not understood much of the GPU market during these years.
cocochanel - Saturday, February 10, 2018 - link
mapesdhs has a few solid points. Most if not all PC games today are console ports. Many loaded with DRM's like Denuvo and the like. And most games, regardless of platform, are mediocre. PS4 has had a few good titles in the last 10 years, but that's about it.VR was supposed to take off but so far it hasn't happened. As the cost of headsets it coming down, hopefully we'll see a change.
Game developers are still stuck in DX11, even though DX12 and Vulkan are vastly superior.
And as Sn3akr points out, Nvidia and AMD complain about miners but are happy with the profits.
So why would they increase supply ?
The really sad part is that with the exception of Intel, here in the West we don't make stuff anymore. We just "design". OLED screens ? Samsung. SoC's ? TSMC and Samsung. Memory ? SSD's ? NVME's ? The same story. Final assembly ? You know the answer. If these companies charge as much as they want, who do we blame it on ? They got us by the balls and gouge us the same way Intel and Nvidia have been doing it for so long.
And that's not about to change anytime soon. Look at Intel for example. Instead of going after TSMC and Samsung, they go after Nvidia. Way to go.
cocochanel - Saturday, February 10, 2018 - link
Oh, I forgot abut Apple. They "design" a lot of stuff too. They have over 100 billion sitting in the bank and even more stashed away in offshore accounts. With that kind of money, they could build some plants here in the good old USA, develop their own technologies and make a lot of stuff.But why do that ?
Instead, let's sell overpriced smartphones and 15 inch laptops for 3000 bucks. But hey, they did built a nice oval campus. To "design" more stuff.
mode_13h - Wednesday, February 21, 2018 - link
How can you possibly be so hostile to design? Everything that makes one CPU (or GPU) good and another bad is down to design. Same with software. Or cars, even.Design actually matters. When people talk about these tech companies designing, they don't just mean the product packaging or the case.
mode_13h - Wednesday, February 21, 2018 - link
They want to increase supply, because most of the margin from recent price increases is going to board partners and the retail channel - not AMD and Nvidia. The best way for the chip makers to increase revenues is to make more chips.And I'm quite suspicious that the real villain behind lagging quality in PC games is the rise of the iGPU. These things are much weaker than even the GPUs of the consoles you deride. Even the latest AMD APU can barely keep up with a GTX 1030.
BTW, Intel *does* have a foundry business. Way to go on your research.
cpy - Friday, February 9, 2018 - link
Gaming my ass, miners take all the cards nowadays.milkod2001 - Friday, February 9, 2018 - link
Does it matter who do you sell your products to? NOmode_13h - Wednesday, February 21, 2018 - link
They actually do care, because they know it's hurting the future gaming market. They might not care that *much*, but they do care.Gunbuster - Friday, February 9, 2018 - link
Indeed. No doubt it takes pressure off your supply chain logistics when you roll cards from one factory in Asia over next door into a mining operation ;)souperior intellect - Friday, February 9, 2018 - link
2018 !!!Are these future predictions?
Yojimbo - Friday, February 9, 2018 - link
Never fails...Fiscal year 2018, not calendar year 2018. NVIDIA's FY 2018 started around February 2017 and just ended.
nfriedly - Friday, February 9, 2018 - link
I have a feeling that, to a large extent, Nvidia is just getting future profits in early due to crypto mining. I think that eventually, crypto mining is going to come crashing down, and the market will suddenly be flooded with used GPUs.When that happens, Nvidia's sales are going to take a hit. They'll probably still be profitable overall, but I expect their margins will take a serious hit as prices come down to compete with all the used mining cards.
(This kind of happened once already, except it didn't last very long.)
CiccioB - Friday, February 9, 2018 - link
It depends on when the crash will happen.If Ampere is already out and selling, a flood of old, worn, slower used GPUs will not be a problem, as the used market is not a problem whenever a new series is launched.
The one going to be hit mostly will be AMD, as not having anything new (and that will happen only in 2019) it will see the market flooded by the same GPUs it is selling as new. That would really bring its GPU prices to low prices, lower than the one they are applying now that grant them no margins at all.
StrangerGuy - Saturday, February 10, 2018 - link
AMD is completely toast if mining dies. From Steam stats NV now is outnumbering them 10:1 in installed base. Heck, even the 1080 Ti has beat any AMD card ever released in that metric, and don't tell me nobody buy 1080 Tis for mining.mode_13h - Wednesday, February 21, 2018 - link
I don't understand how AMD doesn't have any new gaming products, this year. 2016 = Polaris. 2017 = Vega + rebadged Polaris. How can they not have anything for 2018?This is a joke. You can't see how they take the gaming market seriously.
mode_13h - Wednesday, February 21, 2018 - link
Oh, and 2015 = Fury.Don't understand how they could let the pipeline run dry.
StrangerGuy - Friday, February 9, 2018 - link
But-but-but the stingy AMD fanboys told us NV will be dead in short order because nobody wants to pay more for G-sync.What is even more ironic is that it looks entirely possible for Switch's Tegra to make more overall profit for NV than PS4/XB1 APUs for AMD.
boozed - Saturday, February 10, 2018 - link
They did?SlyNine - Saturday, February 10, 2018 - link
Boy, I sure bet Nvidia just hates those miners.Mikuni - Saturday, February 10, 2018 - link
Nobody wants those burned cards from crypto mining, 2nd hand market will be meaningless if crypto crashes.mode_13h - Wednesday, February 21, 2018 - link
If a second-hand card is cheap enough, gamers will still buy them. Like, if you had only enough money for a GTX 1050, but used 980 Ti's were also selling for that much, most would judge it worth the risk to get one.Even if you have to replace the fans and thermal paste, there'll be a huge market for DIY kits and many will still take the trouble for the amount of benefit.
sna1970 - Sunday, February 11, 2018 - link
Q4 2018 ???'HELOOOOOOOOOOOOOOOOOOOOOOOOOOOOO
CiccioB - Monday, February 12, 2018 - link
You are the one to wake up and read something more than comics. It has been written in a previous post also. Fiscal years are different than solar years. Nvda fiscal years ends in February and the company chose to use the ending year to identify the period instead of the starting year.