Grossmargin up 3.5% year-over-year means, that Nvidia could take very high prices on its products. Who doubts, that they did it. Time (and much room) for competition from AMD.
It has a lot to do with their changing market mix. Their highest margin business is the data center business, which had 3 times the revenue this past quarter than the year ago quarter. They are making less lower margin OEM sales. And since the release of the Pascal cards they've had the high end of the GPU market to themselves, so a greater percentage of their gaming sales is from the higher margin high end (not that the Fury line ever sold all that much). Additionally, the entire market has been moving more towards the high end in the last few years.
All of that has nothing to do with a greater selling price for equivalent SKUs because of a lack of competition. Such a greater selling price doesn't seem to exist. 10 series prices are pretty much in line with 700 series prices from 3 to 4 years ago, which was a time when there was more competition from AMD.
Many good points here that I generally agree with. However: @Yojimbo: "All of that has nothing to do with a greater selling price for equivalent SKUs because of a lack of competition. Such a greater selling price doesn't seem to exist. 10 series prices are pretty much in line with 700 series prices from 3 to 4 years ago, which was a time when there was more competition from AMD."
These numbers were compared against last quarter and last year, so why confuse things by comparing to 3 to 4 year old graphics cards. There was in fact a measurable if not significant price bump compared to the 900 series (at least at the upper end). The prices also seem to be staying higher measurably if not significantly longer than they did with the 900 series. More importantly, the die size (significant cost contributor) for the 1000 series is markedly smaller than the last generation (roughly) price equivalent. Most importantly, having higher gross margin suggests pretty directly that they were bringing in more money vs the cost of the products they were selling.
Now, as you said, there are a lot of good reasons outside higher SKU profits (whether higher price or lower manufacturing cost) for their higher gross margins. Certainly the 205.1% Y/Y ($99M) growth in revenue for the datacenter segment was hugely beneficial here. I just wanted to point out that higher profits on SKUs was still a measurable if not significant factor. Keep in mind, while the gaming segment revenues grew by only a 66.4% Y/Y, nVidia brought in more additional revenue ($538M) Y/Y in this segment than the entire datacenter segment ($296M).
"These numbers were compared against last quarter and last year, so why confuse things by comparing to 3 to 4 year old graphics cards."
The comparison to 3 or 4 year old graphics cards had nothing to do with refuting margins numbers, but rather to do with the conclusion the original poster made by looking at the margin numbers. It is independent evidence to refute his conclusion and it doesn't "confuse things".
"There was in fact a measurable if not significant price bump compared to the 900 series (at least at the upper end)"
The fact that there was a increase compared to the 900 series, a series that also faced limited competition at the high end, does not refute my evidence that in the long term the claim that decreased competition has lead to more expensive graphics cards is not true.
"More importantly, the die size (significant cost contributor) for the 1000 series is markedly smaller than the last generation (roughly) price equivalent. "
What is of overriding importance here is that the node used for the 1000 series was very new, whereas the node used for the 900 series was very old. Older nodes have lower costs and higher yields. That's why the die size of the 1000 series needed to be smaller. Notice that the 900 series, while cheaper than the 700 series, had die sizes significantly larger while they were both manufactured on the 28nm node. And the 700 series had die sizes larger than comparatively positioned SKUs within the 600 series that were manufactured on the 28nm node.
"Most importantly, having higher gross margin suggests pretty directly that they were bringing in more money vs the cost of the products they were selling."
That's a circular argument. This is the statement you are meant to be proving, you can't just restate it and say it asserts itself.
"Keep in mind, while the gaming segment revenues grew by only a 66.4% Y/Y, nVidia brought in more additional revenue ($538M) Y/Y in this segment than the entire datacenter segment ($296M)."
Yes, and hence the data center only accounts for part of the margin increase. Also accounting for it is the fact that NVIDIA is not selling so many 710s, 720s, 730s, 740s, 830Ms, 840Ms, 850Ms, 910Ms, 930Ms, 940Ms, etc, and a whole lot more 1060s, 1070s, and 1080s, which are the equivalent of 760s, 770s, 780s, 960s, 970s, 980s, 980Ms, 970Ms, etc.
To be a little more clear about what I said about circular argument. It's circular insomuch as it actually is an argument. In fact it doesn't explicitly say anything at all except a simple definition of what a higher margin is. But you seem to be claiming that simply stating the definition somehow leads to the conclusion you want, i.e., that those higher margins must come from higher margins on similarly positioned SKUs due to less competition, and that is not true. In the post you replied to I already listed alternative ways margins could rise without that claim being true.
"Finally, the OEM and IP segment had revenues of $176 million, down from $198 million a year ago. Although this dropped, it’s likely due to the loss of the $66 million/quarter they were receiving from Intel as part of the settlement agreement made in 2011, which Intel paid it’s last payment in January 2016."
No, this reported quarter still includes the same $66M Intel IP payment they've been claiming each quarter. Next quarter will include a partial payment and that will be the last of it.
NVIDIA's OEM revenue has been decreasing for a number of quarters, by design. The say their OEM sales are lower margin than their other segments and they are focusing on higher margin businesses.
Thanks I assumed it was over since Intel paid them last in January 2016, but they've been claiming this deferred so there was a bit more then. I'll update the piece.
I was wondering how they account for Titan X or Quadro cards used for training neural networks or other non-graphics, non-visualization tasks. I know remote visualization via GRID is counted as data center and not professional visualization, because someone asked that in the Q3 conference call. My guess is that if you're not someone like Baidu buying the Titan Xs it would be counted under gaming. That Quadro GP100 card is a bit of a conundrum, though. Well a workstation probably isn't part of a data center so I'd guess it does go under Professional Visualization.
There are also people using 1080s for cryptocurrency mining. They might have an estimate of how large that market is but they probably still report it as gaming.
New Titan X has lost the GeForce mark, however, that identifies all gaming boards. It may be that they count is as Data Center, as Professional Visualization is not possible with it.
Ppl really beg NVIDIA to take their 1000 euros to buy a 0180.
I frequently check the stocks of AMAZON.DE and AVIDES.DE, two of the biggest PC electronics retailers in Germany, and the 1080's are almost always nearly out of stock.
Meanwhile, it will take 6 months or more before we see the new VEGA GPU's on retail stores, so NVIDIA is looking at two more quarters of explosive profit margins and unbelievable profits, largely due to the monopoly they have and no competition.
Flame me if you want, but if you are NV in a market where volume is declining or stagnant...would you rather target the rich demographic more where $400+ is chump change for the best card and be labelled nGreedia, or would you rather sell more ~$150 Polaris cards to the fickle fanboys while losing money in the process?
Call me stupid, but isn't Q4 2017 means the fourth quarter of 2017 and that should end in December 2017? I'm writing in March 2017 and December hasn't arrived yet.
So how can " ...2017 was even better...." when 2017 hasn't ended yet?
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
21 Comments
Back to Article
awehring - Thursday, February 9, 2017 - link
Grossmargin up 3.5% year-over-year means, that Nvidia could take very high prices on its products. Who doubts, that they did it. Time (and much room) for competition from AMD.Yojimbo - Friday, February 10, 2017 - link
It has a lot to do with their changing market mix. Their highest margin business is the data center business, which had 3 times the revenue this past quarter than the year ago quarter. They are making less lower margin OEM sales. And since the release of the Pascal cards they've had the high end of the GPU market to themselves, so a greater percentage of their gaming sales is from the higher margin high end (not that the Fury line ever sold all that much). Additionally, the entire market has been moving more towards the high end in the last few years.All of that has nothing to do with a greater selling price for equivalent SKUs because of a lack of competition. Such a greater selling price doesn't seem to exist. 10 series prices are pretty much in line with 700 series prices from 3 to 4 years ago, which was a time when there was more competition from AMD.
testbug00 - Friday, February 10, 2017 - link
don't forget that Pascal also has brought FE into the mix, which "steals" margin from AIBs.BurntMyBacon - Friday, February 10, 2017 - link
Many good points here that I generally agree with. However:@Yojimbo: "All of that has nothing to do with a greater selling price for equivalent SKUs because of a lack of competition. Such a greater selling price doesn't seem to exist. 10 series prices are pretty much in line with 700 series prices from 3 to 4 years ago, which was a time when there was more competition from AMD."
These numbers were compared against last quarter and last year, so why confuse things by comparing to 3 to 4 year old graphics cards. There was in fact a measurable if not significant price bump compared to the 900 series (at least at the upper end). The prices also seem to be staying higher measurably if not significantly longer than they did with the 900 series. More importantly, the die size (significant cost contributor) for the 1000 series is markedly smaller than the last generation (roughly) price equivalent. Most importantly, having higher gross margin suggests pretty directly that they were bringing in more money vs the cost of the products they were selling.
Now, as you said, there are a lot of good reasons outside higher SKU profits (whether higher price or lower manufacturing cost) for their higher gross margins. Certainly the 205.1% Y/Y ($99M) growth in revenue for the datacenter segment was hugely beneficial here. I just wanted to point out that higher profits on SKUs was still a measurable if not significant factor. Keep in mind, while the gaming segment revenues grew by only a 66.4% Y/Y, nVidia brought in more additional revenue ($538M) Y/Y in this segment than the entire datacenter segment ($296M).
Yojimbo - Friday, February 10, 2017 - link
"These numbers were compared against last quarter and last year, so why confuse things by comparing to 3 to 4 year old graphics cards."The comparison to 3 or 4 year old graphics cards had nothing to do with refuting margins numbers, but rather to do with the conclusion the original poster made by looking at the margin numbers. It is independent evidence to refute his conclusion and it doesn't "confuse things".
"There was in fact a measurable if not significant price bump compared to the 900 series (at least at the upper end)"
The fact that there was a increase compared to the 900 series, a series that also faced limited competition at the high end, does not refute my evidence that in the long term the claim that decreased competition has lead to more expensive graphics cards is not true.
"More importantly, the die size (significant cost contributor) for the 1000 series is markedly smaller than the last generation (roughly) price equivalent. "
What is of overriding importance here is that the node used for the 1000 series was very new, whereas the node used for the 900 series was very old. Older nodes have lower costs and higher yields. That's why the die size of the 1000 series needed to be smaller. Notice that the 900 series, while cheaper than the 700 series, had die sizes significantly larger while they were both manufactured on the 28nm node. And the 700 series had die sizes larger than comparatively positioned SKUs within the 600 series that were manufactured on the 28nm node.
"Most importantly, having higher gross margin suggests pretty directly that they were bringing in more money vs the cost of the products they were selling."
That's a circular argument. This is the statement you are meant to be proving, you can't just restate it and say it asserts itself.
"Keep in mind, while the gaming segment revenues grew by only a 66.4% Y/Y, nVidia brought in more additional revenue ($538M) Y/Y in this segment than the entire datacenter segment ($296M)."
Yes, and hence the data center only accounts for part of the margin increase. Also accounting for it is the fact that NVIDIA is not selling so many 710s, 720s, 730s, 740s, 830Ms, 840Ms, 850Ms, 910Ms, 930Ms, 940Ms, etc, and a whole lot more 1060s, 1070s, and 1080s, which are the equivalent of 760s, 770s, 780s, 960s, 970s, 980s, 980Ms, 970Ms, etc.
Yojimbo - Friday, February 10, 2017 - link
To be a little more clear about what I said about circular argument. It's circular insomuch as it actually is an argument. In fact it doesn't explicitly say anything at all except a simple definition of what a higher margin is. But you seem to be claiming that simply stating the definition somehow leads to the conclusion you want, i.e., that those higher margins must come from higher margins on similarly positioned SKUs due to less competition, and that is not true. In the post you replied to I already listed alternative ways margins could rise without that claim being true.Yojimbo - Friday, February 10, 2017 - link
"Finally, the OEM and IP segment had revenues of $176 million, down from $198 million a year ago. Although this dropped, it’s likely due to the loss of the $66 million/quarter they were receiving from Intel as part of the settlement agreement made in 2011, which Intel paid it’s last payment in January 2016."No, this reported quarter still includes the same $66M Intel IP payment they've been claiming each quarter. Next quarter will include a partial payment and that will be the last of it.
NVIDIA's OEM revenue has been decreasing for a number of quarters, by design. The say their OEM sales are lower margin than their other segments and they are focusing on higher margin businesses.
Brett Howse - Friday, February 10, 2017 - link
Thanks I assumed it was over since Intel paid them last in January 2016, but they've been claiming this deferred so there was a bit more then. I'll update the piece.p1esk - Friday, February 10, 2017 - link
I just bought 4 Titan X cards for neural network simulations. I wonder it they count that as "gaming".Yojimbo - Friday, February 10, 2017 - link
I was wondering how they account for Titan X or Quadro cards used for training neural networks or other non-graphics, non-visualization tasks. I know remote visualization via GRID is counted as data center and not professional visualization, because someone asked that in the Q3 conference call. My guess is that if you're not someone like Baidu buying the Titan Xs it would be counted under gaming. That Quadro GP100 card is a bit of a conundrum, though. Well a workstation probably isn't part of a data center so I'd guess it does go under Professional Visualization.Yojimbo - Friday, February 10, 2017 - link
There are also people using 1080s for cryptocurrency mining. They might have an estimate of how large that market is but they probably still report it as gaming.testbug00 - Friday, February 10, 2017 - link
Titan cards count as "gaming" segment last I checked. FWIWCiccioB - Friday, February 10, 2017 - link
New Titan X has lost the GeForce mark, however, that identifies all gaming boards.It may be that they count is as Data Center, as Professional Visualization is not possible with it.
testbug00 - Friday, February 10, 2017 - link
CORRECTION: Nvidia 4Q2017 CFO commentary says that OEM/IP includes Intel payment.http://files.shareholder.com/downloads/AMDA-1XAJD4...
"License revenue from our patent license agreement with Intel remained flat at $66 million for the fourth quarter and $264 million for fiscal 2017."
AT TIME OF WRITING ARTICLE CLAIMS 4Q2017 DOES NOT HAVE Intel PAYMENT.
Vatharian - Friday, February 10, 2017 - link
I actually made enough on my nVidia stocks after that information, so I can afford to swap my 2x 1070 to 2x 1080. Way to go!Achaios - Friday, February 10, 2017 - link
Ppl really beg NVIDIA to take their 1000 euros to buy a 0180.I frequently check the stocks of AMAZON.DE and AVIDES.DE, two of the biggest PC electronics retailers in Germany, and the 1080's are almost always nearly out of stock.
We are talking about insane sales.
Achaios - Friday, February 10, 2017 - link
Meanwhile, it will take 6 months or more before we see the new VEGA GPU's on retail stores, so NVIDIA is looking at two more quarters of explosive profit margins and unbelievable profits, largely due to the monopoly they have and no competition.StrangerGuy - Sunday, February 12, 2017 - link
Flame me if you want, but if you are NV in a market where volume is declining or stagnant...would you rather target the rich demographic more where $400+ is chump change for the best card and be labelled nGreedia, or would you rather sell more ~$150 Polaris cards to the fickle fanboys while losing money in the process?Simple math here.
vladx - Friday, February 10, 2017 - link
What a time to be a Nvidia shareholder.TelstarTOS - Friday, February 17, 2017 - link
Results of their ripoff prices on GPUs.adib - Monday, March 6, 2017 - link
Call me stupid, but isn't Q4 2017 means the fourth quarter of 2017 and that should end in December 2017? I'm writing in March 2017 and December hasn't arrived yet.So how can " ...2017 was even better...." when 2017 hasn't ended yet?