Yup. Over the weekend I saw an article on one of the more rumory tech sites claiming an unnamed AIB had sent back 300k Pascal series cards (GPU dies?) to NVidia. That suggests that the crypto bubble, at least for NV is rapidly deflating if not completely empty. More speculatively, it suggests that like AMD during the last bubble, NV ended up ramping volume just in time for demand to slump leaving it stuck with an excess of parts. The rumory site was also speculating that if the actual supply glut of Pascal is large enough NV might delay releasing their 11xx/20xx series cards to sell the remaining pascal inventory off before its value drops due to the new cards coming out.
Hope you're right! Wouldn't mind NVIDIA having a glut of unsold GPUs, which might (finally) bring the prices of their cards back down to planet earth. Unfortunately, the dGPU duopoly is still just that for now. Last, but not least, I hope regulators and the media investigate if NVIDIA and AMD are "discussing pricing" in private while being adverserial in public. With billions of $ at stake, temptation must be high to keep the prices up.
however unlike amd last time they have all the time in the world to sell of the cards as amd is not on the level where they need to come up with something. they can just delay the release until they sold the old cards.
Just to put the 300k number in perspective, Nvidia ships millions of cards per quarter. You are talking about like 5% of their inventory in one quarter. It is also safe to assume a majority of these cards are 1050's with 1070/1080 ti's being the smallest percentage. I have doubts on the rumors that a 300k shipment would cause Nvidia to delay their shipment considering the amount of sales they will get from miners and gamers. To me it seems more like click bait fake news. Source: https://www.anandtech.com/show/10864/discrete-desk...
I didn't link any recent shipment numbers as they are external sites but the numbers aren't hard to find.
On the bright side, you can use the price you paid and the "pain" of the purchase as a way to cleverly hint at your income level, your willingness to spend on PC parts, and on the specs of your system as a way to covertly try to impress Anandtech readers that bother to acknowledge your comment.
You'd be surprised how far down i am on the income level compared to rest of the readers here but that's not the point. There needs to be rules to stop such profit gouging. I'm pretty sure if i didn't buy that unit someone else would have. Regardless.
This could make for some nice refresh products. Take the basic GPU chip and modify to work with GDDR6 for a quick performance bump. I wouldn't expect anything dramatic, but might be able to offer a new mid-range product with better availability and around MSRP pricing!
Question: what is the power consumption of the GDDR6 compared to fast DDR4? Just wondering. For me, one learning from Microsoft's Xbox X was that GDDR (5) memory can work well as RAM for the CPU, as underwhelming as those Jaguar cores are (I know - it's mostly about graphics with consoles). Now, imagine a 16 core Threadripper workstation with 16 or 32 Gb fast and wide GDDR6 RAM; that baby should fly.
GDDR is higher in power consumption than DDR. Running the data bus several times faster isn't free. (And not just in requiring the chips to be soldiered down instead of on removable DIMMs.)
AFAIK GDDR memory controllers are also significantly larger than their DDR equivalents.
Hi Dan, I know that GDDR RAM uses more power than DDR4, but how much more? That's why I asked; for let's say 32 Gb, are we talking 20 W, 50W, or 100W more? Also, not sure about having to have it soldered down; higher end routers (industrial/professional variety) also use GDDR, and I would assume those buyers like to be able to swap a bad module out without having to replace the entire board. But then again, I don't work for Cisco or Juniper. The use scenario I have in mind would be almost real-time UHD video encoding using a software-based approach; that requires processor oomph and sufficient fast memory. Lastly, even if the GDDR RAM would eat an extra 100 W or so, we are talking about processors in the 250W+ TDP class, so an extra 100 W might be well worth it. Agree though that it wouldn't make sense as system RAM in an ultraportable.
An interesting tidbit is that DDR4 power consumption scales linearly when adding sticks (3W per stick), while GDDR5 uses approximately the same power regardless of the number stacks. 4GB across 16 stacks uses 30W while 12GB across 24 stacks uses 31.7W. GDDR6 uses 20W.
8 sticks of DDR4 memory (64GB) would consume a little under 24W. 32 stacks of 16Gb (64GB) would theoretically use a little over 20W.
Implying that the total power cost of implementing a GDDR solution is in the controller itself, as that appears to be dominating the power usage numbers you present.
I know people are fixated more on the performance of these relatively expensive and super-high bandwidth memory, but the breakthrough that I'd like to see is dirt-cheap high-bandwidth memory for low-power devices.
As someone who always looks forward to iGPU advancements (especially Raven Ridge in recent years), it always pains me to see these little but capable GPUs being so starved for memory bandwidth. It'd be amazing if humanity could invent an affordable DRAM technology that blew past DDR4 and finally solve the bandwidth bottleneck for embedded GPUs so that we could have more affordable, high-performance SOCs like Crystalwell or the Xbox One X's Scorpio Engine.
I absolutely agree with you. We need better iGPUs and much faster memory for them. HBM may solve that problem to an extent, but as long as there's a market for large, expensive, and power-hungry dGPUs, there'll always be another 1080 or Vega 64 out there eager to gobble up 250W+ while lobbing a money-destroying torpedo at your account balance. However, as desktop PC markets continue to shrink and only show growth in SFF and gaming segments, I think you'll continue to see incremental iGPU gains that are never sufficient to satisfy in a largely 2-way stratified desktop retail space.
No amount of memory bandwidth is going to make up for the power budget limitation of iGPUs, they simply cannot compete with dGPUs with similar efficiencies that are often many times larger/wider and have 2-3 times the total power budget than the whole iGPU/CPU have combined.
I do agree with you that iGPUs have a power disadvantage that is impossible to overcome compared to discreet cards; it's more about balance. Just as it makes sense to buy a balanced CPU/GPU setup, it also makes sense that iGPUs come with memory and bandwidth that is appropriate for its maximum compute potential.
The problem is that in the iGPU space, compute improvements are far outpacing SDRAM improvements. While the introduction of GDDR6 and HBM2 are in lockstep with GPUs when it comes to improvement rate, no such thing exists for embedded graphics outside of the expensive Crystalwell and the Xbox One eSRAM. For future AMD APUs and Intel Gen10 GPUs, DDR4 is a glass ceiling.
What I want is cheap VRAM tech that is low power enough to ship with most SOCs with a 5W/15W/28W TDP. It doesn't have to be as fast or as big as GDDR5, just enough that the bottleneck in games is compute-bound instead of memory-bound.
I regret to inform you that the word "lit" died earlier this week. Funeral will be July 7th. The legal case against Don Jr. is still pending in the murder of "lit".
The very next time I read a blog, I hope that it does not disappoint me as much as this particular one. I mean, I know it was my choice to read, but I truly thought you would probably have something useful to say. All I hear is a bunch of crying about something that you could possibly fix if you were not too busy searching for attention. https://youtubego.me/download/
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
29 Comments
Back to Article
milkywayer - Monday, June 25, 2018 - link
Here's to some reduction in rip off by gpu makers. Hurt paying $1030 for a 1080 ti direct from evga just 3 months ago.Amoro - Monday, June 25, 2018 - link
I'm fairly certain it's retailers that are setting the prices and not the manufacturers(board partners or NVIDIA/AMD).Also, nobody forced you to pay $1030 for that 1080 Ti. You're only encouraging that type of behavior.
jordanclock - Monday, June 25, 2018 - link
Those prices aren't the fault of GPU makers. Nor are the fault of AIBs.It is because of a combination of increased RAM prices and huge demand by cryptominers.
baka_toroi - Monday, June 25, 2018 - link
Demand that's currently dwindling dramatically due to all cryptos crashing.DanNeely - Monday, June 25, 2018 - link
Yup. Over the weekend I saw an article on one of the more rumory tech sites claiming an unnamed AIB had sent back 300k Pascal series cards (GPU dies?) to NVidia. That suggests that the crypto bubble, at least for NV is rapidly deflating if not completely empty. More speculatively, it suggests that like AMD during the last bubble, NV ended up ramping volume just in time for demand to slump leaving it stuck with an excess of parts. The rumory site was also speculating that if the actual supply glut of Pascal is large enough NV might delay releasing their 11xx/20xx series cards to sell the remaining pascal inventory off before its value drops due to the new cards coming out.eastcoast_pete - Monday, June 25, 2018 - link
Hope you're right! Wouldn't mind NVIDIA having a glut of unsold GPUs, which might (finally) bring the prices of their cards back down to planet earth. Unfortunately, the dGPU duopoly is still just that for now. Last, but not least, I hope regulators and the media investigate if NVIDIA and AMD are "discussing pricing" in private while being adverserial in public. With billions of $ at stake, temptation must be high to keep the prices up.qlum - Monday, June 25, 2018 - link
however unlike amd last time they have all the time in the world to sell of the cards as amd is not on the level where they need to come up with something. they can just delay the release until they sold the old cards.AssassinX - Tuesday, June 26, 2018 - link
Just to put the 300k number in perspective, Nvidia ships millions of cards per quarter. You are talking about like 5% of their inventory in one quarter. It is also safe to assume a majority of these cards are 1050's with 1070/1080 ti's being the smallest percentage. I have doubts on the rumors that a 300k shipment would cause Nvidia to delay their shipment considering the amount of sales they will get from miners and gamers. To me it seems more like click bait fake news.Source:
https://www.anandtech.com/show/10864/discrete-desk...
I didn't link any recent shipment numbers as they are external sites but the numbers aren't hard to find.
cwolf78 - Monday, June 25, 2018 - link
GPU makers will continue to rip off everyone as long as there are people like you willing to pay whatever they ask. Just saying...PeachNCream - Monday, June 25, 2018 - link
On the bright side, you can use the price you paid and the "pain" of the purchase as a way to cleverly hint at your income level, your willingness to spend on PC parts, and on the specs of your system as a way to covertly try to impress Anandtech readers that bother to acknowledge your comment.milkywayer - Monday, June 25, 2018 - link
You'd be surprised how far down i am on the income level compared to rest of the readers here but that's not the point. There needs to be rules to stop such profit gouging. I'm pretty sure if i didn't buy that unit someone else would have. Regardless.grant3 - Monday, June 25, 2018 - link
Why?It's a serious question.
You just said the card was worth $1030, not only to you, but to "someone else".
Please explain why "rules" are needed to disrupt transactions both parties think is fair?
jardows2 - Monday, June 25, 2018 - link
This could make for some nice refresh products. Take the basic GPU chip and modify to work with GDDR6 for a quick performance bump. I wouldn't expect anything dramatic, but might be able to offer a new mid-range product with better availability and around MSRP pricing!eastcoast_pete - Monday, June 25, 2018 - link
Question: what is the power consumption of the GDDR6 compared to fast DDR4? Just wondering. For me, one learning from Microsoft's Xbox X was that GDDR (5) memory can work well as RAM for the CPU, as underwhelming as those Jaguar cores are (I know - it's mostly about graphics with consoles). Now, imagine a 16 core Threadripper workstation with 16 or 32 Gb fast and wide GDDR6 RAM; that baby should fly.DanNeely - Monday, June 25, 2018 - link
GDDR is higher in power consumption than DDR. Running the data bus several times faster isn't free. (And not just in requiring the chips to be soldiered down instead of on removable DIMMs.)AFAIK GDDR memory controllers are also significantly larger than their DDR equivalents.
eastcoast_pete - Monday, June 25, 2018 - link
Hi Dan, I know that GDDR RAM uses more power than DDR4, but how much more? That's why I asked; for let's say 32 Gb, are we talking 20 W, 50W, or 100W more? Also, not sure about having to have it soldered down; higher end routers (industrial/professional variety) also use GDDR, and I would assume those buyers like to be able to swap a bad module out without having to replace the entire board. But then again, I don't work for Cisco or Juniper.The use scenario I have in mind would be almost real-time UHD video encoding using a software-based approach; that requires processor oomph and sufficient fast memory.
Lastly, even if the GDDR RAM would eat an extra 100 W or so, we are talking about processors in the 250W+ TDP class, so an extra 100 W might be well worth it. Agree though that it wouldn't make sense as system RAM in an ultraportable.
Drumsticks - Tuesday, June 26, 2018 - link
I have to imagine it's at least 75W+. HBM brings 25-50W savings over gddr5 iirc, but obviously must still be using a certain amount of power.Rudde - Tuesday, June 26, 2018 - link
An interesting tidbit is that DDR4 power consumption scales linearly when adding sticks (3W per stick), while GDDR5 uses approximately the same power regardless of the number stacks. 4GB across 16 stacks uses 30W while 12GB across 24 stacks uses 31.7W. GDDR6 uses 20W.8 sticks of DDR4 memory (64GB) would consume a little under 24W. 32 stacks of 16Gb (64GB) would theoretically use a little over 20W.
FullmetalTitan - Thursday, June 28, 2018 - link
Implying that the total power cost of implementing a GDDR solution is in the controller itself, as that appears to be dominating the power usage numbers you present.Rudde - Tuesday, June 26, 2018 - link
DDR4 (2x8GB) is 6W and GDDR5 is 30W (4-12GB). GDDR6 uses 20W.It would indeed be nice to see a HEDT cpu paired with GDDR6, I do worry about latencies though.
Rudde - Tuesday, June 26, 2018 - link
I did some searching around, apparently DDR4 and GDDR6 has similar latencies.Power consumption seems to be the only downside of using GDDR5/6 over DDR4.
ajhix36 - Tuesday, June 26, 2018 - link
What about 12Gb chips? I'd much rather see 12GB of VRAM on the next gpus. And it is a valid option of the GDDR6 spec.wizfactor - Tuesday, June 26, 2018 - link
I know people are fixated more on the performance of these relatively expensive and super-high bandwidth memory, but the breakthrough that I'd like to see is dirt-cheap high-bandwidth memory for low-power devices.As someone who always looks forward to iGPU advancements (especially Raven Ridge in recent years), it always pains me to see these little but capable GPUs being so starved for memory bandwidth. It'd be amazing if humanity could invent an affordable DRAM technology that blew past DDR4 and finally solve the bandwidth bottleneck for embedded GPUs so that we could have more affordable, high-performance SOCs like Crystalwell or the Xbox One X's Scorpio Engine.
PeachNCream - Tuesday, June 26, 2018 - link
I absolutely agree with you. We need better iGPUs and much faster memory for them. HBM may solve that problem to an extent, but as long as there's a market for large, expensive, and power-hungry dGPUs, there'll always be another 1080 or Vega 64 out there eager to gobble up 250W+ while lobbing a money-destroying torpedo at your account balance. However, as desktop PC markets continue to shrink and only show growth in SFF and gaming segments, I think you'll continue to see incremental iGPU gains that are never sufficient to satisfy in a largely 2-way stratified desktop retail space.IndianaKrom - Tuesday, June 26, 2018 - link
No amount of memory bandwidth is going to make up for the power budget limitation of iGPUs, they simply cannot compete with dGPUs with similar efficiencies that are often many times larger/wider and have 2-3 times the total power budget than the whole iGPU/CPU have combined.wizfactor - Wednesday, June 27, 2018 - link
I do agree with you that iGPUs have a power disadvantage that is impossible to overcome compared to discreet cards; it's more about balance. Just as it makes sense to buy a balanced CPU/GPU setup, it also makes sense that iGPUs come with memory and bandwidth that is appropriate for its maximum compute potential.The problem is that in the iGPU space, compute improvements are far outpacing SDRAM improvements. While the introduction of GDDR6 and HBM2 are in lockstep with GPUs when it comes to improvement rate, no such thing exists for embedded graphics outside of the expensive Crystalwell and the Xbox One eSRAM. For future AMD APUs and Intel Gen10 GPUs, DDR4 is a glass ceiling.
What I want is cheap VRAM tech that is low power enough to ship with most SOCs with a 5W/15W/28W TDP. It doesn't have to be as fast or as big as GDDR5, just enough that the bottleneck in games is compute-bound instead of memory-bound.
wizfactor - Wednesday, June 27, 2018 - link
Could you imagine a Ryzen APU that came with 256MB/512MB of HBM2 for the GPU? That would be soooo lit!FullmetalTitan - Thursday, June 28, 2018 - link
I regret to inform you that the word "lit" died earlier this week. Funeral will be July 7th. The legal case against Don Jr. is still pending in the murder of "lit".abhijidshailk - Monday, July 2, 2018 - link
The very next time I read a blog, I hope that it does not disappoint me as much as this particular one. I mean, I know it was my choice to read, but I truly thought you would probably have something useful to say. All I hear is a bunch of crying about something that you could possibly fix if you were not too busy searching for attention. https://youtubego.me/download/