One of these stacks has more bandwidth than a 4090. It probably wont be until the 6000 series that we start seeing older gen HBM being used on consumer cards.
Unlikely. Production will shift to newer generations of HBM and leave no plant space or production capacity for prior generations. HBM will have to be cost effective for consumers in whatever generation is current before it makes it way into the mainstream.
So far Nvidia's AI hype has been eating all interposer packaging capacity at TSMC, so there has been no spare capacity to sell any GPU + HBM / interposer combinations for consumers - irrespective of HBM generation.
Maybe one day this will change - but Nvidia will still need to make sure that they don't sell too big memories / memorybandwidths for gamers so that they would end up being used as replacements for the most expensive datacenter GPU's.
I live near where Micron is headquartered, they are building a massive new chip plant= $15 billion, should be online next year...not sure if it will be making this memory though. Maybe memory prices will drop.
Curious to see the performance improvements in real-world applications. Hopefully, this translates to faster training times and better accuracy for AI models.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
8 Comments
Back to Article
QChronoD - Monday, February 26, 2024 - link
One of these stacks has more bandwidth than a 4090. It probably wont be until the 6000 series that we start seeing older gen HBM being used on consumer cards.PeachNCream - Monday, February 26, 2024 - link
Unlikely. Production will shift to newer generations of HBM and leave no plant space or production capacity for prior generations. HBM will have to be cost effective for consumers in whatever generation is current before it makes it way into the mainstream.HideOut - Monday, February 26, 2024 - link
Correct, in fact AMD DID release a disasturous card a few years ago using HBM. Its the only consumer card with it. We'll probably never see it again.Threska - Monday, February 26, 2024 - link
Still rockin my Vega.Orfosaurio - Monday, February 26, 2024 - link
More like "cost-effective enough"...zepi - Tuesday, February 27, 2024 - link
So far Nvidia's AI hype has been eating all interposer packaging capacity at TSMC, so there has been no spare capacity to sell any GPU + HBM / interposer combinations for consumers - irrespective of HBM generation.Maybe one day this will change - but Nvidia will still need to make sure that they don't sell too big memories / memorybandwidths for gamers so that they would end up being used as replacements for the most expensive datacenter GPU's.
Papaspud - Tuesday, February 27, 2024 - link
I live near where Micron is headquartered, they are building a massive new chip plant= $15 billion, should be online next year...not sure if it will be making this memory though. Maybe memory prices will drop.Sudionew - Monday, April 8, 2024 - link
Curious to see the performance improvements in real-world applications. Hopefully, this translates to faster training times and better accuracy for AI models.