It's pretty obvious where things are headed and Nvidia should be scared. Honestly Intel and AMD are both going to have some very serious APU's in 5 years. When AMD is on TSMC's 3nm I really can see a 8-core chiplet, GPU chiplet, 32GB of HBM all on die. The densities that 3nm is slated to bring should make for iGPU's that rival even today's highest end dGPU's.
That's been said for years. One thing that won't change is cost of using stacked memory such as HBM2 and the power/thermal advantages of being a discrete GPU.
All in all, it'll be like today with products largely differentiated by price and thermal envelope.
So yes the costs of an interposer are relatively high in the consumer market but a design like this could shred data center loads. I will need to see it though. I think I have seen this rumor for a while.
And what costs would those be... and don't quote me the only numbers on the net because those are over 3 years old and were from an unreliable source even then.
New games becomes more and more demanding... so some people there always will be room for discrete gpu. But to most common people... no need. Apu is enough.
**We asked AMD more information about it, as to whether it correlates to any of TSMC’s latest packaging developments ...** ______________________________________________
This looks pretty clearly like an enterprise compute card. It makes a ton of sense to produce cards like this for servers, but I'm not sure something this costly (and power dense) will make it into the consumer space for a while.
Probably not likely any time soon, that being said AMD was clear on chiplet made it possible and was far far less expensive for them to be able to do, still turn a solid profit as well as being much easier to get pricing down far more quickly.
I read this on a few different review sites thus far.
suppose it depends on what the chiplets are made for/what they will be doing, some losses, potential massive gains (in terms of latency chip-chiplet to the others)
With how "intelligent" these chips are these days, might be sooner and for a much smaller amount of $$$ to get into our hands...maybe somewhere along the line, SOMEONE, will be "wise" and say "let us start a "buy back" program, so we can salvage the costly materials, save the buyer some funds to have the latest and greatest and we now have "cream of the crop" recycled materials for our use again, why buy what we have already "consumed"
sort of speak.
probably help "tech wastelands" around the world from being toxified at least
I could see this as a crazy enterprise CPU. Each chip having access to 32GB HBM3 @ ~1TB/sec bandwidth for an L4 style cache. The entire CPU package would be able to house an Oracle or MSSQL DB all in cache, SAP HANA not so much.
I have some difficulties to understand why as of 2020, Intel, Qualcomm, AMD, Apple... are still not integrating the compute logic, the memory (ex: 12GB LPDDR5 DRAM), the storage (512GB Nand), all in one package, with an option to bypass any of those if a manufacturer wants to integrate more DRAM or storage ?
Ideally also the wireless communication part (cellular (5G), WiGig 802.11ay, WIFI, Bluetooth, GPS,...) would be integrated in the same package.
All this would allow to have all essential building blocks to build ANY computer in one package, and hopefully with scale/volume, ultimately lower the cost down...
Attend any major tech exhibition and you’ll find Intel announcing or reannouncing mildly improved processors. Whether you’re at IFA in Berlin, CES in Las Vegas, or Computex in Taipei, the spiel is always the same: the future is wireless, battery life matters to everyone, and there are a lot of people with five-year-old PCs who might notice a difference if they buy a new Intel-powered computer. It’s all painfully incremental and out of sync with Apple’s product cadence. Apple will give you, at most, two years with an iPhone before enticing you into upgrading, whereas Intel is trying to convince people with PCs that are half a decade old to do the same. https://rentacar.guruhttps://pakistani.guruhttps://realproperty.onl
That's the best spam comment I've seen in a long while. You had me believing you were genuine .. right up to the spam at the end. Have a fake AnandTech Award.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
12 Comments
Back to Article
FreckledTrout - Thursday, March 5, 2020 - link
It's pretty obvious where things are headed and Nvidia should be scared. Honestly Intel and AMD are both going to have some very serious APU's in 5 years. When AMD is on TSMC's 3nm I really can see a 8-core chiplet, GPU chiplet, 32GB of HBM all on die. The densities that 3nm is slated to bring should make for iGPU's that rival even today's highest end dGPU's.IntelUser2000 - Thursday, March 5, 2020 - link
That's been said for years. One thing that won't change is cost of using stacked memory such as HBM2 and the power/thermal advantages of being a discrete GPU.All in all, it'll be like today with products largely differentiated by price and thermal envelope.
ChaosFenix - Friday, March 6, 2020 - link
So yes the costs of an interposer are relatively high in the consumer market but a design like this could shred data center loads. I will need to see it though. I think I have seen this rumor for a while.cb88 - Sunday, March 8, 2020 - link
And what costs would those be... and don't quote me the only numbers on the net because those are over 3 years old and were from an unreliable source even then.haukionkannel - Friday, March 6, 2020 - link
New games becomes more and more demanding... so some people there always will be room for discrete gpu. But to most common people... no need. Apu is enough.Smell This - Friday, March 6, 2020 - link
**We asked AMD more information about it, as to whether it correlates to any of TSMC’s latest packaging developments ...**______________________________________________
Well . . . . . DUH!
("Earth to Doc Ian . . . . Earth to Doc Ian")
;-)
Someguyperson - Thursday, March 5, 2020 - link
This looks pretty clearly like an enterprise compute card. It makes a ton of sense to produce cards like this for servers, but I'm not sure something this costly (and power dense) will make it into the consumer space for a while.Dragonstongue - Thursday, March 5, 2020 - link
Probably not likely any time soon, that being said AMD was clear on chiplet made it possible and was far far less expensive for them to be able to do, still turn a solid profit as well as being much easier to get pricing down far more quickly.I read this on a few different review sites thus far.
suppose it depends on what the chiplets are made for/what they will be doing, some losses, potential massive gains (in terms of latency chip-chiplet to the others)
With how "intelligent" these chips are these days, might be sooner and for a much smaller amount of $$$ to get into our hands...maybe somewhere along the line, SOMEONE, will be "wise" and say "let us start a "buy back" program, so we can salvage the costly materials, save the buyer some funds to have the latest and greatest and we now have "cream of the crop" recycled materials for our use again, why buy what we have already "consumed"
sort of speak.
probably help "tech wastelands" around the world from being toxified at least
schujj07 - Friday, March 6, 2020 - link
I could see this as a crazy enterprise CPU. Each chip having access to 32GB HBM3 @ ~1TB/sec bandwidth for an L4 style cache. The entire CPU package would be able to house an Oracle or MSSQL DB all in cache, SAP HANA not so much.Diogene7 - Friday, March 6, 2020 - link
I have some difficulties to understand why as of 2020, Intel, Qualcomm, AMD, Apple... are still not integrating the compute logic, the memory (ex: 12GB LPDDR5 DRAM), the storage (512GB Nand), all in one package, with an option to bypass any of those if a manufacturer wants to integrate more DRAM or storage ?Ideally also the wireless communication part (cellular (5G), WiGig 802.11ay, WIFI, Bluetooth, GPS,...) would be integrated in the same package.
All this would allow to have all essential building blocks to build ANY computer in one package, and hopefully with scale/volume, ultimately lower the cost down...
hakabakkjanu - Thursday, March 12, 2020 - link
Attend any major tech exhibition and you’ll find Intel announcing or reannouncing mildly improved processors. Whether you’re at IFA in Berlin, CES in Las Vegas, or Computex in Taipei, the spiel is always the same: the future is wireless, battery life matters to everyone, and there are a lot of people with five-year-old PCs who might notice a difference if they buy a new Intel-powered computer. It’s all painfully incremental and out of sync with Apple’s product cadence. Apple will give you, at most, two years with an iPhone before enticing you into upgrading, whereas Intel is trying to convince people with PCs that are half a decade old to do the same. https://rentacar.guru https://pakistani.guru https://realproperty.onlTomatotech - Saturday, August 15, 2020 - link
That's the best spam comment I've seen in a long while. You had me believing you were genuine .. right up to the spam at the end. Have a fake AnandTech Award.