It's there, but it should be on first use. Fourth paragraph:
It is noteworthy that compute GPUs, FPGAs, and accelerators from CSPs all use HBM memory to get the highest bandwidth possible and use TSMC's interposer-based chip-on-wafer-on-substrate packaging.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
6 Comments
Back to Article
James5mith - Thursday, September 7, 2023 - link
I love how in the article you explain the AI and HPC acronyms, but not CoWoS. Ever. Not even after you side track to explain OSAT.megapleb - Thursday, September 7, 2023 - link
It's there, but it should be on first use. Fourth paragraph:It is noteworthy that compute GPUs, FPGAs, and accelerators from CSPs all use HBM memory to get the highest bandwidth possible and use TSMC's interposer-based chip-on-wafer-on-substrate packaging.
Ryan Smith - Thursday, September 7, 2023 - link
Yeah, that's an unforced error on our part. I've gone ahead and updated the lead paragraph. Thanks!AndrewJacksonZA - Thursday, September 7, 2023 - link
Also, what is "CSP," please?Yojimbo - Thursday, September 7, 2023 - link
Cloud Service ProviderDr_b_ - Thursday, September 7, 2023 - link
Reading is hardddddd.