Does this mean the the MI300 is now a high volume part instead of an HPC niche chip?
The way training/inference is so consolidated around A100s now is almost dystopian. Sometimes you find a JAX repo that works on Google TPUs (which are overpriced anyway). But other than that, from the perspective of a hobbyist, its like vendors outside Nvidia dont exist. So I hope AMD (and Intel) can break that up a little by repurposing these things.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
2 Comments
Back to Article
Threska - Tuesday, May 9, 2023 - link
Be nice to see what they do with their Xilinx acquisition.brucethemoose - Wednesday, May 10, 2023 - link
Does this mean the the MI300 is now a high volume part instead of an HPC niche chip?The way training/inference is so consolidated around A100s now is almost dystopian. Sometimes you find a JAX repo that works on Google TPUs (which are overpriced anyway). But other than that, from the perspective of a hobbyist, its like vendors outside Nvidia dont exist. So I hope AMD (and Intel) can break that up a little by repurposing these things.