For sure, AI in many forms seems likely a big growth driver for at least the period let say 2025 - 2030…
For AI datacenters, would someone get an idea about how much of that growth is for AI training servers (to train the AI models) and how much is for the AI inferencing servers (to apply the learned model) ?
Is it for example, 80% of AI training servers and 20% AI inferencing servers ? I am completely clueless about the numbers here…
Also, another issue emerging from this is cost (as of 2023, it seems Nvidia GPU cards for AI training servers could cost 30 000$ / 40 000$) and exponential rise in power consumption allocated to those AI servers : this will create huge incentive / opportunities to find more power and cost efficient solutions / paradigm…
The current way of doing this is likely vastly energy inefficient and I am a firm believer that spintronics and/or Non-Volatile Memory like MRAM (maybe taking advantage of the stochastics effects, and/or Non-Volatile High Bandwith Memory (HBM) MRAM) used in a different paradigm/way may be much better suited for AI and may enable 100x / 1000x (or much more) energy efficiency / cost improvement…
Wow: "Since Foxconn has production facilities in the U.S., including the well-known factory in Wisconsin"?
ICYMI: Those production facilities pretty much never happened.
"...its unclear what is being manufactured at the site.
The facility has changed from a Generation 10.5 to a Generation 6 which normally makes screens for phones, tablets and TVs. But so far, no screens have been made.
The job goal number is also down from 13,000 statewide to 1,454.
The capital investment has also gone down from $10 billion to $672.8 million."
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
4 Comments
Back to Article
Threska - Thursday, August 17, 2023 - link
Tesla and Dojo.https://en.wikipedia.org/wiki/Tesla_Dojo
Diogene7 - Thursday, August 17, 2023 - link
For sure, AI in many forms seems likely a big growth driver for at least the period let say 2025 - 2030…For AI datacenters, would someone get an idea about how much of that growth is for AI training servers (to train the AI models) and how much is for the AI inferencing servers (to apply the learned model) ?
Is it for example, 80% of AI training servers and 20% AI inferencing servers ? I am completely clueless about the numbers here…
Also, another issue emerging from this is cost (as of 2023, it seems Nvidia GPU cards for AI training servers could cost 30 000$ / 40 000$) and exponential rise in power consumption allocated to those AI servers : this will create huge incentive / opportunities to find more power and cost efficient solutions / paradigm…
The current way of doing this is likely vastly energy inefficient and I am a firm believer that spintronics and/or Non-Volatile Memory like MRAM (maybe taking advantage of the stochastics effects, and/or Non-Volatile High Bandwith Memory (HBM) MRAM) used in a different paradigm/way may be much better suited for AI and may enable 100x / 1000x (or much more) energy efficiency / cost improvement…
Threska - Thursday, August 17, 2023 - link
More efficient training of AI models rather than pie in the sky hardware.e_sandrs - Friday, August 18, 2023 - link
Wow: "Since Foxconn has production facilities in the U.S., including the well-known factory in Wisconsin"?ICYMI: Those production facilities pretty much never happened.
"...its unclear what is being manufactured at the site.
The facility has changed from a Generation 10.5 to a Generation 6 which normally makes screens for phones, tablets and TVs. But so far, no screens have been made.
The job goal number is also down from 13,000 statewide to 1,454.
The capital investment has also gone down from $10 billion to $672.8 million."