I told 14HP which is 14nm FinFET structure on SOI wafer and beyond 14nm... FD-SOI is hire to stay & badly needed for G5, mimo x4 in mobile, IoT and cetera basically all RF blocks & CMOS on it is coming too.
The article states that 14HP is a descendant of 14LP from Samsung, but more likely it is a marriage of IBM's internally developed 14nm SOI process and 14LP. See IBM's last publication on it prior to the acquisition:
The descriptions of IBM's Power9 chips, which indicate it is SOI and features embedded DRAM, sure look a lot more like the IBM technology than the bulk 14LP technology...
As for RFCMOS applications, FDSOI could be a game changer, since as a planar technology, it doesn't suffer from the high gate capacitance of Finfets. This capacitance is part of the Finfet advantage for Logic applications, but is a real handicap for RF, which requires dealing with much higher frequencies.
Yes I reed that article long time ago so do I really need to quote my self again? 14HP FinFET-SOI and beyond.
GF 22 nm FD-SOI isn't game changer but a winning design & for digital it's proven how it can match the best efforts on 10 nm FinFET regarding power consumption when back biasing is used that is 0.4V (for small to medium sized SoC's) while analogue where it's obviously only logical choice goes with 0.8V. For most things & especially IoT you need to use combined multi RF & digital components in the same package & that it's both development and final product price is cheap so it's absolute design win for both RF and IoT and good enough for mainstream, it's also win for DSP's not so much for FPGA's & GPU's as they tend to be huge but in smaller or mobile ones it still wins because of its price. Only thing where it loses is HPC & desktop CPU which tend to push operating frequencies as much as possible & FinFET is better at this...
You must be kidding. They should have kept Dirk Meyer who said in 2011, we need to make a KING product in cpu/gpu FIRST, then do this APU/console crap that has junk margins that are merely additive, rather than the bulk of your net income. You can use junk like that to milk your tech for the last few dollars but like NV/Intel, AMD should be concentrating on the enterprise stuff or rich people PERIOD. After that leaves cards or chips on the shelf, THEN and ONLY then do you chase the low margin products.
If you are limited in supply (can't produce them as fast as you sell them I mean), which is the case for both AMD/NV currently, you shouldn't be wasting time on 10-20% margins (which for AMD just means barely break even on a good quarter), but rather you should be chasing 50%+ margins.
For anyone who doesn't believe this is how things work (they fired Dirk for saying it...LOL, morons), you should just compare the last decade of NV & Intel quarterly reports to AMD's. Note Intel plops chips on TOP of their stack when adding new models (HEDT anyone?), and NV simply put out 1080/1070 (for example) and took a year to put out a 1060. No point in wasting silicon on poor people (sorry, just the facts) when you can nab a RICH person who has throw away cash sitting all around him and upgrades yearly :) At anything under 30% margins AMD just ends up barely paying interest on their debt, wafer agreement penalties etc. They need to hit 40%+ before they start rolling in cash. They should have hit servers first, then home users. IE, I'd rather be chasing $1000-7000 cpus if I was AMD instead of chasing <$1000. They just put out an APU that has a max price currently of $169...ROFLMAO. Lisa should be fired (and most of their management, they are clueless), and Dirk should be sought out ASAP. He was right. Well, DUH.
Now worse IMO, they are giving Intel IP, instead of MAKING THAT CHIP THEMSELVES! Intel will probably charge $400+ for the 8809g. That should have been AMD. AMD is probably making peanuts on that, while Intel takes the bulk of the profits. If I was AMD's CEO I'd fire anyone who brought me an idea that didn't make 40%+ margins out of the gate at least. Intel/NV are chasing 60% stuff and higher. They'd be 70% margins overall if they axed the bottom...LOL.
Not a knock on IBM people, just on this person. Letting Intel/NV have the high end for basically most of the last 7yrs, is retarded. Consoles killed R&D that should have kept them in the cpu race, and drop watts/heat on products, allowed proper R&D to not have crap launches with problem after problem (mboards not having bios' finished, memory issues, vid cards launching and having to speed up fans to mitigate issues etc). With anyone but a fanboy, you probably only get ONE chance to impress your customers. You can't screw up a launch if you're the little guy, you need to get that right out of the gate. Who's idea was it to launch a month or two early damaging the cpu launch? Not the Motherboard makers...AMD had ~800mil to survive one more month (one more year easily), so why push the launch too early? You're fired. Who chose SINGLE DIGIT console margins for 2yrs (they finally said mid double digits-meaing 15% or less or you'd say 16+ right?), instead of CPU/GPU/DRIVERS R&D? You're fired too. Who chose HBM not once, but TWICE, when it was worth nothing more than a buzzword as that bandwidth was not needed at all (see NV cards). It still isn't needed now either which is why all NV home cards have GDDR5x or less, and why their margins are GREAT and setting record profits/margins/revenue quarter after quarter. Who decide a buzzword was worth the risk of massive shortages and super high BOM costs which kills margins (and card supply). If you can't get it on a shelf, your tech is useless. NV chose the KISS principle. For AMD management, who'm I'm sure hasn't heard of this (LOL), it stands for KEEP IT SIMPLE STUPID. AMD went hard to produce, wasteful (overkill bandwidth when only unicorns were using 4K...LOL) and expensive. NV chose easy to make, cheap, and adequate for even Titan. Again AMD management, do you read quarterly reports (your own, and your competition?)?? No? You're fired. Come back when you understand basic economics. Price wars as the only broke guy in the room?? LOL. I digress.
If it was that easy, they did just what you said, with pleasure.
However, AMD's technical ability can't keep up with Intel and Nvidia. Therefore, it has always been AMD's last resort to choose better memory for performance gain to keep up. This is the reason why AMD has been first with integrated memory controllers, HBM, and choosing higher bandwidth memory spec for some cards vs. competing cards of Nvidia
They're pretty lucky with this as the cryptocurrency ethash/Ethereum algorithm is memory intensive, making their cards superior for mining despite behind in gaming performance or other crypto algorithms. It should have been a quick fix for their profitability by producing more cards but they are limited by the shortage of memory.
AMD Can't keep up? That is the most interesting thing that causes my eyebrow to raise, that has been said in a long time :P It's a full stack of technical implementation and AMD has led the market in pushing the tech forward while Intel has bunny suits and disparate architectures for rich people.
I honestly can't stand Intel Fanboys, sure if you actually flex your computational power in a single socket you can drain your capex like a cray (see what I did there) but truly if you want to zen out and achieve more sustainable opex AMD has been a leader.
Intel has only achieved it's success with 25x pockets. So, I really don't think any respect is due to Intel Engineers for marketshare that is complete hogwash. AMD has always done more with less, and unfortunately had complete useless humans like Hector Ruinz destroy the company and Global Foundries.
Seems like a rational move to focus more on the upcoming process innovation like 7nm. I think it's safe to assume Sanjay Jha got a nice fat golden parachute seeing as how nicely they talk about the parting of ways.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
14 Comments
Back to Article
ZolaIII - Thursday, March 15, 2018 - link
Well not a really interesting article...How about writing more about 14HP FinFET-SOI & beyond.
III-V - Thursday, March 15, 2018 - link
Ah yes, FDSOI... the transistor vaporware. Real interesting stuff, there...ZolaIII - Thursday, March 15, 2018 - link
I told 14HP which is 14nm FinFET structure on SOI wafer and beyond 14nm... FD-SOI is hire to stay & badly needed for G5, mimo x4 in mobile, IoT and cetera basically all RF blocks & CMOS on it is coming too.BestGuess - Thursday, March 15, 2018 - link
The article states that 14HP is a descendant of 14LP from Samsung, but more likely it is a marriage of IBM's internally developed 14nm SOI process and 14LP. See IBM's last publication on it prior to the acquisition:http://ieeexplore.ieee.org/abstract/document/70469...
The descriptions of IBM's Power9 chips, which indicate it is SOI and features embedded DRAM, sure look a lot more like the IBM technology than the bulk 14LP technology...
As for RFCMOS applications, FDSOI could be a game changer, since as a planar technology, it doesn't suffer from the high gate capacitance of Finfets. This capacitance is part of the Finfet advantage for Logic applications, but is a real handicap for RF, which requires dealing with much higher frequencies.
ZolaIII - Thursday, March 15, 2018 - link
Yes I reed that article long time ago so do I really need to quote my self again? 14HP FinFET-SOI and beyond.GF 22 nm FD-SOI isn't game changer but a winning design & for digital it's proven how it can match the best efforts on 10 nm FinFET regarding power consumption when back biasing is used that is 0.4V (for small to medium sized SoC's) while analogue where it's obviously only logical choice goes with 0.8V. For most things & especially IoT you need to use combined multi RF & digital components in the same package & that it's both development and final product price is cheap so it's absolute design win for both RF and IoT and good enough for mainstream, it's also win for DSP's not so much for FPGA's & GPU's as they tend to be huge but in smaller or mobile ones it still wins because of its price. Only thing where it loses is HPC & desktop CPU which tend to push operating frequencies as much as possible & FinFET is better at this...
Cooe - Thursday, March 15, 2018 - link
I'm happy with this choice. IBM produces great tech executives. Just look at Lisa Su.TheJian - Saturday, March 17, 2018 - link
You must be kidding. They should have kept Dirk Meyer who said in 2011, we need to make a KING product in cpu/gpu FIRST, then do this APU/console crap that has junk margins that are merely additive, rather than the bulk of your net income. You can use junk like that to milk your tech for the last few dollars but like NV/Intel, AMD should be concentrating on the enterprise stuff or rich people PERIOD. After that leaves cards or chips on the shelf, THEN and ONLY then do you chase the low margin products.If you are limited in supply (can't produce them as fast as you sell them I mean), which is the case for both AMD/NV currently, you shouldn't be wasting time on 10-20% margins (which for AMD just means barely break even on a good quarter), but rather you should be chasing 50%+ margins.
For anyone who doesn't believe this is how things work (they fired Dirk for saying it...LOL, morons), you should just compare the last decade of NV & Intel quarterly reports to AMD's. Note Intel plops chips on TOP of their stack when adding new models (HEDT anyone?), and NV simply put out 1080/1070 (for example) and took a year to put out a 1060. No point in wasting silicon on poor people (sorry, just the facts) when you can nab a RICH person who has throw away cash sitting all around him and upgrades yearly :) At anything under 30% margins AMD just ends up barely paying interest on their debt, wafer agreement penalties etc. They need to hit 40%+ before they start rolling in cash. They should have hit servers first, then home users. IE, I'd rather be chasing $1000-7000 cpus if I was AMD instead of chasing <$1000. They just put out an APU that has a max price currently of $169...ROFLMAO. Lisa should be fired (and most of their management, they are clueless), and Dirk should be sought out ASAP. He was right. Well, DUH.
Now worse IMO, they are giving Intel IP, instead of MAKING THAT CHIP THEMSELVES! Intel will probably charge $400+ for the 8809g. That should have been AMD. AMD is probably making peanuts on that, while Intel takes the bulk of the profits. If I was AMD's CEO I'd fire anyone who brought me an idea that didn't make 40%+ margins out of the gate at least. Intel/NV are chasing 60% stuff and higher. They'd be 70% margins overall if they axed the bottom...LOL.
Not a knock on IBM people, just on this person. Letting Intel/NV have the high end for basically most of the last 7yrs, is retarded. Consoles killed R&D that should have kept them in the cpu race, and drop watts/heat on products, allowed proper R&D to not have crap launches with problem after problem (mboards not having bios' finished, memory issues, vid cards launching and having to speed up fans to mitigate issues etc). With anyone but a fanboy, you probably only get ONE chance to impress your customers. You can't screw up a launch if you're the little guy, you need to get that right out of the gate. Who's idea was it to launch a month or two early damaging the cpu launch? Not the Motherboard makers...AMD had ~800mil to survive one more month (one more year easily), so why push the launch too early? You're fired. Who chose SINGLE DIGIT console margins for 2yrs (they finally said mid double digits-meaing 15% or less or you'd say 16+ right?), instead of CPU/GPU/DRIVERS R&D? You're fired too. Who chose HBM not once, but TWICE, when it was worth nothing more than a buzzword as that bandwidth was not needed at all (see NV cards). It still isn't needed now either which is why all NV home cards have GDDR5x or less, and why their margins are GREAT and setting record profits/margins/revenue quarter after quarter. Who decide a buzzword was worth the risk of massive shortages and super high BOM costs which kills margins (and card supply). If you can't get it on a shelf, your tech is useless. NV chose the KISS principle. For AMD management, who'm I'm sure hasn't heard of this (LOL), it stands for KEEP IT SIMPLE STUPID. AMD went hard to produce, wasteful (overkill bandwidth when only unicorns were using 4K...LOL) and expensive. NV chose easy to make, cheap, and adequate for even Titan. Again AMD management, do you read quarterly reports (your own, and your competition?)?? No? You're fired. Come back when you understand basic economics. Price wars as the only broke guy in the room?? LOL. I digress.
zodiacfml - Saturday, March 17, 2018 - link
If it was that easy, they did just what you said, with pleasure.However, AMD's technical ability can't keep up with Intel and Nvidia. Therefore, it has always been AMD's last resort to choose better memory for performance gain to keep up. This is the reason why AMD has been first with integrated memory controllers, HBM, and choosing higher bandwidth memory spec for some cards vs. competing cards of Nvidia
They're pretty lucky with this as the cryptocurrency ethash/Ethereum algorithm is memory intensive, making their cards superior for mining despite behind in gaming performance or other crypto algorithms. It should have been a quick fix for their profitability by producing more cards but they are limited by the shortage of memory.
Thermalzeal - Friday, April 6, 2018 - link
AMD Can't keep up? That is the most interesting thing that causes my eyebrow to raise, that has been said in a long time :P It's a full stack of technical implementation and AMD has led the market in pushing the tech forward while Intel has bunny suits and disparate architectures for rich people.I honestly can't stand Intel Fanboys, sure if you actually flex your computational power in a single socket you can drain your capex like a cray (see what I did there) but truly if you want to zen out and achieve more sustainable opex AMD has been a leader.
Intel has only achieved it's success with 25x pockets. So, I really don't think any respect is due to Intel Engineers for marketshare that is complete hogwash. AMD has always done more with less, and unfortunately had complete useless humans like Hector Ruinz destroy the company and Global Foundries.
Galcobar - Thursday, March 15, 2018 - link
"Sanjay Jha, who lead the world’s..."Led.
Past tense of the verb lead is led. Not that English makes it easy since the pronunciation for the noun lead is the same as the past tense verb led.
Ian Cutress - Thursday, March 15, 2018 - link
Or, it's a typo.drexnx - Friday, March 16, 2018 - link
Zep?MadManMark - Friday, March 16, 2018 - link
It was probably only a typo. The writer clearly has a strong command of the language, so your presumption he doesn't seem almost insulting.FreckledTrout - Friday, March 16, 2018 - link
Seems like a rational move to focus more on the upcoming process innovation like 7nm. I think it's safe to assume Sanjay Jha got a nice fat golden parachute seeing as how nicely they talk about the parting of ways.