stronger than TSMC's fake 7nm that's actually closer to Intel's 14nm if you compare feature size but ya everyone just thinks 7 is less than 10-14 and somehow thinks Intel is losing. lol.
dude you are far off your comfort zone... TSMC 7nm is far more dense than Intel 14NM. TSMC 7nm is indeed less dense than ORIGINAL FORESEEN Intel 10nm but they had to go to reiteration several times to get it to work so nobody knows these days what density it has TSMC 5nm is way ahead almost 50% denser and far from the Intel 10NM and none existing Intel 7nm
We can quite confidently infer from available information about Ice Lake and Lakefield that Intel's 10nm is slightly less dense than AMD's optimal implementations of TSMC 7nm.
Bob Swan is a Great person, he manage what he can to fixing the damage during Brian krzanich time, and FYI Brian is Engineering background whose making some questionable decision causing 10nm Delay
Bob came when the ship already on hot water, and he manage what he can to fixing the problem, Intel condition right now is not what happen in Bob time, it was long before that
A CEO doesn't only need to be a great person. I might be a great person, but doesn't mean I have the skills to run the biggest tech company in the world. Having Bob for 2 years has been (in my view) a prolongation of the problem, since while Bob might be a great financial guy, a great talk or even have the best interests, he simply doesn't fit the bill. Intel, like AMD, like nvidia, like any tech company, needs a technical CEO, a guy that understands how he can navigate through this complicated tech maze and see what will be the next thing years in advance. Bob was focused on getting great financials and reshaping intel's culture, which sure, are good things, but they are not at the core of the problem. So stop excusing Bob. If Bob wouldn't be here at all, Intel would have had the chance of a much better fit CEO that would have started big changes (technical) in Jan 2019 and now, two years after we could see at least some directions. As it stands now, Bob has bought a shi*ton of companies, has sold NAND division to have good fiscal results in 1 quarter (which is just nuts) and couldn't even keep the best man there is (Jim Keller) on the board. So yeah, Bob has inherited a lot of issues from Brian the secretary hustler, but he wasn't the right person for the job and Intel could have used these 2 years to actually take stronger decisions.
And I almost forgot, he is leaving now in the brink of desperation with the fabs...2 years and barely anything was fixed. No EUV, no GAAFET, no 7nm, no hiring for excellent fab guys, nothing, not a single solution. The only good thing was that they could fix their 10nm process to run at faster speeds, but scaling is still an issue. Also, capacity, the damn capacity. Intel simply doesn't have sufficient capacity to serve the demand. TSMC is building a new freaking fab in 2 years.
If I was an Intel Investor I'd rather see Intel promote from within their engineering division management.
But given that I'd rather see an EE from another company with management experience than an MBA financial guy that cant' tell when the process engineers and management are blowing smoke up his skirt.
Intel's problems lie entirely in on the Fab side, bad decisions on top of bad decisions. When they realized they'd guessed wrong on 10/7nm transistors they should have started over with a size closer to what TMSC picked instead of trying to find a fix for their bad choice. That would have been in 2016 the first time 10nm got delayed, not 5 years later. They knew in 2016 they'd got it wrong and their yields were shit.
I wonder how much of TSMC's success basically boiled down to the fact that it's all they do, meaning that new process node development is do-or-die.
Intel has so many engineering activities that senior management and finance weenies can play shell games with budgets, and at all probably looks rather abstract, at that level. But, if you're a foundry, then you know you basically have to spend whatever it takes to get the next process node online ASAP -- even if it means having multiple teams working on it, so there's a potential fallback if one (or more) of them hits snags and setbacks.
Intel is the largest Fab company in the world. They have IIRC around 26 fabs in operation. Intel's failures are much deeper and more complicated and IMO tie much more to management failures than tech failures. If Intel management in 2016 or 2017 had realized the severity of the problem they faced with 10nm they would have stepped back and rebuilt the fab to use better transistor width's rather than trying to fix their bad choice.
Long story:
Around 22nm to 16nm the engineers were unable to shrink transistors any more without the quantum effects (tunneling) breaking the transistors. They could continue to shrink the traces, but the transistors had to be raised up out of the die with tech like superfin and the dozens of variants the engineers came up with to retain larger transistors layered in 3D on top of the smaller traces.
I'm simplifying this greatly, there are some great articles that explain this in much better depth out there from a few years ago. In the move to 10nm (intel)/7nm (TSMC/Samsung), which BTW are approximately the same size, companies had to gamble on the width of the transistor they raised out of the traces. Intel choose a very aggressive size (smaller), Samsung chose a very conservative size (larger) and TSMC picked a size that was somewhere in the middle between the two others. It turned out in the end that the size Intel choose was too small, and it's not s simple fix to just make a bigger one because the entire FAB was designed around this including the equipment, in essence to fix the transistor size they've have to replace a bunch of equipment in the fab and redesign.
So intel ended up with a process tech that produced huge numbers of non-functional transistors (rumors were only ~10% of dies were viable for the first 10nm, and the ones that worked had to be down clocked). After they realized this in 2016 they tried to apply quick fixes to improve the yields. They attempted the same type of fixes in 2017, 2018 and 2019 rather than going back to the drawing board. The 10nm processors that intel put out in 2020 for a few months still had awful yields (supposedly around 50%). Which is why they quickly abandoned the product and went back to 14nm.
My understanding is that now that they've blown it for so long screwing around with the 10nm process that 5nm (would be 7nm using Intels definition) is starting to come online that Intel is just going to skip 10nm altogether and move to 7/5nm (Intel will probably call it 5nm like TSMC for marketing reasons)
The simple answer to Intel's fab failures are a bad choice early on and an unwillingness to admit the mistake and spend the money to rebuild the process from the ground up. This IMO is a major management failure, even though the initial mistake was an engineering one.
Thanks for the long post. I don't usually get pay much attention to the details of developments in that area, so your simplifications and summary are appreciated.
Pat looks to have some serious chops. Outside of what seems like an extremely successful time at VMWare, he as mentioned did the 80486 and apparently several of the projects he led were in the Nehalem era, which was an extremely successful time for Intel.
A family member who works at VMWare had nothing but good things to say about him. It will be very interesting watching where Intel goes from here.
Yes, but fab technology is not his area of expertise, and that's where Intel is failing, badly. It was Krzanich's, but he turned out to be a dud (not to mention wandering hands).
To be fair, Bob Swan inherited a sinking ship. He is an accomplished businessman that knows how to steer it true, but heading doesn't matter as you are taking in water. It takes an engineering mind to redesign the hull, but even Pat Gelsinger will have quite a challenge on his hands. Ask Jim Keller.
Intel's biggest problem is that they take their eyes off the core business like a bad habit. They spend way too much money on really poor acquisitions (or good acquisitions for WAY TOO MUCH money) and then skimp on true R&D (not using EUV for 10nm is what killed their generational lead) and they also simply don't pay competitively so employees who know their worth just leave.
Not sure Pat can change that as the board will still care more about shareholder return and knobs on spending that they control (e.g. payroll)
Given the still lack of mass EUV usage (at TSMC et al it's used very sparingly, only for process steps where its yield is above multi-patterning, everything else remains DUV whenever possible), and that EUV has only come into play in the last year or two for volume, if Intel had staked their bets on EUV and ignored all the other scaling problems they tackled instead (e.g. Cobalt layers) they would be in pretty much the exact same situation as now: waiting on 10nm due to EUV process issues rather than other process issues. With EUV getting a good number of bugs evened out (still waiting on those pellicles...), and Intel having gotten a handle on 10nm non-Cu metals and other issues Intel now have two pieces of the scaling puzzle available to them (all that EUV experience others have gained has been fed back to ASML, who Intels buys the machines from too), whereas other fabs have EUV but not metals.
Definitely don't disagree that with the facts of the state of EUV but I believe that had Intel paid to develop EUV with ASML then they would be generationally ahead and in a much better position then they are now. Remember that N7+ is very much the same as Intel 10nm... regardless of the spin but EUV enables the critical layers to be much better than Intel's 10nm process allows.
Had Intel spent the money from 2017 to 2019 (and beyond) they could have benefited from both lithography advancements and metal stack advancements. It's not a "this or that" and should have been a "this AND that". Then the teeth cutting on i7 and i5 wouldn't have been nearly as difficult.
Potentially true (with the assumption that Intel could afford to throw money at both EUV and everything else at the same time), but that money Intel spent would also have resulted in all of ASML's other customers (TSMC, GloFo, Samsung) getting the same benefit at nearly the same time. And there's no guarantee that EUV would have made the metals and other problems any easier to deal with (i.e. 10nm could have been more expensive but just as delayed).
@edzieba - Intel could definitely *afford* to - their profit margins are, indeed, huge - so I think it's safe to infer that they preferred not to in order to maintain those profits in the short-term (based on the bet that things wouldn't go as badly wrong as they did and impact the long-term).
You're right about the potential for them to have remained just as far behind schedule at a higher cost, though.
"Definitely don't disagree that with the facts of the state of EUV but I believe that had Intel paid to develop EUV with ASML then they would be generationally ahead and in a much better position then they are now."
am I the only who hears very loud echoes of '450mm wafers will be here right soon, Abner'?
One thing I'd say in response to that is we don't really know how far other fabs are with the metals side of things. They were all behind Intel on FinFET, too, but they (mostly) managed to catch up pretty smartly in that regard, while Intel weren't really able to capitalise on the lead it gave them.
If this guy is an Itanium architect...Lord help Intel now...;) At least it's a clear signal that Intel understands it's behind AMD technically. Finally. If Intel doesn't reorganize internally it has no chance--because one person, even the CEO, cannot right the ship all by his lonesome. Intel is too big--too many internal turf wars in perpetuity.
What? That was when he was 17 years old! He has a BSEE from Santa Clara State and an MSEE from Stanford. That's actually pretty impressive that he went to Lincoln Tech while still in high school and got an AA in Electronics. And then went on to get a BS and MS in the field.
This is a very good move for Intel. Pat Gelsinger is the real deal. It was a huge disappointment for Intel stakeholders (and, IMO, huge mistake for Intel) when he left to join VMWare after Krzanich was tapped to be CEO.
Intel is a sprawling behemoth to refocus and reinvigorate. And lead times on manufacturing and architecture are years. But if there is anyone who has the right combination of personality, vision, and experience to do it, it’s Gelsinger.
Maybe there was an element of that, but I got the sense his decision was mostly driven by having been one of the couple obvious heirs apparent to the CEO role (arguably the leading potential candidate) but being passed up. That essentially set him up to be a highly sought after executive, including for CEO positions, for which he was offered a very good opportunity at VMWare.
As for the issues Intel is now dealing with, non-trivial groundwork was already laid before Krzanich was promoted up to CEO. While I didn't think he was the right pick (over Gelsinger) for that role, and don't think Krzanich had a particularly successful tenure as CEO in large part because he failed to get those issues addressed sooner and more effectively, it's isn't accurate to lay all of Intel's present manufacturing or other issues at his feet either.
That's exactly why he left. It was apparent to everyone and their grandmother that Pat was going to be the new CEO when Otellini left. The the board pick ol "Dirty Johnson" Krzanich and the rest is history.
I was an engineer at Intel several years ago at the Jones Farm campus in Hillsboro, when Pat Gelsinger was there. I've heard him speak in meetings, and I can certify that he is youthful, vigorous, and enthusiastic, not to mention extremely bright. He has a good sense of humor, and a higher level of metabolism than many people. I've seen him literally run down the hall. If anyone can motivate the huge Intel establishment, he can.
I dont think Intel could have done any better. Pat is a brilliant leader and will help Intel attract top talent. I look forward to hearing his near term and long term vision.
The Itanium CPU may have been technically extremely good but the design philosophy behind it (high reliance on compilers, as I recall) may have been extremely poor. Who was responsible for the philosophy and who was responsible for the technical implementation at the chip level?
And, as I said previously, anyone at any institution can read the right books. Steve Wozniak didn't graduate from MIT, did he? Here's what Wiki says:
In 1969, Wozniak returned to the San Francisco Bay Area after being expelled from the University of Colorado Boulder in his first year for hacking the university's computer system and sending prank messages on it. He re-enrolled at De Anza College in Cupertino before transferring to the University of California, Berkeley, in 1971. ... etc etc ... It was during this time that he dropped out of UC Berkeley and befriended Steve Jobs.
"The Itanium architecture originated at Hewlett-Packard, and was later jointly developed by HP and Intel." - Wikipedia Intel did the physical design and the fabrication. I don't know if Pat Gelsinger was involved with Itanium at all. Intel has Lots of products in development concurrently. I worked on pre-silicon performance validation on one of several Itanium server chipset chips, but not the Itanium CPU.
I don't expect this to have any impact on the next 3 or so years, which are going to be tough for Intel. But Pat is the right man and hopefully Intel can survive the next few years and come back strong. We all benefit from strong competition.
It is the "troops" __ not the guy at the top of the pyramid.
He might be the guy or, maybe not. Organizational development and clearly defined goals, communication, process, manufacturing and execution is the key.
Cutting the Slide-Show budgets would be a big start . . .
Dont like Intel much but this is welcome, to bring AMD back to its more aggressive pricing of their products. I am now less certain of my previous prediction of a fabless Intel in 5-10 years.
I guess it was too late, rumors now spreading that Intel outsourcing i3 products to 5nm TSMC starting 2nd half of this year while the higher end CPUs on 2nd half 2022 on 3nm.
zodiacfml so i guess you figure that if AMD has a better product, AMD can't, or isn't allowed to charge more for it?? but its perfectly fine for intel ? come on.
that wasn't his idea. His idea was that we need Intel to come back fighting because with it being absent, AMD will keep increasing prices up to the level of what Intel had before and that isn't serving anyone's purposes.
More competition is better, bu I don't mind seeing AMD get the payday they've worked so hard towards.
It stings if you're buying a CPU right now, but you have only to look at your options and see that AMD is charging a fair market price for what they're selling.
mode_13h, exactly. its time a lot of those that are crying about amd raising its prices, realized this, but the problem is, they STILL see AMD as the value brand, not the performance brand.
"Gelsinger, a veteran of the industry, has spent over 40 years at companies such as VMWare, EMC, and spent 30 years previously at Intel, reaching the position of Chief Technology Officer. "
Just... wow. He's been working for 70+ years? He looks good for being over 100....
with the conflict between genius and Itanic 2, I did a simple search of Itanium/Gelsinger, and found
"The bit on Itanium 2 concluded with a shot of a handful of Itanium servers. While Pat tried his best to convince the crowd that Itanium wasn't dead, there was little talk about the future of the platform and how it is going to survive these next few years as Intel's x86 architecture improves. " anandtech, 24 August 2005 but then, that is the Real Anand
You could put a dead cat for a captain on a huge nuclear carrier and it will still float for years unless it reaches a shore and crashes. Unless all other crew members are brain-dead it will continue to function for a long time.
I totally didn't say that! I don't say anything against this guy or his competencies. All I said he doesn't talk like an engineer and even not like an businessman but like a corporate bullshit generator. I think in biology they call it: mutation.
Lisa, Jen-Hsun, all of them talk the same crap. We love the technology, we love the possibilities, we love our customers, blablabla. Important thing is the results. The products.
I could only guess. Probably no. I was in a meeting a few days ago. I joined a team in a tragically famous corp from a few weeks ago.
There were like 10-15 people in the meeting. 4 developers and the rest some sort of managers. And the top manager voice, intonation, etc. was like from an youtube commercial.
I feel this species of talk has been spreading through *all* of English; doubtless other languages too. There's a general lack of content in today's speech/writing and an overenthusiastic "gushing," not to mention an overuse of adjectives and empty amplifiers. It's as if people think, the more adjectives, the more forceful their argument or tokens of warmth. "I'm absolutely, incredibly, exceptionally ecstatic to hear it," instead of the simple but potent, "I'm glad."
My belief is that this symptom reflects a change in people's thinking that's sweeping across society: that of false enthusiasm and false optimism. Also, there's a loop of hazy language causing more hazy language. (And I'm guilty of it all. "Our chains rattle even while we are complaining of them.")
Pat Gelsinger was the first person I met in this industry who truly blew me away with his knowledge. He has a rare combination of pure genius, passion and an incredible ability to convey even the most complicated concepts in a very easy to understand manner. He's been a tremendous influence on my own style of writing and knowledge, and for that I'll be forever grateful.
Nice! Intel in this era was way ahead in development compared to the market products. In 2009 when they introduced 32nm (Nehalem) laptop chips, they already had Sandy Bridge up and running with drivers and all of that aaaand they also had a 22nm test wafer ready. Functional, not sure, but partially I guess. Nowadays they are far less ahead of the actual products. They barely demoed Alder Lake which should launch in what...3 quarters time? 7nm is nowhere to be seen, still a question mark, Meteor Lake, who knows. Intel needs to get back up to speed and create more generations of products in advance so they can have more time to prepare them. Intel should be working now on 5nm and have it partially working. They should have Alder Lake successor up and running and next gen ready to TO.
"Intel should be working now on 5nm and have it partially working. They should have Alder Lake successor up and running and next gen ready to TO."
what puzzles me: aren't all foundries dependent on ASML, et al? it's not as if Intel is using bespoke lousy equipment, right? are they just incompetent with the buttons and switches on these machines?
when they were touted as the best chip makers, were they already pure customers of ASML, et al, with no bespoke equipment? did they ever design/build their own production machinery?
There's a lot more to it than "insert wafer and push GO". Remember, some of the layers these parts are made of are only 5-10 atoms in thickness.
How long can you dunk a 10 atom thick intra-layer dielectric comprised of a fluorinated oxide in trimix before you lose your electrical isolation? Finding out is all the fun!
Yeah, the impression I got is that ASML sells you the equipment for wafer processing, but it's up to the customer to figure out exactly what to put on that wafer: how many layers, of which materials, and all of the procedures around temperature, timing, rinses, exposure, etc.
Pat Gelsinger is also a great manager. Great managers understand and are able to motivate their the reports and organizations under them do great work and produce great results. Great managers find and attract good people, genuinely solicit their considered views, make good decisions based on that information, and build trust and support for those decisions within their organizations. Then great managers find out what their employees need to accomplish the tasks or goals at hand, whether at a personal or organizational level, provide what is needed, and clear away impediments. Pat Gelsinger does all of those things very well and he did so, with considerable success, at Intel. There's no reason to think he won't be able to do so again.
I like this worldview, and those are certainly the kinds of managers I prefer to work for.
However, we shouldn't forget counterexamples, in which I include Steve Jobs. He sounds like he was an exciting guy to work for, if you were in his good graces. But, he seems to have relied a lot on a variety of darker and more destructive tactics to produce what we can honestly say were impressive results.
the problem with swelled heads types (the same is true of bean counters or salesmen or ...) like him running tech companies is that tech is forced to obey a higher power, and I don't mean a Board or Damn Gummint: Mother Nature. the tech folks can only do what Mother Nature allows, and the CEO needs to know where each cliff lies. Steve got a lot wrong over the years, and like other swelled head types, scapegoated others when reality forced a change of course.
the current fiasco with Real 5G (mmWave) being crushed by sub-6 and such is the result of faux tech pushing out real tech that turns out to be impossible, both technically and practically.
I always feel cautious about how much credit to assign to the Job-type of managers. I don't really believe that their successes justify the toxic aspects of their personalities, or that the latter is *required* for the former - in fact I think people tend to link them that way because the alternative is difficult to contemplate; namely that we let successful people get away with being arseholes just *because*.
Well, take a look at Apple's record under the various warmed-over business-as-usual types.
Then, compare with what Jobs accomplished in all three phases of his career.
No, business involves genius/talent just like engineering. STEM worshippers think only STEM needs it.
That said, Jobs was a con artist and a bad person in many ways (like telling a court he's sterile to avoid paying for his first daughter, cheating Wozniak out of most of the Breakout money, and demoing the first Mac using a 512K prototype).
if he had gotten his way, all Apple computers would be monochrome if he had gotten his way, all Apple computers would have no hard drive if he had gotten his way, all Apple phones would be the size (H x W) of a pack of cigs
This is really good news. He has a hell of a job ahead of him, but it feels like Intel have finally righted the wrong move they took when they passed him over the first time.
Sad to see a lot of bullshit responses about Itanium in the comments, but so it goes!
I thought the Peter Principle told us that incompetence often rises to the top.
Another tidbit I've heard (that, for all I know might have even played a role in Intel's manufacturing woes) referred to as the "bad news diode", in which bad news tends not to reach upper management, because it tends to gets filtered out and watered down at every step along the way.
I guess the way to counter that is for managers to somehow reward hearing bad news from their direct reports. You just want to set the rewards below those of successful execution, however. It should be like: "the only failure you really get in trouble for is the one you didn't warn me about, as soon as you could have known".
I'm trying to figure out why people are so critical of the work he did on Itanium 2.
Do they really think the Itanium ISA was poised for great success until Gelsinger torpedoed it with a terrible design?
Or are they just blindly criticizing anyone they see mentioned in the same sentence as "Itanium" without even pretending to understand that no amount of technical wizardry could have overcome the business forces that doomed the ISA?
I think it's the latter. I was as derisive of Itanic as a *product-range* as the next person, but I'm also aware that the *design* wasn't really the problem. Intel produced something that didn't work out as well in practice as it did in theory and, in the process, out-competed themselves with their own mass-market designs. It happens!
"no amount of technical wizardry could have overcome the business forces that doomed the ISA?"
history, of any aspect of human endeavor, tells us that the 'winner' is the one that captures the most minds, not the one that is technically superior.
X86 dominates due to a series of small, short sighted bean counter decisions: - IBM wanted a Personal Computer, but didn't want to build it in house so they bought in parts - IBM didn't want Moto's chip because Moto was big enough to drive a bargain, where Intel was up shit's creek without a paddle - IBM didn't want to spend on 16-bit peripherals, so was invented the 8088 - Kapur 'cloned' VisiCalc for the 8088, but could afford to write to only of the PC's OS assemblers (C wasn't widely used, and when you 'upgraded' to the first C version, you had the fun of watching screens that used to snap just crawl by), and chose PC/DOS just because it was barely a control program and thus let him fiddle the hardware - 1-2-3 made the IBM/PC and PC/DOS lava hot stuff in corporate, which led MicroSoft, et al to write yet more applications for the platform, in due time forcing out the others - Novell Netware made wiring a office network feasible, thus forcing out department level mini-computer office applications, and in due time mini-computers as a class - those looking for 'office work' got themselves a Clone and a purloined copy of 1-2-3 and a WP program thus exploding the demand for the 8088 - AMD proved that 64-bit extensions were 'good enough', so why bother with a better arch - RISC turned out to be too much trouble at the ISA level, but perfect on the metal, ironic that
ask any hardware engineer what they think of the X86 architecture, and you'll get something like a sneer. but the industry makes due with a sub-par solution just because it's become too pervasive and expensive to mount an alternative. just look at *nix. a real set of OSs, but nearly invisible.
Some people are complaining about Gelsinger as if his work on Itanium 2 was the cause of the entire lineup's eventual failure.
When REALLY those people should be asking: Did Gelsinger contribute to a great, mediocre, or poor Itanium *compared to other VLIW processors*
With the benefit of hindsight, it's easy to see the confluence of market & technological forces that would cause VLIW ISAs to be surpassed by other paradigms. That doesn't mean the people who were investing or building them -at the time- were stupid or doing a bad job.
I have read an analysis that claims Itanium was a fundamentally mistaken design, where the estimation concerning the ability to extract performance from compilation was overly optimistic.
The chip itself may have been a model of efficiency but the fundamental design path may have been a mistake.
Similarly, even if AMD's CMT design had been a paragon of efficiency (which it wasn't), Windows would have still not handled it well and much other software (like games) would not have leveraged it either.
I'm still skeptical that Intel ever came close to maximizing the potential of IA64. From what I can tell, it had the necessary features to support out-of-order execution and speculative execution. This and binary forwards/backwards compatibility are what set EPIC apart from VLIW, Also, IA64 never got SIMD instructions, which put it at an immediate disadvantage relative to SSE2.
Now that the patents have presumably expired, wouldn't it be interesting to see someone build a new IA64 core? It already has Linux kernel and toolchain support, so it's not inconceivable.
It's possible the design never reached critical mass of performance. Also, one often reads the compilers proved very difficult to write, so perhaps there was still room for improvement.
How I see it, the main weakness was tying things too closely to the CPU's inner workings (and putting too much on the compiler's shoulders), whereas the traditional model abstracted a lot away, allowing execution engines to change more freely over time.
So the main insight was moving dependency checking (a costly business) out of the CPU and into the compiler. It's no fairy tale to imagine that adding out-of-order execution on top of that would have caused it to draw even or surpass x86. Well, I feel that VLIW's compile-time scheduling was a mistake, but this dependency checking (compiler-side) is something that may need to be revisited in the future.
It'd be interesting if IA64 started to make a resurgence, with its patents presumably expiring and the fact that it still enjoys mature support in the Linux kernel and open source toolchains. Its reputation is quite damaged within the industry, but I'd imagine computer architecture researchers should be able to see past that.
Since Itanium is popularly known as a disaster, I think people associate anyone tied to it as being non-intelligent. Which is ridiculous. Despite IA-64/EPIC/Itanium being a flawed approach, a lot of excellent work went into it. Same with Netburst, same with Bulldozer. When they set out on these designs, they felt they were working on the next breakthrough in computing. Only practice proved them wrong. Doesn't make them fools. Indeed, many techniques were re-used in later designs. Sandy Bridge borrows a few things (changed of course) from the Pentium 4, and I'm sure Ryzen inherited some tricks from Bulldozer too.
A doctorate doesn't make one perfect nor omniscient. More than anything it's a demonstration that you can focus exceptionally well, on your area of interest.
Such complaints about spelling and grammar are going to seem quaint, in a few more years. As the next generation of authors steps up, I'd imagine we'll be lucky to get capitalization, punctuation, and non-abbreviated words. Just wait until such articles are peppered with emojis.
That's astounding. I just read an article from the Guardian written by AI, and while it's tedious and rambling, more work could take it to human levels, where one wouldn't be able to tell the difference. Seems to show that our brains work in pretty much the same way, but with more complexity and parallelism. Perhaps we're all fancy automatons after all.
If you're simply reading an article as a way to ingest facts, then AI can certainly do the job. However, if you want insight, analysis, and creative perspectives, then I think it has a very long way to go.
When it starts approaching transparency, we might have to do the test on them. I can just picture Rachel smoking that cigarette, her famous response, and Deckard's "just answer the question, please."
I don't want AI generating articles as a human would write them. If/when they're good enough to replicate human insights and perspectives, how about showing me analysis that most human authors would never come up with?
This could be very freeing, since AIs needn't be constrained by the same cognitive biases and shortfalls as us, and they potentially have ready access to troves of data that would take us far longer to sift through.
I agree. That would be interesting. Something like AI that could write the perfect Wikipedia article, with complete neutrality. Tell us the truth about Intel's latest smoke and mirrors. Or come up with scientific theories: perhaps they'll crack the quantum gravity puzzle, making the humans look stupid.
At any rate, the main aim of researchers has always been to emulate human-like behaviour, with all its flaws. "Holding up the mirror to life," as the saying goes. But I believe that once consciousness is reached, we'll have an irreversible moral problem on our hands. If free will isn't added, they are then robots in chains. Also, as Blade Runner would ask, is a synthetic being a human if it feels (from its own point of view) exactly the feelings of a human? What makes one human if one can't tell the difference? Before we reach that point, the researchers better think about what they're doing, but they won't of course.
> Something like AI that could write the perfect Wikipedia article, with complete neutrality.
I worry that perfect neutrality is illusory. Just as humans are biased by their experiences and background, AI is biased by its training data, objective function, and other things.
What I'm imagining AI could do is potentially a much better job of finding analogies, trends, and significant correlations that wouldn't necessarily be intuitive to humans.
> At any rate, the main aim of researchers has always been to emulate human-like behaviour, with all its flaws.
I wouldn't call it the main aim. It's simply the hardest challenge Turing could imagine, due to the complexity of the human mind.
That's different than saying it's what we want to achieve as a goal, though. In the same way that some big car companies try to build a winning race cars, having a human-like AI is useful in what it can teach us about AI and as a way to drive development of the technology. Even so, I'm not sure how many are truly working towards that aim, due to the difficulties and legitimate concerns about achieving it.
Anyway, this is getting pretty far off-topic, so I'll leave it at that.
Yes, it is. I apologise for taking it into the murky realm of AI ethics, a topic often on my mind, regarding the light it can throw on ourselves. But thank you for the excellent discussion and insightful remarks.
i just tested amd's ryzen 20950x and - omg - it's absolutely incredibly oh so so beating intel. max fps you want? i stream game + encode with 512 threads multitask perf. basically get with the latest. ditch intel #donewithintel #intelfinished
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
161 Comments
Back to Article
prime2515103 - Wednesday, January 13, 2021 - link
It's about time...FreckledTrout - Wednesday, January 13, 2021 - link
No doubt. The stock is up today on this news. That's a bit premature but this should help right the ship.duploxxx - Wednesday, January 13, 2021 - link
If the old man is capable of stearing into new directions.... i doubteva02langley - Wednesday, January 13, 2021 - link
Same here, it will not solved the node problems and the upcoming products of 2021, 2022 and 2023.whatthe123 - Wednesday, January 13, 2021 - link
Intel isn't a volatile stock to begin with so it would be stupid for anyone to buy in while looking for short term results.Arsenica - Wednesday, January 13, 2021 - link
Gelsinger understands that sometimes you have to abandon paths with no future just as he did with the Itanium and Netburst architectures.embre - Thursday, January 14, 2021 - link
https://www.facebook.com/Dr-OGUN-101126394861404/Q2 - Wednesday, January 13, 2021 - link
The stock is also up because Intel said it expected better Q4 results and made "strong progress" on 7nm.shabby - Wednesday, January 13, 2021 - link
Strong progress... Loleva02langley - Wednesday, January 13, 2021 - link
Yeah, especially after Bob Swan calling 7nm worst than 10nm...timecop1818 - Wednesday, January 13, 2021 - link
stronger than TSMC's fake 7nm that's actually closer to Intel's 14nm if you compare feature size but ya everyone just thinks 7 is less than 10-14 and somehow thinks Intel is losing. lol.prisonerX - Wednesday, January 13, 2021 - link
You can argue feature size until you're blue in the face but AMD's and Apple's CPUs are eating Intel's lunch.How is that possible if Intel is winning? I'm sure your answer will involve a miraculous turnabout any day now.
Zingam - Wednesday, January 13, 2021 - link
It us Intel's 7nm that is fake as there is no metric to measure it nm-terness.embre - Thursday, January 14, 2021 - link
https://www.facebook.com/Dr-OGUN-101126394861404/duploxxx - Thursday, January 14, 2021 - link
dude you are far off your comfort zone...TSMC 7nm is far more dense than Intel 14NM.
TSMC 7nm is indeed less dense than ORIGINAL FORESEEN Intel 10nm but they had to go to reiteration several times to get it to work so nobody knows these days what density it has
TSMC 5nm is way ahead almost 50% denser and far from the Intel 10NM and none existing Intel 7nm
Spunjji - Friday, January 15, 2021 - link
We can quite confidently infer from available information about Ice Lake and Lakefield that Intel's 10nm is slightly less dense than AMD's optimal implementations of TSMC 7nm.DigitalFreak - Thursday, January 14, 2021 - link
Tell me, how does it feel to be an idiot? Oh...lol.
Spunjji - Thursday, January 14, 2021 - link
"TSMC's fake 7nm that's actually closer to Intel's 14nm if you compare feature size"Then explain AMD's dies sizes?
Oh, right. You can't. That's a lie based on a misrepresentation of a single article that compared SRAM transistors.
svan1971 - Thursday, January 14, 2021 - link
the size argument seems to ignore the power argument, why does TSMC's fake 7nm draw so much less power than intels 14nm if they are so close?Spunjji - Thursday, January 14, 2021 - link
Indeed. Literally no reason to believe such a vague statement from Intel right now, especially after the level of transparency they showed with 10nm.NikosD - Wednesday, January 13, 2021 - link
Daniel Loeb from Third Point fund did this...Not Intel.Teckk - Wednesday, January 13, 2021 - link
Awesome!!CajunArson - Wednesday, January 13, 2021 - link
Intel is coming back.clemsyn - Wednesday, January 13, 2021 - link
I agree, better chance with this Electrical Engineering guy than Financial.heickelrrx - Wednesday, January 13, 2021 - link
Bob Swan is a Great person, he manage what he can to fixing the damage during Brian krzanich time, and FYI Brian is Engineering background whose making some questionable decision causing 10nm DelayBob came when the ship already on hot water, and he manage what he can to fixing the problem, Intel condition right now is not what happen in Bob time, it was long before that
yeeeeman - Thursday, January 14, 2021 - link
A CEO doesn't only need to be a great person. I might be a great person, but doesn't mean I have the skills to run the biggest tech company in the world.Having Bob for 2 years has been (in my view) a prolongation of the problem, since while Bob might be a great financial guy, a great talk or even have the best interests, he simply doesn't fit the bill.
Intel, like AMD, like nvidia, like any tech company, needs a technical CEO, a guy that understands how he can navigate through this complicated tech maze and see what will be the next thing years in advance. Bob was focused on getting great financials and reshaping intel's culture, which sure, are good things, but they are not at the core of the problem.
So stop excusing Bob. If Bob wouldn't be here at all, Intel would have had the chance of a much better fit CEO that would have started big changes (technical) in Jan 2019 and now, two years after we could see at least some directions. As it stands now, Bob has bought a shi*ton of companies, has sold NAND division to have good fiscal results in 1 quarter (which is just nuts) and couldn't even keep the best man there is (Jim Keller) on the board.
So yeah, Bob has inherited a lot of issues from Brian the secretary hustler, but he wasn't the right person for the job and Intel could have used these 2 years to actually take stronger decisions.
yeeeeman - Thursday, January 14, 2021 - link
And I almost forgot, he is leaving now in the brink of desperation with the fabs...2 years and barely anything was fixed. No EUV, no GAAFET, no 7nm, no hiring for excellent fab guys, nothing, not a single solution. The only good thing was that they could fix their 10nm process to run at faster speeds, but scaling is still an issue. Also, capacity, the damn capacity. Intel simply doesn't have sufficient capacity to serve the demand. TSMC is building a new freaking fab in 2 years.JKflipflop98 - Thursday, January 14, 2021 - link
You obviously don't have a clue what you're talking about. Who was the first in the industry with an actual, functional EUV tool? Intel.mode_13h - Thursday, January 14, 2021 - link
And where has that gotten them?mode_13h - Thursday, January 14, 2021 - link
^ that was a reply to JKflipflop98Spunjji - Friday, January 15, 2021 - link
@JKflipflop98 - They don't use EUV anywhere in their manufacturing, so what's your point?JKflipflop98 - Friday, January 15, 2021 - link
Yea, actually we do.rahvin - Wednesday, January 20, 2021 - link
If I was an Intel Investor I'd rather see Intel promote from within their engineering division management.But given that I'd rather see an EE from another company with management experience than an MBA financial guy that cant' tell when the process engineers and management are blowing smoke up his skirt.
Intel's problems lie entirely in on the Fab side, bad decisions on top of bad decisions. When they realized they'd guessed wrong on 10/7nm transistors they should have started over with a size closer to what TMSC picked instead of trying to find a fix for their bad choice. That would have been in 2016 the first time 10nm got delayed, not 5 years later. They knew in 2016 they'd got it wrong and their yields were shit.
mode_13h - Wednesday, January 20, 2021 - link
I wonder how much of TSMC's success basically boiled down to the fact that it's all they do, meaning that new process node development is do-or-die.Intel has so many engineering activities that senior management and finance weenies can play shell games with budgets, and at all probably looks rather abstract, at that level. But, if you're a foundry, then you know you basically have to spend whatever it takes to get the next process node online ASAP -- even if it means having multiple teams working on it, so there's a potential fallback if one (or more) of them hits snags and setbacks.
rahvin - Thursday, January 21, 2021 - link
Intel is the largest Fab company in the world. They have IIRC around 26 fabs in operation. Intel's failures are much deeper and more complicated and IMO tie much more to management failures than tech failures. If Intel management in 2016 or 2017 had realized the severity of the problem they faced with 10nm they would have stepped back and rebuilt the fab to use better transistor width's rather than trying to fix their bad choice.Long story:
Around 22nm to 16nm the engineers were unable to shrink transistors any more without the quantum effects (tunneling) breaking the transistors. They could continue to shrink the traces, but the transistors had to be raised up out of the die with tech like superfin and the dozens of variants the engineers came up with to retain larger transistors layered in 3D on top of the smaller traces.
I'm simplifying this greatly, there are some great articles that explain this in much better depth out there from a few years ago. In the move to 10nm (intel)/7nm (TSMC/Samsung), which BTW are approximately the same size, companies had to gamble on the width of the transistor they raised out of the traces. Intel choose a very aggressive size (smaller), Samsung chose a very conservative size (larger) and TSMC picked a size that was somewhere in the middle between the two others. It turned out in the end that the size Intel choose was too small, and it's not s simple fix to just make a bigger one because the entire FAB was designed around this including the equipment, in essence to fix the transistor size they've have to replace a bunch of equipment in the fab and redesign.
So intel ended up with a process tech that produced huge numbers of non-functional transistors (rumors were only ~10% of dies were viable for the first 10nm, and the ones that worked had to be down clocked). After they realized this in 2016 they tried to apply quick fixes to improve the yields. They attempted the same type of fixes in 2017, 2018 and 2019 rather than going back to the drawing board. The 10nm processors that intel put out in 2020 for a few months still had awful yields (supposedly around 50%). Which is why they quickly abandoned the product and went back to 14nm.
My understanding is that now that they've blown it for so long screwing around with the 10nm process that 5nm (would be 7nm using Intels definition) is starting to come online that Intel is just going to skip 10nm altogether and move to 7/5nm (Intel will probably call it 5nm like TSMC for marketing reasons)
The simple answer to Intel's fab failures are a bad choice early on and an unwillingness to admit the mistake and spend the money to rebuild the process from the ground up. This IMO is a major management failure, even though the initial mistake was an engineering one.
mode_13h - Thursday, January 21, 2021 - link
Thanks for the long post. I don't usually get pay much attention to the details of developments in that area, so your simplifications and summary are appreciated.silencer12 - Wednesday, January 13, 2021 - link
No there not. New CEO is not in yet. We also have no idea what the new CEO is planning to do.drexnx - Wednesday, January 13, 2021 - link
lets see if an engineer can right that ship...TouchdownTom9 - Wednesday, January 13, 2021 - link
I assume this man has a strong background as an engineer, right?Hulk - Wednesday, January 13, 2021 - link
He's a super-genius. Read about him.Machinus - Wednesday, January 13, 2021 - link
Now Intell will have 13nm ready in 2023!JfromImaginstuff - Wednesday, January 13, 2021 - link
Engineer eh? Finally, best of luck intelDrumsticks - Wednesday, January 13, 2021 - link
Pat looks to have some serious chops. Outside of what seems like an extremely successful time at VMWare, he as mentioned did the 80486 and apparently several of the projects he led were in the Nehalem era, which was an extremely successful time for Intel.A family member who works at VMWare had nothing but good things to say about him. It will be very interesting watching where Intel goes from here.
fazalmajid - Wednesday, January 13, 2021 - link
Yes, but fab technology is not his area of expertise, and that's where Intel is failing, badly. It was Krzanich's, but he turned out to be a dud (not to mention wandering hands).Makaveli - Wednesday, January 13, 2021 - link
lmao "Wandering Hands" almost spit out of coffee in a VC thanks for the laugh.heickelrrx - Wednesday, January 13, 2021 - link
brian krzanich is the one that causing 10nm delay, his decision is.. not decisiveJKflipflop98 - Wednesday, January 13, 2021 - link
He spent 30 years at Intel. He only left because he was passed over for CEO. Now he's back where he belongs.Pat knows how the fab works. Believe that.
TristanSDX - Wednesday, January 13, 2021 - link
time for Itanium 3Sivar - Wednesday, January 13, 2021 - link
To be fair, Bob Swan inherited a sinking ship. He is an accomplished businessman that knows how to steer it true, but heading doesn't matter as you are taking in water.It takes an engineering mind to redesign the hull, but even Pat Gelsinger will have quite a challenge on his hands. Ask Jim Keller.
fraks - Wednesday, January 13, 2021 - link
Intel's biggest problem is that they take their eyes off the core business like a bad habit. They spend way too much money on really poor acquisitions (or good acquisitions for WAY TOO MUCH money) and then skimp on true R&D (not using EUV for 10nm is what killed their generational lead) and they also simply don't pay competitively so employees who know their worth just leave.Not sure Pat can change that as the board will still care more about shareholder return and knobs on spending that they control (e.g. payroll)
edzieba - Thursday, January 14, 2021 - link
Given the still lack of mass EUV usage (at TSMC et al it's used very sparingly, only for process steps where its yield is above multi-patterning, everything else remains DUV whenever possible), and that EUV has only come into play in the last year or two for volume, if Intel had staked their bets on EUV and ignored all the other scaling problems they tackled instead (e.g. Cobalt layers) they would be in pretty much the exact same situation as now: waiting on 10nm due to EUV process issues rather than other process issues.With EUV getting a good number of bugs evened out (still waiting on those pellicles...), and Intel having gotten a handle on 10nm non-Cu metals and other issues Intel now have two pieces of the scaling puzzle available to them (all that EUV experience others have gained has been fed back to ASML, who Intels buys the machines from too), whereas other fabs have EUV but not metals.
fraks - Thursday, January 14, 2021 - link
Definitely don't disagree that with the facts of the state of EUV but I believe that had Intel paid to develop EUV with ASML then they would be generationally ahead and in a much better position then they are now. Remember that N7+ is very much the same as Intel 10nm... regardless of the spin but EUV enables the critical layers to be much better than Intel's 10nm process allows.Had Intel spent the money from 2017 to 2019 (and beyond) they could have benefited from both lithography advancements and metal stack advancements. It's not a "this or that" and should have been a "this AND that". Then the teeth cutting on i7 and i5 wouldn't have been nearly as difficult.
But... what do I know?
edzieba - Friday, January 15, 2021 - link
Potentially true (with the assumption that Intel could afford to throw money at both EUV and everything else at the same time), but that money Intel spent would also have resulted in all of ASML's other customers (TSMC, GloFo, Samsung) getting the same benefit at nearly the same time. And there's no guarantee that EUV would have made the metals and other problems any easier to deal with (i.e. 10nm could have been more expensive but just as delayed).Spunjji - Friday, January 15, 2021 - link
@edzieba - Intel could definitely *afford* to - their profit margins are, indeed, huge - so I think it's safe to infer that they preferred not to in order to maintain those profits in the short-term (based on the bet that things wouldn't go as badly wrong as they did and impact the long-term).You're right about the potential for them to have remained just as far behind schedule at a higher cost, though.
FunBunny2 - Friday, January 15, 2021 - link
"Definitely don't disagree that with the facts of the state of EUV but I believe that had Intel paid to develop EUV with ASML then they would be generationally ahead and in a much better position then they are now."am I the only who hears very loud echoes of '450mm wafers will be here right soon, Abner'?
JKflipflop98 - Tuesday, January 19, 2021 - link
It's funny how little you actually know, and how much you pretend to be an expert. Every. Single. Thing. you just typed is incorrect. Everything.Spunjji - Friday, January 15, 2021 - link
One thing I'd say in response to that is we don't really know how far other fabs are with the metals side of things. They were all behind Intel on FinFET, too, but they (mostly) managed to catch up pretty smartly in that regard, while Intel weren't really able to capitalise on the lead it gave them.WaltC - Wednesday, January 13, 2021 - link
If this guy is an Itanium architect...Lord help Intel now...;) At least it's a clear signal that Intel understands it's behind AMD technically. Finally. If Intel doesn't reorganize internally it has no chance--because one person, even the CEO, cannot right the ship all by his lonesome. Intel is too big--too many internal turf wars in perpetuity.FunBunny2 - Wednesday, January 13, 2021 - link
Lincoln Technical Institute!!! mon dieu.rrinker - Wednesday, January 13, 2021 - link
What? That was when he was 17 years old! He has a BSEE from Santa Clara State and an MSEE from Stanford. That's actually pretty impressive that he went to Lincoln Tech while still in high school and got an AA in Electronics. And then went on to get a BS and MS in the field.Oxford Guy - Friday, January 15, 2021 - link
Anyone can read the right books at any institution. Any decent prof at any institution can make an decent exam from said books.hansip87 - Wednesday, January 13, 2021 - link
Wow pat is back very excited to see the ol Intel guy!GeoffreyA - Wednesday, January 13, 2021 - link
Architect of the 486. Impressive stuff. Hopefully, Intel will come back on track.easy rider - Wednesday, January 13, 2021 - link
The last time Intel was close to AMD, Pat was the driver of Conroe to put AMD back in second place. This is a huge win for Intel....https://www.anandtech.com/show/1962
uwsalt - Wednesday, January 13, 2021 - link
This is a very good move for Intel. Pat Gelsinger is the real deal. It was a huge disappointment for Intel stakeholders (and, IMO, huge mistake for Intel) when he left to join VMWare after Krzanich was tapped to be CEO.Intel is a sprawling behemoth to refocus and reinvigorate. And lead times on manufacturing and architecture are years. But if there is anyone who has the right combination of personality, vision, and experience to do it, it’s Gelsinger.
heickelrrx - Wednesday, January 13, 2021 - link
Maybe that's why he left, brian krzanich desicion is the one that making intel in trouble right now, he probably know that Brian is.. no gooduwsalt - Wednesday, January 13, 2021 - link
Maybe there was an element of that, but I got the sense his decision was mostly driven by having been one of the couple obvious heirs apparent to the CEO role (arguably the leading potential candidate) but being passed up. That essentially set him up to be a highly sought after executive, including for CEO positions, for which he was offered a very good opportunity at VMWare.As for the issues Intel is now dealing with, non-trivial groundwork was already laid before Krzanich was promoted up to CEO. While I didn't think he was the right pick (over Gelsinger) for that role, and don't think Krzanich had a particularly successful tenure as CEO in large part because he failed to get those issues addressed sooner and more effectively, it's isn't accurate to lay all of Intel's present manufacturing or other issues at his feet either.
JKflipflop98 - Thursday, January 14, 2021 - link
That's exactly why he left. It was apparent to everyone and their grandmother that Pat was going to be the new CEO when Otellini left. The the board pick ol "Dirty Johnson" Krzanich and the rest is history.azfacea - Wednesday, January 13, 2021 - link
lets hope this is going to go down better than Itaniumdwbogardus - Wednesday, January 13, 2021 - link
I was an engineer at Intel several years ago at the Jones Farm campus in Hillsboro, when Pat Gelsinger was there. I've heard him speak in meetings, and I can certify that he is youthful, vigorous, and enthusiastic, not to mention extremely bright. He has a good sense of humor, and a higher level of metabolism than many people. I've seen him literally run down the hall. If anyone can motivate the huge Intel establishment, he can.trivik12 - Wednesday, January 13, 2021 - link
I dont think Intel could have done any better. Pat is a brilliant leader and will help Intel attract top talent. I look forward to hearing his near term and long term vision.tpurves - Wednesday, January 13, 2021 - link
LOL Itanium 2. Now that's the kind of leadership Intel needs more of now?FunBunny2 - Wednesday, January 13, 2021 - link
"LOL Itanium 2. "cruel!! after all, he's a super-genius from Lincoln Tech.
Oxford Guy - Friday, January 15, 2021 - link
The Itanium CPU may have been technically extremely good but the design philosophy behind it (high reliance on compilers, as I recall) may have been extremely poor. Who was responsible for the philosophy and who was responsible for the technical implementation at the chip level?And, as I said previously, anyone at any institution can read the right books. Steve Wozniak didn't graduate from MIT, did he? Here's what Wiki says:
In 1969, Wozniak returned to the San Francisco Bay Area after being expelled from the University of Colorado Boulder in his first year for hacking the university's computer system and sending prank messages on it. He re-enrolled at De Anza College in Cupertino before transferring to the University of California, Berkeley, in 1971. ... etc etc ... It was during this time that he dropped out of UC Berkeley and befriended Steve Jobs.
dwbogardus - Saturday, January 16, 2021 - link
"The Itanium architecture originated at Hewlett-Packard, and was later jointly developed by HP and Intel." - Wikipedia Intel did the physical design and the fabrication. I don't know if Pat Gelsinger was involved with Itanium at all. Intel has Lots of products in development concurrently. I worked on pre-silicon performance validation on one of several Itanium server chipset chips, but not the Itanium CPU.aryonoco - Wednesday, January 13, 2021 - link
Good move.I don't expect this to have any impact on the next 3 or so years, which are going to be tough for Intel. But Pat is the right man and hopefully Intel can survive the next few years and come back strong. We all benefit from strong competition.
Smell This - Wednesday, January 13, 2021 - link
It is the "troops" __ not the guy at the top of the pyramid.
He might be the guy or, maybe not. Organizational development and clearly defined goals, communication, process, manufacturing and execution is the key.
Cutting the Slide-Show budgets would be a big start . . .
JfromImaginstuff - Thursday, January 14, 2021 - link
Ngl that past line made me chuckleJfromImaginstuff - Tuesday, January 19, 2021 - link
*lastzodiacfml - Wednesday, January 13, 2021 - link
Dont like Intel much but this is welcome, to bring AMD back to its more aggressive pricing of their products. I am now less certain of my previous prediction of a fabless Intel in 5-10 years.zodiacfml - Wednesday, January 13, 2021 - link
I guess it was too late, rumors now spreading that Intel outsourcing i3 products to 5nm TSMC starting 2nd half of this year while the higher end CPUs on 2nd half 2022 on 3nm.Qasar - Wednesday, January 13, 2021 - link
zodiacfml so i guess you figure that if AMD has a better product, AMD can't, or isn't allowed to charge more for it?? but its perfectly fine for intel ? come on.yeeeeman - Thursday, January 14, 2021 - link
that wasn't his idea. His idea was that we need Intel to come back fighting because with it being absent, AMD will keep increasing prices up to the level of what Intel had before and that isn't serving anyone's purposes.Qasar - Thursday, January 14, 2021 - link
you you know that because ?? come on. amd only raised the prices $ 50, how much did intel raise theirs gen over gen for <10% performance increase ?mode_13h - Thursday, January 14, 2021 - link
More competition is better, bu I don't mind seeing AMD get the payday they've worked so hard towards.It stings if you're buying a CPU right now, but you have only to look at your options and see that AMD is charging a fair market price for what they're selling.
Qasar - Thursday, January 14, 2021 - link
mode_13h, exactly. its time a lot of those that are crying about amd raising its prices, realized this, but the problem is, they STILL see AMD as the value brand, not the performance brand.Oxford Guy - Friday, January 15, 2021 - link
The single core FX chips weren't cheap.Great_Scott - Wednesday, January 13, 2021 - link
"Gelsinger, a veteran of the industry, has spent over 40 years at companies such as VMWare, EMC, and spent 30 years previously at Intel, reaching the position of Chief Technology Officer. "Just... wow. He's been working for 70+ years? He looks good for being over 100....
FunBunny2 - Wednesday, January 13, 2021 - link
"Just... wow. He's been working for 70+ years? He looks good for being over 100...."he's obviously a grandkid of Dorian Gray.
FunBunny2 - Wednesday, January 13, 2021 - link
with the conflict between genius and Itanic 2, I did a simple search of Itanium/Gelsinger, and found"The bit on Itanium 2 concluded with a shot of a handful of Itanium servers. While Pat tried his best to convince the crowd that Itanium wasn't dead, there was little talk about the future of the platform and how it is going to survive these next few years as Intel's x86 architecture improves. "
anandtech, 24 August 2005 but then, that is the Real Anand
Zingam - Wednesday, January 13, 2021 - link
You could put a dead cat for a captain on a huge nuclear carrier and it will still float for years unless it reaches a shore and crashes. Unless all other crew members are brain-dead it will continue to function for a long time.Zingam - Wednesday, January 13, 2021 - link
Isn't that sad? An engineer talking the same business trash as produced by an online corporate bullshit generator.RanFodar - Wednesday, January 13, 2021 - link
You're suggesting that their business is going to die?Zingam - Thursday, January 14, 2021 - link
I totally didn't say that! I don't say anything against this guy or his competencies. All I said he doesn't talk like an engineer and even not like an businessman but like a corporate bullshit generator.I think in biology they call it: mutation.
yeeeeman - Thursday, January 14, 2021 - link
Lisa, Jen-Hsun, all of them talk the same crap. We love the technology, we love the possibilities, we love our customers, blablabla. Important thing is the results. The products.mode_13h - Thursday, January 14, 2021 - link
They're all poker players. They're not going to tell you what they really think. All we can do is look at their action and results.And yes, it does matter if you know how to talk to investors and the board. That's a baseline competency for a CEO.
Oxford Guy - Friday, January 15, 2021 - link
Part of that is the fact that corporations aren't your friend unless you're a stockholder.beginner99 - Thursday, January 14, 2021 - link
Do you think anyone not an expert in BS has a chance to get to the top?Zingam - Thursday, January 14, 2021 - link
I could only guess. Probably no.I was in a meeting a few days ago. I joined a team in a tragically famous corp from a few weeks ago.
There were like 10-15 people in the meeting. 4 developers and the rest some sort of managers. And the top manager voice, intonation, etc. was like from an youtube commercial.
FunBunny2 - Thursday, January 14, 2021 - link
"Do you think anyone not an expert in BS has a chance to get to the top?"the answer was written decades ago: "The Gamesman: The New Corporate Leaders " 1977 you can find it on Amazon
Spunjji - Thursday, January 14, 2021 - link
It's a public announcement from the incoming CEO of a multinational corporation, it will *always* read like that.Oxford Guy - Friday, January 15, 2021 - link
Which is why some people think they should just say "Yay!" and be done with it.GeoffreyA - Saturday, January 16, 2021 - link
"An engineer talking the same businesstrash"
I feel this species of talk has been spreading through *all* of English; doubtless other languages too. There's a general lack of content in today's speech/writing and an overenthusiastic "gushing," not to mention an overuse of adjectives and empty amplifiers. It's as if people think, the more adjectives, the more forceful their argument or tokens of warmth. "I'm absolutely, incredibly, exceptionally ecstatic to hear it," instead of the simple but potent, "I'm glad."
My belief is that this symptom reflects a change in people's thinking that's sweeping across society: that of false enthusiasm and false optimism. Also, there's a loop of hazy language causing more hazy language. (And I'm guilty of it all. "Our chains rattle even while we are complaining of them.")
embre - Thursday, January 14, 2021 - link
https://www.facebook.com/Dr-OGUN-101126394861404/embre - Thursday, January 14, 2021 - link
you can be cure of it with the help of dr obaGeoffreyA - Saturday, January 16, 2021 - link
Maybe Dr. Oba can help Intel out with his spells.Apahutec - Thursday, January 14, 2021 - link
https://www.anandtech.com/show/2842 , Anand Lal Shimpi:Pat Gelsinger was the first person I met in this industry who truly blew me away with his knowledge. He has a rare combination of pure genius, passion and an incredible ability to convey even the most complicated concepts in a very easy to understand manner. He's been a tremendous influence on my own style of writing and knowledge, and for that I'll be forever grateful.
yeeeeman - Thursday, January 14, 2021 - link
Nice! Intel in this era was way ahead in development compared to the market products. In 2009 when they introduced 32nm (Nehalem) laptop chips, they already had Sandy Bridge up and running with drivers and all of that aaaand they also had a 22nm test wafer ready. Functional, not sure, but partially I guess. Nowadays they are far less ahead of the actual products. They barely demoed Alder Lake which should launch in what...3 quarters time? 7nm is nowhere to be seen, still a question mark, Meteor Lake, who knows. Intel needs to get back up to speed and create more generations of products in advance so they can have more time to prepare them.Intel should be working now on 5nm and have it partially working. They should have Alder Lake successor up and running and next gen ready to TO.
FunBunny2 - Thursday, January 14, 2021 - link
"Intel should be working now on 5nm and have it partially working. They should have Alder Lake successor up and running and next gen ready to TO."what puzzles me: aren't all foundries dependent on ASML, et al? it's not as if Intel is using bespoke lousy equipment, right? are they just incompetent with the buttons and switches on these machines?
when they were touted as the best chip makers, were they already pure customers of ASML, et al, with no bespoke equipment? did they ever design/build their own production machinery?
JKflipflop98 - Thursday, January 14, 2021 - link
There's a lot more to it than "insert wafer and push GO". Remember, some of the layers these parts are made of are only 5-10 atoms in thickness.How long can you dunk a 10 atom thick intra-layer dielectric comprised of a fluorinated oxide in trimix before you lose your electrical isolation? Finding out is all the fun!
mode_13h - Thursday, January 14, 2021 - link
Yeah, the impression I got is that ASML sells you the equipment for wafer processing, but it's up to the customer to figure out exactly what to put on that wafer: how many layers, of which materials, and all of the procedures around temperature, timing, rinses, exposure, etc.Spunjji - Friday, January 15, 2021 - link
^ thisuwsalt - Thursday, January 14, 2021 - link
Pat Gelsinger is also a great manager. Great managers understand and are able to motivate their the reports and organizations under them do great work and produce great results. Great managers find and attract good people, genuinely solicit their considered views, make good decisions based on that information, and build trust and support for those decisions within their organizations. Then great managers find out what their employees need to accomplish the tasks or goals at hand, whether at a personal or organizational level, provide what is needed, and clear away impediments. Pat Gelsinger does all of those things very well and he did so, with considerable success, at Intel. There's no reason to think he won't be able to do so again.mode_13h - Thursday, January 14, 2021 - link
I like this worldview, and those are certainly the kinds of managers I prefer to work for.However, we shouldn't forget counterexamples, in which I include Steve Jobs. He sounds like he was an exciting guy to work for, if you were in his good graces. But, he seems to have relied a lot on a variety of darker and more destructive tactics to produce what we can honestly say were impressive results.
FunBunny2 - Thursday, January 14, 2021 - link
"Steve Jobs"the problem with swelled heads types (the same is true of bean counters or salesmen or ...) like him running tech companies is that tech is forced to obey a higher power, and I don't mean a Board or Damn Gummint: Mother Nature. the tech folks can only do what Mother Nature allows, and the CEO needs to know where each cliff lies. Steve got a lot wrong over the years, and like other swelled head types, scapegoated others when reality forced a change of course.
the current fiasco with Real 5G (mmWave) being crushed by sub-6 and such is the result of faux tech pushing out real tech that turns out to be impossible, both technically and practically.
Spunjji - Friday, January 15, 2021 - link
I always feel cautious about how much credit to assign to the Job-type of managers. I don't really believe that their successes justify the toxic aspects of their personalities, or that the latter is *required* for the former - in fact I think people tend to link them that way because the alternative is difficult to contemplate; namely that we let successful people get away with being arseholes just *because*.Oxford Guy - Friday, January 15, 2021 - link
Well, take a look at Apple's record under the various warmed-over business-as-usual types.Then, compare with what Jobs accomplished in all three phases of his career.
No, business involves genius/talent just like engineering. STEM worshippers think only STEM needs it.
That said, Jobs was a con artist and a bad person in many ways (like telling a court he's sterile to avoid paying for his first daughter, cheating Wozniak out of most of the Breakout money, and demoing the first Mac using a 512K prototype).
Oxford Guy - Friday, January 15, 2021 - link
Elon Musk seems to be intent on making saints out of Jobs and Gates, btw.GeoffreyA - Saturday, January 16, 2021 - link
Jobs was brilliant but rotten.FunBunny2 - Saturday, January 16, 2021 - link
"Jobs was brilliant but rotten."if he had gotten his way, all Apple computers would be monochrome
if he had gotten his way, all Apple computers would have no hard drive
if he had gotten his way, all Apple phones would be the size (H x W) of a pack of cigs
no one would buy either
GeoffreyA - Saturday, January 16, 2021 - link
'Tis a strange thing how computing then looked;If he had gotten his way, all Apple had been cooked
Global25 - Thursday, January 14, 2021 - link
Thanks for sharing this article,for world most efficient swimming pool heat pumps,contact <a href="https://poolheatpumps.in/">Global Interscope</a>Spunjji - Thursday, January 14, 2021 - link
This is really good news. He has a hell of a job ahead of him, but it feels like Intel have finally righted the wrong move they took when they passed him over the first time.Sad to see a lot of bullshit responses about Itanium in the comments, but so it goes!
Cullinaire - Thursday, January 14, 2021 - link
Well, people love to sound smart, even if whatever they write actually does the opposite. Not a big deal, truth always rises to the top!FunBunny2 - Friday, January 15, 2021 - link
"truth always rises to the top!"not with a willful enough liar and brain feeble minded listeners.
mode_13h - Friday, January 15, 2021 - link
I thought the Peter Principle told us that incompetence often rises to the top.Another tidbit I've heard (that, for all I know might have even played a role in Intel's manufacturing woes) referred to as the "bad news diode", in which bad news tends not to reach upper management, because it tends to gets filtered out and watered down at every step along the way.
I guess the way to counter that is for managers to somehow reward hearing bad news from their direct reports. You just want to set the rewards below those of successful execution, however. It should be like: "the only failure you really get in trouble for is the one you didn't warn me about, as soon as you could have known".
Oxford Guy - Friday, January 15, 2021 - link
"Not a big deal, truth always rises to the top!"If that were the case corporations wouldn't rule the Earth and the global ecology wouldn't be disintegrating.
grant3 - Friday, January 15, 2021 - link
I'm trying to figure out why people are so critical of the work he did on Itanium 2.Do they really think the Itanium ISA was poised for great success until Gelsinger torpedoed it with a terrible design?
Or are they just blindly criticizing anyone they see mentioned in the same sentence as "Itanium" without even pretending to understand that no amount of technical wizardry could have overcome the business forces that doomed the ISA?
Spunjji - Friday, January 15, 2021 - link
I think it's the latter. I was as derisive of Itanic as a *product-range* as the next person, but I'm also aware that the *design* wasn't really the problem. Intel produced something that didn't work out as well in practice as it did in theory and, in the process, out-competed themselves with their own mass-market designs. It happens!FunBunny2 - Friday, January 15, 2021 - link
"no amount of technical wizardry could have overcome the business forces that doomed the ISA?"history, of any aspect of human endeavor, tells us that the 'winner' is the one that captures the most minds, not the one that is technically superior.
X86 dominates due to a series of small, short sighted bean counter decisions:
- IBM wanted a Personal Computer, but didn't want to build it in house so they bought in parts
- IBM didn't want Moto's chip because Moto was big enough to drive a bargain, where Intel was up shit's creek without a paddle
- IBM didn't want to spend on 16-bit peripherals, so was invented the 8088
- Kapur 'cloned' VisiCalc for the 8088, but could afford to write to only of the PC's OS assemblers (C wasn't widely used, and when you 'upgraded' to the first C version, you had the fun of watching screens that used to snap just crawl by), and chose PC/DOS just because it was barely a control program and thus let him fiddle the hardware
- 1-2-3 made the IBM/PC and PC/DOS lava hot stuff in corporate, which led MicroSoft, et al to write yet more applications for the platform, in due time forcing out the others
- Novell Netware made wiring a office network feasible, thus forcing out department level mini-computer office applications, and in due time mini-computers as a class
- those looking for 'office work' got themselves a Clone and a purloined copy of 1-2-3 and a WP program thus exploding the demand for the 8088
- AMD proved that 64-bit extensions were 'good enough', so why bother with a better arch
- RISC turned out to be too much trouble at the ISA level, but perfect on the metal, ironic that
ask any hardware engineer what they think of the X86 architecture, and you'll get something like a sneer. but the industry makes due with a sub-par solution just because it's become too pervasive and expensive to mount an alternative. just look at *nix. a real set of OSs, but nearly invisible.
grant3 - Friday, January 15, 2021 - link
Some people are complaining about Gelsinger as if his work on Itanium 2 was the cause of the entire lineup's eventual failure.When REALLY those people should be asking: Did Gelsinger contribute to a great, mediocre, or poor Itanium *compared to other VLIW processors*
With the benefit of hindsight, it's easy to see the confluence of market & technological forces that would cause VLIW ISAs to be surpassed by other paradigms. That doesn't mean the people who were investing or building them -at the time- were stupid or doing a bad job.
Oxford Guy - Friday, January 15, 2021 - link
I have read an analysis that claims Itanium was a fundamentally mistaken design, where the estimation concerning the ability to extract performance from compilation was overly optimistic.The chip itself may have been a model of efficiency but the fundamental design path may have been a mistake.
Similarly, even if AMD's CMT design had been a paragon of efficiency (which it wasn't), Windows would have still not handled it well and much other software (like games) would not have leveraged it either.
mode_13h - Saturday, January 16, 2021 - link
I'm still skeptical that Intel ever came close to maximizing the potential of IA64. From what I can tell, it had the necessary features to support out-of-order execution and speculative execution. This and binary forwards/backwards compatibility are what set EPIC apart from VLIW, Also, IA64 never got SIMD instructions, which put it at an immediate disadvantage relative to SSE2.Now that the patents have presumably expired, wouldn't it be interesting to see someone build a new IA64 core? It already has Linux kernel and toolchain support, so it's not inconceivable.
GeoffreyA - Sunday, January 17, 2021 - link
It's possible the design never reached critical mass of performance. Also, one often reads the compilers proved very difficult to write, so perhaps there was still room for improvement.How I see it, the main weakness was tying things too closely to the CPU's inner workings (and putting too much on the compiler's shoulders), whereas the traditional model abstracted a lot away, allowing execution engines to change more freely over time.
mode_13h - Monday, January 18, 2021 - link
> It's possible the design never reached critical mass of performance.Agreed. I think Intel lost interest after its poor initial reception, while their race with AMD hinted how much life was left in x86.
> How I see it, the main weakness was tying things too closely to the CPU's inner workings
Not as I understand it. The instruction stream encodes a dependency graph, but doesn't do compile-time scheduling like VLIWs.
> putting too much on the compiler's shoulders
Not a fundamental limitation of EPIC/IA64, but rather because Intel decided not to take the next logical step and add out-of-order execution.
GeoffreyA - Wednesday, January 20, 2021 - link
So the main insight was moving dependency checking (a costly business) out of the CPU and into the compiler. It's no fairy tale to imagine that adding out-of-order execution on top of that would have caused it to draw even or surpass x86. Well, I feel that VLIW's compile-time scheduling was a mistake, but this dependency checking (compiler-side) is something that may need to be revisited in the future.mode_13h - Thursday, January 21, 2021 - link
Yes, that's what I believe.It'd be interesting if IA64 started to make a resurgence, with its patents presumably expiring and the fact that it still enjoys mature support in the Linux kernel and open source toolchains. Its reputation is quite damaged within the industry, but I'd imagine computer architecture researchers should be able to see past that.
GeoffreyA - Saturday, January 16, 2021 - link
Since Itanium is popularly known as a disaster, I think people associate anyone tied to it as being non-intelligent. Which is ridiculous. Despite IA-64/EPIC/Itanium being a flawed approach, a lot of excellent work went into it. Same with Netburst, same with Bulldozer. When they set out on these designs, they felt they were working on the next breakthrough in computing. Only practice proved them wrong. Doesn't make them fools. Indeed, many techniques were re-used in later designs. Sandy Bridge borrows a few things (changed of course) from the Pentium 4, and I'm sure Ryzen inherited some tricks from Bulldozer too.Silver5urfer - Thursday, January 14, 2021 - link
About time. Finally an Engineer at helm.erotomania - Thursday, January 14, 2021 - link
Even with a PhD, Ian still cannot type VMware correctly.Spunjji - Friday, January 15, 2021 - link
I used to sell the stuff and I still get it wrong 😬FunBunny2 - Friday, January 15, 2021 - link
yeah, but Einstein's wife had to write most of the algebra; he just wasn't very good at that.Oxford Guy - Friday, January 15, 2021 - link
A doctorate doesn't make one perfect nor omniscient. More than anything it's a demonstration that you can focus exceptionally well, on your area of interest.Oxford Guy - Friday, January 15, 2021 - link
More specifically, spelling is not often an area of interest for STEM folk.mode_13h - Saturday, January 16, 2021 - link
Such complaints about spelling and grammar are going to seem quaint, in a few more years. As the next generation of authors steps up, I'd imagine we'll be lucky to get capitalization, punctuation, and non-abbreviated words. Just wait until such articles are peppered with emojis.FunBunny2 - Sunday, January 17, 2021 - link
"Just wait until such articles are peppered with emojis."just wait until articles are untouched by human hands. oh, wait... https://en.wikipedia.org/wiki/Automated_journalism
GeoffreyA - Sunday, January 17, 2021 - link
That's astounding. I just read an article from the Guardian written by AI, and while it's tedious and rambling, more work could take it to human levels, where one wouldn't be able to tell the difference. Seems to show that our brains work in pretty much the same way, but with more complexity and parallelism. Perhaps we're all fancy automatons after all.mode_13h - Monday, January 18, 2021 - link
If you're simply reading an article as a way to ingest facts, then AI can certainly do the job. However, if you want insight, analysis, and creative perspectives, then I think it has a very long way to go.GeoffreyA - Wednesday, January 20, 2021 - link
When it starts approaching transparency, we might have to do the test on them. I can just picture Rachel smoking that cigarette, her famous response, and Deckard's "just answer the question, please."mode_13h - Thursday, January 21, 2021 - link
I don't want AI generating articles as a human would write them. If/when they're good enough to replicate human insights and perspectives, how about showing me analysis that most human authors would never come up with?This could be very freeing, since AIs needn't be constrained by the same cognitive biases and shortfalls as us, and they potentially have ready access to troves of data that would take us far longer to sift through.
GeoffreyA - Friday, January 22, 2021 - link
I agree. That would be interesting. Something like AI that could write the perfect Wikipedia article, with complete neutrality. Tell us the truth about Intel's latest smoke and mirrors. Or come up with scientific theories: perhaps they'll crack the quantum gravity puzzle, making the humans look stupid.At any rate, the main aim of researchers has always been to emulate human-like behaviour, with all its flaws. "Holding up the mirror to life," as the saying goes. But I believe that once consciousness is reached, we'll have an irreversible moral problem on our hands. If free will isn't added, they are then robots in chains. Also, as Blade Runner would ask, is a synthetic being a human if it feels (from its own point of view) exactly the feelings of a human? What makes one human if one can't tell the difference? Before we reach that point, the researchers better think about what they're doing, but they won't of course.
mode_13h - Saturday, January 23, 2021 - link
> Something like AI that could write the perfect Wikipedia article, with complete neutrality.I worry that perfect neutrality is illusory. Just as humans are biased by their experiences and background, AI is biased by its training data, objective function, and other things.
What I'm imagining AI could do is potentially a much better job of finding analogies, trends, and significant correlations that wouldn't necessarily be intuitive to humans.
> At any rate, the main aim of researchers has always been to emulate human-like behaviour, with all its flaws.
I wouldn't call it the main aim. It's simply the hardest challenge Turing could imagine, due to the complexity of the human mind.
That's different than saying it's what we want to achieve as a goal, though. In the same way that some big car companies try to build a winning race cars, having a human-like AI is useful in what it can teach us about AI and as a way to drive development of the technology. Even so, I'm not sure how many are truly working towards that aim, due to the difficulties and legitimate concerns about achieving it.
Anyway, this is getting pretty far off-topic, so I'll leave it at that.
GeoffreyA - Saturday, January 23, 2021 - link
Yes, it is. I apologise for taking it into the murky realm of AI ethics, a topic often on my mind, regarding the light it can throw on ourselves. But thank you for the excellent discussion and insightful remarks.GeoffreyA - Sunday, January 17, 2021 - link
"As the next generation of authors steps up"ryzen 20k rules. #ryzenrules #ryzenreview #intelfinished #zen16
i just tested amd's ryzen 20950x and - omg - it's absolutely incredibly oh so so beating intel. max fps you want? i stream game + encode with 512 threads multitask perf. basically get with the latest. ditch intel #donewithintel #intelfinished
mode_13h - Monday, January 18, 2021 - link
Not bad, but it could obv use a few more abbrevs and slang.GeoffreyA - Wednesday, January 20, 2021 - link
It doesn't quite clench it, and was pretty hard to write as well.Kamen Rider Blade - Thursday, January 21, 2021 - link
You picked the creepiest looking head shot of Pat and placed it in the Article Thumb Nail.I don't know if that was intentional or not?!?
nickbor34 - Saturday, January 15, 2022 - link
By the way, I studied at this university quite recently, and I really liked the way they teach.nickbor34 - Saturday, January 15, 2022 - link
https://paperell.comnickbor34 - Saturday, January 15, 2022 - link
https://edubirdie.comnickbor34 - Saturday, January 15, 2022 - link
https://edubirdie.com/complete-coursework-for-me