(hope this doesn't dupe. Previous post failed to show up.)
Jim wasn't fired. He left earlier than planned, for "personal reasons".
Jim likes to move the needle. Intel was very much mired in pride and were insistent on finishing their existing projects, even if those projects were going to be so late to market that they were obsolete on launch. Jim couldn't move the needle under those circumstances.
It doesn't matter how much talent Intel has if management won't kill projects that are failing.
Not surprising. I previously wrote that Qualcomm is expected to become a serious player in laptops. It seems like they're setting their sights on laptops, desktops, and the cloud with the purchase of Nuvia.
It's quite clear that the future of computing is ARM. PC makers like Dell/HP/Lenovo need an answer to Apple Silicon or Macs are going to start eating up market share. No, AMD/Intel isn't going to be able to match the efficiency of Apple Silicon. It will have to be another ARM competitor to challenge Apple Silicon. Qualcomm is trying to position itself as that competitor like they have been in the mobile.
I personally thought that Intel was going to buy Nuvia. I guess Qualcomm makes a lot of sense too.
I think Qualcomm's stock is a better buy than AMD's now that they bought Nuvia. Qualcomm is already generating massive revenue and profit compared to AMD and now they're on the right side of the x86/ARM battle. Qualcomm makes AMD's stock look very overpriced.
I like this deal even more than Nvidia buying ARM. This deal is very likely to go through regulatory review while it's hard to imagine the UK/China approving the Nvidia/ARM deal.
The future of computing isn't ARM. That's the present. RISC-V is the future and x86 is the past.
It's always been about the same thing: power, performance and price. Intel broke the functioning market for CPUs for a while but now it's returning to normal and competition and innovation is returning because ARM is more open. RISC-V is even more open and will take over for that reason unless ARM stops requiring architecture licences for the ARM ISA. Even then RISC-V's simplicity and cleanliness will likely give it an edge.
Why does a semiconductor architecture have to be open and free? Is there any evidence that this results in a better product. Arm doesn't seem to have hampered the efforts of hundreds of semi players using their architecture or designs?
I think you don't even understand what the Meltdown & Spectre vulnerabilities actually are, and you're mindlessly parroting the "Open source means no exploits!" meme.
1. Meltdown / Spectre are a result of the implementation of speculative processing implementation in the hardware, and not because the x86/x64 ISAs are inherently vulnerable.
2. Open Source software is just as vulnerable to being exploited as closed source software. Quite often the vulnerabilities remain open for even longer because in open source because, despite some wishful thinking of open-source free-riders, there's not a vast army of open source programmers working for free with no higher purpose than to patch security flaws.
Look at, for example, QEMU's venom exploit, or OpenSSL's injection vulnerability. Exploits persisted for well over a decade after discovery.
Open source is awesome, and I'm excited by the prospect of an open-source ISA fostering an even more competitive CPU environment than we have now, but citing irrelevant exploits as some kind of benefit for open source in general is asinine. Risc-V could easily have fallen prey to the same Meltdown/Spectre vulnerabilities if it were so mature that 5 years ago the most advanced chip designers in the world were pursuing the same leading-edge Risc-V CPU design decisions as they had reached with x86.
There are plenty of CVEs found every year that affect FOSS products; the difference is that the source is available for an external audit afterwards.
Most of Meltdown and Spectre were the result of Intel assuming they were untouchable and taking shortcuts in their architecture, not because the ISA is inherently flawed (at least, not in the way you mean - x86, AMD64, and x87 have plenty of issues). This is evident when you compare the number of SKUs affected between AMD and Intel. AMD's implementation of SMT and speculative execution are much more secure than Intel's.
The ISA being free and open would hopefully prevent somebody from abusing their dominant market position. NVIDIA's potential purchase of ARM could either lead towards ARM becoming the mainstream computing architecture or NVIDIA could strangle licensees, making Qualcomm look like Santa Claus.
RISC-V will find its place in embedded. Thats it. It can not possibly close the ecosystem gap to ARM anytime soon, while not having any technical advantages. Also the only thing free about RISC-V is the ISA specification - if you want to license a particular core, you have to pay the IP owner. This is very unlike Open Source Software, where not only the specification is free but also the implementation.
The problem with ARM is now that Nvidia owns it, who knows what they'll do with licensing fees or what they might do to the ISA to make it more complementary with their other products.
Anyway, if you think AMD and Intel couldn't design a competitive ARM core, I certainly don't join you in that belief. At a u-arch level, all of these cores have most of the same structures. It's not simply a matter of slapping on a different frontend, but it's also not like they'd be starting from absolute zero.
We do, though. M1 is already competitive with other designs in its current ~15W TDP space, and a simple doubling or tripling of the large cores would make it equally competitive with designs in the 30-65W TDP space (albeit at a lower power, in small part thanks to that 5nm process)
>For consumers, there are no higher TDP RISC CPUs out there.
Because the consumers are always the *last* to see bleeding-edge performance at higher TDPs.
The A64FX, Graviton2 (or any Neoverse design), etc. The server space has always been first in cutting-edge technology at high TDP and RISC vs CISC is no different.
Less arrogant and more matter-of-fact: newsflash, the consumer world doesn't like 40W+ TDP systems anymore. In fact, <20W is where all the innovation happens.
My guess is they still want to be in play w.r.t. datacenters (never were) and 5G basestations/network (they are, but need Intel or other processors to handle a lot of the dataflow and processing their chips cannot).
However, the Centriq saw the end of their old custom uarch team (Qualcomm used to design their own arm uarch, before just customizing Cortex-A IP). So I feel this is more of a realization that they need their team back, in order to compete in those spaces.
Jim wasn't fired. He left earlier than planned, for "personal reasons".
Jim likes to move the needle. Intel was very much mired in pride and were insistent on finishing their existing projects, even if those projects were going to be so late to market that they were obsolete on launch. Jim couldn't move the needle under those circumstances.
It doesn't matter how much talent Intel has if management won't kill projects that are failing.
This purchase probably assures Qualcomm's ability to make SoCs if NVIDIA doesn't buy ARM as much or more than if NVIDIA does. If NVIDIA doesn't buy ARM, there isn't going to be so much R&D being pumped into ARM to create the cores as has been done the past few years. It's Softbank that pumped that R&D into ARM and it's Softbank that is looking to now unload ARM. If ARM is independent it won't have the deep pockets to do it, and Softbank apparently is not happy with the results it has been getting. What happens to ARM if NVIDIA doesn't buy it isn't going to be the same thing as what's happened to ARM the past 3 years.
Pretty happy about this. I feel like it's much more likely we'll see actual consumer products using NUVIA's tech if it's owned qualcomm.
Apple doesn't seem interested in selling their ARM chips to third parties but I imagine qualcomm would be happy to. Assuming nuvia delivers on their promises this is pretty good news I think.
Interestingly, however, while they don't sell their CPU designs to third parties, they allowed this key contributor to be acquired by a rival, I wonder why? Is Quallcomm so bad and monopoly status so close that they decided to throw Quallcomm a bone? It's certainly not for lack of money. And they've always been careful defending their IP advantages until now.
While it looks like a great move for Quallcomm, they should wonder why Apple let it happen.
> they allowed this key contributor to be acquired by a rival, I wonder why?
They didn't "allow" this to happen. Apple is not God. They had two options: either play into extortion by their ex-employee and pay $Billions to buy back what they believe is rightfully theirs, or seek a remedy through the courts, which has been ongoing for many months. I suppose there's also a possibility that Apple will seek an injunction to block the acquisition from completing, while their current legal challenge plays out.
Anyone who bought Nuvia clearly knew they were going to have to fight off Apple's legal challenges, or broker some kind of licensing arrangement with them. It makes sense that Qualcomm would step up to the challenge, since their lawyers must know Apple's lawyer so well, by now.
That's too simplistic. In most states, employees are under a non-compete agreement, preventing you from going to a competitor for usually about 2 years after you leave, although California doesn't allow them. Also, I think his defense is that it wasn't a competitor, since Apple didn't want to build server CPUs.
The next issue is the potential theft of IP. Most companies assert ownership of relevant IP you devise, while under their employment. Even if Apple didn't use it in a CPU, if they can prove that he used some IP he devised while at Apple, they can lay claim to it. This is what I understand they're suing him for.
Finally, companies tend to place their employees under non-solicit agreements, where you can't leave and then lure your ex-colleagues to join you. I don't know if he brought any of his team with him, but that would be another potential point of exposure.
> In most states, employees are under a non-compete agreement
Oops, I didn't mean to imply it's automatic. It's something employers usually do. It's one of the documents they typically have you sign, when you're hired.
Everything you described relating to restricting ex-employees is illegal in CA. Apple's claims are merely an attempt to work around those laws. A key group of people left Apple and went to compete against them, and Apple is upset. Filing a suit was just a way of trying sabotage and punish them. Qualcomm paid so much because they know the legal claims are trash and the team is pure gold.
> Everything you described relating to restricting ex-employees is illegal in CA.
Thanks for letting me know about non-solicit agreements, but I do find it a little hard to believe that a company couldn't assert ownership of CPU IP that a CPU designer originated while working for them. As long as they could prove that he devised it under their employment (which is probably the hard part), it seems like common sense that it would be theirs.
Now, I've heard of companies trying to assert ownership of all IP generated by employees, whether or not it was related to their business, and that would seem like a clear overreach.
Potentially good news! The more players in the CPU space, the merrier. The next big question is when QC/Nuvia will be able to show an actual in-silicon sample of their offering. Also, a key reason why there is even enough space for another ARM-derived server CPU is that Apple has, so far, not shown any interest in designing and selling large, many-core CPUs for third parties. That being said, does anyone here know what Apple runs its iCloud anything on? What OS and which CPUs do they use?
I think Qualcomm actually got this done on the cheap. 1.4B to acquire this team seems very smart to me. But we don't know if NUVIA was running out of funding, and perhaps selling right now was better than doing another round of capital raising.
I just now hope Qualcomm sticks to the plan, gives this team the resources that they need to bring their product to market. And I do hope they eventually use this to get back to the server and enterprise space.
Sounds like Qualcomm paid a lot just to get those few people working for them. I would do same though as Qualcomm has to be able to supply Microsoft and other vendors the Arm SoCs capable of matching Apple's M1 like products.
> Qualcomm has to be able to supply Microsoft and other vendors the Arm SoCs capable of matching Apple's M1 like products.
This argument always seemed weird to me. As long as the performance differential isn't like orders of magnitude, I don't think it really matters if Apple has faster CPUs. People who buy Apple products do so for all sorts of warm-fuzzy reasons or because it's the only way to run the software they need, rather than on the basis of benchmarks and tech specs.
I know a lot of people with 2012 apple machines that then switched to windows for better performance. If windows machines lose their performance advantage (which they kind of have), then any enthusiast who prefers mac os is switching back to apple. Furthermore, if apple puts more powerful gpus in their machines, you may see game developers focus more on mac os, and naturally some gamers would follow.
> I know a lot of people with 2012 apple machines that then switched to windows for better performance.
They ditched an expensive niche platform for a cheaper mainstream one. It's harder to go the other way.
> if apple puts more powerful gpus in their machines, you may see game developers focus more on mac os, and naturally some gamers would follow.
I don't see it. Game devs are going to use the same platform as the overwhelming majority of gamers, especially since Windows and XBox are cousins. Sure, a few fringe devs might use Macs, but probably not much more than before (and some of those only because Apple forces it, for iOS app development).
Macs will always be too expensive to make sense, as a gaming platform. Consoles and PCs will always have a significant advantage in perf/$ and the vast majority of the gamer market is at least somewhat price-sensitive.
Furthermore, GPUs are the main bottleneck for most games, and it's unlikely Apple can pull such a coup in GPU performance as they did with their CPU. GPUs aren't constrained by ISA or even API (as Mantle/DX12/Vulkan showed).
As someone who's owned only Windows PCs for 15 years, I think I'll be moving to an Apple Silicon desktop in the next few years.
>They ditched an expensive niche platform for a cheaper mainstream one. It's harder to go the other way.
Absolutely incorrect. You seem misinformed on MacOS growth: 17% of the world's desktops & laptops run MacOS now and it's been growing 20% each year for a decade now. And that was *before* M1 / Apple Silicon.
A small note: Qualcomm's old custom CPU cores were more power efficient than ARM's in general, as detailed benchmarks bore out. But ARM clocked their CPUs higher, increasing power drain and lowering battery life, but impressing in benchmarks. Benchmarks Qualcomm investors used to justify the CPU division getting cut, despite their superior technology. Because, well that's what happens when investors think they understand engineering.
Qualcomm needs to adjust its own attitude towards know it all investors as much as it needs more differentiated products. Or else the same thing could happen all over again.
The Firestorm cores on the M1 clock higher and use more power than Cortex A-78 and X1. That's how you get high performance on devices like laptops where low power drain isn't such a concern as compared to phones. I have a feeling a Qualcomm-Nuvia chip could end up using similar M1 concepts like a wide and deep pipeline and a fast memory controller to beat x86 chips.
This just wrong. Qualcomm stopped using their own cores in mobile because they were less efficient than Cortex cores - Qualcomm openly admitted to this fact.
From what I remember, their custom cores had better floating point performance, but that didn't help that much with most everyday tasks. Still, would be nice if another custom core player joins the ARM-RISC space also in mobile.
This wasn't precisely true before ARMv8 but Qualcomm never really nailed the transition. Only released product was the Snap 820, which outperformed the A57 but not even the A72, IIRC.
My sense is that they scrapped their proprietary cores for much the same reasons as Samsung: ARM was improving so fast that (unlike Apple) they were unwilling to put the kind of resources into their design teams that would've been needed to stay ahead. They saw the writing on the wall and decided to cut their losses.
The next question I would've had is about GPUs, but the Nuvia purchase signifies that Qualcomm still sees value in developing its own proprietary IP blocks.
They put all that into Centriq. They put more effort into that CPU, which was supposed to be the ARM server DC market spearhead. As AT proposes and how ARM fans here mention how x86 is dead and all.
Guess what ? It was closed down, Qcomm the corp which pushes R&D for ROI closed its complete custom server design and Anand Chandrasekhar who was the lead also left the company.
There's one thing that bothers me. If Nuvia's greatest asset is its talented people, what's stopping these people from leaving the company once the takeover is done and founding another new company?
"Golden handcuffs". Knowing this, the company doing the acquisition typically places conditions on the stock options of the company being acquired, in order to keep around key employees for a few years. So, they can certainly leave early, but they won't get their big payday.
Ultimately, if they're in California, nothing is preventing them from going to a competitor (or creating one), but they can't take any IP with them (as Apple alleges they did, when founding Nuvia - see my above comment about Qualcomm's legal team).
Since the company was so cheap and so important, why didn't Apple acquire them? I have to think that Apple already has the key people or has another path charted out to stay ahead of Quallcomm/Nuvia. Letting Nuvia go doesn't make much sense otherwise - though of course Apple and Tim Cook aren't perfect, either. While they often blunder on certain hardware (input devices come to mind) I can't imagine Tim blundering on a key piece of IP that represents a huge advantage for multiple years, when all he has to do is write a tiny check (tiny for Apple).
He stated that he left due to Apple's unwillingness to allow him to do a server oriented part:
"In his lawsuit, Williams said that he raised the prospect of Apple doing server chips as far back as 2010 and the company rejected the idea then and subsequently. He said his co-founders recruited him to start Nuvia with them and that he avoided being involved with the company until after he left Apple."
The issue may not be in the specific direction of this initiative so much as the general culture of Apple given his willingness to be bought out by Qualcomm. I doubt he'd be rebuffed by Qualcomm if he leads equally ambitious initiatives in the HPC datacenter direction even if the first parts they are collaborating on are more aligned with mobile. With Qualcomm, he now has access to leading edge fabs, world class support, and will likely have a lot more leeway to steer hardware initiatives while surrounded by other top talent.
I suspect Qualcomm has already been partnering with NUVIA on a project close to tapeout (maybe a laptop oriented 9cx SoC?) and an agreement was reached because of the mutually compatible feelings over the collaboration before another round of fund raising...
Like Raqiq said and what I just posed above, Apple is already seeking a legal remedy to this.
Another point is that if Apple would've paid $Billions to acquire IP they think was already theirs, it would set a dangerous precedent for other Apple employees to leave and start business with Apple IP that the company would be forced to re-acquire.
But they would have stopped a competitor in it's tracks, and would still retain the option to claim it was their in the 1st place? Plus, it's not like you HAVE to follow internal precedents.
It looks as if they just said that they already have something in the pipeline to be ahead anyway.
Apple aren't competing with others much though. They make plenty of money selling to their niche ecosystem. So, as long as others aren't orders of magnitude better (hence enticing those in their ecosystem to leave), they don't have much to worry about.
Interesting! Not exactly great from a competition perspective, but probably a very good thing for the Nuvia tech itself to be attached to a company that will have no trouble at all bringing it to market.
Exactly that - plus all the other stuff further down the line once they actually have a product fabbed, like developer and customer relations. A great product is no good if you can't persuade anyone to implement it in an OEM design!
I think it's a positive development, especially since Apple's legal challenges might've rendered Nuvia too toxic for anyone else.
Seriously, who else is doing custom ARM cores, at this point? They've been dropping like flies! I'm gladdened by the prospect that there will be at least one set of SoC's on the market with a non-Cortex ARM core (and I'm not counting Apple's, since those will surely remain locked in their walled-garden). Especially now that Nvidia owns ARM.
Apple did it because they control most of their own supply chain and they can set the prices accordingly. A chip vendor like Qualcomm can't sell many expensive high-end ARM chips to customers who won't take the risk of using those chips. Using ARM Cortex IP ends up being the lower cost, lower risk option for everyone from Qualcomm to Samsung to Huawei. Heck, even Ampere and Amazon use ARM cores because that's the lower cost and fastest method to get to market.
Apple having its own CPU design arm allows it to undercut Intel pricing while prioritizing certain features for MacOS. Other ARM vendors have to focus on more general purpose platforms like Android and Windows.
That's definitely one way to look at it! I'm mainly apprehensive because of how much of a lock Qualcomm already have on the ARM SoC market - if the Nuvia IP is as good as they claim then the remaining players may have trouble keeping up in anything other than a second-tier role.
I still have my fingers crossed that the Nvidia/ARM deal gets knocked down; if it doesn't, then I think I'll probably hew closer to your take on this.
I don't count them because, as far as I can tell, their cores were designed only for use in their HPC systems. That would basically put them in the same camp as Apple.
I also think they took this approach more for political reasons than purely technical or economic, not wanting to be dependent on providers in any external country for the IP of their critical HPC infrastructure.
They're custom ARM cores. You didn't specify in which class.
Fujitsu have produced the most power-efficient general purpose CPU-based HPC system to date; I would assume they chose the ARM ISA in order to realise that.
> They're custom ARM cores. You didn't specify in which class.
You're taking my point out of context. My point was that it's good for the ARM ecosystem to have the option of cores (i.e. available for use in products that you can actually buy) designed by more than just ARM, itself. The fact that some proprietary cores exist somewhere in the world that almost nobody can use is of no practical benefit to the ecosystem.
By contrast, in a world where all ARM cores (i.e. that we can actually use) were ARM-designed, well, that would be a bit like living in an x86 world without AMD (or VIA).
Qualcomm has a bad reputation of not being adhesive. A lot of ambitious projects were dropped even before there first iteration. The 8cx family seems to be on this track as well, as they pretend to have upgraded it in the so called "gen2".
How do they compare to Ampere computing? They're doing ARM server CPUs too right? Seems wierd, QCOM just closed their CPU division, why not just hire a few folks rather than buy a whole new company and shutdown their old one. Smells of really bad management.
We don't know because Nuvia hasn't reached the stage of a working server CPU that can be tested.
> They're doing ARM server CPUs too right? Seems wierd, QCOM just closed their CPU division
Qualcomm specifically mentioned using them in just about everything other than servers, though. So, it doesn't necessarily mean Qualcomm is getting back into the server CPU game, or maybe they're just trying not to show too many of their cards.
As for designing their own cores, who can say why Qualcomm changed their mind, but maybe they're looking at the current competitive landscape and feel this would be a worthwhile way to differentiate their products. You also have to consider that Nuvia might have more talent than Qualcomm's old Kryo team, which can change the calculus.
> why not just hire a few folks rather than buy a whole new company and shutdown their old one. Smells of really bad management.
Because the senior talent are Nuvia's founders. They have a fiduciary responsibility to their investors to try and take Nuvia to a liquidity event, so long as that's a viable option. And they'll get paid even more by selling the company and cashing in their stock options than anything Qualcomm would pay them otherwise.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
78 Comments
Back to Article
yeeeeman - Wednesday, January 13, 2021 - link
smart of them. mister bob swan should learn a thing or two from lisa and cristiano amon, instead of firing jim keller.Arsenica - Wednesday, January 13, 2021 - link
Not much left for Bob Swan to learn as he´s going to be replaced by good old Pat ¨Kicking¨ Gelsinger.linuxgeex - Wednesday, January 13, 2021 - link
(hope this doesn't dupe. Previous post failed to show up.)Jim wasn't fired. He left earlier than planned, for "personal reasons".
Jim likes to move the needle. Intel was very much mired in pride and were insistent on finishing their existing projects, even if those projects were going to be so late to market that they were obsolete on launch. Jim couldn't move the needle under those circumstances.
It doesn't matter how much talent Intel has if management won't kill projects that are failing.
mode_13h - Wednesday, January 13, 2021 - link
Yeah, you got duped. It seems that you posted it as a new comment instead of a reply.s.yu - Thursday, January 14, 2021 - link
I thought it was just me, looks like this is a server issue.SarahKerrigan - Wednesday, January 13, 2021 - link
Custom-core Snapdragon rides again!mode_13h - Wednesday, January 13, 2021 - link
...oh snapdragon!lemurbutton - Wednesday, January 13, 2021 - link
Not surprising. I previously wrote that Qualcomm is expected to become a serious player in laptops. It seems like they're setting their sights on laptops, desktops, and the cloud with the purchase of Nuvia.It's quite clear that the future of computing is ARM. PC makers like Dell/HP/Lenovo need an answer to Apple Silicon or Macs are going to start eating up market share. No, AMD/Intel isn't going to be able to match the efficiency of Apple Silicon. It will have to be another ARM competitor to challenge Apple Silicon. Qualcomm is trying to position itself as that competitor like they have been in the mobile.
I personally thought that Intel was going to buy Nuvia. I guess Qualcomm makes a lot of sense too.
I think Qualcomm's stock is a better buy than AMD's now that they bought Nuvia. Qualcomm is already generating massive revenue and profit compared to AMD and now they're on the right side of the x86/ARM battle. Qualcomm makes AMD's stock look very overpriced.
I like this deal even more than Nvidia buying ARM. This deal is very likely to go through regulatory review while it's hard to imagine the UK/China approving the Nvidia/ARM deal.
prisonerX - Wednesday, January 13, 2021 - link
The future of computing isn't ARM. That's the present. RISC-V is the future and x86 is the past.It's always been about the same thing: power, performance and price. Intel broke the functioning market for CPUs for a while but now it's returning to normal and competition and innovation is returning because ARM is more open. RISC-V is even more open and will take over for that reason unless ARM stops requiring architecture licences for the ARM ISA. Even then RISC-V's simplicity and cleanliness will likely give it an edge.
Speedfriend - Wednesday, January 13, 2021 - link
Why does a semiconductor architecture have to be open and free? Is there any evidence that this results in a better product. Arm doesn't seem to have hampered the efforts of hundreds of semi players using their architecture or designs?xdrol - Wednesday, January 13, 2021 - link
Two words: Meltdown and Spectre.grant3 - Wednesday, January 13, 2021 - link
I think you don't even understand what the Meltdown & Spectre vulnerabilities actually are, and you're mindlessly parroting the "Open source means no exploits!" meme.1. Meltdown / Spectre are a result of the implementation of speculative processing implementation in the hardware, and not because the x86/x64 ISAs are inherently vulnerable.
2. Open Source software is just as vulnerable to being exploited as closed source software. Quite often the vulnerabilities remain open for even longer because in open source because, despite some wishful thinking of open-source free-riders, there's not a vast army of open source programmers working for free with no higher purpose than to patch security flaws.
Look at, for example, QEMU's venom exploit, or OpenSSL's injection vulnerability. Exploits persisted for well over a decade after discovery.
Open source is awesome, and I'm excited by the prospect of an open-source ISA fostering an even more competitive CPU environment than we have now, but citing irrelevant exploits as some kind of benefit for open source in general is asinine. Risc-V could easily have fallen prey to the same Meltdown/Spectre vulnerabilities if it were so mature that 5 years ago the most advanced chip designers in the world were pursuing the same leading-edge Risc-V CPU design decisions as they had reached with x86.
Daeros - Saturday, January 16, 2021 - link
There are plenty of CVEs found every year that affect FOSS products; the difference is that the source is available for an external audit afterwards.Most of Meltdown and Spectre were the result of Intel assuming they were untouchable and taking shortcuts in their architecture, not because the ISA is inherently flawed (at least, not in the way you mean - x86, AMD64, and x87 have plenty of issues). This is evident when you compare the number of SKUs affected between AMD and Intel. AMD's implementation of SMT and speculative execution are much more secure than Intel's.
Dolda2000 - Wednesday, January 13, 2021 - link
RISC-V doesn't imply open and free implementations. It's just the ISA that is free, and that is certainly good for everyone.serendip - Thursday, January 14, 2021 - link
The ISA being free and open would hopefully prevent somebody from abusing their dominant market position. NVIDIA's potential purchase of ARM could either lead towards ARM becoming the mainstream computing architecture or NVIDIA could strangle licensees, making Qualcomm look like Santa Claus.Thala - Tuesday, January 26, 2021 - link
RISC-V will find its place in embedded. Thats it. It can not possibly close the ecosystem gap to ARM anytime soon, while not having any technical advantages.Also the only thing free about RISC-V is the ISA specification - if you want to license a particular core, you have to pay the IP owner. This is very unlike Open Source Software, where not only the specification is free but also the implementation.
mode_13h - Thursday, January 28, 2021 - link
The problem with ARM is now that Nvidia owns it, who knows what they'll do with licensing fees or what they might do to the ISA to make it more complementary with their other products.mode_13h - Wednesday, January 13, 2021 - link
Why aren't you considering AMD or Intel to be viable competitors in the ARMs race?brucethemoose - Wednesday, January 13, 2021 - link
They will be competitors when they, at the very least, show off plans for an ARM design.Tams80 - Friday, January 15, 2021 - link
It's CISC vs. RISC. Not ARM vs x86.It's all up in the air right now. For consumers, there are no higher TDP RISC CPUs out there. We don't know if they will scale well.
mode_13h - Friday, January 15, 2021 - link
Huh?Anyway, if you think AMD and Intel couldn't design a competitive ARM core, I certainly don't join you in that belief. At a u-arch level, all of these cores have most of the same structures. It's not simply a matter of slapping on a different frontend, but it's also not like they'd be starting from absolute zero.
Spunjji - Friday, January 15, 2021 - link
We do, though. M1 is already competitive with other designs in its current ~15W TDP space, and a simple doubling or tripling of the large cores would make it equally competitive with designs in the 30-65W TDP space (albeit at a lower power, in small part thanks to that 5nm process)ikjadoon - Sunday, February 21, 2021 - link
>For consumers, there are no higher TDP RISC CPUs out there.Because the consumers are always the *last* to see bleeding-edge performance at higher TDPs.
The A64FX, Graviton2 (or any Neoverse design), etc. The server space has always been first in cutting-edge technology at high TDP and RISC vs CISC is no different.
Less arrogant and more matter-of-fact: newsflash, the consumer world doesn't like 40W+ TDP systems anymore. In fact, <20W is where all the innovation happens.
dudedud - Wednesday, January 13, 2021 - link
When was Nubia expected to show their CPU?jeremyshaw - Wednesday, January 13, 2021 - link
My guess is they still want to be in play w.r.t. datacenters (never were) and 5G basestations/network (they are, but need Intel or other processors to handle a lot of the dataflow and processing their chips cannot).However, the Centriq saw the end of their old custom uarch team (Qualcomm used to design their own arm uarch, before just customizing Cortex-A IP). So I feel this is more of a realization that they need their team back, in order to compete in those spaces.
mode_13h - Wednesday, January 13, 2021 - link
Clearly, Qualcomm's most valuable contribution to the relationship is going to be its legal team.mode_13h - Wednesday, January 13, 2021 - link
Probably a close second is their name. Nuvia and Nvidia are way too similar, for the longer term.linuxgeex - Wednesday, January 13, 2021 - link
Jim wasn't fired. He left earlier than planned, for "personal reasons".Jim likes to move the needle. Intel was very much mired in pride and were insistent on finishing their existing projects, even if those projects were going to be so late to market that they were obsolete on launch. Jim couldn't move the needle under those circumstances.
It doesn't matter how much talent Intel has if management won't kill projects that are failing.
Yojimbo - Wednesday, January 13, 2021 - link
This purchase probably assures Qualcomm's ability to make SoCs if NVIDIA doesn't buy ARM as much or more than if NVIDIA does. If NVIDIA doesn't buy ARM, there isn't going to be so much R&D being pumped into ARM to create the cores as has been done the past few years. It's Softbank that pumped that R&D into ARM and it's Softbank that is looking to now unload ARM. If ARM is independent it won't have the deep pockets to do it, and Softbank apparently is not happy with the results it has been getting. What happens to ARM if NVIDIA doesn't buy it isn't going to be the same thing as what's happened to ARM the past 3 years.andrewaggb - Wednesday, January 13, 2021 - link
Pretty happy about this. I feel like it's much more likely we'll see actual consumer products using NUVIA's tech if it's owned qualcomm.Apple doesn't seem interested in selling their ARM chips to third parties but I imagine qualcomm would be happy to. Assuming nuvia delivers on their promises this is pretty good news I think.
nico_mach - Thursday, January 14, 2021 - link
Interestingly, however, while they don't sell their CPU designs to third parties, they allowed this key contributor to be acquired by a rival, I wonder why? Is Quallcomm so bad and monopoly status so close that they decided to throw Quallcomm a bone? It's certainly not for lack of money. And they've always been careful defending their IP advantages until now.While it looks like a great move for Quallcomm, they should wonder why Apple let it happen.
mode_13h - Thursday, January 14, 2021 - link
> they allowed this key contributor to be acquired by a rival, I wonder why?They didn't "allow" this to happen. Apple is not God. They had two options: either play into extortion by their ex-employee and pay $Billions to buy back what they believe is rightfully theirs, or seek a remedy through the courts, which has been ongoing for many months. I suppose there's also a possibility that Apple will seek an injunction to block the acquisition from completing, while their current legal challenge plays out.
Anyone who bought Nuvia clearly knew they were going to have to fight off Apple's legal challenges, or broker some kind of licensing arrangement with them. It makes sense that Qualcomm would step up to the challenge, since their lawyers must know Apple's lawyer so well, by now.
Spunjji - Friday, January 15, 2021 - link
"since their lawyers must know Apple's lawyer so well, by now"They're practically co-workers at this stage. Competing legal design teams 😅
Tams80 - Friday, January 15, 2021 - link
They don't own employees.mode_13h - Friday, January 15, 2021 - link
> They don't own employees.That's too simplistic. In most states, employees are under a non-compete agreement, preventing you from going to a competitor for usually about 2 years after you leave, although California doesn't allow them. Also, I think his defense is that it wasn't a competitor, since Apple didn't want to build server CPUs.
The next issue is the potential theft of IP. Most companies assert ownership of relevant IP you devise, while under their employment. Even if Apple didn't use it in a CPU, if they can prove that he used some IP he devised while at Apple, they can lay claim to it. This is what I understand they're suing him for.
Finally, companies tend to place their employees under non-solicit agreements, where you can't leave and then lure your ex-colleagues to join you. I don't know if he brought any of his team with him, but that would be another potential point of exposure.
mode_13h - Friday, January 15, 2021 - link
> In most states, employees are under a non-compete agreementOops, I didn't mean to imply it's automatic. It's something employers usually do. It's one of the documents they typically have you sign, when you're hired.
prisonerX - Saturday, January 16, 2021 - link
Everything you described relating to restricting ex-employees is illegal in CA. Apple's claims are merely an attempt to work around those laws. A key group of people left Apple and went to compete against them, and Apple is upset. Filing a suit was just a way of trying sabotage and punish them. Qualcomm paid so much because they know the legal claims are trash and the team is pure gold.mode_13h - Saturday, January 16, 2021 - link
> Everything you described relating to restricting ex-employees is illegal in CA.Thanks for letting me know about non-solicit agreements, but I do find it a little hard to believe that a company couldn't assert ownership of CPU IP that a CPU designer originated while working for them. As long as they could prove that he devised it under their employment (which is probably the hard part), it seems like common sense that it would be theirs.
Now, I've heard of companies trying to assert ownership of all IP generated by employees, whether or not it was related to their business, and that would seem like a clear overreach.
eastcoast_pete - Wednesday, January 13, 2021 - link
Potentially good news! The more players in the CPU space, the merrier. The next big question is when QC/Nuvia will be able to show an actual in-silicon sample of their offering. Also, a key reason why there is even enough space for another ARM-derived server CPU is that Apple has, so far, not shown any interest in designing and selling large, many-core CPUs for third parties. That being said, does anyone here know what Apple runs its iCloud anything on? What OS and which CPUs do they use?aryonoco - Wednesday, January 13, 2021 - link
I think Qualcomm actually got this done on the cheap. 1.4B to acquire this team seems very smart to me. But we don't know if NUVIA was running out of funding, and perhaps selling right now was better than doing another round of capital raising.I just now hope Qualcomm sticks to the plan, gives this team the resources that they need to bring their product to market. And I do hope they eventually use this to get back to the server and enterprise space.
zodiacfml - Wednesday, January 13, 2021 - link
Sounds like Qualcomm paid a lot just to get those few people working for them. I would do same though as Qualcomm has to be able to supply Microsoft and other vendors the Arm SoCs capable of matching Apple's M1 like products.mode_13h - Thursday, January 14, 2021 - link
> Qualcomm has to be able to supply Microsoft and other vendors the Arm SoCs capable of matching Apple's M1 like products.This argument always seemed weird to me. As long as the performance differential isn't like orders of magnitude, I don't think it really matters if Apple has faster CPUs. People who buy Apple products do so for all sorts of warm-fuzzy reasons or because it's the only way to run the software they need, rather than on the basis of benchmarks and tech specs.
Otritus - Tuesday, January 19, 2021 - link
I know a lot of people with 2012 apple machines that then switched to windows for better performance. If windows machines lose their performance advantage (which they kind of have), then any enthusiast who prefers mac os is switching back to apple. Furthermore, if apple puts more powerful gpus in their machines, you may see game developers focus more on mac os, and naturally some gamers would follow.mode_13h - Wednesday, January 20, 2021 - link
> I know a lot of people with 2012 apple machines that then switched to windows for better performance.They ditched an expensive niche platform for a cheaper mainstream one. It's harder to go the other way.
> if apple puts more powerful gpus in their machines, you may see game developers focus more on mac os, and naturally some gamers would follow.
I don't see it. Game devs are going to use the same platform as the overwhelming majority of gamers, especially since Windows and XBox are cousins. Sure, a few fringe devs might use Macs, but probably not much more than before (and some of those only because Apple forces it, for iOS app development).
Macs will always be too expensive to make sense, as a gaming platform. Consoles and PCs will always have a significant advantage in perf/$ and the vast majority of the gamer market is at least somewhat price-sensitive.
Furthermore, GPUs are the main bottleneck for most games, and it's unlikely Apple can pull such a coup in GPU performance as they did with their CPU. GPUs aren't constrained by ISA or even API (as Mantle/DX12/Vulkan showed).
ikjadoon - Sunday, February 21, 2021 - link
As someone who's owned only Windows PCs for 15 years, I think I'll be moving to an Apple Silicon desktop in the next few years.>They ditched an expensive niche platform for a cheaper mainstream one. It's harder to go the other way.
Absolutely incorrect. You seem misinformed on MacOS growth: 17% of the world's desktops & laptops run MacOS now and it's been growing 20% each year for a decade now. And that was *before* M1 / Apple Silicon.
https://gs.statcounter.com/os-market-share/desktop...
Frenetic Pony - Wednesday, January 13, 2021 - link
A small note: Qualcomm's old custom CPU cores were more power efficient than ARM's in general, as detailed benchmarks bore out. But ARM clocked their CPUs higher, increasing power drain and lowering battery life, but impressing in benchmarks. Benchmarks Qualcomm investors used to justify the CPU division getting cut, despite their superior technology. Because, well that's what happens when investors think they understand engineering.Qualcomm needs to adjust its own attitude towards know it all investors as much as it needs more differentiated products. Or else the same thing could happen all over again.
serendip - Thursday, January 14, 2021 - link
The Firestorm cores on the M1 clock higher and use more power than Cortex A-78 and X1. That's how you get high performance on devices like laptops where low power drain isn't such a concern as compared to phones. I have a feeling a Qualcomm-Nuvia chip could end up using similar M1 concepts like a wide and deep pipeline and a fast memory controller to beat x86 chips.ikjadoon - Sunday, February 21, 2021 - link
>The Firestorm cores on the M1 clock higher and use more power than Cortex A-78 and X1.Firestorm consume *less* energy than X1, while being much faster. The power drain is significantly in Apple's favor.
https://www.anandtech.com/show/16463/snapdragon-88...
SPECint2006:
Firestorm: 63.34 points, 8941 joules consumed
SD888 X1: 41.30 points, 9621 joules consumed (ouch)
SD888 A78: 30.32 points, 7258 joules consumed (double-ouch)
SPECfp2006_C/C++:
Firestorm: 81.23 points, 4700 joules consumed
SD888 X1: 59.29 points, 4972 joules consumed (triple-ouch)
SD888 A78: 46.65 points, 3868 joules consumed (have mercy)
NUVIA's Phoenix cores are claimed to have higher efficiency than A13, so I hope to see it ship in consumer laptops sooner than later. :(
Andrei Frumusanu - Thursday, January 14, 2021 - link
This just wrong. Qualcomm stopped using their own cores in mobile because they were less efficient than Cortex cores - Qualcomm openly admitted to this fact.eastcoast_pete - Thursday, January 14, 2021 - link
From what I remember, their custom cores had better floating point performance, but that didn't help that much with most everyday tasks. Still, would be nice if another custom core player joins the ARM-RISC space also in mobile.lmcd - Thursday, January 14, 2021 - link
This wasn't precisely true before ARMv8 but Qualcomm never really nailed the transition. Only released product was the Snap 820, which outperformed the A57 but not even the A72, IIRC.mode_13h - Thursday, January 14, 2021 - link
My sense is that they scrapped their proprietary cores for much the same reasons as Samsung: ARM was improving so fast that (unlike Apple) they were unwilling to put the kind of resources into their design teams that would've been needed to stay ahead. They saw the writing on the wall and decided to cut their losses.The next question I would've had is about GPUs, but the Nuvia purchase signifies that Qualcomm still sees value in developing its own proprietary IP blocks.
Silver5urfer - Friday, January 15, 2021 - link
They put all that into Centriq. They put more effort into that CPU, which was supposed to be the ARM server DC market spearhead. As AT proposes and how ARM fans here mention how x86 is dead and all.Guess what ? It was closed down, Qcomm the corp which pushes R&D for ROI closed its complete custom server design and Anand Chandrasekhar who was the lead also left the company.
benedict - Thursday, January 14, 2021 - link
There's one thing that bothers me. If Nuvia's greatest asset is its talented people, what's stopping these people from leaving the company once the takeover is done and founding another new company?mode_13h - Thursday, January 14, 2021 - link
"Golden handcuffs". Knowing this, the company doing the acquisition typically places conditions on the stock options of the company being acquired, in order to keep around key employees for a few years. So, they can certainly leave early, but they won't get their big payday.Ultimately, if they're in California, nothing is preventing them from going to a competitor (or creating one), but they can't take any IP with them (as Apple alleges they did, when founding Nuvia - see my above comment about Qualcomm's legal team).
nico_mach - Thursday, January 14, 2021 - link
Since the company was so cheap and so important, why didn't Apple acquire them? I have to think that Apple already has the key people or has another path charted out to stay ahead of Quallcomm/Nuvia. Letting Nuvia go doesn't make much sense otherwise - though of course Apple and Tim Cook aren't perfect, either. While they often blunder on certain hardware (input devices come to mind) I can't imagine Tim blundering on a key piece of IP that represents a huge advantage for multiple years, when all he has to do is write a tiny check (tiny for Apple).Raqia - Thursday, January 14, 2021 - link
GW III was personally sued by Apple:https://www.bizjournals.com/sanjose/news/2020/02/1...
He stated that he left due to Apple's unwillingness to allow him to do a server oriented part:
"In his lawsuit, Williams said that he raised the prospect of Apple doing server chips as far back as 2010 and the company rejected the idea then and subsequently. He said his co-founders recruited him to start Nuvia with them and that he avoided being involved with the company until after he left Apple."
The issue may not be in the specific direction of this initiative so much as the general culture of Apple given his willingness to be bought out by Qualcomm. I doubt he'd be rebuffed by Qualcomm if he leads equally ambitious initiatives in the HPC datacenter direction even if the first parts they are collaborating on are more aligned with mobile. With Qualcomm, he now has access to leading edge fabs, world class support, and will likely have a lot more leeway to steer hardware initiatives while surrounded by other top talent.
I suspect Qualcomm has already been partnering with NUVIA on a project close to tapeout (maybe a laptop oriented 9cx SoC?) and an agreement was reached because of the mutually compatible feelings over the collaboration before another round of fund raising...
mode_13h - Thursday, January 14, 2021 - link
Like Raqiq said and what I just posed above, Apple is already seeking a legal remedy to this.Another point is that if Apple would've paid $Billions to acquire IP they think was already theirs, it would set a dangerous precedent for other Apple employees to leave and start business with Apple IP that the company would be forced to re-acquire.
markiz - Tuesday, January 26, 2021 - link
But they would have stopped a competitor in it's tracks, and would still retain the option to claim it was their in the 1st place?Plus, it's not like you HAVE to follow internal precedents.
It looks as if they just said that they already have something in the pipeline to be ahead anyway.
Tams80 - Friday, January 15, 2021 - link
Apple aren't competing with others much though. They make plenty of money selling to their niche ecosystem. So, as long as others aren't orders of magnitude better (hence enticing those in their ecosystem to leave), they don't have much to worry about.mode_13h - Friday, January 15, 2021 - link
> as long as others aren't orders of magnitude better..., they don't have much to worry about.I don't know if they're exactly worried, but they're certainly suing him and could even seek an injunction to stop the acquisition.
Spunjji - Thursday, January 14, 2021 - link
Interesting! Not exactly great from a competition perspective, but probably a very good thing for the Nuvia tech itself to be attached to a company that will have no trouble at all bringing it to market.lmcd - Thursday, January 14, 2021 - link
Agreeing with the implicit part of your comment. If nothing else, it gets Nuvia access to a vital asset no one has mentioned: wafer agreements!Spunjji - Friday, January 15, 2021 - link
Exactly that - plus all the other stuff further down the line once they actually have a product fabbed, like developer and customer relations. A great product is no good if you can't persuade anyone to implement it in an OEM design!mode_13h - Thursday, January 14, 2021 - link
I think it's a positive development, especially since Apple's legal challenges might've rendered Nuvia too toxic for anyone else.Seriously, who else is doing custom ARM cores, at this point? They've been dropping like flies! I'm gladdened by the prospect that there will be at least one set of SoC's on the market with a non-Cortex ARM core (and I'm not counting Apple's, since those will surely remain locked in their walled-garden). Especially now that Nvidia owns ARM.
serendip - Thursday, January 14, 2021 - link
Apple did it because they control most of their own supply chain and they can set the prices accordingly. A chip vendor like Qualcomm can't sell many expensive high-end ARM chips to customers who won't take the risk of using those chips. Using ARM Cortex IP ends up being the lower cost, lower risk option for everyone from Qualcomm to Samsung to Huawei. Heck, even Ampere and Amazon use ARM cores because that's the lower cost and fastest method to get to market.Apple having its own CPU design arm allows it to undercut Intel pricing while prioritizing certain features for MacOS. Other ARM vendors have to focus on more general purpose platforms like Android and Windows.
Spunjji - Friday, January 15, 2021 - link
That's definitely one way to look at it! I'm mainly apprehensive because of how much of a lock Qualcomm already have on the ARM SoC market - if the Nuvia IP is as good as they claim then the remaining players may have trouble keeping up in anything other than a second-tier role.I still have my fingers crossed that the Nvidia/ARM deal gets knocked down; if it doesn't, then I think I'll probably hew closer to your take on this.
Meteor2 - Sunday, January 17, 2021 - link
Fujitsu.mode_13h - Monday, January 18, 2021 - link
I don't count them because, as far as I can tell, their cores were designed only for use in their HPC systems. That would basically put them in the same camp as Apple.I also think they took this approach more for political reasons than purely technical or economic, not wanting to be dependent on providers in any external country for the IP of their critical HPC infrastructure.
Meteor2 - Monday, January 18, 2021 - link
They're custom ARM cores. You didn't specify in which class.Fujitsu have produced the most power-efficient general purpose CPU-based HPC system to date; I would assume they chose the ARM ISA in order to realise that.
mode_13h - Wednesday, January 20, 2021 - link
> They're custom ARM cores. You didn't specify in which class.You're taking my point out of context. My point was that it's good for the ARM ecosystem to have the option of cores (i.e. available for use in products that you can actually buy) designed by more than just ARM, itself. The fact that some proprietary cores exist somewhere in the world that almost nobody can use is of no practical benefit to the ecosystem.
By contrast, in a world where all ARM cores (i.e. that we can actually use) were ARM-designed, well, that would be a bit like living in an x86 world without AMD (or VIA).
darkich - Thursday, January 14, 2021 - link
THE BIGGEST BARGAIN IN TECH HISTORY!EthiaW - Thursday, January 14, 2021 - link
Qualcomm has a bad reputation of not being adhesive. A lot of ambitious projects were dropped even before there first iteration. The 8cx family seems to be on this track as well, as they pretend to have upgraded it in the so called "gen2".vol.2 - Saturday, January 16, 2021 - link
Wait, is Nuvia making ARM based designs, or is it a totally different architecture altogether?Meteor2 - Monday, January 18, 2021 - link
People are assuming ARM, but Nuvia never specified anything. It sounded like they were rolling their own.mode_13h - Wednesday, January 20, 2021 - link
I can't quote where it was ever publicly stated, but I think it's a generally-established fact that they're ARM cores.webdoctors - Monday, January 18, 2021 - link
How do they compare to Ampere computing? They're doing ARM server CPUs too right? Seems wierd, QCOM just closed their CPU division, why not just hire a few folks rather than buy a whole new company and shutdown their old one. Smells of really bad management.mode_13h - Wednesday, January 20, 2021 - link
> How do they compare to Ampere computing?We don't know because Nuvia hasn't reached the stage of a working server CPU that can be tested.
> They're doing ARM server CPUs too right? Seems wierd, QCOM just closed their CPU division
Qualcomm specifically mentioned using them in just about everything other than servers, though. So, it doesn't necessarily mean Qualcomm is getting back into the server CPU game, or maybe they're just trying not to show too many of their cards.
As for designing their own cores, who can say why Qualcomm changed their mind, but maybe they're looking at the current competitive landscape and feel this would be a worthwhile way to differentiate their products. You also have to consider that Nuvia might have more talent than Qualcomm's old Kryo team, which can change the calculus.
> why not just hire a few folks rather than buy a whole new company and shutdown their old one. Smells of really bad management.
Because the senior talent are Nuvia's founders. They have a fiduciary responsibility to their investors to try and take Nuvia to a liquidity event, so long as that's a viable option. And they'll get paid even more by selling the company and cashing in their stock options than anything Qualcomm would pay them otherwise.