Apple could probably bootcamp Arm based Win 10, but I doubt they can get MS to want to bother somehow getting x86 Windows to work on it. They'll have to make their own program or some 3rd party dev does it. I don't see any kind of revival of Virtual PC for Mac, but you never know.
I think main issue is one of performance. The Apple chips are the fastest ARM devices around - would be awkward if a bootcamp installation is faster than any native Windows ARM device on sale. Not saying it won't happen, but might take a little time.
I would presume that Rosetta 2 will be part of the virtualisation platform as well. Boot an x86-64 VM within it, and it will use the dynamic recompiling aspects to run it on the ARM hardware. Presumably with recompilation caches subsequent uses of the VM would get faster as well.
But on the other hand, if Windows was working on this platform already, surely they would have shown this off? Instead we got Linux and virtualised server demos showing interoperability with the host Mac, which is nice, but maybe not all everyone wants.
I expect Windows on ARM may become a good fit. Indeed it could be the boost that this platform needs to actually take off - so I expect that Microsoft will be working with Apple on this.
They could do that, but then people are unlikely to buy a desktop mac for anything other than developing apps for iphones and ipads. No phone based ARM SoC can even get close to the power of a discrete GPU and no one else on the market had anywhere near the patent library or R&D of AMD or Nvidia in that arena. Apple might pull off switching to ARM CPUs bit if they try to squeeze in a GPU pulled from any of their other designs their not going to be scaling it up to a useful level for animators, VR, or heavy gaming. The lack of driver support for modern GPUs on MacOS is already a sticking point.
I dont know yet, but I expect them to focus primarily on features that lock their echosystem together and wall out the rest of the world as much as possible. They seem fine with not having a performance leadership if it means you can't put other peoples software or hardware in your mac, and as long as it keeps you locked in their garden.
Apple knows more about what they have in plan - they may already have a in house designed GPU that will allow them to dump both Intel and AMD. To say that a switch of architecture will render them as only dev platforms for iphones and ipads.
I wasn't aware that Apple was a viable platform for VR and heavy gaming. The lack of drivers in MacOS would not be an issue when Apple makes the hardware, the software, the OS and everything else in the Apple Ecosystem - So I would imagine that Apple will fully support Apple.
Apples has been walling off their garden for their entire existence and I would not expect that to change. I would expect Apple to start looking to purchase a display company... The dev tools will allow fat binaries that contain both x86 and ARM programs... so not sure how that would change when they fully transition to their in house ARM derivative. Devs would be using the exact same dev tools and would only need to compile either fat binaries or pure ARM binaries... nothing changing.
They don't even have a truly in-house GPU design in their mobile devices yet. They tried to run off with Imagination's IP and nearly got away with it - now they're back to a licencing agreement with them.
I can't see why they wouldn't continue to use AMD GPUs in their more powerful devices, at least in the near term. There are already Arm-based dev systems out there with PCIe slots and AMD GPUs in them. Using their own GPU makes sense for the lower-end MacBook devices, though.
it would make sense to use AMD GPU if they can't effectively scale up imagination tech IP for higher TDP designs more effectively than AMD design. it remains to be seen.
Tiger Tiger is a restaurant chain based in London that specializes in serving special fast foods and delicious dishes across the world. The restaurant provides everything from home comfort classics to chicken dishes to grill dishes.
Apple likely requires a licensing agreement with Imagination based on products still being sold that are not yet on Apple GPU. I would expect Imagination IP to be phased out over time if any still remains.
I think MS could do this with the windows 10 for ARM code base that already exists. Not sure how heavy a lift it would be, but at least there's a starting point.
MS already have Windows for ARM, but they build it as per device. So if MS decided to support Apple's Silicon, they will either build a separate build for each apple device, or build a single build with support for all Apple devices. But in both cases, they must have Apple's bless first, Apple must also want this to happen and, they need device specific drivers like how it works with current Bootcamp Windows as Apple provide the necessary drivers.
On the other hand, if Apple decided not to support Windows, the only other method will be virtualisation, which according to Apple is working very good.
But after seeing the current dev kit with it's bootloader and the ability to boot other non-signed OS's like Linux or even Apple's own macOS (modded). Apple might actually be working with MS to support Apple Silicon. Unless, they will only provide this non-signed feature to the dev-kit and consumers must find a signed OS first, which will be hard for Linux.
The linux situation on Apple silicon is still unclear yet, as in all cases, Apple must provide the drivers also, it's already tricky to install ubuntu for example on current Intel systems. I guess things will be harder with Apple's Silicon.
Unclear. AMD is known to licence their IP fairly liberally, so we may see an AMD GPU with ARM CPU (we're expecting this combination in Samsung's Exynos 1000).
Apple is already paying the 3rd major foundation graphics vendor (that survived, anyways), Imagination, for licensing relevant IP and (presumably) patents. Even if they switch to AMD, remember that Intel has long paid Nvidia then AMD graphics IP/patent feees and royalties. None of that means the Intel UDH620 has any practical relation to an AMD Radeon.
No, but every discreet GPU used in a MAC product in the last what 5 years? (or more) has been an AMD GPU. There is only so much GPU HP you can put inside of a single chip solution -- due to things like thermals, memory bandwidth, etc. I would not be surprised to see Apple attaching discreet AMD GPU products to their ARM chips, at least at first, but because they are Apple, I would also not be surprised to see they eventually build their own discreet GPU.
I expect them to keep AMD for a while - makes little sense to split focus on CPU and GPU - but at some point they will have a full replacement for AMD... Their on SOC GPUs are not terrible - light years ahead of the general ARM catalog.
Yeah I slip up on that one all the tim, but hey it's not like they but a big old AMD sticker on them like you see on windows laptops so you could say it's a discreet discrete GPU.
Apple is best ever company who is producing quality wise better things than other.everyone would like to purchase products by Apple .i always use apple's products and suggest same thing. They are best in technology and quality.<a href="http://www.mybesthoverboard.com/epikgo-hoverboard-... hoverboard price</a>
They could easilly still use AMD as a GPU supplier, if it made sense performance wise. Look at the Nintendo switch with an ARM CPU and Nvidia GPU, or the Nintendo Game Cube With a PPC CPU and an AMD GPU. It all depends here. GPU performance is not that bad in phones theese days. you could play quite demanding games on a very competent screen, resolution wise. On a very small budget. The more I think about it, the more of a monopoly I smell. Corruption.. Time to shake things up I guess!
While that's pretty up in the air right now (AMD GPUs with Arm-based CPU), I would actually be really curious to see just how well Apple's GPU designs scale up once they approach dGPU levels of transistor count, and get fast VRAM to work with. Let's not forget that while the A chip line has shown great progress on CPU power, progress on the GPU performance was even greater. I am not an Apple fan or user, but I'd love to see them light a bright fire under seats of the GPU duopoly (I'll add Intel to AMD and NVIDIA if they ever release anything worthwhile)
Even if they made a GPU that performed at 3090 levels for 100$ and sipped 50 watts at full bore they would only use it on their systems and never allow it to be used outside the apple ecosystem. 30 years of watching these guys, they play by the old school monopoly rules. If they could own the entire supply chain and entire retail chain and entire data services chain (your phone provider) they would.
Since you have been watching Apple for 30 years, when, in your opinion, they became a monopoly and in which product category or categories?
Apple is not obliged to sell or license their CPUs or GPUs to anyone. It’s not “playing by old school monopoly rules” when they don’t.
Do you really think that Apple should be obliged to supply their own chips to the likes of Huawei, presumably without profit even?
Owning the design of your core HW components and having your own OS etc. is not a crime either.
But of course you know all this.
(The bitterness towards Apple from people who have never owned, and never will own, a single Apple product just never ceases to amaze me. None of this affects your life in any way whatsoever.)
Definition of Monopoly: Exclusive possession or control of something.
So yes they are a monopoly.
Monopolies are not inherently illegal (in the US). Abuse of a monopoly is. If for example Apple went and said any apps sold in our app store cannot be sold anywhere else (ie google play), that would be illegal abuse of a monopoly. Just ask Amazon, that's exactly what they did with their Kindle store and they lost. Amazon still has a virtual monopoly on the Ebook market.
That's not a monopoly, that's exclusiveness and while they might be similar in a few aspects, they're not the same thing. And to refrain to your example, it's up to the app developers to choose if they want to sell on Apple only platforms or ditch Apple and sell their apps on other platforms. Considering that the developers still have the freedom of choice regarding their apps, this is not a monopoly.
It’s called vertical integration. People calling Apple a monopoly don’t call Ford or GM monopolies for designing and making their own engines and car parts. Apple has a monopoly on doing well all the things needed for a phone or computer to work best, but anybody else in the industry has the opportunity to do the same. What are you gonna do, demand they stop making better products so someone can catch up? You’re also neglecting the fact that they ditched Intel to do this. Whose chips alone made up about 25% of the cost of a cheap macbook and sometime probably as much as 40% of the bill of materials. Apple can make a competitive chip for 25-50 dollars. As opposed to over $250. Intel was the more monopolistic company in this relationship. Apple could drop mac prices by $100 and still make $100 more profit. They should have done this years ago. The benefits were incredibly obvious since 2016 at the latest.
Undoubtedly the end result will be AMD GPUs are out of Apple.
But integrated graphics, even on 5nm, can only do so much. I presume the A14Z on 5nm will have double the GPU of the A12Z at least, so the silicon will be very powerful - but it won't be >5 TFLOPS powerful.
So I guess AMD GPUs in laptops are dead by next year, but AMD GPUs in iMac and Mac Pro might last a couple more years - it depends on whether or not Apple are going to create their own discrete GPU line using their IP or not.
well, for next year or so, I don't think Apple iGPU will replace laptop dGPU just yet. It'll be a beast in it's range, no doubt, but there's so much you can do with limited power draw. MBP 16 refresh with Apple Arm CPU will surely still has AMD dGPU option.
Welp, now that they've made a 10.4tflop integrated GPU, and their 13.7tflop laptop iGPU is coming this November, what do you think? If the top-end Mac Pro chip ends up being 4x M2 Max chips as rumored, that'll have 54.7 fp32 tflops, about as much as the 4x Radeon Pro VII GPUs that the current Mac Pro has.
Apple may even create a dedicated GPU die with a high bandwidth connection to their SoC so that they can create products with a powerful GPU in addition to products using their integrated GPU.
I'm not sure how much they would get out of doing that vs. just using AMD dedicated GPUs so it's still an open question.
Consoles are SoCs also and we will soon have 10+TFlops GPUs on 7nm this year. Of course I doubt Apple would want such a power hungry GPU on their SoCs for MacBook laptops but I don't think it would be impossible to do for a Mac Pro or iMac.
Will probably be phased out at the same time the last Intel for Apple ships. Not sure it makes sense at this moment to try and replace both the CPU and GPU - so maybe they will stick around a little longer - but then again not like Apple is cash constrained...
Apple will continue to build chips with integrated graphics, just like all their a-series chips have. But these will function like intel's iGPUs on current chips- meaning on the low end products they will be the only GPU present, but ffor higher end products they will be supplemented by discrete GPUs (from AMD or possibly someday again Nvidia). There is nothing in this weeks announcements to suggest Apple is going to build high end GPUs any time soon. Nor any reaosn they cant use AMD (or nvidia) discreet GPUs with their own apple ARM chips (they just need to work with them for drivers and my understanding is they basically write their own AMD drivers now anyway).
I don't care for Apple and its walled garden, but it's gonna be interesting to see some *real* benchmarks once these things are released. Windows *does* run on ARM, and if Apple can show that its upcoming chips have a clear performance/power advantage, we might see some pressure on Intel from Microsoft too. Interesting times!
This is going to be huge. Intel or AMD isn't able to keep up any more. The ISA it self, in witch they have built their whole business model is too old. Either they go full on IA -64 and risk it all, or ARM wins by a margin. I predict that most of new PC's sold from five years from now will be ARM-based. AMD might still have a chance for now, with their GPUs, but still, we can see similar things with Imageon and ARMs own GPU -tech. You can drive quite a demanding scene with a veeeery small power budget... I predict my next stationary build will be an ARM system with an unknown GPU-brand. And it will be 4 times as powerful as the one I own today.
I'm convinced that both AMD and nVidia GPU IP isn't going to vanish alongside x86 - nVidia has already demonstrated SBCs with both ARM cores and nV GPU IP (Jetson series). It would not be a stretch to imagine higher performance ARM CPU integration along these lines.
GPU-wise, they still have a chance, AMD and Nvidia. The weakest of the bunch pretty quickly turned out to be Intel Here. But as said, you still have Imagination tech, which ins't that bad on a 7w phone, pushing FPS-shooters on 3,5k- screens...
Well since the first of the new Intel Xe line is only available as a discrete card and identical to the 96EU iGPU in the imminent Tiger Lake... Pretty much looking like that Xe LP is besting the Vega in the latest AMD APUs. Discrete graphics and compute cards with Xe HP are to be launched this year.
Nvidia is the king, no doubt - installed base of a tech they invented (GPU compute), and ecosystem they have been investing heavily in for years (hardware, software and expertise).
Intel is not aiming their Xe HP at Turing - they are aiming at Ampere- and with OneAPI and their already dominant position in the data center - they have the resources to dethrone Nvidia - won't happen over night, but it will happen. AMD has neither the tech nor the resources to dethrone Nvidia - and 3rd place will hasten their departure from GPUs and focus on CPUs - or may hasten a merger...
Not only demonstrated but been shipping for several years including into the Nintendo Switch - Tegra. Jetson is more for machine vision and machine learning - and also contains ARM and NV graphics
"Full on IA-64" as another atempt at a new ISA if I didn't make myself clear. They have to get the balls Apple have to succeed. There is no turn back to theese instruction sets from 1998, all of witch is done is modern as hell-approach, get used to it. Like Apple. Now, I wrote it out EILI5-Style
Lol, dude, you really don't know what you're talking about. BTW ARM is from 1985.
> I predict that most of new PC's sold from five years from now will be ARM-based.
Lol -- absolutely not. Sure, Apple machines will be but the vast majority of computers will still be x86. X86 has some complexity with the instruction encoding but honestly once you are beyond the front end there is not much difference between an ARM CPU or an X86 one.
It's in the front end we have all the problems. A14 is a 8 width design. Intel is far behind in any case. I see there are some more responses regarding the ISAs just down below. This isn't a competition in who's right or wrong, but I'll gladly eat my hat if any company still hasn't released a complete, windos compatible platform that kicked ass. All Hail to the new PC-master race, ARM. Took them a while, but we are there now, both hardware, and software wise. We don't have the same troubles with compiling as we had 15 years ago, alot has happened and you can see from the companies already on the mac hype train, that there is good and functioning software for the Apple silicon. The ARM X1 seems to fare quite well in the same league, worse than A14, better than anything yet released on x86.
No, but I just presume they will when looking forward. The X1 could be seen as an 8 wide cpu in many usecases: "The fetch bandwidth out of the L1I has been bumped up 25% from 4 to 5 instructions with a corresponding increase in the decoder bandwidth, and the fetch and rename bandwidth out of the Mop-cache has seen a 33% increase from 6 to 8 instructions per cycle. In effect, the core can act as a 8-wide machine as long as it’s hitting the Mop cache." The A12 and 13 has been known for beeing extremely wide cores, the latter 7 wide, don't expect them to lag behind in this sense. Only time will tell I guess!
" Intel's product marketing and industry engagement efforts were substantial and achieved design wins with the majority of enterprise server OEM's including those based on RISC processors at the time, industry analysts predicted that IA-64 would dominate in servers, workstations, and high-end desktops, and eventually supplant RISC and complex instruction set computing (CISC) architectures for all general-purpose applications "
" Beyond Kittson, there will be no more chips coming from the Itanium family, an Intel spokesman said in an email. That ends a tumultuous, 16-year journey for Itanium, which Intel once envisioned as a replacement for x86 chips in 64-bit PCs and servers." " Intel hedged its bets and planned for Itanium 64—also called IA-64—to ultimately go down the stack from servers to PCs. But that never happened, and the market shifted quickly after AMD introduced the first 64-bit x86 server chips in 2003. That gave AMD a competitive edge over Intel, which still was offering 32-bit x86 chips. " " The transition disrupted Intel’s vision of Itanium as an architecture of the future for 64-bit servers and PCs. Instead, x86 chips started moving up the stack into more powerful servers. "
" When Intel launched its first Itanium processor in 2001, it had very high hopes: the 64-bit chip was supposed to do nothing less than kill off the x86 architecture that had dominated PCs for over two decades. Things didn't quite pan out that way, however, and Intel is officially calling it quits. " " The news marks the quiet end to a tumultuous saga. Itanium was supposed to represent a clean break from x86 that put Intel firmly into the 64-bit era. It was first intended for high-end servers and workstations, but it was eventually supposed to find its way into home PCs. Needless to say, that's not how it worked out. "
Right now Intel chips still easily outperform ARM at the higher tiers, yes ARM is far more power efficient. The reason ARM has gotten so much closer is Intel has been stuck for a long time on one node, but let's pretend they stay stuck and don't do anything about it. Five years from now ARM *might* be able to match their performance of both are running native code, but there is ZERO chance ARM would be anywhere close under emulation which would be required for the PC platform to migrate.
To be clear, I'd love to see the PC market evolve into something where we could see more competition on the CPU side, but we already had decades where x86 was clearly inferior(MIPS, Sparq, Alpha etc) and still had an iron grip on the PC market.
The GPU side is even less likely. IT has been making graphics chips for a very long time now. They've always done very well in lower power situations, and they've never been able to scale to the high end. We've seen Intel throw billions at the graphics side and utterly fail too, everyone else that dominated the market I the early days is give now. We didn't ask for a duopoly in the graphics segment, we got there because no-one else could compete.
I seriously believe we are in to a paradigm shift here. An older gen ARM chip of 3W out performs a 125W intel chip single core wise. Intel is going to have to prove their point with 7nm or what ever. AMD is al ready on the same node and the performance isn't stellar there either if we compare to the performance of ARM in general. Graphics wise, it might be as bad i'm afraid, or not afraid. Would be fun to be able to play some VR games with a decent performance soon...
>AMD is al ready on the same node and the performance isn't stellar there either if we compare to the performance of ARM in general. Graphics wise, it might be as bad i'm afraid, or not afraid. Would be fun to be able to play some VR games with a decent performance soon...
This comment is so dumb I can't even tell if you are being serious or trolling. You seem to have this weird fantasy that arm is much better at EVERYTHING while ignoring context and what's actually out there.
For example, the current AMD RDNA 7nmm chips are not even close to efficient. Nvidia's 12nm offering are just as , if not more efficient. This is partly due to the clocks being too high and past the power efficient point to get extra performance. Remember, these are hardware not designed for power saving.
I also find it hilarious that you think AMD and Nvidia will somehow be "replaced" in 4 years. There is nothing to suggest that Apple is able to develop anything on the gpu side that comes close. Remember, Nvidia is no Intel, they are a market leader and is still constantly innovating. The gpu
Have not owned an apple product my whole life. What i'm seriously excited about is the raw performance at extremely low energy levels. :) Look at what ARM X1 is going, slap it on a motherboard and turn the frequencies a notch. Cool it with a noctua good for 200W. I'm just tired of the stagnation in the PC-industry, where higher prices for the same performance has become very common. ARM and Imagination is very much welcome to fix this, that apple is first out in this shift, good for them. I welcome the new PC-masterrace, developed for mobile. Apple gpu seems to deliver 5,7 fps per watt in 3d mark according to this site, that's friggin amazing if we could get it to scale with the power envelope. Would gladly buy a 150w gpu with that type of performance.
Your thinking is stuck in the "Motherboard" paradigm... I think Apple is continuing to move away from these dinosaurs. Apple's best selling professional computers are the MacBooks. If they can take the performance crown among all laptops, it's going to be very disruptive for the PC world.
Except AMD and Intel are hitting those performance levels by running at 4+GHZ with all cores. As intel so kindly showed us over a decade ago you don't get good performance/watt by chasing ghz. Yet we also see that most of the code people write for a CPU doesn't scale well past 4-6 cores and so you need high clock speeds and IPC to run many programs well on a desktop. Then we have the multitasking conundrum, not something your typical ARM SoC has to deal with.
I expect these ARM SoC designs to be excellent at low power and single task performance with solid design tricks to boost speeds on a couple cores for running a game/etc. But I don't expect them to manage heavy hitting CPU apps that hoover up all the available cores. Compression / decompression / crypto / animation / video transcoding / etc. At least not with current designs or while maintaining those power levels.
But in the desktop world or the gaming world very few of us are complaining about a lack of CPU performance. Doubling CPU performance would give most users very little. While doubling GPU performance is hugely beneficial. A major architecture change would also likely reap more utility long term, like some unified memory controller between the dGPU and the CPU.
If you look at the last time one of the desktop GPU makers actually gave mobile a shot, the TegraX1, they eviscerated the top IT part, it was years before the mobile market caught up.
On the CPU side it's a stretch to think ARM can actually compete, on the GPU side it's laughable.
A 10900k has 20mb of cache 10c/20thread @5.3 max turbo. Sure it's 125W, but that's for the whole thing. Similarly the 3950x is 72mb cache, 16c/32thread and 105W TDP.
Even if single core speeds are comparable, the number of cores, threads, clocks, cache etc aren't comparable at all. The processor bus, memory bus, pci-e, etc is also all way different.
20 megs of cache on 10 cores equals to 2 Megs per core, that's sweet...
Anandtech: "The amount of SRAM that Apple puts on the A13 is staggering, especially on the CPU side: We’re seeing 8MB on the (2) big cores, 4MB on the small cores, and 16MB on the SLC which can serve all IP blocks on the chip."
A13 isn't an ARM design. It uses the ARM ISA, but it's Apple's own design. There are a ton of mediocre, memory bandwidth-starved ARM cores out there with poor voltage regulation, coarse frequency adjustment, that burn hot when you push them for any length of time.
https://gadgetversus.com/processor/intel-core-i9-9... A13 only has 2 large cores, in the single threaded beekbench 5, A13 gets out on top, on half the fequency and a powerdraw thats just amazing.. Lets throw in a couple more of those larger cores and see what they can do with some serious cooling.
Where did i mention Apple? I meant windows PC's. Some will pick up the X1-license and develop further. smack it on a mobo and voila, change of platform!
Lol. 5 years? Keep a note of this. In 5 years apple will actually have lost PC market share. Trust me, this is about apple owning everything in their stack, not about out cpuing AMD or intel or out GPUing AMD or Nvidia.
Not talking about apple, im talking about You I and HP and every one else putting little ARM-chips in our beast PC's instead of Intel or AMD. This is going to pace up. If Apple is this sure on a change of platform, they ain't doing it for a measly 20% performance increase.
What the heck are you on about? Do you seriously think Apple is going to license out the tech and design which makes their ARM cpus so great in their iPads, iPhones etc. to other companies? Have you seen how badly Apple outperforms all the other ARM chips in the smartphone space? Qualcomm ARM chips look like garbage compared to Apple.
Unless other companies can quickly develop ARM designs that can catch up to what Apple is doing and where they are going, your suggestion that everyone is going to be putting ARM CPUs in their desktops in a few years makes no sense to plenty of users.
Apple obviously does NOT just use off the shelf ARM CPUs. It's going to continue to keep the in-house design + tech which makes their ARM CPUs so great to THEMSELVES and ONLY use them in their OWN devices.
I don't think any other company that supplies for the rest of the computer market will catch up to Apple in ARM CPU performance anytime soon.
"as they have accomplished some amazing things with their A-series silicon thus far" "the PC market is about to become fractured once again."
For the first line, Can I know what an iPhone can do over my Android phone ? I ask two things here - One, does it allow the usage of filesytem by user without any fence or such like installing apps outside. Nope. Second being, is it stopping anything for even direct similar application performance comparisons on OnePlus 8 Pro vs an iPhone 11 Pro max ? None so far. Except the uber insane SPEC scores.
And for the second line, how is PC getting fractured here lol. Is M$ going to bet big on Apple A series processors and make Surface on them, or is that ARM has taken over the Server/Data Center market ? The numbers show AMD is rising in their Server marketshare chipping away Intel's which sits over 95% while ARM is at 4.5% so the 0.5% is from ARM lol.
Qualcomm abandoned their Centriq processor and whole IP, Samsung is closing down own ARM, and Huawei got slapped by US for it's CCP bootlicking, Altera is still about to show its potential, AWS Graviton 2 according to STH and Phoronix is no match for an EPYC7742 or Xeon, where AMD trumps over Xeon by a huge margin.
Now coming to the notebooks and desktop OS marketshare from Netmarketshare, Windows is at 86.7%, Mac is at 9.7% and Linux at 3.x%, Chrome OS at 0.4% (lol), so by that numbers alone, Mac is not even used by many people across the world. How is PC market which is mostly Windows going to be fractured Once again ? I would like to ask when did it get fractured so that it's becoming now again.
Now for the x86 innovation. AMD is now spearheading new Zen 3 processors which are on TSMC 7NP and improved cache layout, then we have the Intel's 10nm or whatever 7nm along with DDR5 and Gen 5 coming in 2022, which is supposedly bringing massive improvements in memory alone, and MCM is getting more traction on Intel side as well with rumors of RKL having Foveros based 10nm Die for GPU and 14nm++ for CPU with improved IPC.
And the biggest of all, Choice - DIY system or even DC machine, so many sockets, so many OEMs so many Vendors and Linux OS support with massive stack of the Software and Programming and Gaming as well with next gen Consoles and AI. And in DIY we have the ability to choose a Soundcard, Memory, Mobo, LAN, GPU, CPU, Cooling, OS, Display, Software and fucking shitton of other things along with affordability, how is this going to change for the worst of x86 ? ARM is always custom BS, no choice and closed like a drum just like iPhones and locked bootloader phones with so little wiggleroom, don't even start on the BGA BS that Apple does to their macs and their users with fully soldered, locked down trash from super expensive RAM, Storage to the damned basic repairs like LCD cables ? (Louis Rossman)
So How is this going to FRACTURE lol ? SPEC scores from Andrei I guess. The uber fast Kryptonain class SPEC score of A series so fast that it will be at the speed of Light and don't experience time like a Photon but in reality nothing but a benchmark.
Quantumz0d said: "One, does it allow the usage of filesytem by user without any fence or such like installing apps outside. Nope."
Yup. Are you up to date? Since iOS 13, there is a Files browser with hierarchical files and folders. Any application can read them. Plug in an external USB drive, the files are visible as is, and any iOS application can list them. It is revolutionizing how I use my iPad. (I still get most work done on a laptop, but that is changing rapidly thanks to iOS 13.)
iOS 13 also allows connection to SMB shares. So I have turned on file sharing on my computer, connected to it from my iPad, used the iPad native Files app to browse the files on my computer, open them in a compatible iPad app (they were photos), edit them, and transfer them back to the iPad. Same goes for iPhone.
You better keep up, because that was back in iOS 13, and Apple just announced iOS 14 today.
Took over a decade for a POS OS to implement a Files browser with hierarchical files and folders so Apple sheeps can brag about it. But hey, you still don’t get to see your own pictures that you transferred from your PC. You can see them in Photos app but still have no control over them, such as move, rename, delete or view their metadata. Isn’t it great?
Brag all you can now because the day that your government can no longer acting like a gangster, affordable products from the world will flood your market and consumers will push your Apple god aside.
I can't help but get a laugh from the fact that someone described being able to access the actual file system as "revolutionary". But then I see him mention it as "new" in IOS13 (THIRTEEN!!!) And I cry a little.
Apple does nothing for *you*, Apple is only concerned about Apple. I could think of dozens of features apple still refuses to put on any of their devices because doing so might earn them a few pennies less or give the user some semblance of choice.
Apple isn't switching to ARM to give you guys performance or lower power consumption. They are doing it so *they* are the supplier and so *they* own and rule the entire stack.
I can still play games on my PC I purchased decades ago. Apple removed the ability for 32 bit apps to function at all on their iOS devices. Virtualization exists, its used extensively and quite effectively all over the industry. They *could* have allowed them to run in sandboxed virtual machines and I still use the software I purchased. But nope. That didn't benefit them, it didn't earn them more hardware sales or new software fees from the app store. So they just outright removed the apps and the ability to use them.
On my PC I have my older apps backed up, I can install them myself outside of some walled store. Support for x86 machines and software will be gone (along with the ability to run that software) by 2025-2027.
At the same time there's been access to files stored on the phone (and the cloud) via cloud services such as Dropbox, Google Drive and Box for a long time in iOS. There's also been the app Documents:
I haven't suffered much from the lack of a file browser, but one can sure wonder why it took Apple so long to add it. It's there now at least. Apple sometimes do take their time adding some iOS features, but I think the OS has often offered most features I've needed via apps – for me it's geeky enough. :)
Apple sure is progressive with their OS and deprecating legacy stuff. But I'm not sure it's all for worse as you seem to imply. I agree Microsoft is doing a good job with backwards compatibility in Windows, but many also seem to think they will eventually need to at start somewhat anew with Windows too.
Exactly, it's unbelievably dumb how some people wish for a more closed hardware space with little to no choice.
Personally I don't worry about it, because there's no chance that the translation layer offers enough performance for the decades long library of x86 software.
"the ramifications of Apple building the underlying hardware down to the SoC means that they can have the OS make full use of any special features that Apple bakes into their A-series SoCs. Idle power, ISPs, video encode/decode blocks, and neural networking inference are all subjects that are potentially on the table here."
To paraphrase a certain famous programmer: to run the spyware faster. The nanosecond time stamping from APFS and other innovations do create overhead.
Apple has been steadily adding metadata/spyware to the Mac and even special chips like T2. It's not content with Intel and AMD being the only ones with black boxes in place.
Would say it's not only SoCs, but also peripherals and also Thunderbolt found its cost barrier on changing from copper to fibre for consumer level hardware. Question is, what for we are using improved computing power? Increased resolution only for same pictures we saw already is not really a satisfying improvement for materials and energy invested on mass markets, especially with in sight challenges growing?
People think everything is about faster performance. Bring on the benchmarks.
Meanwhile, Apple has a recent history of dramatically slowing performance, such as with its awful APFS filesystem which is horrible with mechanical drives and slow with SSDs.
Apple is not in business to grant wishes to geeks, by coming out of a cloud with a wand to deliver faster performance. It will provide only as much performance as it feels necessary to provide and the vast majority of the focus will be on wringing every last bit of data/IP out of its users. That is the new model. You're not the customer. You are the fodder.
Seems like rather baseless comments you are making. APFS has been well received and is a great match for the needs of Apple's products. Also, while it is a modern file system designed to be optimized with SSDs, I can assure that it works perfectly well with spinning disks also.
"Seems like rather baseless comments you are making. APFS has been well received and is a great match for the needs of Apple's products. Also, while it is a modern file system designed to be optimized with SSDs, I can assure that it works perfectly well with spinning disks also."
Rubbish.
Look at the actual tests. It's substantially slower with SSDs and ridiculously slow with mechanical hard disks. It is, though, a dream for "forensics".
There are some tests such as the enumeration of an entire disk that will be slower on spinning disks in APFS because of the way meta data is stored. On HFS, all metadata for all files are blocked together. On APFS, the meta data is stored with the actual data files themselves. Yes, for this kind of test, there would be many more seeks, etc. and likewise slower performance. In practice, all of that data is indexed via spotlight and there is no real difference on actual routine usage.
I wonder how long before the current US administration picks up on that and beats Apple over the head with it (no matter that they're really a UK company bought by a Japanese investment fund and hived out to the Saudis).
"Curiously, the company has carefully avoided using the word “Arm” anywhere in their announcement"
Suprise! Its actually an array of 64 in-order MIPS cores per chiplet.
"Arm is on the precipice of announcing the Arm v9 ISA, which will bring several notable additions to the ISA such as Scalable Vector Extension 2 (SVE2)."
The dev kit is clearly ARMv8 though. I agree that this is critically important, as SVE2 as a guaranteed baseline would be *huge*.
I'm recalling how the first few Intel Macs were 32-bit only, replaced later in the same year by 64-bit Core 2 processors. Apple just missed being able to establish a 64-bit baseline back then, and those 32-bit x86 Mac users were the first ones left out in the cold by rising system requirements. This might be one of the best reasons to not be a first-adopter of ARM Macs.
Actually, the kernel itself could be run in 32bit or 64bit mode, which allowed those first macs to remain useful up to and including Lion, in which case those first macs would be nearing the end of their useful lives anyway, given they were limited in hardware power.
Obviously, any 64-bit only apps wouldn't run on a mac with a 32-bit only processor (like core solo/duo), but amusingly, thanks to the way macOS is designed, you could run 64bit apps with a 32bit kernel (if you had a 64bit processor like Core 2 or better). Can't do that on Windows.
I waited for the 64-bit processors before buying my iMac, so I ran it for years with the 32-bit firmware/kernel and 64-bit applications. By the time the GPU started dying, Linux had gained support for booting a 64-bit kernel from 32-bit EFI (rather than relying on the Bootcamp BIOS CSM), and I ran it as a mostly-headless Linux machine for a while longer.
In 16bit Windows 3.11 you could run 32bit apps using the Microsoft's Win32s extension (means mainly the memory model not splitted into 64kB segments). Of course it required 32bit CPU (386 - but not the first version with those bugs).
Gotta wonder if that's why it's called "Apple Silicon" in the build process. It's very generic, and doesn't commit to an ISA - although it's confirmed its still ARM as expected, not RISC-V or something exotic like that.
I wonder if the Universal 2 binary currently has: (1) Intel x86-64 (2) ARMv8.2 and (3) ARMv9 in it. (2) will be dropped prior to release. Maybe (3) will come in a couple of months.
Apple was early to 64-bit ARMv8, so I don't see why they wouldn't be early to ARMv9 as well.
But let's be honest, if ARMv9 silicon can run ARMv8.2 binaries, then there's no issue right now. Apple can add ARMv9 to universal binary when they need to and not before.
What I don't understand is why Apple refuses to use AMD CPUs. When IBM/motorola dropped the ball with PPC, they ditched them and went for x86 Intel. Then, when Intel dropped the ball, they decided to switch... again.
If Apple just wanted out of Intel, they would have gone AMD. But the difference with this CPU transition is that this time, unlike PowerPC to Intel, this time Apple has achieved mastery of designing and building SoC, which is fast, and power-efficient, and...unlike what AMD has...it is fully customized for Apple's needs. From integration with macOS/iOS to security/privacy and many other things, Apple can design no-compromise processors that do everything they need and nothing they don't. ARM can only offer what ARM designed for a more general market.
Its actually the exact opposite. Apple left PPC because IBM couldn't deliver a power/heat level G5 to put in notebooks.
This would seem to be a move to more power efficiency, and of course, more control over their products. It makes sense; the iphone/pad/whatever isn't going anywhere, and they are all ARM, so really, why support two ecosystems when you can get it done with 1?
This isn't about leaving Intel for anyone else. If it was, AMD would be a good choice, even this year. The Mac Pro could have been a 64 core EPYC machine with PCIe4, for example. Renoir could have brought 8C to Apple's 13" designs instead of flawed 4C IceLake.
It's about having the full platform, not just the CPU cores, available on the desktop platform.
Fabbing a ~150mm^2 5nm Mac ARM SoC will save Apple money in the long term than buying from either AMD or Intel. And they include all their accelerators as well, bringing them to the Mac for the first time. They can have far deeper integration between their OS frameworks and their hardware as a result.
Ice Lake is not flawed - works perfectly and only a moron thinks that 8 cores in a lightweight machine is an advantage - all the while the GPU in that APU is getting owned hardcore by the follow up to Ice Lake - Tiger Lake.
Almost like Renoir spent its die area on twice as many cores for nearly twice the aggregate performance. Tiger Lake will have the per core advantage and IGP, though Intel IGP 3Dmark scores haven't translated that well to real titles in the past. But for multicore workloads Renoir is still going to be the champ, some "morons" actually use their cores for work.
Going with AMD wouldn't solve any of their current problems. People need to understand that Apple has also mentioned the importance of the SoC and the types of capabilities that will bring beyond basic CPU and GPU components.
Well - after the Opteron Fiasco and Intel Core dropping and putting AMD out to pasture for close to 12 years with no competitive products ... makes sense. Same reason there are very few AMD systems in the data center...
Apple is interested in vertical integration, not pro or anti Intel - pro Apple.
I don't think this is going to work out well. Of course if all you want is a web browser or "creative" applications then okay. But many many professionals use Windows and the software legacy of x86 there is just too great. Say goodbye to any technical engineering or CAD tool use on Apple. Software dev. will be taken care of but I think this is a loosing move. If it were a winning move, we wouldn't still be using Ax/Bx/Cx. Apple does have the best shot of doing it given their walled garden world but they are actually mistaken if they think their hardship in the powerPC days was just due to not having control over their own supply chain. Intel has stumbled but this is their bread and butter, while Apple's bread and butter is selling UI and industrial design not computing power. But like the good old days the fans will just say how awesome it is I suspect. For those of you old enough to remember.
Just as an example, Maya was primarily an SGI/IRIX program, but moved to x86/PowerPC when SGI was losing ground. If Apple's workstation class chips are as powerful as everyone hopes, I don't see a reason for CAD/Engineering software to release Apple/ARM builds.
I'm not too hopeful, since Apple still holds such low market share, but seeing the entire industry being pushed to shed decades of legacy x86 support would be amazing.
Some software like AutoCAD, Maya, Fusion, Unity, Unreal Engine, Resolve, Redshift, etc. ported their rendering engines to Metal recently, so they should run half-decently on ARM.
It's too bad Solidworks and Inventor are still stuck as Windows-only though.
Agreed, there's no chance that Rosetta 2 will be fast enough to run the huge library of x86 legacy software so these news will have no impact on majority of the PC world.
Legacy software by definition doesn't need the latest and greatest. Install-time binary translation may not result in perfect performance, but as long as Apple's CPU cores are decent, the apps will still run fine. And many many applications actually spend most of their time in system functions (that's why the games ran well - it's in the OS, Metal, GPU drivers most of the time) and they're native.
The issue will be modern x86-64 Mac software that makes heavy use of assembler to speed up core functions. On ARM that will result in the slowest default codepath running. AVX will be the big issue (Rosetta 2 can do SSE it appears, and SSE2Neon is already migrated). And that's why the Mac Pros will be the final devices to be migrated. Maybe 16" MacBook Pros will be available in Intel for a while alongside ARM versions as well, for this reason.
Or maybe Apple will simply strongarm the software vendor to put some backbone into their port.
They are a significantly greater proportion of certain sections of the PC market, such as artists/animators, designers, marketing, and hipster web developers.
FWIW, wikipedia seems to indicate that the Apple Lightning/Thunder cores are ARMv8.4 and Vortex/Tempest are ARMv8.3 -- but then they don't list what the ISA revisions bring. If anyone knows I'm really curious if there's anything interesting in there. I know that v8.2 brought some nifty changes.
Everything I've read seem to paint the picture that because of the superior instruction set, ARM is just better in every way when compared to x86: iPad Pro compares favorably to Macbook Airs; a new supercomputer is out with ARM cores; etc, etc. They seem to suggest ARM has better perf/watt and scales just fine. What gives?
Did the entire industry not move to ARM solely because no one company had the power of vertical integration that Apple has? What am I missing here?
The ARM architecture isn't much of an advantage. Aarch64 is a cleaner instruction set than x86_64 and the more relaxed memory model makes things a little bit easier for hardware designers (and harder for programmers), but those are minor factors.
The important thing is that Apple hired a lot of chip design talent and gave them a lot of resources. They went with ARM architecture for the iPhone largely for historical reasons that can be traced back to the 1990s, but ARM wasn't and isn't the only instruction set architecture that's suitable for mobile processors. Apple's financial situation means their CPU cores tend to have much larger transistor budgets than ARM's own Cortex cores, and the concentration of CPU design talent at Apple means those transistors are well-spent. Apple processors stand out because they have more resources than other mobile processor designers, and a different target market from Intel's mainstream microarchitectures. (Intel has never been good at developing two CPU microarchitectures in parallel, which is why the Atom family always disappoints.)
The ARM supercomputer win can be mostly chalked up to the fact that instruction set matters little when you're being scored on the basis of vector arithmetic on highly-optimized code—in many ways, it's a derivative of Fujitsu's SPARC supercomputer processors. And minimizing the die area spent on traditional CPU core functionality means more room for more and wider SIMD units.
So, if I understood correctly, the A-series chips are performing outstandingly simply because Apple is under the surface a very competitive chip-design company? Also owning to ARM licenses, of course.
It almost feels like Apple caught up to other, well-known names in virtually no time. But I guess there must be a lot about patents and other IP stuff floating around in the back.
Don’t forget that a huge driver in this change is not the switch to the ARM ISA, but rather enabling tighter integration between hardware and software. So if you see Apples move to ARM as discrediting x86, be a bit cautious, because there are a lot of other factors at play as well.
This. I haven't seen a good analysis yet of what Apple can now implement in hardware for the Mac since it no longer is relying on a generalized third party processor. Industry-specific Mac processors, or instance.
It's not so much a matter of discrediting x86 as it is discrediting Intel. Intel has been stagnant with regard to development in general and they've fallen way behind in their manufacturing process.
Yes, there is some ISA advantage to going ARM, but mostly for Apple, the advantage is being able to control their own roadmap.
The iPhone 4 came out in 2011 and was the first time Apple made custom silicon, which means they have been developing in-house silicon at least for a couple years before that. So it’s been more than a decade of experience already, plus part of the pedigree came from them acquiring P.A. Semi which was around for longer. So while it’s impressive it’s not like they just started doing this yesterday. It’s been a long time coming.
As for the whole ARM thing, simply put it’s just the most mature instruction set that you can actually license easily and make chips out of. Even if Apple wanted to make x86 chips they won’t be able to because Intel probably won’t license it to them. The ubiquitous nature of ARM chips owe largely to the flexible licensing model, not just technical merits.
Pardon me, but can anyone tell me how to correctly read the SPECint/SPECfp graph? Is that single core performance and total SoC power draw/energy consumption? Thanks! I'm trying to understand how to correctly interpret that.
I absolutely detest that double-sided graph layout - it's too information dense, I can never parse it, and strange stipples and colours are applied instead of something clear.
Same here, I've never understood what the motivation for this format is, instead of just a simple bar graph of perf/watt, except to hide some conclusion they don't want you to see.
It’s not being skipped, it’s right there in the article: “... Apple isn’t saying anything about BootCamp support at this time, despite the fact that dual-booting macOS and Windows has long been a draw for Apple’s Mac machines.”
Are there any numbers on boot camp usage? I suppose Apple has those so they might well have decided boot camp just isn't so important to their customers.
Exactly. In practice, nobody cares about boot camp. I'm a multi-platform user. I've experimented with Boot Camp years back... just because I could. In practice, I never really used it. My current Macs don't even have it installed.
Personally, I’m not sure Bootcamp was ever used for dual booting as much as it were used for windows users who simply preferred Apple hardware. Unfortunately through Apple has showed very little interest in actually developing proper software support for their hardware in windows in recent years. Power management is atrocious, and don’t get me started on the trackpad driver situation.
This also highlights the way Apple has been headed over the last couple of years, where their computers have an increasing amount of custom hardware with deep ties to complex custom software. Trying to replicate that software on another platform is a gargantuan task. The move to custom silicon is the ultimate step on this journey, and the one that puts the final nail in bootcamps coffin, but the journey has been going on for a long time.
I think what a lot of people are forgetting here is that this is never a technical exercise as much as a commercial one. I'm quite confident given the power budget of workstations and optimisations it's not that difficult to push a custom ARM core to be toe-to-toe with MOST midrange x86/x64 CPUs.
What potentially allows Apple to succeed here is their mature market in the ARM platform: the iPad and the budding prevalence of iPad apps that make the iPad as a legitimate laptop replacement. They waited until the iPad/iOS market share is great enough, and obviously they see that now is a great time to jump ship to ARM - people need to realize they've spent years building this ship from the commercial side of things (I'm guessing from the days of the first iPad Pro).
We know that iOS/iPadOS share a common core with MacOS, so the transition from iPad <--> MacOS now that MacOS can run on ARM shouldn't be too difficult (a lot of the optimized codebase already exists - it's just a UI change).
However this is probably something only Apple could've pulled off at this stage - Microsoft tried this multiple times and failed (Windows RT, Windows 10 on ARM) - it's just such a compromise running Windows on ARM given the software ecosystem.
Been thinking about this for a moment. Apple has ALWAYS preferred vertical integration and the current political environment seems to be encouraging this (nary a whisper of anti-trust anything these days). We see it at Apple, Tesla, Amazon and a bunch of other giants.
Apple has seen a bunch of success on the appliance, never upgrade side with iPhone/iPad/iMac and it's clear that with their "right to repair" issues that this is another move in the direction of sealed EVERYTHING. I'm old school and I've never liked that trajectory, but that's been the constantly reaffirmed path for awhile now.
Custom silicon isn't gonna open options, but restrict them. I would expect that the future for iPhone and iPad are only solidified going forward to be very bright. I think the iMac is gonna be a mixed bag with continued TCO issues and horrible upgrade options mostly consisting of external box options and the Pro space, especially on the low end is gonna be borked. But that's been the case for a really long time.
Just glad I didn't make that Hackintosh I was thinking about. Seems to run MacOS Big Sur and have the freedom to swap drives/memory/ and I/O cards (network, breakout boxes), one is gonna have to go big or go home.
Sorry, to hear all of this without the Reality Distortion Field is weird. All I can see are the future Alka Seltzer moments for the IT pros who are gonna have to sort out all the nonsense that the custom silicon will yield as Apple vertically integrates everything.
But at least the iPhones/iPads will be pretty great...
There no point of worrying, these news will have little impact outside the Mac PCs which are less than 10% of the whole PC market. The MacOS library cannot compete with the huge x86 software library.
There is one point of worry... People love laptops, and if Apple can deliver MacBook performance that Intel and AMD laptops cannot, then it could be very disruptive.
For example.. PC Laptops today can play games, but they all suffer from loud fans screaming at you the whole time to dissipate the heat and sub two hour battery life - it's a shitty experience. If Apple can solve this problem... and then attract game developers.. it might work. Game Companies like money .. and Apple users like spending money. You call them 'sheep', but they must be doing something right, because they seem to have a lot more money to spend.
Your point was valid until you mentioned game developers switching due to loud fans in high performance laptops. As soon as you talk about high performance laptops you start talking about a laptop with a dGPU. Even a midrange laptop specific dGPU like the 2060 rtx max q from nvidia consumes *65* watts all by itself. So after memory, CPU, storage, etc you now have the bottom half of that clamshell needing to dissipate over 100w of power.
Apple isn't going to change that math anytime soon.
This will (like always) benefit Apple. It *won't* usher in a new gold rush in massive computing performance gains or a massive shift in client and datacenter system purchases.
Yet, look at all the "options" in say the Android market. All those options and none can beat what Apple has done with their own silicon. Seriously, the advantage for Apple is pretty obvious. Moreover, they will be in control of their own destiny rather than waiting on product delays from Intel. Great move for Apple.
Your comparison is completely asinine because smartphones unlike PCs don't need top performance to satify the habits of most smartphone users, ARM could design a competitive SoC but it would be useless besides bragging rights.
Lol... great example of cognitive dissonance. My phone doesn't have good performance, therefore phones as a category don't need performance! Yeah, talk about asinine arguments...
Why do you think people buy new devices? High performance leads to longer practical product life. It also leads to the best overall user experience. Did you ever notice on iPhones when you go to take a picture and you see real time previews of effects, but that is horribly lacking in Android? Just a small example of an everyday user experience difference.
Microsoft will follow suit. This will be the end of Intel, serves them right of playing possum trying to milk the cash on the same node for the past 6-7 years.
Microsoft doesn't sell laptops or PCs (only a Surface tablet, which might as well be ARM). They have no interest in favoring one CPU or another, they'll provide software for whatever their customers want to run it on.
"Follow suit"? Really? Microsoft already has ARM version of Windows and it's their second attempt. Apple is following suit here and so far Microsoft "suit" has been a failure. Also Microsoft does not design CPUs and they have no reasons to do it.
I’d be interested in finding out if the ARM Macs can boot Linux. The options so far have been mostly underpowered systems using SOCs like the RK3399 roughly comparable to 5 year old phones, or ultra-expensive ones using server-grade processors.
Well they've showed Linux running in the Parallels VM on Mac OS.
That indicates either an ARM Linux build running in the VM, or that the virtualisation system can run x86 VMs via Rosetta 2 just fine (so Windows will run here).
Boot Camp is a very valid question for natively booting Linux or Windows(onARM) of course, if you absolutely want to have no VM involved. I fear this may take a while.
>I fear this may take a while. I fear Apple might lock it down to the point of being impossible. It would be completely in line with everything else they're doing at the moment.
To coincide with the transition. When they launch they should announce licensing for hackintosh. That would expand the mac user base dramatically many of whom would then switch to arm later.
Also they should give everyone free ice cream, and let them come round for a go on their sister! What planet do you live on to think that Apple gives a damn about twerps pratting around with "hackintosh" systems? I am sure almost none of them ever turn into revenue generating customers.
I don't doubt that you're right in that Apple is never going to do that, but I think there are many software developers (including myself) that would greatly welcome that. I myself am indeed never going to be a revenue generating customer for Apple, but being able to compatibility-test some of my programs without having to own a Mac would be tremendously useful, and with enough people like that, even beneficial to Apple to at least some degree.
Is this Intel's fault? They haven't innovated much in the past 10 years and have been stuck on 14nm for ever. Did this lead Apple to conclude that they can do things better?
This'll probably be lost in the sea of comments, but it's bugging me how many people think the ISA (that is ARM and x86) is what determines the processor's performance. It's not. It's the implementation of that ISA that determines the processor's performance. If the ISA is all that mattered, then performance jumps like from Netburst to Core or Bulldozer to Zen wouldn't have happened.
And if the ISA is all that mattered, how come Qualcomm's Snapdragons were lacking in performance with Apple's SoC's even though they have, at least on paper from the consumer point of view, similar or even worse looking specs (i.e., lower core count or lower clock speed)?
All that matters is how the company implemented their processors. I'm certain this is why Apple chose to use terms like "Apple Silicon" as opposed to "ARM" because Apple's silicon _is_ what's giving them their performance edge, not because they're using ARM.
While you have a point, differences in ISA *can* lead to differences in power efficiency, things ranging from the necessity of having a CISC-to-RISC translation layer (where you split ops into micro ops) and a lower number of directly addressable registers put x86 at a disadvantage. ARM, on the other hand, was conceived and optimized for being power efficient, and that has had an effect on its ISA design as well.
BUT, our good friends at AMD & APPL have been well known to canoodle, zig instead of zag, psyche, deep fake, 'shens, et. al., six months prior in anticipation of product announcements and major architectural design charges.
I'm not suggesting anything - much less Mac-related, but the next "Master" AMD could include BIG-little, AM5, Zen3, DDR5/LPDDR5 memory, two (or, more!) PCIe 4.0 devices, USB4, __________________ (insert wish here).
Throw in a chiplet for some RDNA-Next action and while we're at it, some HBM as a last-level cache.
Now, that's a Mac ;-)
A much bigger update is expected with Rembrandt. This design will not only feature Zen3+ design but also RDNA graphics. Furthermore, Rembrandt is also expected to support DDR5 and LPDDR5 memory. Expreview data even mentions that the APU will support USB4 and (if translations are correct) two PCIe 4.0 devices at the same time.
The article states: "At the same time, however, even with the vast speed advantage of x86 chips over PPC chips, running PPC applications under the original Rosetta was functional, but not fast."
Huh? Vast speed advantage? Maybe as compared to the G4 laptops, but not over the G5 desktops. The whole point of the move to Intel was because Motolola dropped out effectively and IBM was only interested in doing their Power based workstation level chips. While that would have been fine on the desktop, it wasn't suitable for a mobile laptop chip.
Incidentally, Apple has the same motivation today to move to ARM due to superior power per watt, etc. This also benefit's Apple in the form of vertical integration. Macs will also get the benefit of better power management, Neural Engine and a host of other specialty functions in Apple SoCs today. Seriously, nobody should be questioning whether or not this is a good move for Apple. It's a great move for Apple and even better for Apple's customers.
Given that he is at the center of a lawsuit regarding an arm-based CPU company, I'd say that's not going to happen, at least within any relevant timespan.
It's been coming ever since Jobs lopped the world "computer" from the corporate logo, Apple is phasing out of its traditional Mac business, imo. I think they'll probably concentrate on low-performance "devices" like iPads and so on. OS X will likely just merge with iOS. I guess they couldn't sell enough $1k monitor stands, etc.
No not happening lmao. Stop this BS. A series chips heralded as lighting fast in real world application performance on an iPhone 11 Pro Max vs OnePlus 8 Pro no noticeable differences which say it otherwise.
ARM didn't scale up, Apple is solely doing this to combat their inefficient cooling designs because of their thin and light bga soldered trash. Their Mac always overheated, Solder issues, IC reballing, errors and locks in the mobo design, and ICs not available to repair also charging insane amounts for basic things such as display cable replacement. Go to Louis Rossman to know more.
Apple ain't doing jackshit to PC market, Apple Macs are sub 10% in marketshare and their own profit share, spending millions on Intel and then engineering to them while they also spend billions on TSMC funding and A series processors ain't viable when their own Mac OS market is dwindling because of growth of their Services revenue stream 17.7% vs Mac at 9.x%, It is a purely business decision.
And Fujitsu Supercomputer is not a consumer product, IBM had that crown with SMT8 based PPC and Tianhe is Chinese built on Sunway design on RISC, all are non consumer based. This whitenknighting of ARM taking over the world esp with Anandtech SPEC is absolute rubbish. We will see when their first Mac ships, they even blocked Developer A12Z Mac Mini to offer benchmarks, there it goes, "Revolutionize" drama.
AMD is on a roll with their Rome EPYC7742 processor which is the king of the DC market and AMD is having 4.5% share, Intel has 95%, the rest are IBM, ARM etc which is 0.5%, and these run on the consumer class software. Apple is not going to revolutionize any of it. They do not make those products nor have any interest.
iPad Pro is not a computer it cannot be a computer replacement, just because it got Office tools and First party type optimized Adobe doesn't make it a computer, it doesn't even have a Terminal to work on, Mac has that. So there goes your high performance geekbench trash.
Of course he did put a cold plate against the iPhone or something to keep it from thermaling ... but that won't be a problem in a Mac (which will probably be actively cooled).
Ever notice about how when comparing decent performing x86 PCs, the discussion always devolves into how well the beast is cooled? That's cause you need pressor plates, heat pipes, and big fans to get them to perform as advertised.
As you can see from the chart in the article, Lightning (this year's high performance core) is bested by Skylake 10900K on integer, and by 10900K, Ryzen 3050X, and Cortex-X1 on floating point.
The Mac SoC will be based on the A14 Bionic and will be built on a 5nm mask, rumored to be the first ARMv9 implementation with the first ARM core running at more than 3ghz. Because it's custom silicon, Apple can stick as many Firestorm and Icestorm cores in it as they want - as well as graphic cores and neural engine cores.
10900K is built on a 14nm mask and is reaching the limits of what they can cool - they had to shave the top of the chip to get enough surface area contact to cool the chip. It's supposedly a 125w TDP part, but under real load it can draw over 300w.
The 10900 going into this year's iMac is rumored to be a 95w TDP, but lord knows what maximum draw will be.
It's rumored that the first Mac SoC will have 8 Firestorm and 4 Icestorm cores, though we don't know what machine that going in.
You make the same mistake as many x86 bigots - because A series chips use the AArch64 instruction set, you equate A series silicon to Qualcomm and Expos. The latter may be architectural licensees, but they pretty much pick from a menu of big/little ARM standard cores, adjust the caching, add some IP blocks and call it a day.
Apple is an architectural licensee who has been designing their own silicon for a decade now. They've added parallel arithmetic units, optimized their vector units, produced out-of-order execution units, widened the data paths, designed their own instruction decoders, and added hundreds of voltage domains so they can turn off parts of the SoC not in use. They have silicon engineers at least as good as that of any x86 designer, and they have only one customer so their silicon doesn't have to carry all the cruft around that other silicon designers wouldn't think of removing.
The silicon, hardware, and software teams probably have regular meetings and they all know what their objective is for the future product roadmap.
Apple silicon in Macs will offer an enormous competitive advantage on multiple fronts: efficiency, power consumption, and millions of iOS and iPad apps which will run natively because they use the same instruction set architecture and frameworks. This could easily acelerate market share increases since all the iPhone and iPad users will find they can also run their mobile software on their laptops or desktops.
As part of the Mac SoC, each Mac will also be getting a state of the art image signal processor capable of very fast contextual image processing with smart HDR, a secure enclave capable of storing/comparing fingerprint/face geometry math, a neural engine (with A13 core count) capable of 5 trillion ops/second, a within-the-SoC-memory-bandwith multicore graphics processor, a hardware based work dispatcher, and an inertial motion processor.
Oh ... and the A12Z was just the SoC they used to cobble together the developer transition machine. This is basically a two year old model which had an additional graphics core added for the iPad Pro. Any benchmarks you've seen are Geekbench run through a binary recompiled from Intel code to ARM - proving the efficacy of Rosetta 2's x86 to ARM transcription.
So ... we'll see how all this plays out, because the first machine running on a Mac SoC should be appearing this year.
It's not exactly rocket science.... One side shows how much energy is being used. The other side shows how much performance you are getting for that energy. What part is tripping you up?
The only think missing is the power usage for the Intel/Amd chips.
Regarding the ARMv9 thing, methinks ARMv9 is not a real thing. It is just ARMv8.x extensions being re-organized to mandate certain features and create more "profiles" to suite the new server/desktop segment.
For example, ARMv9a could make NEON optional again if SVE2 (minus the legacy NEON support) is mandatory.
Intel/AMD would simply step up their efforts to match or beat Apple's SoC's in the future. x86 will remain faster but Apple's will be a lot more efficient
How exactly are Apple's clients treated so badly? By making bold moves to improve that platform above what others in the industry are doing or capable of doing?
I'll take the Apple products you don't want.. thanks!
During the 2-year transition the sales of x86 based Macs should suffer, particularly during the second year. Who on Earth would buy a super expensive Mac(Book) Pro that will be obsolete in a year or so? Even if Apple supports x86 for a few more years there is no guarantee third party companies and developers will, and besides it will be obsolete hardware-wise. It will be like a PowerPC Mac during the first few years of the transition to x86 (when Intel used to innovate and offer significantly more performance every year ... at about the same time Dennard scaling collapsed and thus clocks stopped rising). If ARM based Macs have the same or higher performance at 1/2 to 1/3 the TDP owning a power hungry x86 based Mac will be like owning an inefficient dinosaur..
This. As I sit here with a fairly warm MacBook Pro running nothing but a web browser, knowing an iPad doing the same thing would be stone cold. Macs are going to be so much better.
I plan to buy the 2020 iMac when it comes out. The thought of a 10900 coupled with a Radeon Pro 5xxx - even if the design language remains the same - makes my mouth water.
My current primary was purchased in late 2018 because my 2014 was going into the shop - and I was forced to buy a year old machine with a core-i7 and a Radeon Pro 580 8gb because the 2018 was tardy. Then in 2019 Apple released the iMac I wanted - a core-i9 (8 core 16 thread) with a Radeon Vega 48 (I think).
I do a *lot* of intense 7-10 hour transcode sessions, and a 10 core 20 thread processor will cut my transcodes down to a reasonable timeframe.
With the 2020 iMac 5K I expect to have a fairly competent gaming machine, maybe in Parallels but certainly in boot camp. Catalina cut off a *lot* of games when it went 64 bit only.
I expect Intel support to linger on for years - there are a lot of enterprise customers Apple wants to keep happy who have to run legacy Intel software in a native hypervisor - at least until you can run a hypervisor on an ARM Mac at native speeds.
Meanwhile, I expect to skip the many of the road bumps sure to be revealed by the transition, while you folks graciously spur developers into producing performant universal 2 solutions.
After the transition, I expect that this iMac 5K will retain a lot of its value due to its position of being the best performing x64 Mac ever produced while still being boot camp-able.
At that point, if ARM hypervisors can run Win x86 AAA games at native speeds I'll purchase an ARM iMac - if not, I'll buy an ARM Mac and a cheap windows gaming machine. Or ... if AAA games start to appear on the ARM Mac natively I'll just buy an ARM Mac and forget Windows ever existed 🤣.
I'm retired now, so the only things I need Windows for is gaming and possibly the occasional firmware flasher for devices which purportedly support the Mac but who never produce a Mac firmware updater 🤬.
Apple is a much more prosperous company now and they can afford the engineers required to support Intel Macs long term - and they've been leaning towards longer support times, obsoleting old machines not because they're tired of supporting them but because they lack needed OS facilities for the new release of the OS (like Metal).
Intel Macs will keep their value because they are the bridge for people existing between the Mac and PC world (requiring x86 Windows running natively - Bootcamp/dualboot, not a virtual machine). It was different for PowerPC, because those were not usable for anything else.
Just like that Apple removed all future AAA game developer support for anything but mobile based games. You'll be able to play the latest fortnight mobile version games on your system sure, candy crush, and many other indie titles but they just prevented 99% of existing and future real game titles to run in their ecosystem. Pretty much all modern titles for the PS, XBOX, PC are all running on the same AMD chips using x64 ISA so porting between those platforms is very trivial.
Not only would you have to port your code to run on linux/mac os, which is already very hard to get developers to do using Intel x64 chips, but your program also has to run on ARM which at that point means you are basically writing the application from scratch again unless you want to emulate things and take a massive game-breaking performance hit.
This effectively LOCKS Apple into their own ecosystem in regards to gaming which has always been a problem on Mac's but now the problem is exponentially worse. Porting iPad and iPhone games to the platform won't be a problem sure but those aren't serious games. Unless they plan on continuing use of AMD GPU's, which I find highly highly unlikely at this point, they will be even further behind when it comes to gaming experience outside of the Apple Reality Distortion Field(tm) as they'll most likely just try to use their onboard iGPU solution on unnecessarily thin devices with terrible cooling.
I think the move is good for Apple overall for other reasons, but make no mistake about it, all major AAA gaming on the mac platform is officially dead. You can easily port the code from PC to Playstation to XBOX back and forth, but try doing that for ARM processors, including the graphics engines, libraries, drivers, etc and it just isn't going to happen on a platform that only has 5% market penetration on desktop.
Very good point but sadly the stupid world is moving into that GaaS bullshit of no ownership and game streaming, that garbage Stadia, Xcloud, PSN and Amazon. AAA gaming is dead nowadays because of political baggage they shove it in our throats. EA, Ubisoft, Bethesda, Activision all are in the same league. Except a few in Ukraine/Russia/CZ and Japan
This isn't as difficult as you make it out to be. All modern game engines are multi-platform by nature - even 10 years ago, bespoke in-house engines tended to run on x86 (PC) and PowerPC (consoles) natively (and ARM, if you were also doing mobile / handheld), so the engineers have already been doing stuff like this for years, if not decades.
In engines like UE4, you can already make iOS builds which use desktop-class Metal shaders, so they can already look as good as their desktop counterparts - performance notwithstanding (and I wouldn't bet against Apple in getting a high-performance CPU/GPU combination out).
Epic has been updating the UE4 rendering pipeline on mobile, unifying it with the desktop/console one to bring them to parity (which is possible thanks to Metal and Vulkan), so the feature set between platforms would soon be identical. The only thing that's not standardised yet is ray tracing, which is still very much hardware dependent, but there's nothing stopping Apple from including it in their future GPUs.
Also worth pointing out that 99.9% of your game code and scripting would be platform independent (usually to account for differences in platforms).
When porting an engine over to a new platform, the bulk of the work would be on the rendering pipeline and asset importers. Ditto with optimizations such as concurrency. A lot of the existing frameworks for user input, memory management, game object lifecycle and UI can be re-used or modified for new platforms so it's hardly re-writing from scratch.
Nobody rewrites games from scratch to support new platforms.
Fair comment. But what if, and this is a huge if, Apple ends up releasing silicon with GPUs that start to pull away from the big guys? What if they make their platform so damn compelling that more and more people make the switch and suddenly the lowest performing systems are the measly old Windows PCs still stuck using giant graphics cards and hot CPUS?
Apple is doing as Apple always tried to do, control everything. You might see the EU looking into it one day, but the US never would.
I think we will see ARM performance in general improve as it starts scaling up to match Intel and AMD in performance. So, at first, if Microsoft keeps on the ball with Windows for ARM, we will start to see low cost PCs and laptops with Windows for ARM. This also means that Microsoft will need to encourage developers to code for UWP so the software works on any platform, we will see the reemergence of Windows Tablets and possibly Windows Phones again. We will also see more Linux distributions compiled for ARM as well.
Hopefully, this also means that as ARM performance improves, we will see cheaper x86 processors. I also hope we will see manufactured ARM processors we can drop in sockets and not just integrated SoC motherboards. I like having the ability to build my own computer, but I wouldn't be surprised if one day everything is all integrated. Afterall, we are seeing laptops with integrated memory, so I expect SSDs to be soldered on at some point too. Of course, I would wonder if the EU might go on some kind of e-waste reduction initiative and require some companies to manufacture devices so things are user replaceable to reduce e-waste.
"You might see the EU looking into it one day, but the US never would."
Huh? What law are you implying Apple is breaking by designing their own silicon? Comments like this are absolutely ridiculous.
"as ARM performance improves"
One ISA isn't inherently faster than another. It's the chip that's designed around that ISA that determines performance. Outside of Apple, nobody seems willing to step up and design a custom chip anymore. Everyone is content to use ARM reference designs. As for Apple, I believe you'll see with the A14 that they've already passed Intel in core design from both a performance and power efficiency perspective.
I think even the protectionist EU would find it hard to stomach forcing Apple to sell in-house locally developed A SoC silicon to competitors - and AFAIK the don't even have any PC or laptop makers in the EU, do they?
The EU really seems to be content sitting on their laurels and for some reason have had trouble moving into the 21st century ... not sure why.
As for the Wintel alliance - I expect the Mac to gain a huge competitive advantage in energy efficiency, performance, and the ability to run millions of iOS and iPadOS software natively due to a common instruction set architecture and common frameworks - something I can't see anyone else cashing in on.
If they add touchscreen to the Macs, you'll have the ability to interact with the Mac just as you do on your phone - though the undo gesture may have to be changed 😏.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
241 Comments
Back to Article
welehcool - Monday, June 22, 2020 - link
Does it mean Apple will not use AMD GPU anymore ?trivik12 - Monday, June 22, 2020 - link
obliviously Apple will create both their CPU and GPU as they do with A12Z soc which is dev system.My question is if Apple will work with Microsoft on native windows support. Then it would be awesome.
khanikun - Monday, June 22, 2020 - link
Apple could probably bootcamp Arm based Win 10, but I doubt they can get MS to want to bother somehow getting x86 Windows to work on it. They'll have to make their own program or some 3rd party dev does it. I don't see any kind of revival of Virtual PC for Mac, but you never know.dontlistentome - Tuesday, June 23, 2020 - link
I think main issue is one of performance. The Apple chips are the fastest ARM devices around - would be awkward if a bootcamp installation is faster than any native Windows ARM device on sale. Not saying it won't happen, but might take a little time.psychobriggsy - Tuesday, June 23, 2020 - link
I would presume that Rosetta 2 will be part of the virtualisation platform as well. Boot an x86-64 VM within it, and it will use the dynamic recompiling aspects to run it on the ARM hardware. Presumably with recompilation caches subsequent uses of the VM would get faster as well.But on the other hand, if Windows was working on this platform already, surely they would have shown this off? Instead we got Linux and virtualised server demos showing interoperability with the host Mac, which is nice, but maybe not all everyone wants.
I expect Windows on ARM may become a good fit. Indeed it could be the boost that this platform needs to actually take off - so I expect that Microsoft will be working with Apple on this.
Bp_968 - Tuesday, June 23, 2020 - link
They could do that, but then people are unlikely to buy a desktop mac for anything other than developing apps for iphones and ipads. No phone based ARM SoC can even get close to the power of a discrete GPU and no one else on the market had anywhere near the patent library or R&D of AMD or Nvidia in that arena. Apple might pull off switching to ARM CPUs bit if they try to squeeze in a GPU pulled from any of their other designs their not going to be scaling it up to a useful level for animators, VR, or heavy gaming. The lack of driver support for modern GPUs on MacOS is already a sticking point.I dont know yet, but I expect them to focus primarily on features that lock their echosystem together and wall out the rest of the world as much as possible. They seem fine with not having a performance leadership if it means you can't put other peoples software or hardware in your mac, and as long as it keeps you locked in their garden.
Deicidium369 - Tuesday, June 23, 2020 - link
Apple knows more about what they have in plan - they may already have a in house designed GPU that will allow them to dump both Intel and AMD. To say that a switch of architecture will render them as only dev platforms for iphones and ipads.I wasn't aware that Apple was a viable platform for VR and heavy gaming. The lack of drivers in MacOS would not be an issue when Apple makes the hardware, the software, the OS and everything else in the Apple Ecosystem - So I would imagine that Apple will fully support Apple.
Apples has been walling off their garden for their entire existence and I would not expect that to change. I would expect Apple to start looking to purchase a display company... The dev tools will allow fat binaries that contain both x86 and ARM programs... so not sure how that would change when they fully transition to their in house ARM derivative. Devs would be using the exact same dev tools and would only need to compile either fat binaries or pure ARM binaries... nothing changing.
Spunjji - Friday, June 26, 2020 - link
They don't even have a truly in-house GPU design in their mobile devices yet. They tried to run off with Imagination's IP and nearly got away with it - now they're back to a licencing agreement with them.I can't see why they wouldn't continue to use AMD GPUs in their more powerful devices, at least in the near term. There are already Arm-based dev systems out there with PCIe slots and AMD GPUs in them. Using their own GPU makes sense for the lower-end MacBook devices, though.
Eliadbu - Saturday, June 27, 2020 - link
it would make sense to use AMD GPU if they can't effectively scale up imagination tech IP for higher TDP designs more effectively than AMD design. it remains to be seen.melgross - Monday, June 29, 2020 - link
They’ve had an in house design for a while now.prince 999 - Monday, July 6, 2020 - link
Tiger Tiger is a restaurant chain based in London that specializes in serving special fast foods and delicious dishes across the world. The restaurant provides everything from home comfort classics to chicken dishes to grill dishes.http://menupricelist.co.uk/tiger-tiger-menu/
techconc - Monday, June 29, 2020 - link
Apple likely requires a licensing agreement with Imagination based on products still being sold that are not yet on Apple GPU. I would expect Imagination IP to be phased out over time if any still remains.Apple Worshipper - Wednesday, July 8, 2020 - link
Lol no mate current Apple GPUs on their iPhones and iPads are customized PowerVR designs. Been that way.vol.2 - Tuesday, June 23, 2020 - link
I think MS could do this with the windows 10 for ARM code base that already exists. Not sure how heavy a lift it would be, but at least there's a starting point.Samus - Tuesday, June 23, 2020 - link
Speaking of...damn I want that Mac Mini DTK!Xajel - Monday, June 29, 2020 - link
It's a two swords question.MS already have Windows for ARM, but they build it as per device. So if MS decided to support Apple's Silicon, they will either build a separate build for each apple device, or build a single build with support for all Apple devices.
But in both cases, they must have Apple's bless first, Apple must also want this to happen and, they need device specific drivers like how it works with current Bootcamp Windows as Apple provide the necessary drivers.
On the other hand, if Apple decided not to support Windows, the only other method will be virtualisation, which according to Apple is working very good.
But after seeing the current dev kit with it's bootloader and the ability to boot other non-signed OS's like Linux or even Apple's own macOS (modded). Apple might actually be working with MS to support Apple Silicon. Unless, they will only provide this non-signed feature to the dev-kit and consumers must find a signed OS first, which will be hard for Linux.
The linux situation on Apple silicon is still unclear yet, as in all cases, Apple must provide the drivers also, it's already tricky to install ubuntu for example on current Intel systems. I guess things will be harder with Apple's Silicon.
twnznz - Monday, June 22, 2020 - link
Unclear. AMD is known to licence their IP fairly liberally, so we may see an AMD GPU with ARM CPU (we're expecting this combination in Samsung's Exynos 1000).jeremyshaw - Monday, June 22, 2020 - link
Apple is already paying the 3rd major foundation graphics vendor (that survived, anyways), Imagination, for licensing relevant IP and (presumably) patents. Even if they switch to AMD, remember that Intel has long paid Nvidia then AMD graphics IP/patent feees and royalties. None of that means the Intel UDH620 has any practical relation to an AMD Radeon.extide - Tuesday, June 23, 2020 - link
No, but every discreet GPU used in a MAC product in the last what 5 years? (or more) has been an AMD GPU. There is only so much GPU HP you can put inside of a single chip solution -- due to things like thermals, memory bandwidth, etc. I would not be surprised to see Apple attaching discreet AMD GPU products to their ARM chips, at least at first, but because they are Apple, I would also not be surprised to see they eventually build their own discreet GPU.Deicidium369 - Tuesday, June 23, 2020 - link
I expect them to keep AMD for a while - makes little sense to split focus on CPU and GPU - but at some point they will have a full replacement for AMD... Their on SOC GPUs are not terrible - light years ahead of the general ARM catalog.bananaforscale - Tuesday, June 23, 2020 - link
*discreteextide - Wednesday, June 24, 2020 - link
Yeah I slip up on that one all the tim, but hey it's not like they but a big old AMD sticker on them like you see on windows laptops so you could say it's a discreet discrete GPU.Deicidium369 - Tuesday, June 23, 2020 - link
Samsung will do Samsung things - release RDNA on 1 product, then discontinue - it's what they dodotjaz - Wednesday, June 24, 2020 - link
And how many sh*try Exynos Mx did they push out before finally calling it quits?iphonebestgamephone - Wednesday, June 24, 2020 - link
3?Spunjji - Friday, June 26, 2020 - link
They were doing custom Arm architecture CPUs for a long time before the M series.iphonebestgamephone - Saturday, June 27, 2020 - link
Ok?steve219999 - Wednesday, June 24, 2020 - link
Apple is best ever company who is producing quality wise better things than other.everyone would like to purchase products by Apple .i always use apple's products and suggest same thing. They are best in technology and quality.<a href="http://www.mybesthoverboard.com/epikgo-hoverboard-... hoverboard price</a>SarahKerrigan - Monday, June 22, 2020 - link
No reason they can't. There's nothing about AMD GPUs that's uniquely tied to x86; I use them on non-x86 systems myself.Kurosaki - Monday, June 22, 2020 - link
They could easilly still use AMD as a GPU supplier, if it made sense performance wise. Look at the Nintendo switch with an ARM CPU and Nvidia GPU, or the Nintendo Game Cube With a PPC CPU and an AMD GPU. It all depends here. GPU performance is not that bad in phones theese days. you could play quite demanding games on a very competent screen, resolution wise. On a very small budget. The more I think about it, the more of a monopoly I smell. Corruption.. Time to shake things up I guess!eastcoast_pete - Monday, June 22, 2020 - link
While that's pretty up in the air right now (AMD GPUs with Arm-based CPU), I would actually be really curious to see just how well Apple's GPU designs scale up once they approach dGPU levels of transistor count, and get fast VRAM to work with. Let's not forget that while the A chip line has shown great progress on CPU power, progress on the GPU performance was even greater. I am not an Apple fan or user, but I'd love to see them light a bright fire under seats of the GPU duopoly (I'll add Intel to AMD and NVIDIA if they ever release anything worthwhile)nandnandnand - Monday, June 22, 2020 - link
They should slap some HBM (article image says "high-efficiency DRAM", that should qualify) on those desktop/workstation-class ARM SoCs.lmcd - Wednesday, June 24, 2020 - link
Oh that'd be one way to push iMac/workstation performance to parity.brucethemoose - Monday, June 22, 2020 - link
The other side of that coin: will Apple ship big, fat in house GPUs with their higher power products?Is there room for a 3rd company in the Anand GPU bench?
brucethemoose - Monday, June 22, 2020 - link
*discrete PCIe GPUs, that is.nandnandnand - Monday, June 22, 2020 - link
You mean 4th. Intel is joining the dGPU party, remember?brucethemoose - Monday, June 22, 2020 - link
Ah, right.Ryan Smith - Monday, June 22, 2020 - link
"Is there room for a 3rd company in the Anand GPU bench?"For more competition in the GPU market, I'll make as much room as is necessary!
brucethemoose - Monday, June 22, 2020 - link
Fantastic!eastcoast_pete - Tuesday, June 23, 2020 - link
Thanks Ryan! My sentiments exactly. The more the merrier!Bp_968 - Tuesday, June 23, 2020 - link
Even if they made a GPU that performed at 3090 levels for 100$ and sipped 50 watts at full bore they would only use it on their systems and never allow it to be used outside the apple ecosystem. 30 years of watching these guys, they play by the old school monopoly rules. If they could own the entire supply chain and entire retail chain and entire data services chain (your phone provider) they would.Zerrohero - Tuesday, June 23, 2020 - link
Since you have been watching Apple for 30 years, when, in your opinion, they became a monopoly and in which product category or categories?Apple is not obliged to sell or license their CPUs or GPUs to anyone. It’s not “playing by old school monopoly rules” when they don’t.
Do you really think that Apple should be obliged to supply their own chips to the likes of Huawei, presumably without profit even?
Owning the design of your core HW components and having your own OS etc. is not a crime either.
But of course you know all this.
(The bitterness towards Apple from people who have never owned, and never will own, a single Apple product just never ceases to amaze me. None of this affects your life in any way whatsoever.)
starcrusade - Tuesday, June 23, 2020 - link
Definition of Monopoly: Exclusive possession or control of something.So yes they are a monopoly.
Monopolies are not inherently illegal (in the US). Abuse of a monopoly is. If for example Apple went and said any apps sold in our app store cannot be sold anywhere else (ie google play), that would be illegal abuse of a monopoly. Just ask Amazon, that's exactly what they did with their Kindle store and they lost. Amazon still has a virtual monopoly on the Ebook market.
liquid_c - Tuesday, June 23, 2020 - link
That's not a monopoly, that's exclusiveness and while they might be similar in a few aspects, they're not the same thing. And to refrain to your example, it's up to the app developers to choose if they want to sell on Apple only platforms or ditch Apple and sell their apps on other platforms. Considering that the developers still have the freedom of choice regarding their apps, this is not a monopoly.RedGreenBlue - Wednesday, June 24, 2020 - link
It’s called vertical integration. People calling Apple a monopoly don’t call Ford or GM monopolies for designing and making their own engines and car parts. Apple has a monopoly on doing well all the things needed for a phone or computer to work best, but anybody else in the industry has the opportunity to do the same. What are you gonna do, demand they stop making better products so someone can catch up?You’re also neglecting the fact that they ditched Intel to do this. Whose chips alone made up about 25% of the cost of a cheap macbook and sometime probably as much as 40% of the bill of materials. Apple can make a competitive chip for 25-50 dollars. As opposed to over $250. Intel was the more monopolistic company in this relationship. Apple could drop mac prices by $100 and still make $100 more profit.
They should have done this years ago. The benefits were incredibly obvious since 2016 at the latest.
lucam - Tuesday, June 23, 2020 - link
Nope, likely to be Imagination...psychobriggsy - Tuesday, June 23, 2020 - link
Undoubtedly the end result will be AMD GPUs are out of Apple.But integrated graphics, even on 5nm, can only do so much. I presume the A14Z on 5nm will have double the GPU of the A12Z at least, so the silicon will be very powerful - but it won't be >5 TFLOPS powerful.
So I guess AMD GPUs in laptops are dead by next year, but AMD GPUs in iMac and Mac Pro might last a couple more years - it depends on whether or not Apple are going to create their own discrete GPU line using their IP or not.
Fulljack - Tuesday, June 23, 2020 - link
well, for next year or so, I don't think Apple iGPU will replace laptop dGPU just yet. It'll be a beast in it's range, no doubt, but there's so much you can do with limited power draw. MBP 16 refresh with Apple Arm CPU will surely still has AMD dGPU option.caribbeanblue - Sunday, October 30, 2022 - link
Welp, now that they've made a 10.4tflop integrated GPU, and their 13.7tflop laptop iGPU is coming this November, what do you think?If the top-end Mac Pro chip ends up being 4x M2 Max chips as rumored, that'll have 54.7 fp32 tflops, about as much as the 4x Radeon Pro VII GPUs that the current Mac Pro has.
defferoo - Tuesday, June 23, 2020 - link
Apple may even create a dedicated GPU die with a high bandwidth connection to their SoC so that they can create products with a powerful GPU in addition to products using their integrated GPU.I'm not sure how much they would get out of doing that vs. just using AMD dedicated GPUs so it's still an open question.
taigebu - Tuesday, June 23, 2020 - link
Consoles are SoCs also and we will soon have 10+TFlops GPUs on 7nm this year. Of course I doubt Apple would want such a power hungry GPU on their SoCs for MacBook laptops but I don't think it would be impossible to do for a Mac Pro or iMac.Samus - Tuesday, June 23, 2020 - link
AMD Silicon? Probably not. AMD IP? Highly likely at some point.Deicidium369 - Tuesday, June 23, 2020 - link
Will probably be phased out at the same time the last Intel for Apple ships. Not sure it makes sense at this moment to try and replace both the CPU and GPU - so maybe they will stick around a little longer - but then again not like Apple is cash constrained...genzai - Tuesday, June 23, 2020 - link
Apple will continue to build chips with integrated graphics, just like all their a-series chips have. But these will function like intel's iGPUs on current chips- meaning on the low end products they will be the only GPU present, but ffor higher end products they will be supplemented by discrete GPUs (from AMD or possibly someday again Nvidia). There is nothing in this weeks announcements to suggest Apple is going to build high end GPUs any time soon. Nor any reaosn they cant use AMD (or nvidia) discreet GPUs with their own apple ARM chips (they just need to work with them for drivers and my understanding is they basically write their own AMD drivers now anyway).g\
Robo Jones - Tuesday, June 23, 2020 - link
Great question. Yes they will. AMD works with Apple's Metal API as oppose to Nvidia.AMD fits nicely as a supplier of highend GPUs. Especially for Enterprise which will be Apple's focus in the future.
osxandwindows - Monday, June 22, 2020 - link
It sounds like their new hypervisor doesn't support x86 anything. Crossover should still work, tho. So I don't see why my old games shouldn't work.biigD - Monday, June 22, 2020 - link
I don't care for Apple and its walled garden, but it's gonna be interesting to see some *real* benchmarks once these things are released. Windows *does* run on ARM, and if Apple can show that its upcoming chips have a clear performance/power advantage, we might see some pressure on Intel from Microsoft too. Interesting times!eastcoast_pete - Monday, June 22, 2020 - link
Agree! More competition is never a bad thing in this space - just look at how Zen woke up Intel, and let's see what the x86 side comes up with.Kurosaki - Monday, June 22, 2020 - link
This is going to be huge. Intel or AMD isn't able to keep up any more. The ISA it self, in witch they have built their whole business model is too old. Either they go full on IA -64 and risk it all, or ARM wins by a margin. I predict that most of new PC's sold from five years from now will be ARM-based. AMD might still have a chance for now, with their GPUs, but still, we can see similar things with Imageon and ARMs own GPU -tech. You can drive quite a demanding scene with a veeeery small power budget... I predict my next stationary build will be an ARM system with an unknown GPU-brand. And it will be 4 times as powerful as the one I own today.twnznz - Monday, June 22, 2020 - link
I'm convinced that both AMD and nVidia GPU IP isn't going to vanish alongside x86 - nVidia has already demonstrated SBCs with both ARM cores and nV GPU IP (Jetson series). It would not be a stretch to imagine higher performance ARM CPU integration along these lines.Kurosaki - Monday, June 22, 2020 - link
GPU-wise, they still have a chance, AMD and Nvidia. The weakest of the bunch pretty quickly turned out to be Intel Here. But as said, you still have Imagination tech, which ins't that bad on a 7w phone, pushing FPS-shooters on 3,5k- screens...iphonebestgamephone - Tuesday, June 23, 2020 - link
They dont run on 3.5k, pubgm runs default 720p, tho it supports 120fps now. And then the sd 865 cant even maintain 60 fps on fortnite on the op8p.Deicidium369 - Tuesday, June 23, 2020 - link
Well since the first of the new Intel Xe line is only available as a discrete card and identical to the 96EU iGPU in the imminent Tiger Lake... Pretty much looking like that Xe LP is besting the Vega in the latest AMD APUs. Discrete graphics and compute cards with Xe HP are to be launched this year.Nvidia is the king, no doubt - installed base of a tech they invented (GPU compute), and ecosystem they have been investing heavily in for years (hardware, software and expertise).
Intel is not aiming their Xe HP at Turing - they are aiming at Ampere- and with OneAPI and their already dominant position in the data center - they have the resources to dethrone Nvidia - won't happen over night, but it will happen. AMD has neither the tech nor the resources to dethrone Nvidia - and 3rd place will hasten their departure from GPUs and focus on CPUs - or may hasten a merger...
lucam - Tuesday, June 23, 2020 - link
AMD may still be present in the Mac Pro. For other Macs it's gonna be Imagination...Deicidium369 - Tuesday, June 23, 2020 - link
Not only demonstrated but been shipping for several years including into the Nintendo Switch - Tegra. Jetson is more for machine vision and machine learning - and also contains ARM and NV graphicsQuantumz0d - Monday, June 22, 2020 - link
Did you miss an /s or it is for real lolboozed - Monday, June 22, 2020 - link
"Either they go full on IA-64 and risk it all"IA-64 is EOL.
Kurosaki - Monday, June 22, 2020 - link
"Full on IA-64" as another atempt at a new ISA if I didn't make myself clear. They have to get the balls Apple have to succeed. There is no turn back to theese instruction sets from 1998, all of witch is done is modern as hell-approach, get used to it. Like Apple. Now, I wrote it out EILI5-Styleextide - Tuesday, June 23, 2020 - link
Lol, dude, you really don't know what you're talking about. BTW ARM is from 1985.> I predict that most of new PC's sold from five years from now will be ARM-based.
Lol -- absolutely not. Sure, Apple machines will be but the vast majority of computers will still be x86. X86 has some complexity with the instruction encoding but honestly once you are beyond the front end there is not much difference between an ARM CPU or an X86 one.
Kurosaki - Wednesday, June 24, 2020 - link
It's in the front end we have all the problems. A14 is a 8 width design. Intel is far behind in any case. I see there are some more responses regarding the ISAs just down below. This isn't a competition in who's right or wrong, but I'll gladly eat my hat if any company still hasn't released a complete, windos compatible platform that kicked ass. All Hail to the new PC-master race, ARM. Took them a while, but we are there now, both hardware, and software wise. We don't have the same troubles with compiling as we had 15 years ago, alot has happened and you can see from the companies already on the mac hype train, that there is good and functioning software for the Apple silicon. The ARM X1 seems to fare quite well in the same league, worse than A14, better than anything yet released on x86.tipoo - Thursday, June 25, 2020 - link
How do we know the A14's design width?Kurosaki - Thursday, June 25, 2020 - link
Oh, shit, that isn't official yet? Nevermind me. You don't know where I can find a edit function in this section?Kurosaki - Thursday, June 25, 2020 - link
No, but I just presume they will when looking forward. The X1 could be seen as an 8 wide cpu in many usecases:"The fetch bandwidth out of the L1I has been bumped up 25% from 4 to 5 instructions with a corresponding increase in the decoder bandwidth, and the fetch and rename bandwidth out of the Mop-cache has seen a 33% increase from 6 to 8 instructions per cycle. In effect, the core can act as a 8-wide machine as long as it’s hitting the Mop cache."
The A12 and 13 has been known for beeing extremely wide cores, the latter 7 wide, don't expect them to lag behind in this sense. Only time will tell I guess!
Deicidium369 - Tuesday, June 23, 2020 - link
Was a niche product created for and with HP. Was NEVER going to be mainstream.Korguz - Tuesday, June 23, 2020 - link
" Intel's product marketing and industry engagement efforts were substantial and achieved design wins with the majority of enterprise server OEM's including those based on RISC processors at the time, industry analysts predicted that IA-64 would dominate in servers, workstations, and high-end desktops, and eventually supplant RISC and complex instruction set computing (CISC) architectures for all general-purpose applications "source : https://en.wikipedia.org/wiki/IA-64
" Beyond Kittson, there will be no more chips coming from the Itanium family, an Intel spokesman said in an email. That ends a tumultuous, 16-year journey for Itanium, which Intel once envisioned as a replacement for x86 chips in 64-bit PCs and servers." " Intel hedged its bets and planned for Itanium 64—also called IA-64—to ultimately go down the stack from servers to PCs. But that never happened, and the market shifted quickly after AMD introduced the first 64-bit x86 server chips in 2003. That gave AMD a competitive edge over Intel, which still was offering 32-bit x86 chips. " " The transition disrupted Intel’s vision of Itanium as an architecture of the future for 64-bit servers and PCs. Instead, x86 chips started moving up the stack into more powerful servers. "
source : https://www.pcworld.com/article/3196080/intels-ita...
" When Intel launched its first Itanium processor in 2001, it had very high hopes: the 64-bit chip was supposed to do nothing less than kill off the x86 architecture that had dominated PCs for over two decades. Things didn't quite pan out that way, however, and Intel is officially calling it quits. " " The news marks the quiet end to a tumultuous saga. Itanium was supposed to represent a clean break from x86 that put Intel firmly into the 64-bit era. It was first intended for high-end servers and workstations, but it was eventually supposed to find its way into home PCs. Needless to say, that's not how it worked out. "
source : https://www.engadget.com/2017-05-13-intel-ships-la...
it wasnt created just for HP, with HP, it WAS supposed to replace X86. Sorry Deicidium369 but you are not correct.
BenSkywalker - Monday, June 22, 2020 - link
Right now Intel chips still easily outperform ARM at the higher tiers, yes ARM is far more power efficient. The reason ARM has gotten so much closer is Intel has been stuck for a long time on one node, but let's pretend they stay stuck and don't do anything about it. Five years from now ARM *might* be able to match their performance of both are running native code, but there is ZERO chance ARM would be anywhere close under emulation which would be required for the PC platform to migrate.To be clear, I'd love to see the PC market evolve into something where we could see more competition on the CPU side, but we already had decades where x86 was clearly inferior(MIPS, Sparq, Alpha etc) and still had an iron grip on the PC market.
The GPU side is even less likely. IT has been making graphics chips for a very long time now. They've always done very well in lower power situations, and they've never been able to scale to the high end. We've seen Intel throw billions at the graphics side and utterly fail too, everyone else that dominated the market I the early days is give now. We didn't ask for a duopoly in the graphics segment, we got there because no-one else could compete.
Kurosaki - Monday, June 22, 2020 - link
I seriously believe we are in to a paradigm shift here. An older gen ARM chip of 3W out performs a 125W intel chip single core wise. Intel is going to have to prove their point with 7nm or what ever. AMD is al ready on the same node and the performance isn't stellar there either if we compare to the performance of ARM in general. Graphics wise, it might be as bad i'm afraid, or not afraid. Would be fun to be able to play some VR games with a decent performance soon...mattbe - Monday, June 22, 2020 - link
>AMD is al ready on the same node and the performance isn't stellar there either if we compare to the performance of ARM in general. Graphics wise, it might be as bad i'm afraid, or not afraid. Would be fun to be able to play some VR games with a decent performance soon...This comment is so dumb I can't even tell if you are being serious or trolling. You seem to have this weird fantasy that arm is much better at EVERYTHING while ignoring context and what's actually out there.
For example, the current AMD RDNA 7nmm chips are not even close to efficient. Nvidia's 12nm offering are just as , if not more efficient. This is partly due to the clocks being too high and past the power efficient point to get extra performance. Remember, these are hardware not designed for power saving.
I also find it hilarious that you think AMD and Nvidia will somehow be "replaced" in 4 years. There is nothing to suggest that Apple is able to develop anything on the gpu side that comes close. Remember, Nvidia is no Intel, they are a market leader and is still constantly innovating. The gpu
mattbe - Monday, June 22, 2020 - link
>An older gen ARM chip of 3W out performs a 125W intel chip single core wiseDid you actually look at the graph is this very article or are you just making things up?
vladx - Tuesday, June 23, 2020 - link
He's a delusional Apple fanboy, only someone like that could make such a laughable claimKurosaki - Tuesday, June 23, 2020 - link
Have not owned an apple product my whole life. What i'm seriously excited about is the raw performance at extremely low energy levels. :)Look at what ARM X1 is going, slap it on a motherboard and turn the frequencies a notch. Cool it with a noctua good for 200W.
I'm just tired of the stagnation in the PC-industry, where higher prices for the same performance has become very common. ARM and Imagination is very much welcome to fix this, that apple is first out in this shift, good for them. I welcome the new PC-masterrace, developed for mobile.
Apple gpu seems to deliver 5,7 fps per watt in 3d mark according to this site, that's friggin amazing if we could get it to scale with the power envelope. Would gladly buy a 150w gpu with that type of performance.
TEAMSWITCHER - Tuesday, June 23, 2020 - link
Your thinking is stuck in the "Motherboard" paradigm... I think Apple is continuing to move away from these dinosaurs. Apple's best selling professional computers are the MacBooks. If they can take the performance crown among all laptops, it's going to be very disruptive for the PC world.vladx - Tuesday, June 23, 2020 - link
Imagine supporting less hardware choices, just disgraceful.Bp_968 - Tuesday, June 23, 2020 - link
Except AMD and Intel are hitting those performance levels by running at 4+GHZ with all cores. As intel so kindly showed us over a decade ago you don't get good performance/watt by chasing ghz. Yet we also see that most of the code people write for a CPU doesn't scale well past 4-6 cores and so you need high clock speeds and IPC to run many programs well on a desktop. Then we have the multitasking conundrum, not something your typical ARM SoC has to deal with.I expect these ARM SoC designs to be excellent at low power and single task performance with solid design tricks to boost speeds on a couple cores for running a game/etc. But I don't expect them to manage heavy hitting CPU apps that hoover up all the available cores. Compression / decompression / crypto / animation / video transcoding / etc. At least not with current designs or while maintaining those power levels.
But in the desktop world or the gaming world very few of us are complaining about a lack of CPU performance. Doubling CPU performance would give most users very little. While doubling GPU performance is hugely beneficial. A major architecture change would also likely reap more utility long term, like some unified memory controller between the dGPU and the CPU.
BenSkywalker - Tuesday, June 23, 2020 - link
If you look at the last time one of the desktop GPU makers actually gave mobile a shot, the TegraX1, they eviscerated the top IT part, it was years before the mobile market caught up.On the CPU side it's a stretch to think ARM can actually compete, on the GPU side it's laughable.
iphonebestgamephone - Wednesday, June 24, 2020 - link
Didnt the ipad pro with a9x beat it a few months later?Kurosaki - Thursday, June 25, 2020 - link
Yeah, the X1 was really bandwith starved and not all that they seem to have portrayed.Deicidium369 - Tuesday, June 23, 2020 - link
No need to devolve into calling people fanboys. Take that to WCCFtech or Tom's.andrewaggb - Tuesday, June 23, 2020 - link
pretty sure they're just making stuff up.A 10900k has 20mb of cache 10c/20thread @5.3 max turbo. Sure it's 125W, but that's for the whole thing. Similarly the 3950x is 72mb cache, 16c/32thread and 105W TDP.
Even if single core speeds are comparable, the number of cores, threads, clocks, cache etc aren't comparable at all. The processor bus, memory bus, pci-e, etc is also all way different.
smalM - Thursday, June 25, 2020 - link
An i9-10900K is 225W @4.9GHz all cores. 125W is the pretty much meaningless TDP.Kurosaki - Thursday, June 25, 2020 - link
20 megs of cache on 10 cores equals to 2 Megs per core, that's sweet...Anandtech: "The amount of SRAM that Apple puts on the A13 is staggering, especially on the CPU side: We’re seeing 8MB on the (2) big cores, 4MB on the small cores, and 16MB on the SLC which can serve all IP blocks on the chip."
voicequal - Tuesday, June 23, 2020 - link
A13 isn't an ARM design. It uses the ARM ISA, but it's Apple's own design. There are a ton of mediocre, memory bandwidth-starved ARM cores out there with poor voltage regulation, coarse frequency adjustment, that burn hot when you push them for any length of time.Deicidium369 - Tuesday, June 23, 2020 - link
It is an ARM design, not from ARM itself - but is still an ARMbananaforscale - Tuesday, June 23, 2020 - link
"An older gen ARM chip of 3W out performs a 125W intel chip single core wise."Got a source?
Kurosaki - Thursday, June 25, 2020 - link
https://gadgetversus.com/processor/intel-core-i9-9...A13 only has 2 large cores, in the single threaded beekbench 5, A13 gets out on top, on half the fequency and a powerdraw thats just amazing..
Lets throw in a couple more of those larger cores and see what they can do with some serious cooling.
Kurosaki - Thursday, June 25, 2020 - link
Like 8 of the big, new revised cores, better cooling capabilities and higher clocks? https://www.theverge.com/2020/4/23/21232441/apple-...vladx - Tuesday, June 23, 2020 - link
"I predict that most of new PC's sold from five years from now will be ARM-based"Lmao you do know Mac PCs are only around 5% of the whole PC market? Sorry to burst your bubble but that has no chance of happening.
PeterCollier - Tuesday, June 23, 2020 - link
What's a MacPC?Strom- - Tuesday, June 23, 2020 - link
It's a personal computer sold by Apple. You can read more about it on Wikipedia https://en.wikipedia.org/wiki/MacintoshKurosaki - Tuesday, June 23, 2020 - link
Where did i mention Apple? I meant windows PC's. Some will pick up the X1-license and develop further. smack it on a mobo and voila, change of platform!Deicidium369 - Tuesday, June 23, 2020 - link
And overnight the entire vast majority of all software will magically become ARM binaries.There is a place for ARM - and it's at 5% market share at most.
Kurosaki - Wednesday, June 24, 2020 - link
And that is where I hope you are wrong! :)Bp_968 - Tuesday, June 23, 2020 - link
Lol. 5 years? Keep a note of this. In 5 years apple will actually have lost PC market share. Trust me, this is about apple owning everything in their stack, not about out cpuing AMD or intel or out GPUing AMD or Nvidia.Deicidium369 - Tuesday, June 23, 2020 - link
Right, vertical integration driven. Would expect Apple is already looking to purchase a display manufacturerKurosaki - Wednesday, June 24, 2020 - link
Not talking about apple, im talking about You I and HP and every one else putting little ARM-chips in our beast PC's instead of Intel or AMD. This is going to pace up. If Apple is this sure on a change of platform, they ain't doing it for a measly 20% performance increase.xenol - Tuesday, June 23, 2020 - link
The ISA doesn't determine the performance of the processor. It's the implementation that does.Also the ARM ISA itself is over 30 years old.
Deicidium369 - Tuesday, June 23, 2020 - link
Very little remains from Acorn RISC MachinesJakeHarvey - Wednesday, June 24, 2020 - link
What the heck are you on about? Do you seriously think Apple is going to license out the tech and design which makes their ARM cpus so great in their iPads, iPhones etc. to other companies? Have you seen how badly Apple outperforms all the other ARM chips in the smartphone space? Qualcomm ARM chips look like garbage compared to Apple.Unless other companies can quickly develop ARM designs that can catch up to what Apple is doing and where they are going, your suggestion that everyone is going to be putting ARM CPUs in their desktops in a few years makes no sense to plenty of users.
Apple obviously does NOT just use off the shelf ARM CPUs. It's going to continue to keep the in-house design + tech which makes their ARM CPUs so great to THEMSELVES and ONLY use them in their OWN devices.
I don't think any other company that supplies for the rest of the computer market will catch up to Apple in ARM CPU performance anytime soon.
Quantumz0d - Monday, June 22, 2020 - link
"as they have accomplished some amazing things with their A-series silicon thus far""the PC market is about to become fractured once again."
For the first line, Can I know what an iPhone can do over my Android phone ? I ask two things here - One, does it allow the usage of filesytem by user without any fence or such like installing apps outside. Nope. Second being, is it stopping anything for even direct similar application performance comparisons on OnePlus 8 Pro vs an iPhone 11 Pro max ? None so far. Except the uber insane SPEC scores.
And for the second line, how is PC getting fractured here lol. Is M$ going to bet big on Apple A series processors and make Surface on them, or is that ARM has taken over the Server/Data Center market ? The numbers show AMD is rising in their Server marketshare chipping away Intel's which sits over 95% while ARM is at 4.5% so the 0.5% is from ARM lol.
Qualcomm abandoned their Centriq processor and whole IP, Samsung is closing down own ARM, and Huawei got slapped by US for it's CCP bootlicking, Altera is still about to show its potential, AWS Graviton 2 according to STH and Phoronix is no match for an EPYC7742 or Xeon, where AMD trumps over Xeon by a huge margin.
Now coming to the notebooks and desktop OS marketshare from Netmarketshare, Windows is at 86.7%, Mac is at 9.7% and Linux at 3.x%, Chrome OS at 0.4% (lol), so by that numbers alone, Mac is not even used by many people across the world. How is PC market which is mostly Windows going to be fractured Once again ? I would like to ask when did it get fractured so that it's becoming now again.
Now for the x86 innovation. AMD is now spearheading new Zen 3 processors which are on TSMC 7NP and improved cache layout, then we have the Intel's 10nm or whatever 7nm along with DDR5 and Gen 5 coming in 2022, which is supposedly bringing massive improvements in memory alone, and MCM is getting more traction on Intel side as well with rumors of RKL having Foveros based 10nm Die for GPU and 14nm++ for CPU with improved IPC.
And the biggest of all, Choice - DIY system or even DC machine, so many sockets, so many OEMs so many Vendors and Linux OS support with massive stack of the Software and Programming and Gaming as well with next gen Consoles and AI. And in DIY we have the ability to choose a Soundcard, Memory, Mobo, LAN, GPU, CPU, Cooling, OS, Display, Software and fucking shitton of other things along with affordability, how is this going to change for the worst of x86 ? ARM is always custom BS, no choice and closed like a drum just like iPhones and locked bootloader phones with so little wiggleroom, don't even start on the BGA BS that Apple does to their macs and their users with fully soldered, locked down trash from super expensive RAM, Storage to the damned basic repairs like LCD cables ? (Louis Rossman)
So How is this going to FRACTURE lol ? SPEC scores from Andrei I guess. The uber fast Kryptonain class SPEC score of A series so fast that it will be at the speed of Light and don't experience time like a Photon but in reality nothing but a benchmark.
Quantumz0d - Monday, June 22, 2020 - link
Shit that 4.5% is AMD not ARM, massive typo, from 2019 Q4 data of Intel and AMD reports.anonomouse - Monday, June 22, 2020 - link
Ah, this guy back again.quadra - Monday, June 22, 2020 - link
Quantumz0d said: "One, does it allow the usage of filesytem by user without any fence or such like installing apps outside. Nope."Yup. Are you up to date? Since iOS 13, there is a Files browser with hierarchical files and folders. Any application can read them. Plug in an external USB drive, the files are visible as is, and any iOS application can list them. It is revolutionizing how I use my iPad. (I still get most work done on a laptop, but that is changing rapidly thanks to iOS 13.)
iOS 13 also allows connection to SMB shares. So I have turned on file sharing on my computer, connected to it from my iPad, used the iPad native Files app to browse the files on my computer, open them in a compatible iPad app (they were photos), edit them, and transfer them back to the iPad. Same goes for iPhone.
You better keep up, because that was back in iOS 13, and Apple just announced iOS 14 today.
sonny73n - Tuesday, June 23, 2020 - link
Took over a decade for a POS OS to implement a Files browser with hierarchical files and folders so Apple sheeps can brag about it. But hey, you still don’t get to see your own pictures that you transferred from your PC. You can see them in Photos app but still have no control over them, such as move, rename, delete or view their metadata. Isn’t it great?Brag all you can now because the day that your government can no longer acting like a gangster, affordable products from the world will flood your market and consumers will push your Apple god aside.
vladx - Tuesday, June 23, 2020 - link
But could 3rd party file explorer apps have the same access to the file system as the Apple file browser? Doubt itBp_968 - Tuesday, June 23, 2020 - link
I can't help but get a laugh from the fact that someone described being able to access the actual file system as "revolutionary". But then I see him mention it as "new" in IOS13 (THIRTEEN!!!) And I cry a little.Apple does nothing for *you*, Apple is only concerned about Apple. I could think of dozens of features apple still refuses to put on any of their devices because doing so might earn them a few pennies less or give the user some semblance of choice.
Apple isn't switching to ARM to give you guys performance or lower power consumption. They are doing it so *they* are the supplier and so *they* own and rule the entire stack.
I can still play games on my PC I purchased decades ago. Apple removed the ability for 32 bit apps to function at all on their iOS devices. Virtualization exists, its used extensively and quite effectively all over the industry. They *could* have allowed them to run in sandboxed virtual machines and I still use the software I purchased. But nope. That didn't benefit them, it didn't earn them more hardware sales or new software fees from the app store. So they just outright removed the apps and the ability to use them.
On my PC I have my older apps backed up, I can install them myself outside of some walled store. Support for x86 machines and software will be gone (along with the ability to run that software) by 2025-2027.
star-affinity - Tuesday, June 23, 2020 - link
At the same time there's been access to files stored on the phone (and the cloud) via cloud services such as Dropbox, Google Drive and Box for a long time in iOS. There's also been the app Documents:https://apps.apple.com/us/app/documents-by-readdle...
I haven't suffered much from the lack of a file browser, but one can sure wonder why it took Apple so long to add it. It's there now at least. Apple sometimes do take their time adding some iOS features, but I think the OS has often offered most features I've needed via apps – for me it's geeky enough. :)
star-affinity - Tuesday, June 23, 2020 - link
@BP_968Apple sure is progressive with their OS and deprecating legacy stuff. But I'm not sure it's all for worse as you seem to imply. I agree Microsoft is doing a good job with backwards compatibility in Windows, but many also seem to think they will eventually need to at start somewhat anew with Windows too.
MarcusMo - Tuesday, June 23, 2020 - link
Ah, the inverse correlation between randomly inserted “lol”:s and valuable content continues to hold true.vladx - Tuesday, June 23, 2020 - link
Exactly, it's unbelievably dumb how some people wish for a more closed hardware space with little to no choice.Personally I don't worry about it, because there's no chance that the translation layer offers enough performance for the decades long library of x86 software.
Oxford Guy - Monday, June 22, 2020 - link
"the ramifications of Apple building the underlying hardware down to the SoC means that they can have the OS make full use of any special features that Apple bakes into their A-series SoCs. Idle power, ISPs, video encode/decode blocks, and neural networking inference are all subjects that are potentially on the table here."To paraphrase a certain famous programmer: to run the spyware faster. The nanosecond time stamping from APFS and other innovations do create overhead.
Apple has been steadily adding metadata/spyware to the Mac and even special chips like T2. It's not content with Intel and AMD being the only ones with black boxes in place.
Zerrohero - Tuesday, June 23, 2020 - link
What’s your take on 5G, vaccinations and COVID-19?back2future - Monday, June 22, 2020 - link
Would say it's not only SoCs, but also peripherals and also Thunderbolt found its cost barrier on changing from copper to fibre for consumer level hardware. Question is, what for we are using improved computing power? Increased resolution only for same pictures we saw already is not really a satisfying improvement for materials and energy invested on mass markets, especially with in sight challenges growing?Oxford Guy - Monday, June 22, 2020 - link
People think everything is about faster performance. Bring on the benchmarks.Meanwhile, Apple has a recent history of dramatically slowing performance, such as with its awful APFS filesystem which is horrible with mechanical drives and slow with SSDs.
Apple is not in business to grant wishes to geeks, by coming out of a cloud with a wand to deliver faster performance. It will provide only as much performance as it feels necessary to provide and the vast majority of the focus will be on wringing every last bit of data/IP out of its users. That is the new model. You're not the customer. You are the fodder.
Zerrohero - Tuesday, June 23, 2020 - link
Nice, but Apple isn’t a data company.I think you meant Google and their Android platform whose main purpose is just to collect data for their ad business.
Oxford Guy - Wednesday, June 24, 2020 - link
"Nice, but Apple isn’t a data company."Wrong.
techconc - Tuesday, June 23, 2020 - link
Seems like rather baseless comments you are making. APFS has been well received and is a great match for the needs of Apple's products. Also, while it is a modern file system designed to be optimized with SSDs, I can assure that it works perfectly well with spinning disks also.Oxford Guy - Wednesday, June 24, 2020 - link
"Seems like rather baseless comments you are making. APFS has been well received and is a great match for the needs of Apple's products. Also, while it is a modern file system designed to be optimized with SSDs, I can assure that it works perfectly well with spinning disks also."Rubbish.
Look at the actual tests. It's substantially slower with SSDs and ridiculously slow with mechanical hard disks. It is, though, a dream for "forensics".
techconc - Monday, June 29, 2020 - link
There are some tests such as the enumeration of an entire disk that will be slower on spinning disks in APFS because of the way meta data is stored. On HFS, all metadata for all files are blocked together. On APFS, the meta data is stored with the actual data files themselves. Yes, for this kind of test, there would be many more seeks, etc. and likewise slower performance. In practice, all of that data is indexed via spotlight and there is no real difference on actual routine usage.GreenReaper - Monday, June 22, 2020 - link
Interesting timing, given the board of Arm's Chinese... arm is currently in a battle with its own chief executive.https://www.telegraph.co.uk/technology/2020/06/21/...
I wonder how long before the current US administration picks up on that and beats Apple over the head with it (no matter that they're really a UK company bought by a Japanese investment fund and hived out to the Saudis).
Deicidium369 - Tuesday, June 23, 2020 - link
Tim Apple has it under control. besides. only need to deal with the current infestation until January 20 2021 11:59 AMbrucethemoose - Monday, June 22, 2020 - link
"Curiously, the company has carefully avoided using the word “Arm” anywhere in their announcement"Suprise! Its actually an array of 64 in-order MIPS cores per chiplet.
"Arm is on the precipice of announcing the Arm v9 ISA, which will bring several notable additions to the ISA such as Scalable Vector Extension 2 (SVE2)."
The dev kit is clearly ARMv8 though. I agree that this is critically important, as SVE2 as a guaranteed baseline would be *huge*.
Billy Tallis - Tuesday, June 23, 2020 - link
I'm recalling how the first few Intel Macs were 32-bit only, replaced later in the same year by 64-bit Core 2 processors. Apple just missed being able to establish a 64-bit baseline back then, and those 32-bit x86 Mac users were the first ones left out in the cold by rising system requirements. This might be one of the best reasons to not be a first-adopter of ARM Macs.Morky - Tuesday, June 23, 2020 - link
Good memory and insight!blppt - Tuesday, June 23, 2020 - link
Actually, the kernel itself could be run in 32bit or 64bit mode, which allowed those first macs to remain useful up to and including Lion, in which case those first macs would be nearing the end of their useful lives anyway, given they were limited in hardware power.Obviously, any 64-bit only apps wouldn't run on a mac with a 32-bit only processor (like core solo/duo), but amusingly, thanks to the way macOS is designed, you could run 64bit apps with a 32bit kernel (if you had a 64bit processor like Core 2 or better). Can't do that on Windows.
Billy Tallis - Tuesday, June 23, 2020 - link
I waited for the 64-bit processors before buying my iMac, so I ran it for years with the 32-bit firmware/kernel and 64-bit applications. By the time the GPU started dying, Linux had gained support for booting a 64-bit kernel from 32-bit EFI (rather than relying on the Bootcamp BIOS CSM), and I ran it as a mostly-headless Linux machine for a while longer.Ladis - Wednesday, August 5, 2020 - link
In 16bit Windows 3.11 you could run 32bit apps using the Microsoft's Win32s extension (means mainly the memory model not splitted into 64kB segments). Of course it required 32bit CPU (386 - but not the first version with those bugs).psychobriggsy - Tuesday, June 23, 2020 - link
Gotta wonder if that's why it's called "Apple Silicon" in the build process. It's very generic, and doesn't commit to an ISA - although it's confirmed its still ARM as expected, not RISC-V or something exotic like that.I wonder if the Universal 2 binary currently has: (1) Intel x86-64 (2) ARMv8.2 and (3) ARMv9 in it.
(2) will be dropped prior to release. Maybe (3) will come in a couple of months.
Apple was early to 64-bit ARMv8, so I don't see why they wouldn't be early to ARMv9 as well.
But let's be honest, if ARMv9 silicon can run ARMv8.2 binaries, then there's no issue right now. Apple can add ARMv9 to universal binary when they need to and not before.
Deicidium369 - Tuesday, June 23, 2020 - link
It is ARM derived - but Apple's ARM derived SOCs are so far ahead of any other ARM derivative that it warrants being called Apple Silicon.techconc - Tuesday, June 23, 2020 - link
Nothing more than a recompile would be needed to leverage ARMv9 though. Even SVE2 is handled automatically through Apple's Accelerate framework.meacupla - Monday, June 22, 2020 - link
Sounds like PowerPC 2.0.What I don't understand is why Apple refuses to use AMD CPUs.
When IBM/motorola dropped the ball with PPC, they ditched them and went for x86 Intel.
Then, when Intel dropped the ball, they decided to switch... again.
quadra - Monday, June 22, 2020 - link
If Apple just wanted out of Intel, they would have gone AMD. But the difference with this CPU transition is that this time, unlike PowerPC to Intel, this time Apple has achieved mastery of designing and building SoC, which is fast, and power-efficient, and...unlike what AMD has...it is fully customized for Apple's needs. From integration with macOS/iOS to security/privacy and many other things, Apple can design no-compromise processors that do everything they need and nothing they don't. ARM can only offer what ARM designed for a more general market.quadra - Monday, June 22, 2020 - link
Sorry, can't type...last sentence above should say AMD can only offer what AMD designed for a more general market.blppt - Monday, June 22, 2020 - link
Its actually the exact opposite. Apple left PPC because IBM couldn't deliver a power/heat level G5 to put in notebooks.This would seem to be a move to more power efficiency, and of course, more control over their products. It makes sense; the iphone/pad/whatever isn't going anywhere, and they are all ARM, so really, why support two ecosystems when you can get it done with 1?
psychobriggsy - Tuesday, June 23, 2020 - link
This isn't about leaving Intel for anyone else. If it was, AMD would be a good choice, even this year. The Mac Pro could have been a 64 core EPYC machine with PCIe4, for example. Renoir could have brought 8C to Apple's 13" designs instead of flawed 4C IceLake.It's about having the full platform, not just the CPU cores, available on the desktop platform.
Fabbing a ~150mm^2 5nm Mac ARM SoC will save Apple money in the long term than buying from either AMD or Intel. And they include all their accelerators as well, bringing them to the Mac for the first time. They can have far deeper integration between their OS frameworks and their hardware as a result.
Don't expect savings to affect end-user pricing.
Deicidium369 - Tuesday, June 23, 2020 - link
Ice Lake is not flawed - works perfectly and only a moron thinks that 8 cores in a lightweight machine is an advantage - all the while the GPU in that APU is getting owned hardcore by the follow up to Ice Lake - Tiger Lake.andykins - Tuesday, June 23, 2020 - link
Think we found the intel shareholder.tipoo - Wednesday, June 24, 2020 - link
Almost like Renoir spent its die area on twice as many cores for nearly twice the aggregate performance. Tiger Lake will have the per core advantage and IGP, though Intel IGP 3Dmark scores haven't translated that well to real titles in the past. But for multicore workloads Renoir is still going to be the champ, some "morons" actually use their cores for work.techconc - Tuesday, June 23, 2020 - link
Going with AMD wouldn't solve any of their current problems. People need to understand that Apple has also mentioned the importance of the SoC and the types of capabilities that will bring beyond basic CPU and GPU components.Deicidium369 - Tuesday, June 23, 2020 - link
Well - after the Opteron Fiasco and Intel Core dropping and putting AMD out to pasture for close to 12 years with no competitive products ... makes sense. Same reason there are very few AMD systems in the data center...Apple is interested in vertical integration, not pro or anti Intel - pro Apple.
drwho9437 - Monday, June 22, 2020 - link
I don't think this is going to work out well. Of course if all you want is a web browser or "creative" applications then okay. But many many professionals use Windows and the software legacy of x86 there is just too great. Say goodbye to any technical engineering or CAD tool use on Apple. Software dev. will be taken care of but I think this is a loosing move. If it were a winning move, we wouldn't still be using Ax/Bx/Cx. Apple does have the best shot of doing it given their walled garden world but they are actually mistaken if they think their hardship in the powerPC days was just due to not having control over their own supply chain. Intel has stumbled but this is their bread and butter, while Apple's bread and butter is selling UI and industrial design not computing power. But like the good old days the fans will just say how awesome it is I suspect. For those of you old enough to remember.kusix - Monday, June 22, 2020 - link
Just as an example, Maya was primarily an SGI/IRIX program, but moved to x86/PowerPC when SGI was losing ground. If Apple's workstation class chips are as powerful as everyone hopes, I don't see a reason for CAD/Engineering software to release Apple/ARM builds.I'm not too hopeful, since Apple still holds such low market share, but seeing the entire industry being pushed to shed decades of legacy x86 support would be amazing.
ThreeDee912 - Tuesday, June 23, 2020 - link
Some software like AutoCAD, Maya, Fusion, Unity, Unreal Engine, Resolve, Redshift, etc. ported their rendering engines to Metal recently, so they should run half-decently on ARM.It's too bad Solidworks and Inventor are still stuck as Windows-only though.
kusix - Tuesday, June 23, 2020 - link
Ah, yep I meant I don't see a reason for them NOT to release ARM builds.Deicidium369 - Tuesday, June 23, 2020 - link
Apple is a small but reliable market - would expect ARM native buildsvladx - Tuesday, June 23, 2020 - link
Agreed, there's no chance that Rosetta 2 will be fast enough to run the huge library of x86 legacy software so these news will have no impact on majority of the PC world.psychobriggsy - Tuesday, June 23, 2020 - link
Legacy software by definition doesn't need the latest and greatest. Install-time binary translation may not result in perfect performance, but as long as Apple's CPU cores are decent, the apps will still run fine. And many many applications actually spend most of their time in system functions (that's why the games ran well - it's in the OS, Metal, GPU drivers most of the time) and they're native.The issue will be modern x86-64 Mac software that makes heavy use of assembler to speed up core functions. On ARM that will result in the slowest default codepath running. AVX will be the big issue (Rosetta 2 can do SSE it appears, and SSE2Neon is already migrated). And that's why the Mac Pros will be the final devices to be migrated. Maybe 16" MacBook Pros will be available in Intel for a while alongside ARM versions as well, for this reason.
Or maybe Apple will simply strongarm the software vendor to put some backbone into their port.
vladx - Tuesday, June 23, 2020 - link
"Or maybe Apple will simply strongarm the software vendor to put some backbone into their port."Strong arm PC software vendors when Macs are less than 10% of the PC market? Laughable
Deicidium369 - Tuesday, June 23, 2020 - link
Small but reliable market - outsized for their small market share..GreenReaper - Tuesday, June 23, 2020 - link
They are a significantly greater proportion of certain sections of the PC market, such as artists/animators, designers, marketing, and hipster web developers.smalM - Thursday, June 25, 2020 - link
Yeah, those software vendors for sure care a lot about those millions and millions of $300 commodity PCs.wrkingclass_hero - Monday, June 22, 2020 - link
Intel seems to be following the IBM blueprint quite nicelyMrCommunistGen - Monday, June 22, 2020 - link
FWIW, wikipedia seems to indicate that the Apple Lightning/Thunder cores are ARMv8.4 and Vortex/Tempest are ARMv8.3 -- but then they don't list what the ISA revisions bring. If anyone knows I'm really curious if there's anything interesting in there. I know that v8.2 brought some nifty changes.smalM - Thursday, June 25, 2020 - link
https://en.wikipedia.org/wiki/AArch64A10 wasn't fully ARMv8.1 but after that each A-SoC generation did a full implementation to the next extension level.
cuxrzk53761 - Tuesday, June 23, 2020 - link
Everything I've read seem to paint the picture that because of the superior instruction set, ARM is just better in every way when compared to x86: iPad Pro compares favorably to Macbook Airs; a new supercomputer is out with ARM cores; etc, etc. They seem to suggest ARM has better perf/watt and scales just fine. What gives?Did the entire industry not move to ARM solely because no one company had the power of vertical integration that Apple has? What am I missing here?
Billy Tallis - Tuesday, June 23, 2020 - link
The ARM architecture isn't much of an advantage. Aarch64 is a cleaner instruction set than x86_64 and the more relaxed memory model makes things a little bit easier for hardware designers (and harder for programmers), but those are minor factors.The important thing is that Apple hired a lot of chip design talent and gave them a lot of resources. They went with ARM architecture for the iPhone largely for historical reasons that can be traced back to the 1990s, but ARM wasn't and isn't the only instruction set architecture that's suitable for mobile processors. Apple's financial situation means their CPU cores tend to have much larger transistor budgets than ARM's own Cortex cores, and the concentration of CPU design talent at Apple means those transistors are well-spent. Apple processors stand out because they have more resources than other mobile processor designers, and a different target market from Intel's mainstream microarchitectures. (Intel has never been good at developing two CPU microarchitectures in parallel, which is why the Atom family always disappoints.)
The ARM supercomputer win can be mostly chalked up to the fact that instruction set matters little when you're being scored on the basis of vector arithmetic on highly-optimized code—in many ways, it's a derivative of Fujitsu's SPARC supercomputer processors. And minimizing the die area spent on traditional CPU core functionality means more room for more and wider SIMD units.
cuxrzk53761 - Tuesday, June 23, 2020 - link
Thank you! That clears a lot of things.So, if I understood correctly, the A-series chips are performing outstandingly simply because Apple is under the surface a very competitive chip-design company? Also owning to ARM licenses, of course.
It almost feels like Apple caught up to other, well-known names in virtually no time. But I guess there must be a lot about patents and other IP stuff floating around in the back.
MarcusMo - Tuesday, June 23, 2020 - link
Don’t forget that a huge driver in this change is not the switch to the ARM ISA, but rather enabling tighter integration between hardware and software. So if you see Apples move to ARM as discrediting x86, be a bit cautious, because there are a lot of other factors at play as well.Morky - Tuesday, June 23, 2020 - link
This. I haven't seen a good analysis yet of what Apple can now implement in hardware for the Mac since it no longer is relying on a generalized third party processor. Industry-specific Mac processors, or instance.techconc - Wednesday, June 24, 2020 - link
It's not so much a matter of discrediting x86 as it is discrediting Intel. Intel has been stagnant with regard to development in general and they've fallen way behind in their manufacturing process.Yes, there is some ISA advantage to going ARM, but mostly for Apple, the advantage is being able to control their own roadmap.
xign - Tuesday, June 23, 2020 - link
The iPhone 4 came out in 2011 and was the first time Apple made custom silicon, which means they have been developing in-house silicon at least for a couple years before that. So it’s been more than a decade of experience already, plus part of the pedigree came from them acquiring P.A. Semi which was around for longer. So while it’s impressive it’s not like they just started doing this yesterday. It’s been a long time coming.As for the whole ARM thing, simply put it’s just the most mature instruction set that you can actually license easily and make chips out of. Even if Apple wanted to make x86 chips they won’t be able to because Intel probably won’t license it to them. The ubiquitous nature of ARM chips owe largely to the flexible licensing model, not just technical merits.
[ECHO] - Tuesday, June 23, 2020 - link
Pardon me, but can anyone tell me how to correctly read the SPECint/SPECfp graph? Is that single core performance and total SoC power draw/energy consumption? Thanks! I'm trying to understand how to correctly interpret that.psychobriggsy - Tuesday, June 23, 2020 - link
I absolutely detest that double-sided graph layout - it's too information dense, I can never parse it, and strange stipples and colours are applied instead of something clear.ABR - Tuesday, June 23, 2020 - link
Same here, I've never understood what the motivation for this format is, instead of just a simple bar graph of perf/watt, except to hide some conclusion they don't want you to see.yeeeeman - Tuesday, June 23, 2020 - link
One important aspect has been skipped here....Mac users will not be able to run Windows anymore.MarcusMo - Tuesday, June 23, 2020 - link
It’s not being skipped, it’s right there in the article: “... Apple isn’t saying anything about BootCamp support at this time, despite the fact that dual-booting macOS and Windows has long been a draw for Apple’s Mac machines.”amrs - Tuesday, June 23, 2020 - link
Are there any numbers on boot camp usage? I suppose Apple has those so they might well have decided boot camp just isn't so important to their customers.techconc - Tuesday, June 23, 2020 - link
Exactly. In practice, nobody cares about boot camp. I'm a multi-platform user. I've experimented with Boot Camp years back... just because I could. In practice, I never really used it. My current Macs don't even have it installed.MarcusMo - Wednesday, June 24, 2020 - link
Personally, I’m not sure Bootcamp was ever used for dual booting as much as it were used for windows users who simply preferred Apple hardware. Unfortunately through Apple has showed very little interest in actually developing proper software support for their hardware in windows in recent years. Power management is atrocious, and don’t get me started on the trackpad driver situation.This also highlights the way Apple has been headed over the last couple of years, where their computers have an increasing amount of custom hardware with deep ties to complex custom software. Trying to replicate that software on another platform is a gargantuan task. The move to custom silicon is the ultimate step on this journey, and the one that puts the final nail in bootcamps coffin, but the journey has been going on for a long time.
alumine - Tuesday, June 23, 2020 - link
I think what a lot of people are forgetting here is that this is never a technical exercise as much as a commercial one. I'm quite confident given the power budget of workstations and optimisations it's not that difficult to push a custom ARM core to be toe-to-toe with MOST midrange x86/x64 CPUs.What potentially allows Apple to succeed here is their mature market in the ARM platform: the iPad and the budding prevalence of iPad apps that make the iPad as a legitimate laptop replacement. They waited until the iPad/iOS market share is great enough, and obviously they see that now is a great time to jump ship to ARM - people need to realize they've spent years building this ship from the commercial side of things (I'm guessing from the days of the first iPad Pro).
We know that iOS/iPadOS share a common core with MacOS, so the transition from iPad <--> MacOS now that MacOS can run on ARM shouldn't be too difficult (a lot of the optimized codebase already exists - it's just a UI change).
However this is probably something only Apple could've pulled off at this stage - Microsoft tried this multiple times and failed (Windows RT, Windows 10 on ARM) - it's just such a compromise running Windows on ARM given the software ecosystem.
Mackeyser - Tuesday, June 23, 2020 - link
Been thinking about this for a moment. Apple has ALWAYS preferred vertical integration and the current political environment seems to be encouraging this (nary a whisper of anti-trust anything these days). We see it at Apple, Tesla, Amazon and a bunch of other giants.Apple has seen a bunch of success on the appliance, never upgrade side with iPhone/iPad/iMac and it's clear that with their "right to repair" issues that this is another move in the direction of sealed EVERYTHING. I'm old school and I've never liked that trajectory, but that's been the constantly reaffirmed path for awhile now.
Custom silicon isn't gonna open options, but restrict them. I would expect that the future for iPhone and iPad are only solidified going forward to be very bright. I think the iMac is gonna be a mixed bag with continued TCO issues and horrible upgrade options mostly consisting of external box options and the Pro space, especially on the low end is gonna be borked. But that's been the case for a really long time.
Just glad I didn't make that Hackintosh I was thinking about. Seems to run MacOS Big Sur and have the freedom to swap drives/memory/ and I/O cards (network, breakout boxes), one is gonna have to go big or go home.
Sorry, to hear all of this without the Reality Distortion Field is weird. All I can see are the future Alka Seltzer moments for the IT pros who are gonna have to sort out all the nonsense that the custom silicon will yield as Apple vertically integrates everything.
But at least the iPhones/iPads will be pretty great...
vladx - Tuesday, June 23, 2020 - link
There no point of worrying, these news will have little impact outside the Mac PCs which are less than 10% of the whole PC market. The MacOS library cannot compete with the huge x86 software library.TEAMSWITCHER - Tuesday, June 23, 2020 - link
There is one point of worry... People love laptops, and if Apple can deliver MacBook performance that Intel and AMD laptops cannot, then it could be very disruptive.For example.. PC Laptops today can play games, but they all suffer from loud fans screaming at you the whole time to dissipate the heat and sub two hour battery life - it's a shitty experience. If Apple can solve this problem... and then attract game developers.. it might work. Game Companies like money .. and Apple users like spending money. You call them 'sheep', but they must be doing something right, because they seem to have a lot more money to spend.
Bp_968 - Tuesday, June 23, 2020 - link
Your point was valid until you mentioned game developers switching due to loud fans in high performance laptops. As soon as you talk about high performance laptops you start talking about a laptop with a dGPU. Even a midrange laptop specific dGPU like the 2060 rtx max q from nvidia consumes *65* watts all by itself. So after memory, CPU, storage, etc you now have the bottom half of that clamshell needing to dissipate over 100w of power.Apple isn't going to change that math anytime soon.
This will (like always) benefit Apple. It *won't* usher in a new gold rush in massive computing performance gains or a massive shift in client and datacenter system purchases.
vladx - Tuesday, June 23, 2020 - link
"You call them 'sheep', but they must be doing something right, because they seem to have a lot more money to spend."Most of the "sheep" will just prefer using an iPhone or iPad, hence the small market share for Macs
techconc - Tuesday, June 23, 2020 - link
Yet, look at all the "options" in say the Android market. All those options and none can beat what Apple has done with their own silicon. Seriously, the advantage for Apple is pretty obvious. Moreover, they will be in control of their own destiny rather than waiting on product delays from Intel. Great move for Apple.vladx - Tuesday, June 23, 2020 - link
Your comparison is completely asinine because smartphones unlike PCs don't need top performance to satify the habits of most smartphone users, ARM could design a competitive SoC but it would be useless besides bragging rights.techconc - Wednesday, June 24, 2020 - link
Lol... great example of cognitive dissonance. My phone doesn't have good performance, therefore phones as a category don't need performance! Yeah, talk about asinine arguments...Why do you think people buy new devices? High performance leads to longer practical product life. It also leads to the best overall user experience. Did you ever notice on iPhones when you go to take a picture and you see real time previews of effects, but that is horribly lacking in Android? Just a small example of an everyday user experience difference.
sharath.naik - Tuesday, June 23, 2020 - link
Microsoft will follow suit. This will be the end of Intel, serves them right of playing possum trying to milk the cash on the same node for the past 6-7 years.nevcairiel - Tuesday, June 23, 2020 - link
Microsoft doesn't sell laptops or PCs (only a Surface tablet, which might as well be ARM). They have no interest in favoring one CPU or another, they'll provide software for whatever their customers want to run it on.rrinker - Tuesday, June 23, 2020 - link
What rock have you been hiding under? There is a whole range of Surface devices. Not just one tablet.lilo777 - Tuesday, June 23, 2020 - link
"Follow suit"? Really? Microsoft already has ARM version of Windows and it's their second attempt. Apple is following suit here and so far Microsoft "suit" has been a failure. Also Microsoft does not design CPUs and they have no reasons to do it.fazalmajid - Tuesday, June 23, 2020 - link
I’d be interested in finding out if the ARM Macs can boot Linux. The options so far have been mostly underpowered systems using SOCs like the RK3399 roughly comparable to 5 year old phones, or ultra-expensive ones using server-grade processors.psychobriggsy - Tuesday, June 23, 2020 - link
Well they've showed Linux running in the Parallels VM on Mac OS.That indicates either an ARM Linux build running in the VM, or that the virtualisation system can run x86 VMs via Rosetta 2 just fine (so Windows will run here).
Boot Camp is a very valid question for natively booting Linux or Windows(onARM) of course, if you absolutely want to have no VM involved. I fear this may take a while.
Dolda2000 - Tuesday, June 23, 2020 - link
>I fear this may take a while.I fear Apple might lock it down to the point of being impossible. It would be completely in line with everything else they're doing at the moment.
toomanylogins - Tuesday, June 23, 2020 - link
To coincide with the transition. When they launch they should announce licensing for hackintosh. That would expand the mac user base dramatically many of whom would then switch to arm later.davidedney123 - Tuesday, June 23, 2020 - link
Also they should give everyone free ice cream, and let them come round for a go on their sister! What planet do you live on to think that Apple gives a damn about twerps pratting around with "hackintosh" systems? I am sure almost none of them ever turn into revenue generating customers.Dolda2000 - Tuesday, June 23, 2020 - link
I don't doubt that you're right in that Apple is never going to do that, but I think there are many software developers (including myself) that would greatly welcome that. I myself am indeed never going to be a revenue generating customer for Apple, but being able to compatibility-test some of my programs without having to own a Mac would be tremendously useful, and with enough people like that, even beneficial to Apple to at least some degree.web2dot0 - Tuesday, June 23, 2020 - link
You are living in your own culture war. 😝 How juvenile.poohbear - Tuesday, June 23, 2020 - link
Is this Intel's fault? They haven't innovated much in the past 10 years and have been stuck on 14nm for ever. Did this lead Apple to conclude that they can do things better?prophet001 - Tuesday, June 23, 2020 - link
Pretty sure intel doesn't have fabs though do they?They just use existing fabs with their own architectures.
GreenReaper - Tuesday, June 23, 2020 - link
Uh, Intel has a whole bunch of fabs:https://en.wikipedia.org/wiki/List_of_Intel_manufa...
Maybe you're thinking Apple? Or AMD, which spun theirs off into GlobalFoundries.
Sahrin - Tuesday, June 23, 2020 - link
Anandtech: Wants to be taken seriouslyAlso Anandtech: Compares cell phone SoC to 90W desktop CPU's.
prophet001 - Tuesday, June 23, 2020 - link
True but would be nice to see Intel in competitive mode and not just resting on their laurels.xenol - Tuesday, June 23, 2020 - link
This'll probably be lost in the sea of comments, but it's bugging me how many people think the ISA (that is ARM and x86) is what determines the processor's performance. It's not. It's the implementation of that ISA that determines the processor's performance. If the ISA is all that mattered, then performance jumps like from Netburst to Core or Bulldozer to Zen wouldn't have happened.And if the ISA is all that mattered, how come Qualcomm's Snapdragons were lacking in performance with Apple's SoC's even though they have, at least on paper from the consumer point of view, similar or even worse looking specs (i.e., lower core count or lower clock speed)?
All that matters is how the company implemented their processors. I'm certain this is why Apple chose to use terms like "Apple Silicon" as opposed to "ARM" because Apple's silicon _is_ what's giving them their performance edge, not because they're using ARM.
OreoCookie - Wednesday, June 24, 2020 - link
While you have a point, differences in ISA *can* lead to differences in power efficiency, things ranging from the necessity of having a CISC-to-RISC translation layer (where you split ops into micro ops) and a lower number of directly addressable registers put x86 at a disadvantage. ARM, on the other hand, was conceived and optimized for being power efficient, and that has had an effect on its ISA design as well.outsideloop - Tuesday, June 23, 2020 - link
AMD will shove Cezanne down Apple's throat.Smell This - Tuesday, June 23, 2020 - link
LOL ... yeah, probably.
BUT, our good friends at AMD & APPL have been well known to canoodle, zig instead of zag, psyche, deep fake, 'shens, et. al., six months prior in anticipation of product announcements and major architectural design charges.
I'm not suggesting anything - much less Mac-related, but the next "Master" AMD could include BIG-little, AM5, Zen3, DDR5/LPDDR5 memory, two (or, more!) PCIe 4.0 devices, USB4, __________________ (insert wish here).
Throw in a chiplet for some RDNA-Next action and while we're at it, some HBM as a last-level cache.
Now, that's a Mac ;-)
A much bigger update is expected with Rembrandt. This design will not only feature Zen3+ design but also RDNA graphics. Furthermore, Rembrandt is also expected to support DDR5 and LPDDR5 memory. Expreview data even mentions that the APU will support USB4 and (if translations are correct) two PCIe 4.0 devices at the same time.
Smell This - Tuesday, June 23, 2020 - link
HA!Too bad there is no edit function
techconc - Tuesday, June 23, 2020 - link
The article states:"At the same time, however, even with the vast speed advantage of x86 chips over PPC chips, running PPC applications under the original Rosetta was functional, but not fast."
Huh? Vast speed advantage? Maybe as compared to the G4 laptops, but not over the G5 desktops. The whole point of the move to Intel was because Motolola dropped out effectively and IBM was only interested in doing their Power based workstation level chips. While that would have been fine on the desktop, it wasn't suitable for a mobile laptop chip.
Incidentally, Apple has the same motivation today to move to ARM due to superior power per watt, etc. This also benefit's Apple in the form of vertical integration. Macs will also get the benefit of better power management, Neural Engine and a host of other specialty functions in Apple SoCs today. Seriously, nobody should be questioning whether or not this is a good move for Apple. It's a great move for Apple and even better for Apple's customers.
LarsBars - Tuesday, June 23, 2020 - link
Can you secure an exclusive interview with your namesake about this?jeremyshaw - Tuesday, June 23, 2020 - link
Given that he is at the center of a lawsuit regarding an arm-based CPU company, I'd say that's not going to happen, at least within any relevant timespan.BOBOSTRUMF - Tuesday, June 23, 2020 - link
For they sake they must use the new Ryzem mobile professors, at least at their high end laptops. No ARM can compete to 8 core mobile RyzensWaltC - Tuesday, June 23, 2020 - link
It's been coming ever since Jobs lopped the world "computer" from the corporate logo, Apple is phasing out of its traditional Mac business, imo. I think they'll probably concentrate on low-performance "devices" like iPads and so on. OS X will likely just merge with iOS. I guess they couldn't sell enough $1k monitor stands, etc.web2dot0 - Wednesday, June 24, 2020 - link
IPadPro are low performance devices? 🤣Just watch in the next 2years. Apple is gonna revolutionize the PC market with their “Apple Silicon”.
Imagine getting desktop performance on a laptop.
Quantumz0d - Sunday, June 28, 2020 - link
No not happening lmao. Stop this BS. A series chips heralded as lighting fast in real world application performance on an iPhone 11 Pro Max vs OnePlus 8 Pro no noticeable differences which say it otherwise.ARM didn't scale up, Apple is solely doing this to combat their inefficient cooling designs because of their thin and light bga soldered trash. Their Mac always overheated, Solder issues, IC reballing, errors and locks in the mobo design, and ICs not available to repair also charging insane amounts for basic things such as display cable replacement. Go to Louis Rossman to know more.
Apple ain't doing jackshit to PC market, Apple Macs are sub 10% in marketshare and their own profit share, spending millions on Intel and then engineering to them while they also spend billions on TSMC funding and A series processors ain't viable when their own Mac OS market is dwindling because of growth of their Services revenue stream 17.7% vs Mac at 9.x%, It is a purely business decision.
And Fujitsu Supercomputer is not a consumer product, IBM had that crown with SMT8 based PPC and Tianhe is Chinese built on Sunway design on RISC, all are non consumer based. This whitenknighting of ARM taking over the world esp with Anandtech SPEC is absolute rubbish. We will see when their first Mac ships, they even blocked Developer A12Z Mac Mini to offer benchmarks, there it goes, "Revolutionize" drama.
AMD is on a roll with their Rome EPYC7742 processor which is the king of the DC market and AMD is having 4.5% share, Intel has 95%, the rest are IBM, ARM etc which is 0.5%, and these run on the consumer class software. Apple is not going to revolutionize any of it. They do not make those products nor have any interest.
iPad Pro is not a computer it cannot be a computer replacement, just because it got Office tools and First party type optimized Adobe doesn't make it a computer, it doesn't even have a Terminal to work on, Mac has that. So there goes your high performance geekbench trash.
varase - Friday, July 17, 2020 - link
Here's an article from Anandtech comparing some modern cores to Lightning - the high performance cores in the A13 Bionic in the iPhone 11: https://www.anandtech.com/show/15875/apple-lays-ou...Of course he did put a cold plate against the iPhone or something to keep it from thermaling ... but that won't be a problem in a Mac (which will probably be actively cooled).
Ever notice about how when comparing decent performing x86 PCs, the discussion always devolves into how well the beast is cooled? That's cause you need pressor plates, heat pipes, and big fans to get them to perform as advertised.
As you can see from the chart in the article, Lightning (this year's high performance core) is bested by Skylake 10900K on integer, and by 10900K, Ryzen 3050X, and Cortex-X1 on floating point.
The Mac SoC will be based on the A14 Bionic and will be built on a 5nm mask, rumored to be the first ARMv9 implementation with the first ARM core running at more than 3ghz. Because it's custom silicon, Apple can stick as many Firestorm and Icestorm cores in it as they want - as well as graphic cores and neural engine cores.
10900K is built on a 14nm mask and is reaching the limits of what they can cool - they had to shave the top of the chip to get enough surface area contact to cool the chip. It's supposedly a 125w TDP part, but under real load it can draw over 300w.
The 10900 going into this year's iMac is rumored to be a 95w TDP, but lord knows what maximum draw will be.
It's rumored that the first Mac SoC will have 8 Firestorm and 4 Icestorm cores, though we don't know what machine that going in.
You make the same mistake as many x86 bigots - because A series chips use the AArch64 instruction set, you equate A series silicon to Qualcomm and Expos. The latter may be architectural licensees, but they pretty much pick from a menu of big/little ARM standard cores, adjust the caching, add some IP blocks and call it a day.
Apple is an architectural licensee who has been designing their own silicon for a decade now. They've added parallel arithmetic units, optimized their vector units, produced out-of-order execution units, widened the data paths, designed their own instruction decoders, and added hundreds of voltage domains so they can turn off parts of the SoC not in use. They have silicon engineers at least as good as that of any x86 designer, and they have only one customer so their silicon doesn't have to carry all the cruft around that other silicon designers wouldn't think of removing.
The silicon, hardware, and software teams probably have regular meetings and they all know what their objective is for the future product roadmap.
Apple silicon in Macs will offer an enormous competitive advantage on multiple fronts: efficiency, power consumption, and millions of iOS and iPad apps which will run natively because they use the same instruction set architecture and frameworks. This could easily acelerate market share increases since all the iPhone and iPad users will find they can also run their mobile software on their laptops or desktops.
As part of the Mac SoC, each Mac will also be getting a state of the art image signal processor capable of very fast contextual image processing with smart HDR, a secure enclave capable of storing/comparing fingerprint/face geometry math, a neural engine (with A13 core count) capable of 5 trillion ops/second, a within-the-SoC-memory-bandwith multicore graphics processor, a hardware based work dispatcher, and an inertial motion processor.
Oh ... and the A12Z was just the SoC they used to cobble together the developer transition machine. This is basically a two year old model which had an additional graphics core added for the iPad Pro. Any benchmarks you've seen are Geekbench run through a binary recompiled from Intel code to ARM - proving the efficacy of Rosetta 2's x86 to ARM transcription.
So ... we'll see how all this plays out, because the first machine running on a Mac SoC should be appearing this year.
HardwareDufus - Tuesday, June 23, 2020 - link
Acorns to Apples... just sayin'twtech - Tuesday, June 23, 2020 - link
Is it just me, or is that the most confusing performance graph you've seen?Tilmitt - Wednesday, June 24, 2020 - link
Yeah it's brutaltechconc - Wednesday, June 24, 2020 - link
It's not exactly rocket science.... One side shows how much energy is being used. The other side shows how much performance you are getting for that energy. What part is tripping you up?The only think missing is the power usage for the Intel/Amd chips.
dotjaz - Wednesday, June 24, 2020 - link
Regarding the ARMv9 thing, methinks ARMv9 is not a real thing. It is just ARMv8.x extensions being re-organized to mandate certain features and create more "profiles" to suite the new server/desktop segment.For example, ARMv9a could make NEON optional again if SVE2 (minus the legacy NEON support) is mandatory.
zodiacfml - Wednesday, June 24, 2020 - link
Intel/AMD would simply step up their efforts to match or beat Apple's SoC's in the future. x86 will remain faster but Apple's will be a lot more efficientNOTELLN - Wednesday, June 24, 2020 - link
Why do people continue to do business with a company like Apple; who treat their clients so badly? You could not give me an Apple product.techconc - Wednesday, June 24, 2020 - link
How exactly are Apple's clients treated so badly? By making bold moves to improve that platform above what others in the industry are doing or capable of doing?I'll take the Apple products you don't want.. thanks!
Santoval - Wednesday, June 24, 2020 - link
During the 2-year transition the sales of x86 based Macs should suffer, particularly during the second year. Who on Earth would buy a super expensive Mac(Book) Pro that will be obsolete in a year or so? Even if Apple supports x86 for a few more years there is no guarantee third party companies and developers will, and besides it will be obsolete hardware-wise. It will be like a PowerPC Mac during the first few years of the transition to x86 (when Intel used to innovate and offer significantly more performance every year ... at about the same time Dennard scaling collapsed and thus clocks stopped rising).If ARM based Macs have the same or higher performance at 1/2 to 1/3 the TDP owning a power hungry x86 based Mac will be like owning an inefficient dinosaur..
Morky - Wednesday, June 24, 2020 - link
This. As I sit here with a fairly warm MacBook Pro running nothing but a web browser, knowing an iPad doing the same thing would be stone cold. Macs are going to be so much better.TheMighty - Wednesday, July 1, 2020 - link
agreed this is the first time I'm actually considering buying a macvarase - Saturday, July 18, 2020 - link
I plan to buy the 2020 iMac when it comes out. The thought of a 10900 coupled with a Radeon Pro 5xxx - even if the design language remains the same - makes my mouth water.My current primary was purchased in late 2018 because my 2014 was going into the shop - and I was forced to buy a year old machine with a core-i7 and a Radeon Pro 580 8gb because the 2018 was tardy. Then in 2019 Apple released the iMac I wanted - a core-i9 (8 core 16 thread) with a Radeon Vega 48 (I think).
I do a *lot* of intense 7-10 hour transcode sessions, and a 10 core 20 thread processor will cut my transcodes down to a reasonable timeframe.
With the 2020 iMac 5K I expect to have a fairly competent gaming machine, maybe in Parallels but certainly in boot camp. Catalina cut off a *lot* of games when it went 64 bit only.
I expect Intel support to linger on for years - there are a lot of enterprise customers Apple wants to keep happy who have to run legacy Intel software in a native hypervisor - at least until you can run a hypervisor on an ARM Mac at native speeds.
Meanwhile, I expect to skip the many of the road bumps sure to be revealed by the transition, while you folks graciously spur developers into producing performant universal 2 solutions.
After the transition, I expect that this iMac 5K will retain a lot of its value due to its position of being the best performing x64 Mac ever produced while still being boot camp-able.
At that point, if ARM hypervisors can run Win x86 AAA games at native speeds I'll purchase an ARM iMac - if not, I'll buy an ARM Mac and a cheap windows gaming machine. Or ... if AAA games start to appear on the ARM Mac natively I'll just buy an ARM Mac and forget Windows ever existed 🤣.
I'm retired now, so the only things I need Windows for is gaming and possibly the occasional firmware flasher for devices which purportedly support the Mac but who never produce a Mac firmware updater 🤬.
Apple is a much more prosperous company now and they can afford the engineers required to support Intel Macs long term - and they've been leaning towards longer support times, obsoleting old machines not because they're tired of supporting them but because they lack needed OS facilities for the new release of the OS (like Metal).
Ladis - Wednesday, August 5, 2020 - link
Intel Macs will keep their value because they are the bridge for people existing between the Mac and PC world (requiring x86 Windows running natively - Bootcamp/dualboot, not a virtual machine). It was different for PowerPC, because those were not usable for anything else.nfineon - Wednesday, June 24, 2020 - link
Just like that Apple removed all future AAA game developer support for anything but mobile based games. You'll be able to play the latest fortnight mobile version games on your system sure, candy crush, and many other indie titles but they just prevented 99% of existing and future real game titles to run in their ecosystem. Pretty much all modern titles for the PS, XBOX, PC are all running on the same AMD chips using x64 ISA so porting between those platforms is very trivial.Not only would you have to port your code to run on linux/mac os, which is already very hard to get developers to do using Intel x64 chips, but your program also has to run on ARM which at that point means you are basically writing the application from scratch again unless you want to emulate things and take a massive game-breaking performance hit.
This effectively LOCKS Apple into their own ecosystem in regards to gaming which has always been a problem on Mac's but now the problem is exponentially worse. Porting iPad and iPhone games to the platform won't be a problem sure but those aren't serious games. Unless they plan on continuing use of AMD GPU's, which I find highly highly unlikely at this point, they will be even further behind when it comes to gaming experience outside of the Apple Reality Distortion Field(tm) as they'll most likely just try to use their onboard iGPU solution on unnecessarily thin devices with terrible cooling.
I think the move is good for Apple overall for other reasons, but make no mistake about it, all major AAA gaming on the mac platform is officially dead. You can easily port the code from PC to Playstation to XBOX back and forth, but try doing that for ARM processors, including the graphics engines, libraries, drivers, etc and it just isn't going to happen on a platform that only has 5% market penetration on desktop.
Quantumz0d - Thursday, June 25, 2020 - link
Very good point but sadly the stupid world is moving into that GaaS bullshit of no ownership and game streaming, that garbage Stadia, Xcloud, PSN and Amazon. AAA gaming is dead nowadays because of political baggage they shove it in our throats. EA, Ubisoft, Bethesda, Activision all are in the same league. Except a few in Ukraine/Russia/CZ and JapanStuntFriar - Sunday, June 28, 2020 - link
This isn't as difficult as you make it out to be. All modern game engines are multi-platform by nature - even 10 years ago, bespoke in-house engines tended to run on x86 (PC) and PowerPC (consoles) natively (and ARM, if you were also doing mobile / handheld), so the engineers have already been doing stuff like this for years, if not decades.In engines like UE4, you can already make iOS builds which use desktop-class Metal shaders, so they can already look as good as their desktop counterparts - performance notwithstanding (and I wouldn't bet against Apple in getting a high-performance CPU/GPU combination out).
Epic has been updating the UE4 rendering pipeline on mobile, unifying it with the desktop/console one to bring them to parity (which is possible thanks to Metal and Vulkan), so the feature set between platforms would soon be identical. The only thing that's not standardised yet is ray tracing, which is still very much hardware dependent, but there's nothing stopping Apple from including it in their future GPUs.
StuntFriar - Sunday, June 28, 2020 - link
Also worth pointing out that 99.9% of your game code and scripting would be platform independent (usually to account for differences in platforms).When porting an engine over to a new platform, the bulk of the work would be on the rendering pipeline and asset importers. Ditto with optimizations such as concurrency. A lot of the existing frameworks for user input, memory management, game object lifecycle and UI can be re-used or modified for new platforms so it's hardly re-writing from scratch.
Nobody rewrites games from scratch to support new platforms.
scottrichardson - Tuesday, June 30, 2020 - link
Fair comment. But what if, and this is a huge if, Apple ends up releasing silicon with GPUs that start to pull away from the big guys? What if they make their platform so damn compelling that more and more people make the switch and suddenly the lowest performing systems are the measly old Windows PCs still stuck using giant graphics cards and hot CPUS?tripadago - Wednesday, June 24, 2020 - link
Great, finally officialJustMe21 - Thursday, June 25, 2020 - link
Apple is doing as Apple always tried to do, control everything. You might see the EU looking into it one day, but the US never would.I think we will see ARM performance in general improve as it starts scaling up to match Intel and AMD in performance. So, at first, if Microsoft keeps on the ball with Windows for ARM, we will start to see low cost PCs and laptops with Windows for ARM. This also means that Microsoft will need to encourage developers to code for UWP so the software works on any platform, we will see the reemergence of Windows Tablets and possibly Windows Phones again. We will also see more Linux distributions compiled for ARM as well.
Hopefully, this also means that as ARM performance improves, we will see cheaper x86 processors. I also hope we will see manufactured ARM processors we can drop in sockets and not just integrated SoC motherboards. I like having the ability to build my own computer, but I wouldn't be surprised if one day everything is all integrated. Afterall, we are seeing laptops with integrated memory, so I expect SSDs to be soldered on at some point too. Of course, I would wonder if the EU might go on some kind of e-waste reduction initiative and require some companies to manufacture devices so things are user replaceable to reduce e-waste.
techconc - Thursday, June 25, 2020 - link
"You might see the EU looking into it one day, but the US never would."Huh? What law are you implying Apple is breaking by designing their own silicon? Comments like this are absolutely ridiculous.
"as ARM performance improves"
One ISA isn't inherently faster than another. It's the chip that's designed around that ISA that determines performance. Outside of Apple, nobody seems willing to step up and design a custom chip anymore. Everyone is content to use ARM reference designs. As for Apple, I believe you'll see with the A14 that they've already passed Intel in core design from both a performance and power efficiency perspective.
varase - Saturday, July 18, 2020 - link
I think even the protectionist EU would find it hard to stomach forcing Apple to sell in-house locally developed A SoC silicon to competitors - and AFAIK the don't even have any PC or laptop makers in the EU, do they?The EU really seems to be content sitting on their laurels and for some reason have had trouble moving into the 21st century ... not sure why.
As for the Wintel alliance - I expect the Mac to gain a huge competitive advantage in energy efficiency, performance, and the ability to run millions of iOS and iPadOS software natively due to a common instruction set architecture and common frameworks - something I can't see anyone else cashing in on.
If they add touchscreen to the Macs, you'll have the ability to interact with the Mac just as you do on your phone - though the undo gesture may have to be changed 😏.
ksec - Monday, June 29, 2020 - link
Implementing aarch32 in ARMv8 has always been optional. May be ARMv9 will completely get rid of it while having SVE2 as standard?mccracken0070 - Thursday, July 2, 2020 - link
Finally I have found something which helped me.Appreciate it! Visit here https://www.masterbuildinginspectors.com.au/about/studynotebd - Saturday, July 4, 2020 - link
I also think the main issue is one of performance.https://studynotebd.info/2018/05/english-to-bangla...