Overpriced compared to what? What else offers this SOC performance, battery, and Mini LED display, just to name a few unique selling points? They can charge basically what they want.
To previously generation MacBook Pros, that's what. Sure, you can list off the fancy features and yes, you are totally right. But for the majority of people, regardless of features, the price will be a huge deterrent.
There's no *device* out there that has this level of performance AND efficiency. You may get a mobile device with similar performance and much worse efficiency, or a workstation class device with better performance but abysmal (by comparison) efficiency.
But you certainly won't find a $2000 device that's really comparable unless you look at one very specific thing and say "aha, the cheaper one is better at this". But let's be honest, such comparison is not useful to anyone but Intel :).
Realistically if this had official Windows or Linux support, even with the expected loss in performance or efficiency, I'd get one in a heartbeat. But as I'm not a big MacOS fan I'll just watch.
Indeed, impressive chip. I noted my satisfaction/dissatisfaction a whole year ago with the original Apple M1. I even suggested that Apple should release a family of chipsets for their devices. It was mainly for being more competitive and having better product segmentation. This didn't happen, and it looks like its only somewhat happening. Also they could update their "chipset-family" with the subsequent architectural improvements per generation. For instance;
Apple M10, ~7W, 4 large cores, 8cu GPU... for 9in tablet, ultra thin, fanless Apple M11, ~10W, 8 large cores, 8cu GPU... for 11in laptop, ultra thin, fanless Apple M13, ~15W, 8 large cores, 16cu GPU... for 14in laptop, thin, active cooled Apple M15, ~25W, 8 large cores, 32cu GPU... for 17in laptop, thick, active cooled Apple M17, ~45W, 16 large cores, 32cu GPU... for 29in iMac, thick, AC power Apple M19, ~95W, 16 large cores, 64cu GPU.... for Mac Pro, desktop, strong cooling
...and after 1.5 years, they can move unto the next refined architecture/node (ex Apple M20, M23, M25, M27, M29 etc etc, and repeat the cycle every 18 months). I'll reiterate, this was the lineup that I had in mind, and I commented about this an entire year ago.
The film industry is deeply rooted in Quadro so while they might spend $100,000 on camera lenses, they would not be buying Apple just because. This appeals to the small freelancers, photographers and videographers, not the big blockbuster-churning cornerstone movie studios. And even among the low budget folks, there are many who are vested in NVIDIA and/or Windows ecosystem because Apple lacks many key codecs, APIs, and software.
I can tell you for sure that there are so many shops that have paid more in software licenses for MacOS than all the hardware put together.
Sure, places like Pixar swim in Quadros, Teslas, Radeon Pro SSGs. But everywhere else you're most likely to find Apple Final Cut or Avid Media Composer, Nuke or After Effects, Blackmagic Fusion or Davinci Resolve, Autodesk Maya or Cinema 4D running mainly on Macs. The cost of the metal is less than the cost of the licenses.
Only a certain subsection of the film industry use Windows, in particular, corporate VFX houses. And they're not the companies that buy $100,000 lenses.
The vast majority of production houses use Macs.
I still don't know why you think a company that can afford a $100,000 lens would think a $6000 laptop would be too expensive?
There are plenty of TV and Movie crews that use Macs. Ted Lasso is completely edited and outputted on Mac Laptops. I can't imagine how much faster the editing would be on these new MBP's. I am forecasting a 5-10 times speed increase in Prores. So let us say it takes 2 weeks to complete a whole series on their existing Mac and 40% of the time is limited by rendering times. The new M1 Max would bring that down to 3 days. Is that worth it?
I'll never understand why people don't apply context to these costs. People were complaining about the entry-level Mac Pro tower's graphics card being "anemic" for a $6000 configuration, not realizing that there were a ton of audio pros out there who would be putting a $4500 audio card in a tower that didn't require a powerhouse GPU and wouldn't want to pay for it.
So... it's fine that's the graphics are only "anemic" in a $6000 configuration because with good graphics it would be even more expensive? Great logic there.
I’m not sure what you mean. If you view both of these as potential heirs of the 15/16” MacBook Pro, the pricing is virtually the same or better. This isn’t meant as the next generation of the M1 13” MacBook Pro.
You know, I _owned_ a 2019 16" MacBook Pro, core-i9 2.3ghz, AMD Radeon Pro 5500M 8 GB, 32 GB RAM, 2 TB SSD purchased from Adorama via their AppleInsider link for a discount for $3499.
I purchased the new 2021 16" MacBook Pro, M1 Max, 32 GPU cores, 32 GB RAM, 2 TB SSD directly from Apple with no discount for $3899. In case there were any issues, I wanted to be sure I owned an Apple SKU so Apple Storage managers would have more flexibility in terms of returns or upgrades (they have far less flexibility when the SKU is for some other vendor).
The AppleInsider offer probably offered me a $200 discount (that's what they're currently offering on a M1 Max, 32 GPU, 32 GB RAM, 1 TB SSD), so the net difference is probably $200, a 5.71% increase. Over the two year period, that could easily reflect inflation and increased tariffs.
So no ... this new model _does not_ reflect a big increase of the last model.
It's a Surface Book killer, that's for sure, if you were in the market for one. However, if you are an industry pro, there are many lacking features. For one, if you are film industry, you can already ignore this. Quadro is the standard there for the major pillar studios. You'd get laughed out of the room just because of the lack of CUDA. Never mind that this has an inexplicable lack of hardware AV1 codec support.
Well, Apple does a really good job of making people think that anyhow. Much like they do a really good job of making people think they invented everything on a smartphone and that they are faster than the competition.
This is what PC SOC was always supposed to end up. Apple just got there first. Intel in their attempt to make money tried hard to stop integrating memory on CPU SOC to sell more chips. This will change the landscape in Apples favor if Intel and AMD does not have an equivalent answer to this soon.
Agreed. The whole concept of an APU is rooted in a design like this. Too bad the two desktop CPU manufacturers have little desire to produce a high end product like this.
Dude, let it go. Are you going to disagree with everyone on this forum with the goal of showing you are the smartest? Go for a walk, take a breath, read a book.
It’s not. All of PIXAR/Disney Animation, ILM, etc are using custom applications built with heavy cloud farm processing and Renderman latest is heavily CUDA based.
XPU is a proprietary CUDA solution. Support for Maxwell requires Optix—NVIDIA CUDA
The main workstations are Windows/Linux. Sure Artist textures and such have Macs but their entire workflow is a series of custom applications leveraging CUDA.
Steve has been dead for over a decade. We know much of the post processing film industry aren’t using FCP. It’s mainly improved thanks to AppleTV+ studio.
The one application that has large support is Logic Pro, and speaking as someone who uses it daily that Mac Pro 2019 decked out is a beast. You have two routes, invest heavily in standard Analog hardware to offload major portions of your workflow and leverage automation in LP then export stems to others working in Pro Tools for final mix/ mastering or get a 16 core refurb Xeon with Afterburner and duo 6800XT Pro GPGPUs or dump another $10-20k in studio Neve gear like the Rupert Neve Portico II master bus stereo processor for $4k alone or the recommended mastering stack complete from Rupert Neve and ultimately the idea of a quality home studio with treated acoustic paneling and more isn’t the cute laptop in the back seat of a cab mixing or strumming a guitar on a bed. You pull a Rick Beato studio investment and have older copies of LP on a 2012 Mac Pro—until today requiring Big Sur for 10.7–or you move to Xeon because everything is still via Rosetta 2 or use straight x86.
Even Apogee Digital isn’t certified Apple Silicon yet and requires Rosetta 2 if you must be Silicon or die.
A lot of studios will buy the Mac Pro 2019 load it up and use it for the next decade.
I work both in video production and music production, and I'm going the hardware route (Neve analog for tracking, hardware fx etc), but for computing and software but I'm holding out for the Apple Silicon Mac Pro for my production company. I was very close to switching to PC (had my Threadripper build all specced-out and funds available), and then Apple released M1. So I decided to wait for the Mac Pro, based on the performance of that and taking a leap of faith that it can be satisfactorily scaled up for max performance. (There was a graph of the trend of the performance growth over generations of Apple ARM CPUs here on AnandTech - pretty impressive). I'd assume the same growth rate for Apple Silicon in years to come. I trialled the M1 Macbook air (my wife's computer!) and it did a fine job for all of my apps, Rosetta included. Seems like M1 Max is a step in the right direction, but I do wonder how they can cater for ultimate maximum performance where power consumption is no concern, and there is an endless need for RAM etc. As well as how they will allow for user-expandability, for example if the user wants to replace the CPU / SoC. That Mac Pro had better be a proper, modular pro workstation or there is going to be a new generation of burnt, upset pro users who slowly migrate to PC (assuming they don't use Logic etc - I'm on Pro Tools and other cross-platform software)
This is the Biggest myth, not true Most of Film/Animation works are done on Either pre built Machine or Workstation run Linux or other proprietary software, most of time have Xenon.
Just do simple Google search on what companies like DreamWorks, Disney, Pixar etc use.
Render farms aren't that important for most video and even motion-picture pros anymore.
Apple tried with the rack models with fibre channel Xsan ages ago, but rendering is a really specialist task for the biggest of the big guys.
Most of our rendering and encoding is now handled at AWS anyway where TCO is an order of magnitude lower than having on-premises hardware which dates quickly and costs a fortune in manpower to keep ticking over. Our major competitors also now use AWS or Google Cloud.
Being able to spin up massively powerful instances with multiple NVIDIA GPUs on demand is the best case for us. If you don't require CUDA and AMD's your thing, well they do that too now.
Apple are targeting this thing right. There's a lot of semi-pro content people who are making serious money out of YouTube/FB/Insta. FCP X is great for them and as mentioned Logic is very popular.
None of that target needs Avid Media Composer or DaVinci Resolve but even that now runs natively on Apple ARM. (Incidentally, Blackmagic Design ATEM and cameras are now getting wildly popular at the low and even higher end.)
So on the capture/edit side, these new MBPs are going to be huge within the industry. For the extreme high end - definitely not the target demo. Being able to scale up and out when required is far better for us.
Who needs cuda when apple has ml create and metal. Tensorflow 2.4+ is optimized for apple gpus, with puny m1 outperforming i9 with radeon macbook by a considerable margin (this is google's own benchs)
Yeah I guess Blackmagic, Adobe, Otoy, Maxon (just to name a few) all got laughet out of the room with M1 native Resolve, Premiere Pro, Octane Render, Cinema4D, Redshift. Also film industry doesn't give a damn about built in ProRes hardware acceleration either, regardless of it supports 30x 4K or 7x8K vide stream simultenously. Oh my.
The (world's) first hardware AV1 encoder just became available 6 months ago. Who TF else has it in their systems? Decoding AV1 is a relative cakewalk and that's by design in the codec.
No, it won’t. People doing the kind of work these machines will excel in won’t mind the pricing, which isn’t bad at all. I just ordered a 16” 64GB RAM and 2TB drive. I think it’s a bargain at this performance level. There’s nothing to really compete.
And since a laptop is intended to be used as such—sans power cord, the performance difference will be even greater, since most performance Windows AMD/Intel machines suffer greatly when not plugged in.
Hence the lower end 14" partially disable chip at quite a good price over the old equiv MBP. Otherwise if you dont want the "fancy features" the M1 MBP or air are more reasonable. Did you expect the new high-end notebook at the low end notebook price or something?
No ones ever going to suggest Apple isnt expensive. But at least they arent just buying off the shelf shit that everyone else uses anymore and charge 50% more.
Apple has always provided pro products that are a sort of combination of really good components where the end result is something that is either a good deal for exactly the specs they provide, or simply has no equal, but looks expensive if you don’t need all of the fancy stuff they have decided should be included in the SKU. For example, the Mac Pros and iMac Pros offered Xeon processors. Xeons are expensive whether you buy your workstation from Apple or Dell (or anyone else), and they have a few extra features like buffered ECC RAM support. If you just want a PC with a fast processor and don’t care about the extra features of the Xeon you can buy an i7 or i9 machine from PC OEMs or make one yourself and you will end up with a machine that is faster than the Mac Pro for less money. Similarly, when you add up the mini-led and the 10 core CPU and the 32 core GPU with 400GB/s of bandwidth and all the other features of these new laptops, it’s pretty hard to find something comparable. If you’re like me and just want an M1 macbook that supports more than one external display natively, and you don’t get value from all the area they spent on video decoders and encoders and whatnot, then they look pretty expensive. I might still buy one :p but there are definitely a lot of awesome things in this machine that many people simply don’t need or won’t use.
What you just said is why the M1 Macbook exists vs the M1 Pro and M1 Max Macbook Pr exists.
M1 Macbook with a TB4 Hub got your needs covered addressing the multiple displays use case you need that it doesn’t put-of-the-box. More than 1 external display is definitely a “pro” thing, but I anticipate them making that included with M2
I have owned multiple generations of MacBooks for 14+ years and these Pro series are priced right in line with the previous ones (BTW, Apple is also very good at pricing their products) and are a very good deal when you consider all the leading edge technology in them and what other laptop even comes close to this level of performance.
How that? You get much more performance for the same price, not to mention that display or the battery. Sure, the new 14" is more expensive than the old four-port Intel 13", but the 13" M1 is significantly faster than Ice Lake anyway for less money.
And of course, the higher-end models are pricy, but these are very serious mobile workstations. And they compare very well to other workstations at the same price level.
The majority of people don't need a laptop like this and would be perfectly happy with a regular M1 laptop like the MBP13 or MB Air, both of which are actually very reasonably priced for what you get. For the people that need or want the best of the best, the MBP14 and MBP16 are definitely not overpriced.
Just for fun, go price a desktop computer that compares with this. You would need an AMD 5800X, fairly high end GPU, memory, 1 TB ram, PSU, case, operating system, keyboard, mouse and a monitor ... I was up to $2800 and it could easily be significantly more. And that is for a clunky desktop with virtually no software. This is an extremely clean packaged system that is mobile. Has a battery, keyboard, mouse and again is MOBILE. Comparing this to systems that are 1/5 the power just isn't a fair comparison.
Previous generation MacBook Pros with discrete graphics chips were also overpriced from the perspective of a majority of users. So the intended customer base are users willing to pay more for advanced technology
Try again. These machines are priced in line with what MacBook Pros have been for years. Only there's an enormous jump in performance this time around instead of an incremental one from Intel. And now the fans won't be kicking on and drowning out the world.
Exactly. This is the world's best laptop but yet a niche for Apple super fans. Maybe Apple will get major studios hooked, but they would need to get Apple chips into server farms. No way the technical teams code their rendering tools for two entirely different GPU and CPU archs. And don't see that happening for another five years at least.
Esports champions don't choose their hardware, they play at LAN events with hardware chosen by the event (where a system OEM/integrator will often be a sponsor). Similarly for their personal use, players belonging to esports teams are use and/or promote whatever hardware is attached to the organization's portfolio of sponsors.
I was coding on one architecture and shipping on another back in the 2000s at a previous job. I'm doing it now with the M1. I'm sure there are projects where it can't be done right now, but there are also plenty of opportunities were it's fine.
Macs have always been expensive. But they are marketed for affluent consumers who can easily afford them. BMW, Mercedes Benz, and Teslas are expensive and no one complains. My 2016 Macbook Pro 16 i9 64GB 2TB cost $5000. The 2021 Macbook Pro 16 M1Max 64GB 2TB costs $4300. It's actually cheaper.
He is actually comparing it to what was available at that time ad what he had to pay back then to get that kind of hardware. What else should he compare it to?
No. You are neglecting the memory speed, interface, bus speed, process node for custom chip, mini led screen, etc. The biggest one in inflation. So yeah, it's cheaper.
Strawman argument. Technology goes down in price over time. 2TB SSDs cost $800-$1000 in 2015. Now, they cost just $200-$300 in 2021. Oh my goodness, yes, they are indeed overcharging compared to what they should be, commensurate with current capacity-per-dollar costs.
No question, DRAM and SSD are where Apple soaks up the profit. Their prices on these are 3 times what the PC world charges. Once hooked on the decision to buy a Mac at the entry price, the rest is just numbers on the credit card.
There are already laptops which Apple is comparing with what is that ? Reinventing the wheel at TSMC 5nm is what they are doing here. DTR existed since Alienware M18x R2 and today's Clevo X170SM which is a true Desktop replacement because it runs 10900K full speed and a mobile RTX380 at higher performance targets.
Once you run this M1X on such high workload the CPU and GPU will not keep up with efficiency it's the law of physics and nobody is going to cheat physics.
Basically buy this for a notch and a crappy OS with locked down ecosystem on a BGA soldered Hardware with zero user customization for show off and notched POS display for bragging rights at even more expensive cost.
but the laptops you've mentioned don't have: as good of a display (no mini-LED, no brightness, no HDR), any acceptable battery life, comparable portability and sound system. MacOS can shine if you work well with the command prompt
Film industry is laughing at you right now. Find someone who knows someone who actually works at Pixar or another CGI shop and they will straight up tell you that this won't work for their workflow. Quadro is the industry standard unless you are a little guy doing small-time freelance.
There's a big world between full frame Hollywood or Pixar level movie making and people with high needs for video editing. Youtube is kind of a big thing now. There's also no mobile GPU you can pay any amount of money for that can access 64GB of RAM with equal speed, there were SSGs that tried to bolt on SSD storage, but this is 400GB/s LPDDR5.
Pretending the uber high end of the industry which you're definitely not in invalidates what this is good for is just poor reasoning.
These ARE priced well tbh. For example we just got some Dell Precisions with i9s, basic screen, no dGPU and LIST on them is $5,500 (biz pays around 50% retail). The Dell is probably 2x thicker and about the only plus is it has, IMO, better port selection.
I’m surprised because the regular pre-m1 MacBook isn’t that great pricewise.
For our graphics need we use a workstation with a fat $3k+ Quadro. It just encodes video all day.
You know I'm starting to suspect you're the kind of 12 year old that bores his classmates to death with your high volume lectures on the superiority of PCs (and your maxed out gaming rig) to those awful Macs. One tell is your constant refrain of "the film industry is laughing at you", which is the reasoning of an insecure child. Additionally, the only thing you seem to think you know about this film industry is "Quadro" and "AV1", which you have chanted about 20 times now. I've seen the "go ask any industry pro" a lot, and typically is weirded by people who know little to nothing about the industry in question.
> but the laptops you've mentioned don't have: as good of a display (no mini-LED, no brightness, no HDR), any acceptable battery life, comparable portability and sound system. MacOS can shine if you work well with the command prompt
My MSI Creator 17 has all of that, do some research before spouting crap like that
I have a Clevo, it's an unwieldy, heavy, loud brick with pretty lousy OS integration. Not much of a comparison to a MBP if you take laptop ergonomics remotely seriously.
I'm sure for macos power users, whoever those people are, these machines will be second-to-none. For every else it's a bit of a mystery what all this power would be useful for.
Tons of cameras and ubiquitous mobile monitor/recorder combos now record ProRes natively. It's actually a whole suite of codecs from lower end but still good, through to extremely high end film intermediates.
In fact it's starting to taking over parts of the video and TV industry from Sony's seemingly unstoppable Betacam/HDCAM/XDCAM lineup which is incredible considering those standards go back to half-inch analog tape in 1981.
Digital Betacam was THE standard for production in the 90s and SD era. Some of the biggest films of the 2000s were shot in HDCAM or HDCAM SR including Lucasfilm. Then they moved to file based workflow with XDCAM first on 23GB and 50GB professional discs which are still in use and then to flash based XDCAM.
But now Sony have a serious challenger on the record and ingest side.
With ProRes codecs built right into the wildly popular Atomos Ninja outboard monitor/recorders, camos (like me when I can't find an operator - for example in the middle of COVID) can record better quality out the HDSDI back of their expensive industry standard camcorders, and only use the Sony cards or discs as backup. They're incredibly cheap - for pro TV prices - and great little field monitors which include ProRes RAW mode.
AJA gear is similar, and over the past decade Blackmagic Design has become massively popular with entire pro studio setups based on ProRes including their stunningly good and great value cameras it's exceptional news. Those are seriously mainstream in production today.
With all the history of being the undisputed king of pro-video, it's astounding Sony themselves are releasing gear with ProRes built right in and releasing firmware updates to make some of the older semi-pro camcorders output it too for those Atomos recorders and others.
Add the new iPhone 13 Pro recording ProRes natively and it's hard to overstate how massive a change this seems like it's going to be.
Pointing out that you can get ~20% higher GPU performance and comparable CPU performance out of a device that weighs 2-3x as much and draws 3-5x as much power under load isn't exactly a compelling argument.
On the contrary, it's underpriced (or "right-priced"). Even purely from a performance perspective, show me a single laptop with equivalent CPU/GPU performance at Apple's prices.
400gb/s ddr5 and 7 gb/s ssd wich I'm assuming uses pcie gen 5 interface because goddamn 7.5gb/s, they both are quite expensive. quantum dot and mini-led backlid display is very expensive. don't be fooled by it's size, this monster should be compared to a high end quadro laptop.
My 2020 8-core i7 Dell Laptop cost over $2500 gets smoking hot and pales in comparison to these computers in terms of performance, design and quality. The Apple touchpads are 1,000% better than they crappy one Dell includes.
The pricing is actually very good. The lowest standard SKU will probably destroy any comparably priced Intel based laptop, and do so without having fans that sound like a leaf blower.
No it's not. You don't need these machines if you're not a creative pro using these to make a living. The Air has crazy good performance for most people.
If the performance is even close to what Apple claims here, calling this “massively overpriced” is laughable. Something is only “overpriced” if you’re being charged too much for what you’re getting. There won’t be a better price/performance combo for most users who actually NEED this much portable power.
There won’t be any other laptop for photographers. Its only competition will really be the iPad Pro, for people who can live with its limitations for ultraportable workflow. Those who want more power for on-the-road editing will be jumping all over this. Lightroom on Apple Silicon performance alone makes this worth the price.
Similar specced Dell XPS 15 ( 1 TB SSD, 3050 Ti, 32 GB RAM ) is in fact more expensive compared to the 14 inch Macbook Pro with the top M1 Pro SOC ( 10 core CPU, 16 core GPU, 32 Gb RAM, 1 TB SSD ).
Let's take a look at the closest competition. The MSI Creator 17 has a MiniLED display. It has a Intel 11800H which is 25% slower than the M1 Max. It has a 3080 which is about the same as the M1 Max, it has a 4K display, which is only slightly larger than the 16-inch Macbook Pro. The MSI has a max of 1000 nits brightness. The 16" MBP is 1000 nits sustained with 1600 peak. The MSI does have more ports. The MSI has 720p camera whereas the Mac has 1080p with notch. The Mac is 0.6 pounds less at 4.8lbs. The Mac supports 4 external monitors. The MSI supports 3. And the big one is the MSI's battery life is 9 hours and the MBP is 21. So based on your comment the MSI should be around $2200-2500 and the Mac of course is $3499 with the same 32GB RAM and 1TB SSD. But wait, the MSI is actually $3499. Yup, sounds overpriced to me.
Comments like this are an opinion. For the fastest laptop ever made by a long shot with one of the best screens ever and just about the highest battery life of any laptop that is worth it and it is the same price as a less competitive x86 laptop. My opinion is it is expensive, but worth every penny to someone who needs it.
The die "as-is" is includes either 6 performance cores or 8 performance cores. Even if the 6 performance core model is just a binned part (2 failed cores) it is still an M1 Pro - the article should not imply that all M1 Pros have 8 cores... since they do not all have 8 cores.
The base 8/14c M1 Pro offers 6 performance cores and the upgraded 10/14c and 10/16c M1 Pro offers 8 performance cores. The performance gap is likely to be higher between these two socs than between the 10/16 M1 Pro and the 10/32c M1 Max in most applications. Not being clear what Apple is selling is a disservice to readers who are literally putting orders in now.
And if they’re putting in orders based on a random tech article and not looking at the specs realizing that there are two less cores then they likely won’t miss them.
Anyone who can spend 3,000 without fully researching the product they’re buying isn’t worried about money.
Anyone not smart enough to notice two Missing cores in the spec sheet and subsequently investigate the consequences will likely make bigger mistakes in their life than choose a potentially underspecced laptop.
It's true and should be noted, but let's be real. Anyone who is reaching for one of these is doing it largely for the performance (the M1 Air/Pro are already quite good for "normal" things and much cheaper). So I think very few people are going to buy the 6-core model, it's really just there for Apple to do something with some binned chips and present a lower price floor for the model lineup. Getting the base model makes little sense for almost anyone.
For someone to spend $2K on one of these and not go an extra $300 for another 2 performance cores and another 2 GPU cores would be quite insane.
It’s $550 more than a MacBook Air with 16GB RAM and a 256GB SSD. The extra $550 brings a much better screen, better speakers, microphone, and webcam, and the ability to drive 2 external displays. For some people, that might be worth it even if they don’t need the extra 2 CPU cores and 6 GPU cores.
And yet it still doesn't matter as much as you want us to believe, because the hardware is locked into an ecosystem that many have no interest in, or not the ability to switch to.
I'm not looking for a new job and stop gaming all my favorite games just to be able to use a Mac with fancy new hardware.
If Apple were to just sell hardware, it would be a much bigger deal, instead they sell lifestyle, that happens to come with hardware.
Because many at work MUST have windows. Even if this mac was 10x better than it really is (it does look awesome) it is completely impossible for many industries to switch or support.
I loathe Windows as much as anyone but Windows Subsystem for Linux works quite well and in fact Docker on Windows uses it directly with no VM unlike on Mac.
I've not found that WSL provides the sort of seamless command line-native UI app integration that powers things like https://www.barebones.com/products/bbedit/benefits... and other such things. In macOS, the command line is part of a full-fledged UNIX app ecosystem that is something totally different from X Window / Gnome / KDE / Unity / etc.
You don't like MacOS fine ... this is a Hardware discussion. Not whether you love/hate MacOS.
Why do you need to get a new job for buy a MBP14/16? Is $2000 too much for you to afford? If so, maybe you should get a better paying job. Maybe you should forget about getting a new computer and put food on the table first.
I didn't know this forum is about making ends meet because you are destitute. I feel bad for not everyone can afford luxuries in life, but this forum isn't about self pity.
LOL. Nope. Apple's GPU, like AMD's, while amazing from test tube standpoint, lacks the developer support for CGI and machine learning. NVIDIA's CUDA and CGI industry ecosystem is second-to-none. I find it laughable they omitted AV1 and market this as a serious film indudstyr tool. The little one-man shops will love this, for sure, but the industry-leading pros will always use a Quadro workstation away from the studio. Wake me up when Pixar and Moving Picture Company are using MacBook Pros. I'll wait... Zzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzz.
Pros really don't have the kind of 'fanboyism' you're demonstrating.
If a M1M MBP is as powerful as it seems, Pixar will write the software needed to use it. It's CPU performance makes it competitive with a 12 core Intel part and it's GPU performance compares to a 3080 RTX mobile; of course, maybe Pixar will prefer to not write new software and just use a Razer Blade 15 (the benched laptop) instead.
That said, we're talking about laptops. Pixar (and many other companies) already use MacBook Pros because they are portable, and use Linux hardware farms because they are cheap. The question is if a Mac Pro with a 120W part (64 GPU cores) will entice the likes of Pixar, which I doubt. Again, they use Linux farms because the hardware is cheap. I don't see a Mac Pro hitting the price/performance curve they need to render their movies.
> Pros really don't have the kind of 'fanboyism' you're demonstrating.
Exactly except your fanboyism is showing. CUDA and Quadro are the industry standard for GPU accelerated machine learning and rendering in the professional sphere. Apple has alway been the little guys on the fringes or niche corner cases.
you don't actually do work on the server farms... you work on a workstation or mobile workstation, then render on the server farm. The workstation just needs to be able to output in the format the server farm accepts. you've said the same things over and over in these comments, but you still don't really know what you're talking about. In case you didn't know, Pixar's Renderman runs on Windows, macOS, and Linux, it's basically platform agnostic.
It’s clear from your comments here that you don’t know anything about modern high-end animation and vfx production. There’s a multiplicity of roles (animation, fx, lighting, layout, simulation, etc.), all with different workstation requirements. CUDA is mostly useless for an animator, say, who needs responsive viewport playback of animated characters, and which is CPU bound. An fx animator or simulation artist, on the other hand, could make use of CUDA. High end studios are mostly not using CUDA for rendering, as their huge scenes don’t fit in VRAM, and out of core memory reduces the GPU render advantage significantly. These new Mac laptops could render scenes in Octane or Redshift that are currently impractical on the vast majority of NVidia cards, due to their comparatively massive memory pool.
As others have pointed out. You are just talking out of your ass because your Anti-Apple fanboyism is showing.
If movie studios thinking Apple hardware is all shit, why do you think they are buying MacPro by the truckload? LOL.
Hope we have waken you up from your excessive koolaid drinking.
Specs are specs.
MBP16 64GB, M1Max 32GPU Cores, XDR Display 120Hz are gonna destroy any PC laptops you throw at it at their thermal envelope. Those are just facts buddy. Keep coping.
Competition is good. I thought almost everyone in this forum like a healthy dose of competitions.
Intel has been sleep on their wheels and AMD is coming late to the game. Apple is turning the table upside down.
Time for the industry to innovate. Apple took the lead, now it's time for the rest of the industry to wake up or be left behind.
The laptops they are comparing to are crippled junk. If you want real PC Laptop look at Clevo X170SM which runs 10900K at 5.0GHz all core OC. That will destroy this overpriced garbage which beats a measly 11800H TGL 10nmSF with 48W PL1 and locked down clocks at 4.6GHz max Turbo. The GPU on this as per Apple claims is a 2080 Class not 3080. And Clevo X170SM has a closer to desktop 3080 MXM GPU.
Finally the NVMe Storage, PC laptops have replaceable SSDs, HDDs / 2.5" SATA SSDs. And replaceable components if anything goes wrong. This abomination is a soldered POS design at screaming $2500 cost for which I can build a real desktop powerhouse which can run Linux and Windows and run anything that I can throw at it. Or get a maxed out X170SM CLEVO.
AMD is going to have Zen 4 Raphael on TSMC 5N that will destroy these CPUs and once the Hopper and RDNA3 with MCM arrive in the pure performance aspect this will be obliterated, as Nvidia is having 150% CUDA core count increase over the GA102 on their new Hopper or whatever arch they call it.
Without power cable ? Did you even see M1 ? Once you run it at full speed the efficiency drops like a brick. It's basic Physics. There's no way this SoC is going to sip power and perform around RTX2070/2080 Class. The Clevo X170SM is going to wreck havoc if unleashed that is from a 2020 CPU and 2020 GPU performance.
And just think, this is only a mobile chip! They clearly have a high-end desktop chip coming for the MacPro. I'm guessing it will be called the M1 Extreme. Whatever they call it, it will completely redefine Apple's position in the market and you can bet that unified memory will play a huge role in it's success.
This isn't really true. Just like you can't expect to turn a 3080 down to 100w and get exactly 1/3rd the performance you can't expect to take a 30w part and goose it up to 300w and expect 10 times the performance.
Apple is specifically targeting this power usage. Intel or AMD are designing an arch to perform from 15w up to 200w, apple is designing an arch to work at one main target power usage (from tablet to lightweight laptop). Nvidia does the same thing in reverse and so their laptop parts always seem to struggle on a per watt basis VS a integrated gpu (i mean, is that really a surprise considering the laptop 3080 requires pcie access, its own memory subsystem, etc all that require significant power budget).
I suspect if AMD decided to make a APU with similar specs it could reach similar power levels. The potential weak link with x86 is that they dont have complete control over the OS like apple does. Of course many of us here also consider that a feature, not a weakness.
My wife asked about the new apple hardware and what it might mean. I said its a technical achievement, but not one thats likely to effect us in any meaningful way other than potentially pushing forward more competitive hardware from PC suppliers. And its true. If your not already an Apple "follower" its unlikely a faster, lower power laptop is suddenly going to turn you into one. Another poster said it would be a different thing altogether if they were releasing the hardware for direct sale, but apple has no desire or intention to do that.
Tldr: i can put the most powerful most efficient engine ever made in a truck and it doesn't really change much for you as a customer if what you need is a car.
Apple doesn’t need to turn up the power consumption and decrease the efficiency of its individual GPU and CPU cores to increase performance. The idea that is being talked about is that Apple will increase the number of cores by a factor of 4. 40 CPU cores, 128 GPU cores, in something similar to chiplet package. It would loose some overall efficiency in performance per watt but Apple would continue to have an advantage in power consumption and would be extremely competitive in performance.
Silver5urfer: "PC laptops have replaceable SSDs, HDDs / 2.5" SATA SSDs."
Are you a comedian? Are you saying that proper laptops only have HDDs or SATA SSDs?
Do you realise how unbelievably slow a HDD is compared to the NVMe storage on these MBPs?
Let's take one of the best HDDs on the market, a Seagate Barracuda 7200.14, but feel free to use any other HDD, how does it do at 4K random read? 770KB/sec or so. Let's call it 0.8MB/sec to be generous.
Now take one of various NVMe PCIE4.0 SSDs, the WD Black SN850 or the Samsung 980 Pro. They all max around 1 million IOPs at 4k random read, which if I have my maths right, is about 4GB/sec. Or around 5,000 times faster than your 'fast HDD'. That's the kind of drive a proper modern performance laptop should have. Which is the kind of drive the Apple MBPs have.
What about your precious SATA SSDs? SATA tops out at 600MB/s for SATA III. The NVMe drives I mentioned above do 7+GB/s. That's 12 times faster.
As for the rest of your claims, you're making them before this new MBP has even been reviewed and independently benchmarked. And you're also comparing it to your imaginary Zen 4 Raphael which hasn't even been released yet.
You're full of the worst sort of FUD and vapourware combined with an extremely poor grasp of how tech is used in the real world.
Dude what are you on ? I said SATA HDD or SSDs which is SATA III standard. PC laptops already have NVMe which I already mentioned. You are strawmanning to peak go and do your BS else where. A soldered POS is a soldered POS nothing is going to change that fact. 8TB of soldered junk if it goes kaput you and your precious $6000 laptop is a junk.
Apple's SSDs are always encrypted since T2 (2018), and the key is stored in the Secure Enclave, so you won't be able to read the chips without a functioning original SoC w/ its own Secure Enclave.
With FileVault enabled (which you can do on first startup when you create your first user account), not only the original SoC is required w/ its hardware key, but also the user's password.
Hahahahaha who in their right mind would compare that Clevo machine to the new MBPs? The Clevo has nearly 600W TDP, requires 2 PSUs, and probably can't run for more than 30 minutes unplugged under workload. That thing is a monstrosity and will still be way worse at a lot of workloads. Also what does that say about the MBP's that THAT is the computer you have to compare them to?
Not sure how to break it to you, but the 11800H at 4.6Ghz is faster than the 10900K at 5.0Ghz - and when it comes to perf/watt this chip kicks the snot out of both.
"Closer to desktop" is a relative term for a mobile 3080, they're all based on the GPU that powers the 3070 because Nvidia blew the thermal budget.
The soldered-down criticism is pretty valid (although not so much for a professional workflow where service contracts and network storage come into play) but you really just come across as a fanboy when you act like this device is going to come out looking bad in direct competition with a *4.7Kg* DTR that pulls 500W and produces 62dB of sound under maximum load.
I'll certainly be interested to see how it compares to AMD CPUs and GPUs on 5nm, but it's a little bit like comparing an F1 car to a Bugatti Veyron - they do different things in different ways. Apple are going for maximum efficiency in specific tasks under their own OS, AMD can't optimise to that extent.
Okay... Supreme performance is one thing, but why does M1 have to have a list of games it can play, rather than the entire steam library? Why is there still software, like Adobe illustrator, that still can't run perfectly on M1. Why does M1 still have SSD thrashing issues? Why can't you upgrade the SSD or RAM on your M1? It would be a shame if something were to happen to the batteries, like degradation.
AMD and Intel already have good enough mobile APUs for most usage cases. So they can't run fanless, or at full power while on battery, so what? At least they can run a majority of games and apps flawlessly.
You brush off the "it's too expensive" talking point, but you are aware that the overwhelmingly vast majority of people don't spend more than US$700 for a laptop, right? Like, this is literally where AMD, Intel, and MS make something like 90% of their profit from. You know this, right?
Now, If Apple were to lower the price of vanilla M1 macbooks to around US$650, then sure, AMD, Intel and MS should be really scared. But with the current software and price limitations? pfffffft yeah, whatever.
I can't believe these arguments still happen. Value is in the eye of the beholder. Clearly myriad people place high value in the seamless, well designed, secure eco system that Apple provides with their hardware/software/services combination. Meanwhile other people don't see the same value, and prefer products with more fundamental modularity and lower prices at the expense of not having premium/cutting-edge industrial design and components (MiniLED, M1 Max, etc). Apple products are designer, boutique products, will almost always be owned by the minority, and will continue to be seen as 'premium' options. The vast majority will of course not own an Apple laptop, and I don't think Apple or any Mac fan has a problem with that. There's literally next to no arguments you can win with many Mac/Apple fans because there's simply a totally different understanding and appreciation of what value is to them vs others. So rather than argue what's better (Win vs Mac, Apple vs Android etc), just accept that people like what they like for the reasons they choose and arguing helps nobody.
I think that there's also a bit of elitism at play here. There's a lot of comments in here that come across almost as if they pitty macOS users for somehow not being as enlightened and buying 'overpriced crap' and it's their civic duty to protect the sheeple.
The Creator 17 is the first and only other miniLED laptop I'm aware of, and it's coincidentally priced almost the exact same as the new MBP. But yes, that does appear to be the only semi-comparable machine on the market right now.
The MBP has a higher peak brightness, better webcam, better battery life, and is thinner, but the MSI has lots more ports and is (presumably) upgradeable.
2. The selling of new Macs capped at 8 GB of RAM, while simultaneously refusing security fixes for machines that have 16 GB?
3. The removal of MagSafe in previous generations for an inferior solution?
4. The defective butterfly keyboard?
5. The rapidity of software incompatibility, due to frequent changes of the operating system's innards?
6. The ever-increasing increasingly-aggressive spyware?
7. The not-enough-ports strategy to create a dongle-selling side business?
8. The inefficient touch bar?
9. The removal of WindowShade, which was better than Dock minimization in many cases?
10. The inability to turn off button flashing (something that could be done in System 6, for goodness' sake)?
11. The addition of extremely irritating window shake, with the shaking not being able to be turned off?
12. The inability to disable Dock minimization animation?
13. The stupidity of not having two separate docks, one on the left side of the screen, and one on the right, so that application position doesn't change (interfering with muscle memory) and new users aren't confused about where things should be placed? Also stupid is placing it along the bottom of the screen given the fact that vertical real estate is worth more generally.
14. The incredible number of hidden background processes, a huge number of which phone home with data that really shouldn't be transferred?
15. The inability to have good control over file metadata without resorting to 3rd-party software (and even then)?
16. The lack of adequate documentation of the operating system?
17. The inability to truly erase files and do secure erase on SSDs?
18. The inability to fully disable automatic sleep? (Perhaps this has been fixed but in Catalina it's not.)
19. The ugly new icon shape and loud + ugly new icons (versus skeumorphic) to go with garish backgrounds?
20. The APFS file system's extreme inefficiency with hard disks?
All sorts of remote monitoring and control tools are included in the operating system by default, rather than being able to be manually downloaded and added by the 'owner' of the machine.
Apple will say it's for ease of use. I say these things are for a certain type of ease of use.
None of the remote monitoring tools are enabled by default. They have to be enabled by the owner of the machine with administrator privileges, password, etc. How is this different from Remote Desktop Connection included with Windows 10?
1. Windows having the same sort of spyware in it is irrelevant.
2. Closed-source operating systems with various kooky bits of hardware (like T2 and CPUs with black box CPUs built into them) can't be relied upon to enable and disable via the intentions of the 'consumer'.
As for the ad hom at the end of your post... citation needed.
Scared of what? These laptops are thousands above what the vast majority spend on one. These represent a tiny fraction of sales. These chips are the equivalent of a RTX 3090. Absurdly powerful and absurdly expensive.
This. Anyone who thinks this isn't part of the goal is fooling themselves. Apple wants complete control of the stack, top to bottom. MacOS is still far too open and old school in its design. They want *all* software to be funnelled through the app store.
This is an amazing chip, but its relevance doesn't leak outside the apple ecosystem very much. I would suggest its a much bigger concern to android than windows or linux. Linux will continue to dominate the server market and windows will continue to dominate the office and home user market and certainly the PC gaming market.
Hopefully the combo of this and the (hopeful) success of the steamdeck show AMD and intel that there is a market for a really efficient *and* powerful SoC as a portable gaming system.
In the next 6 months 5nm TSMC Zen 4 based EPYC CPUs designed around 3D MCM will be splashed all over these pages. In an additional 4 months the first 5nm Zen 4 3D MCM Workstation class TR chips will be available. The conservative claims by Lisa Su and company a year ago were 50% improvements over the previous Zen solution.
During these months the newly merged AMD-Xilinx will be revealing to the world their designs and some new accelerator solutions that Xilinx currently shows are best of breed. Nothing Apple has is a long term industry first. I know the folks. I love my old company and I'm a life long OS X from the days when I worked at NeXT.
ARM isn't going to dethrone x86, Apple could care less to dethrone Windows. The company concerned the most is Intel. It's got a lot of old talent ready to retire.
Microsoft is steadily diversifying itself and continues to expand and grow its valuation. It's not scared.
AMD in the next two to four years will be > $500 billion corporation within a massive portfolio of markets to continue expansion in.
Apple will continue to expanding its EV projects and eventually have their line to show up, and show up into a highly contested industry that has thin margins, so Tim and company will be a niche player, by design, and continue expanding its services divisions.
The general consumer purchase of computers do not have the world leaping onto Macs like the world of the iPhone or the iPod.
Apple will never enter the Enterprise Markets. We were fully prepared to do so until Steve saw the books and realized the only this company survives is through the general consumer.
Another company scared is Nvidia. They are banking their future on selling ARM based CPUs to manage custom solutions for AI, CFD, FEA, and other scientific markets that they can sell truckloads of custom 3D MCM future SoC designs that they can then charge massive license fees for ala Oracle.
Without the ARM merger, long term, Nvidia will go the way of SGI. Sold off, piece by piece.
It will be interesting to see if there are any under-the-hood improvements to the cores. We'll see when the single-core CPU specs become available. Also, it appears Apple is binning aggressively, with chips with 6 vs 8 performance cores, and multiple GPU configurations.
No. They compared it to laptop GPUs. ;) This seriously lacks CUDA, video codecs (O AV1, AV1, wherefore art thou AV1?) and has just a third of the TFLOPs of a stock RTX 3080.
A laptop with 1/3 the TFLOPS of a desktop RTX 3080! Or equivalent to an RTX 3060, really, which is roughly the same performance as the mobile RTX 3080.
@michael2k no idea as the power numbers or whether that's peak numbers. Will see once proper benchmarks come out, would hazard it can peak to a RTX 3080 in specific workloads tailored to it.
@Blark64 focus on AV1 over HEVC, AV1 is about 45% better at compression size and decode at the same quality levels for "movie" quality at 4k and nears 50% at 8k. Normal proxy footage is 1080p for most, that's around 32% smaller for Av1 vs HEVC at same quality (I'm looking at MacX and a comparison paper titled "A comparative performance evaluation of VP9, x265, SVT-AV1, VVC codecs leveraging the VMAF perceptual quality metric" from June 2020.
SVT-AV1 from Netflix and Intel is great though, brings software encode/decode to only 2x hardware encode/decode, so don't think Apple needs a hardware encoder/decoder tbh, will be fine for most considering the performance of the device.
@RSAuser Sure, AV1 has some efficiency advantages over HEVC, at the cost of higher computational requirements and limited hardware and platform support. It's a wash. The more important point I was making is that the OP was focussed for some reason on a delivery codec, not an editing/production codec, which is a focus of the coverage of these new machines (ProRes encode/decode blocks). The two classes of codec have very different goals and functional requirements, which people unfamiliar with TV/Film/VFX production rarely understand. Production codecs prioritize lossless or visually lossless quality, symmetric encode/decode, intraframe compression, high bit depths, alpha (or depth) channel support, and dynamic range. Delivery codecs prioritize lossy compression, efficiency, and asymmetric encode/decode (meaning long compression times, lightning fast decode).
Why the focus on AV1? AV1 is a nice quality delivery codec, but really no better than Hevc/mp4, and completely irrelevant on the Mac platform. What AV1 is not is a pro codec: it’s asymmetric and interframe, which kills editing performance. That’s why these new macs have hardware support for a pro codec (Prores), and the M1 has already shown that it has the grunt to decode other pro codecs like Redcode Raw in real time. In other words they are meant for content creators, not consumers.
Totally right about the difference between capture/ingest and delivery/streaming codecs. Very few tech people who aren't media professionals get that. The focus on AV1 is of course openness and a lack of any royalties on the consumer side.
HEVC is great and VVC looks better, but the issue is the MPEG LA and the licensing costs and the patent pool. Apple is a founder member of the Alliance for Open Media - it's hard to find a company that isn't on board.
ProRes is a totally different beast - it's looking like the Sony Betacam /HDCAM/XDCAM of the future.
Why though? I run most of what I run on linux pretty much natively on MacOS. A lot of things cross compile especially when it comes to doing development
Linux doesn't destroy the viability of useful machines via the withholding of security patches nearly to the degree Apple (and now MS with 11) do. That's why.
Shockingly slow? It's been a year since they started, and they are still not done. To me this indicates that it is quite hard. And IMO that's enough reason not to buy these Apple machines as sexy as they may otherwise be. If they want our business, they should support Linux rather than making it hard. And that's why our group has not bought an M1 Mac despite some of us considering it.
Apple is Unix. Tools like Homebrew (disclosure: which I had a small hand in converting to native ARM during the NDA pro dev kit era) means if it the code is open and available for Linux, it will almost certainly compile and run on ARM. As a sidenote, I helped port an open x86 macro assembler to ARMv8 via brew. ... where it still complies to native x86/x86_64 code :)
My 10core/16-core 16GB 1TB 14" is ordered. Can't stand my 2020 13" MBP 2.3GHz 17, it gets hot doing somewhat banal tasks, and is literally uncomfortable to use as a laptop some of the time.
Decided to hold back and not go all-out -- I'll wait to do that when the M2 Pro/Max are out in a year or two, hopefully w/ Arm v9 and its SVE instruction set.
The M2 would be using the same cores as the A15 chip found in the iPhone 13. Not shabby at all but Apple isn't going to be rolling out SVE2 enabled chips for the Mac next year. Beyond that is anyone's guess but Apple hasn't shown interesting in developing a Mac only CPU core design. The biggest benefits of SVE2 are currently targeted by dedicated accelerator blocks in Apple's SOC designs.
Gaming Laptop when Apple? Most of us plebs don't have any use for your 16/32 core GPUs. Of course we shouldn't and wouldn't be buying a Mac for gaming anyway but it'd be cool to see Doom Eternal running at 60+ fps on M1 Max.
Apple has never really shown any interest in gaming, and all the years of alienating the ecosystem won't be turned around over night, even if they tried.
The first piece of software for the Lisa was a game. It was called Alice.
It became the first piece of software for the Mac.
After that, though... gaming was 'deprecated'.
Jobs wowed the crowd with Halo running on a smurf G3 with the first Radeon card. Vapourware. It is rather hilarious to watch that presentation, given how much Jobs wanted everyone to believe Apple was serious about games.
> Apple has never really shown any interest in gaming
I don't think that's entirely true. While they haven't been doing anything with gaming on a Mac, they've made Metal a quite nice framework on iOS.
And now it's available on Macs, complete with all that GPU power.
Also: if Apple avoids silicon shortages, they might be selling some of the most affordable (!) gaming-worthy hardware in 2021 and 2022. It might resonate well with game publishers.
The golden age of Apple Silicon Mac Software isn't here yet. Let's hope that when that day comes, game makers are on board. Just don't hold your breath.
Apple simply don't care about gaming on the macOS platform. iOS however is a totally different story because it's aimed at a very different market segment.
Apple have chosen some critical market segments where they can make the most money and they have executed on those plans brilliantly.
Gamers are the exact opposite of people who use professional work machines; content creators, executives, people whose time is precious and consequently price is less of an issue.
Bluntly, Apple's target demographic make money on their machine by saving time. Gamers, generally speaking, try to save money on buying a machine to spend time.
"AMD advertises 26.8bn transistors for the Navi 21 GPU design at 520mm², Apple here has over double the transistors at a lower die size."
Why not mention while AMD is at TSMC 7N and Apple is on 5nm ?
"In terms of performance, Apple is battling it out with the very best available in the market, comparing the performance of the M1 Max to that of a mobile GeForce RTX 3080, at 100W less power (60W vs 160W). Apple also includes a 100W TDP variant of the RTX 3080 for comparison, here, outperforming the Nvidia discrete GPU, while still using 40% less power."
Did you look at the MSI GE76 Raider actually ? It's a crippled RTX3080 garbage, which is around a 2080 performance. Not Ampere level. And again, it's on a Samsung 10nm / Samsung 8N vs TSMC 5nm,
I think these doesn't matter because It's Anandtech and Apple coverage.
Right, Andrei didn't specifically call out why Apple is leading in this area, so it's like they're not even leading? Apple is cheating by using better technology. It's not fair I tells ya!
About the CPU performance, why not even look at the processors used lol, just saying "massively" doesn't cut it tbh. Those are 11800H crippled BGA i7 processors at 48W puny TDP at 70W short power duration, that too locked at 4.6GHz max turbo speed. x86 parts scale up with power. So on a BGA board they are literal trash. Unfortunately the market is not like that and people love to buy them.
I hope when you review you will put everything clear for eg running a CBR23 on say, a 11900K or 12900K or 5950X with full processor package power consumption on a "sustained load" vs the same for M1X processors with stock and full unlocked.
Also notch on a laptop ? Insanely stupid. Finally the price is $2000 for a pure soldered design and no consumer rights at all, a completely locked down OS and ecosystem blackbox.
As a consumer, I prefer my right to have a secure machine to your right to demand that my machine must be vulnerable so that you, if it was yours, would be able to do whatever you wanted on it.
You can't remove the storage, the RAM, and the battery.
This means that you have to rely upon a closed-source operating system with various odd hardware bits (like the T2 chip) in the mix — for file deletion.
The APFS file system has an interesting way of handling file deletion and forensics people already documented shenanigans like placing deleted files (e.g. from the hidden FS_Events folder) into unallocated space for later retrieval, prior to APFS.
Does APFS do anything different when deleting files? Because most file systems don't actually delete anything, just the entries from their master tables. Concerning the fsevents log, even NTFS has something analogous: the USN Journal, introduced in Windows 2000.
It’s being compared to mobile BGA Intel CPUs because it’s a laptop-class SoC replacing (and now competing with) mobile BGA Intel CPUs. This seems incredibly obvious.
Apple isn’t selling desktops with unlimited power budget containing this silicon … yet. So comparing performance to Intel chips with unlimited power budget would be useless. Why compare what’s been designed for a laptop to a 280W CPU, when that 280W CPU will never show up in a laptop?
"Why not mention while AMD is at TSMC 7N and Apple is on 5nm ?"
Noted and clarified. The point was more about the transistor count than the die size, but it never hurts to add more details.
"Did you look at the MSI GE76 Raider actually ?"
Yes. We've even reviewed it. It's the most powerful gaming laptop we've seen so far, typically coming out well ahead of the next fastest laptop GPU, the Radeon RX 6800M.
MSI GE76 might be one of the most powerful ones, but the real one is Clevo X170SM, which is the king that too on a Z490 chipset with fully socketed hardware, it can run everything at max. LGA CPU at fully unlocked power limits and GPU 2080SUPER MXM without any limits and if purchased from a good reseller they get full unlock BIOS too. Unfortunately due to the market being so small, they didn't refresh with Ampere.
I'm struggling to understand the desire to compare the monstrosity that is the Clevo X170SM (over 10 lbs) with this MacBook Pro (4.8 lbs) as though they're in the same class.
Price - $2000+ if anyone looking for what people are expecting here like to dethrone AMD and Intel on the PC space, everyone is going to look what options they have.
Performance - Clevo X170SM-G has a G-Sync Ultrafast IPS display on top it has a big Heatsink which can handle 10900K sustained constant Power draw of 250W+, maintaining an all core 4.9GHz-5.0GHz Clockspeed, that's a 10C processor with 20T and no BS power caps like they are using here which is at 4.6GHz of 11800H plus many can simply buy a 10600K and then upgrade later, or even add a 11900K (this is hard to cool in that chassis still it can manage). GPU is 200W capable MXM chipset as well which is not going to throttle like most of the BGA junktops.
Next this MBP, you guys thinking this is going to consume a 20W load as Apple states and gives you a 2080 TU104 performance ? That too constantly without throttling, same for CPU magically this M1X can do a full no throttle no C-State based max performance at just 30W. The MBP has a 140W brick with and that would be probably capped at 96W-120W for this machine.
And on 96Whr battery, you guys think this is going to perform constantly hold it's high clockspeed on a denser 57Bn TSMC 5N die ? No it's not going to happen. The M1 already throws out it's efficiency when put on high workloads this will repeat the same. Why do you think Apple is advertising the fans on this laptop, and etc.
To put it shortly, the X170SM has bad battery backup vs this, true. This one has good efficiency but in peak performance this is not going to compete at all. On top the GPU workload and CPU workload where did Apple perform the benchmarks ? Don't say Final Cut Pro. ARM based designs have dedicated blocks for encode and decode. We do not have any benchmark like FireStrike or Unigine Superposition or any great AAA to compare these. Not even sure what Apple is claiming here.
Finally the user servicing. Everything is darn soldered on this. Ever seen a MacBook tear down ? Go and watch, see how KB and Battery are glued to the chassis. Ever heard of Louis Rossman ? Try to search and see what Apple does charge for a simple IC repair and instead throw a Mobo swap cost to the user on top Storage is soldered. That's a big time no. Paying over $2500 and having a hunk of junk in case of an issue is not at all acceptable.
You remind me of the guys who remap the ECU on their car to get an extra 50bhp for the low low price of doubling their fuel consumption and getting 1/4 of the life out of the engine. You're welcome to it, just don't think that obsession with getting the last little bit of performance out of your system represents most users.
"M1 already throws out it's efficiency when put on high workloads" As far as I know that is not the typical behaviour under high workloads. M1 CPU power consumption on most high demanding task overs around 20W, or even 15W.
Have you heard of Louis Rossman. Because he talks about the Apple products and how they are susceptible to dihydro monoxide infiltration and soldered on the motherboard and I’d rather have a Clevo cause it’s heavy and I’ll get reall jacked?
It will be interesting to see the Clock Frequency of the CPU. Judging from the bump the A15 got, perhaps 10% more might be possible here? And how much will the massive caches contribute?
I've been following CPU chips since 1968, and first with the M1 jump, and now with todays announcements, I don't recall ever seeing such a large step change. With all due to respect to Intel's leadership role in the earlier years, the cynical comment would be the Apple has proven that Moore's law is not dead, but Intel is. Fortunately, there are enormous resources available there, which should keep competition healthy.
As we look to the coming year of Apple silicon, the logical thing to do is iMac and MacPro. We can then speculate if the year after, Apple plays the final card by applying its chip efficiency prowess to server farms, were the green benefits would be huge. Whether a consumer-oriented company would do this or not, is an interesting question, but the societal impact would be signifiant, so one could argue that they have a moral obligation to do it.
Yikes there's a bunch spring to mind! 286 to 386 to 486 to pentium. The pentium jump was huge! K6 to k7, P4 to conroe, 'dozer to zen. Hell back in the day a cpu could be shrunk *twice* over it's lifetime.
We should hold off a bit on judgment until we see real life benchmarks come in from the various review sites. Having said that, if review site consensus confirms Apple's claims, then this shift could be bigger than when intel "conroed" the CPU market and I agree with Carstenpxi, it could very well be the biggest CPU leap step. If it holds true, it shows the limits of the x86 incremental approach (like we did not know that already) and what is possible with large financial muscles and market position to get access to most advanced process on the planet prioritized by TSMC (through Apple's $$$) ahead of AMD and Nvidia (both who are a fraction of Apple's market cap and revenue). As a business and tech guy you must admire Apple's strategic play here... and this is just the beginning!
More fundamentally, taking a few steps back and looking at the complete picture:
this is a game of financial muscles so you have to look at the # Apple market cap: revenue: intel market cap: revenue: 220.94
Comparing Market Cap to Revenue of Apple and Intel based on your numbers, that's quite impressive for Intel. Or it just shows how little perceived value the public thinks Intel is compared to that of Apple. But when considering Market Cap for the massive companies, like Apple, I don't think you can really use it as an indicator of performance. It exists as a speculative measure for the big big companies. Intel is not a big big company, so comparing Revenues and Margins are a better metric. With the numbers you posted, Apple is *only* 4x bigger than Intel. But then, they can always sell a few more stocks to raise a crap ton more capital too!
I'll give you 8080 to 8086. That was a massive step up, but if you remember 486DX4 at 100MHz or AMD's 133MHz equivalent, the Pentium 75MHz seemed a bit on the slow side. It did of course take huge strides forward with every new clock increase.
And if you're been following CPUs since 1968, you're doing better than me. Was that in DTL logic ? That's pre Intel's 4004. It's even pre Texas Instruments TTL based 4 bit Arithmetic Logic Unit - the 74181 which powered those old PDP-11 and VAX machines! That chip didn't get released in TTL till I think late 1969 or early 1970!!
Arguably the key story is the use of LPDDR5. Intel TigerLake Core i7-11800H may be 8 cores but it is only 2 channels of DDR4-3200, so it is amazing it can even post 60% of the perf of the M1 Pro. Using 25% of the memory throughput it would be pouring power into speculation and other Core tricks to try to compensate.
DDRx memory is a trap for CPU designers. It has a purely unwarranted reputation as the ideal. Apple are way out in front because they realize that LP-DDR is actually far higher performance potential at much lower energy per bit. All you need to do is focus on how to package it. Perhaps Intel gets it by now.
>so it is amazing it can even post 60% of the perf of the M1 Pro
To be fair, very few consumer CPU workloads are memory-bandwidth-bound: ideally, they don't even want to go to DRAM, right? Before today, the AMD EPYC HEDT behemoths, relatively speaking, had eight-channel DDR4 and there are some workstation benefits, so perhaps this is where the M1 Pro / Max are positioned.
But, gosh: what Apple's done with the LPDDR5 @ 400 GB/s is much more important for the GPU. I'm still hesitating to write it, "Wait, is that a typo? 400 GB/s?!"
Apple's cache sizes are the real story -- they're absolutely massive, and extremely local at the expense of the most efficient per-transistor layouts (or their layout is genius). If Intel spent that much die space on caches, they'd go out of business just off of the cost per die of their sub-$400 segment.
The only route to competing with this madness is with chiplets in laptops, and hope the interconnect doesn't hose idle power.
The CPU performance claims are probably reasonable but GPUs aren't held back by legacy instruction sets in the same way that CPUs are. How is the GPU performance/watt claim even remotely realilstic?
As long as it doesn't say 'edited' in the post. That's really annoying for someone who makes plenty of typos. It makes posts seem suspicious when, in reality, they're posts that someone took the time to clean up.
Apple could have reduced weight by including a smaller battery. I, for one, am very glad they haven’t. I’m totally cool if the pro machine is heavier in exchange for battery life and power.
If I want a light machine, I’ll get an Air. (Keep in mind that the current Air’s enclosure has not yet been designed for the much lower TDP of the M1, so I reckon a redesign will be a bit lighter.)
For the performance? No, not really. It's neither extremely light nor particularly heavy; you bottom out at around 1Kg for a low-performance 14" device and go up to 2Kg for something with GPU performance that will slightly exceed the M1 Pro.
i hate you apple, surpassed my expectations.👏 I look at AMD's APU on the PS5/Xbox as the best innovation currently in the consumer PC but considering AMD's humble size, a tech company bigger than AMD can outdo it and this is exactly it. I've been preaching recently that AMD should immediately build Arm chips with huge AMD graphics where the end game is selling this to Apple as they don't have graphics prowess, turns out this game plan is not a walk in the park.
My thoughts echo yours. The console chip design from AMD is something that needs to be replicated fast in the PC space. The era of discrete GPUs is very fast approaching the end, and the PC needs highly integrated SoCs with large memory bandwidths like Apple's to compete on the efficiency front.
I wonder if Qualcomm's Nuvia team is going in this direction with their Apple-esque CPUs? It will be exciting next year to see how these turn out and to see how they're adopted into the Microsoft PC world.
Not quite related to the SoC specifically, but anyone know if Apple's ProMotion refreshes on demand a la gsync, or does it change between fixed refresh rates on the fly?
So they are going all-in on solidered die carrier memory--no expandability: that, together with the large pools of last level SLC gives them GDDRx/HBM like bandwidth with LDDRx like latencies (as well as huge DRAM related power savings), which is great as long as your CPU/GPU workload demands lie right on that linear line of CPU/GPU-core/RAM capacities for their 1x/2x/4x configurations.
The M1x basically become appliances in the four basic sizes (I guess some intermediate binning related survivers will round out the offer), which I've imagined for some time via a PCIe or InfinityFabric backend using AMD "APU-type" CCDs, HBM/GDDRx and IODs.
What they sacrifice is the last vestiges of what had me buy an Apple ][ (clone) and let me switch to the PC afterwards: the ability to make it a "personal computer" by adding parts and capabilities where I wanted them throughout its life-cycle (slots!).
I can see how that wouldn't matter to most, because they can fit their needs into these standard sizes, especially since they may be quite reasonable for mainstream (my own systems tend to have 2x-4x the RAM).
Of course it would be nice if there still was some way to hang a CXL, IF or PCIe off the larger chips, but Apple will just point out that this type of compromise would cost silicon real-estate they prefer to put into performance and interest only a few.
Of course they could still come out with server variants sans GPUs (or far reduced parts) that in fact do offer some type of scale-up for RAM and workstation expandability. But somehow I believe I get the message, that their goal is to occupy that productivity niche and leave everything else to "niche" vendors, which now includes x86.
Well executed, Apple!
And believe me, that doesn't come easy to someone who's professional career has been x86 since 1984.
I still don't see myself buying anything Apple, but that's because I am an IT professional who builds his infrastructure taylormade to his needs since decades, not a "user".
I'd get myself one for curiosities sake (just like I got myself a Raspberry PI as a toy), but at the prices I am expecting for these, curiosity will stop short of getting one that might actually be usable for something interesting (the M1Max), when I get paid for doing things with CUDA on Linux.
Getting enough machine learning training power into a battery operated notebook is still much futher away than electrical power anywhere I sit down to work. Just like with "phones", I barely use the computational power nor battery capacity of the notebooks I own. My current Ryzen 5800U is total overkill, while I'd happily put 64GB of RAM in it (but it's 16GB soldered). So if I actually do want to run a couple of VMs in a resort, I'll have to pack the other slightly heftier (64GB and it will do CUDA, but not for long on battery).
I can probably buy two or three more, add 8TB NVMe and double RAM on each and still have money left vs. what Apple will charge.
Yes, they won't have as much power per Watt taken from the battery, but that does not matter to me... enough to get my Raspberry a fruity compagnon ;-)
OK, so now that we've all got out of our systems - cost too much - suck compared to team Intel/AMD/nVidia - don't include <weird specialist feature I insist every computer on earth has to include> let's try to return to technology.
Note the blocks (in red) at the very bottom of the M1 Max. On the left-most side we have a block that is mirrored higher up, above SLC and just to the left of the 8 CPUs. Next we have a block that is mirrored higher up above SLC, to the right of the 8 CPUs. Apple tell us that with Max we get 2x Pro Res Encoders and Decoders. Presumably those blocks; one minor question of interest is whether those blocks are *only* ProRes or are essentially generic encoders and decoders; ie you may get double the generic media encode/decode on Max, which may be useful for videographers beyond just Pro Res users?
It certainly also looks like the NPU was doubled. Did I miss that in the event? I don't recall Apple saying as such. (Also looks like the NPU -- or somethingNPU-relevant -- extends beyond the border drawn by Andrei, when you compare the blocks in the two locations).
Finally we get the stuff at the right of the Max bottom edge, which replicated the area in blue above the NPU. Any suggestions? Is that more NPU support hardware (??? it's larger than what Andrei draws as the NPU). Lots of SRAMs -- just by eye, comparing it to the P cluster L2, it could 16MB or so of cache.
So this all suggests that (a) with the Max you also get doubled NPU resources (presumably to search through more video streams for whatever NPU's search for -- faces, body poses, cats, etc)
(b) the NPU comes with a fairly hefty secondary cache (unless you can think of something else that those blocks represent). Looking at the M1 die, you can find blocks that look kinda similar near the NPU, but nothing that's a great match to my eyes. So is this an additional M1X change, that it comes with the same baseline NPU as the M1/A14, but augmented with a substantial NPU-specific "L2" (which might be specialized for training or holding weights or whatever it is that people want from NPUs)?
Well, I love your speculation, but on the Apple shop page, the SoC configuration makes no difference on core count of neural engine, it remains at 16 "cores" for all three variants, M1/Pro/Max.
You may argue unique differentiation for M1 SoC and how they do RAM with it, but SSD storage is just commodity. And all their cleverness about using DRAM to produce GDDR5 class bandwidth leaves a bad taste when they sell it at HBM prices.
Ursury around here starts at 20% above market price and Apple is at 200% for SSD and RAM.
After the minimally interesting config got me beyond €6000, my curiosity died.
Apple used to charge absolute rip-off prices for bog-standard SODIMMs in their models. By comparison, this new pricing of $400 for a 32GB RAM upgrade to 64GB is actually not too bad.
This is NOT DDR4 RAM. This is running LPDDR5 RAM, and is the first customer laptop / desktop in the world to run LPDDR5. You're paying for that first-adopter advantage. (There are some very recent phones that run LPDDR5 but I think they max at 12GB and use a slower variant)
Mid-range DDR4 for desktops seems to run at about $5/GB for 2 x 16GB. But go up to 2 x 32GB, and suddenly it's around $10/GB especially on the high end, so you're looking at around $320 for 32GB of fast high-end DDR4.
The M1 Max runs extremely fast extremely specialist RAM that is a generation faster than DDR4 / LPDDR4, and the 64GB is concentrated down into only 4 on-chip RAM modules.
Getting that at only $12.50/GB for the extra 32GB is a bit of a bargain at this point in time.
(I previously said this was DDR5 RAM, I was wrong. As for storage, yes $400/TB is stupidly steep and shouldn't cost so much extra even for fast pcie 4.0 storage. That's more or less a commodity by now.)
Remember there's probably like 1 or 2 NAND options they've rated their SoC-internal controller for, which means they've probably had to select for top binning on both power and performance due to sharing architecture with the M1 iPad Pro.
Which, honestly, is less of a sacrifice than I expected from Apple's ARM transition. This whole thing has been disturbingly smooth even considering the software incompatibility lumps.
Thanks for lecture, but I actually already admired their creative super-wide four channel interface for the M1max, which unfortunately sacrifices any external expandability.
Still, while it's a special and needs to be managed with a complex die carrier and assembly, in overall cost it's relatively normal technology and thus cost, everything high-volume items.
So they make their typical >200% margin also on the DRAM.
One charges a premium for a halo product. That is simply what one does, it's very a simple economic calculation. I expected the upcharge to go to 64GB to be north of $1k, it was not, it was $400.
> Note the blocks (in red) at the very bottom of the M1 Max. On the left-most side we have a block that is mirrored higher up, above SLC and just to the left of the 8 CPUs. Next we have a block that is mirrored higher up above SLC, to the right of the 8 CPUs.
My guess is that these blocks are for redundancy. Apple already does a lot of binning and to have some IP-blocks as spare should further increase yield. Note in the upper right corner the four identical blocks? The M1 has two of them and i think these are four TB-controlers and only three of them are used.
The fact that the Pro looks pretty much like the upper part of the Max makes me think if they only manufacture Max dies and then cut off the lower part if it's going to be a Pro. It's a tradeoff between two manufactoring runs and only one run with admittedly lots of wasted silicon. I guess we will see actual die shots at the end of next week and it should be visible if there are some "residuals" from a larger die on the Pro die.
Apple’s M1 Max is alien technology for a laptop! Wow! 400GB/second throughput! Best fiber optic throughput is 10GB/second! Standard is up to 1 GB/second! Amazing!
400 GByte/s of bandwidth in a laptop is impressive but you're clearly underestimating where fiber networking currently is. 800 Gbit Ethernet is a thing which is roughly 100 GByte of bandwidth there at the highend. 1 Tbit and 1.6 Tbit Ethernet are in the draft stages. 2.5, 3.2 and 5.0 Tbit speeds are in proposal, though they'll likely need silicon photonics to scale to such speeds.
400 GByte is a lot of bandwidth but you only have to move a few centimeters where as those fiber specifications are able to move data at those rates in the ranges of 10s of kilometers.
This goes to show what you can do when you don't have to care about a product priced for ordinary people.
Apple's mobile die sizes have already been massive compared to competing mobile SoCs. It's not shocking that they made the same tradeoff for their laptop SoCs. It's just sad to think how many chiplet-type designs could be harvested from the same silicon wafers.
Microsoft better be working on a secret SoC of its own as relying on Intel is not getting it much. It needs a major boost to the Surface line with a bigger performance jump and battery life
It is kind of hard to compare now that Apple has gone ARM, but Microsoft's Surface laptop pricing (in the UK at least) has been pretty comparable to Apple's x86 laptop pricing.
One thing I really like here (as a non-Mac user) is that Apple is finally breaking this nonsensical barrier to wider, higher-throughput memory buses for CPUs and APUs wide open. For whatever reason, we x86 users have been told that there is no good reason or no benefit to such wide RAM access. Well, maybe now AMD and Intel will reconsider. Memory bandwidths of 200 - 400 GB/s using working RAM (yes, it's DDR5, but so what) is something to aspire to!
AMD's been selling APU's with high memory bandwidth and graphics performance for years in the current and previous gen consoles. Maybe they'll finally start selling them into the consumer PC market.
It's all about economics. Perhaps this will open a pathway for AMD and Intel to pursue larger APUs, but for the time being the cost vs. performance trade-offs haven't made sense.
If a PC company will do soldered RAM like how Apple M1 Pro & Max did, then I'm finally okay'ish with soldered RAM.
I'm still losing the flexibility and the affordability of self upgrade, but man.. seeing this compared to 8GB & 16GB soldered is an Earth/Sky different. especially with the memory bandwidth. I'm not complaining here, period.
RAM upgrades have always been the most important thing to extend the useful lifespan of laptops, except in rarer cases where someone gets far more than they feel they need when the buy the machine (something much more common now that 32 GB has become a lot more affordable).
It should always be assumed that soldered RAM in laptops is about making planned obsolescence faster more than anything else.
Is that extra bandwidth such a great advantage, once the amount of RAM is no longer enough to prevent slowdowns?
Apple sold Macbook Pro machines with 16 GB of RAM. I have a 15" 2013 model with that much. It will shortly become 'deprecated' — no longer able to function with Apple's level of security on the Internet. I know a number of people with older 16 GB Macs that haven't been able to be secure for a long time now, machines that are 100% adequate for their needs in all other respects (particularly given the fact that they have SSDs). Meanwhile, the company has been introducing new machines that are capped at 8 GB.
It's insanity for consumers on a parade float of 'wicked fast' banality. Yes, yes... very fast. Very quick to end up in the landfill because of inadequate RAM.
'I know a number of people with older 16 GB Macs that haven't been able to be secure for a long time now, machines that are 100% adequate for their needs in all other respects (particularly given the fact that they have SSDs). Meanwhile, the company has been introducing new machines that are capped at 8 GB.'
99% of people don't ever upgrade the laptop ... Those are just facts. Most laptops become slow/obsolete in 5-6years.
My MBP15.4 from 2012 w/16GB still works fine. It's getting slow now, but never had memory problems for general tasks.
At some point, more ram doesn't do anything. You just need more compute. If you need 32GB of ram, chances are, that laptop you bought in 2012-2015 isn't fast enough no matter how much RAM you put in it.
In my over 30 years of computing, I've never once used a Mac or Apple product. As a filmmaker I am starting to think it may be time to bite the bullet and learn.
Nothing in the PC market will be able to touch this, from a video editing standpoint. I just wish it couple play AAA games.
An interesting announcement. Very high memory bandwidth (especially for M1 max), probably for the benefit of the GPU.
The transistor density of the max is more than twice as much as that of the Navi 21 despite the Navi 21 having more (dense) cache; they must have made their logic more than twice as dense as AMD, which is more than I would expect from the process advantage.
I wonder about the 4 RAM packages, each of which contains 16GB (128Gb); AFAIK RAM dies nowadays have at most 16Gb; so the packages contain 8 dies; stacked or in a flat layout?
Why is Apple doing such a big GPU? Is this important for their existing markets, or do they want to get into new markets? Maybe a game console?
Overall: The good: Very impressive efficiency. The bad: Non-expandable. The ugly: Apple.
LPDDR5 uses 16-bit interfaces. The M1 Max has a 512-bit memory interface and 4 memory modules. Each memory module would have a 128-bit interface. Combine 8 dies of 16 Gb for a memory module with a 128-bit interface and 16 GB of memory. Four of those and you have a M1 Max with a 512-bit memory interface and 64 GB RAM memory.
Hey Andrei, in the M1 Max dieshot, aren't there two of the structures homologous to what you're calling the NPU in the M1 Pro? There's one in the copied portion to the M1 Pro and one in the lower left corner. There's also something that looks like an SLC block next to each as well.
The bottom of the M1 max contains duplicates of at least 4 IPs. The only IP Apple has claimed to be doubled (excluding the GPU itself) is the ProRes encoder and decoder. I drew some boxes around the IPs I found: https://imgur.com/a/u1onxdI. The blue box is the Neural Engine (compare with the annotated die shot of M1 https://images.anandtech.com/doci/16252/M1.png).
My best guess: It's for yield. They need all of their dies to have those components functional, and that area between the memory interfaces would otherwise not be doing much.
Dont know if anyone noticed, but in the M1 Max, at the bottom of the SoC, there is an additional 16-core Neural Engine. Must be disabled, but its interesting they just copied and pasted it on the other half.
It looks like there are several spare parts down there. Cheaper to have them on every die than to be throwing away usable CPU/GPUs because of some defect in some random IP block. Given the size of the die, a lot of that area might have ended up being scrap silicon anyway.
Are these die shots real? AFAIK Apple has a habit of showing conceptualized die shots to prevent competitors from seeing the layout (which they can easily do afterwards anyway, but I digress). I'd be surprised if they actually doubled the neural engine, but just kept one active. Wouldn't it be more efficient to include a few more “units”/“cores” than they needed instead?
I don't take much credence in Apple marketing hype, you really really can't make comparisons with the x86 unless there's parity with the software and setup.
It's still a phone CPU but marketing is a wonderful thing, you can turn a pigs ear into silk. In the land of warped reality, where the sky is the colour of the rainbow and flying unicorns nesting in the treetops, anything is possible, it's about perception not facts.
In terms of energy efficiency and kudos to Apple, M1 or whatever incarnation sets the benchmark, not sure if the upcoming V-RISC processors would improve the power envelope further. The herd will always follow the fruity cult but I rather piss over their garden wall than to be part of the grass grazers.
How is a 400mm plus die with multiple cores each of which can run multiple workloads at the same speed if not faster than workstation CPU cores from Intel and AMD a phone CPU? It can’t fit in a phone. It can’t be powered by a phone PCB.
Math is math and computations are computations. All these cores do the same computations, so I would define a phone CPU by its size and power envelope. But your definition somehow includes a CPU that has 3 times more transistors than the new IBM Power 10 and going to draw 30W or more for light loads and will pull more than 100W on heavy loads.
I would love to hear your view as to how it could be put in a realistic normal phone.
I’m with you. Upcoming processors from AMD et al in the coming years are going to make these chips look underpowered. That’s what I’ve always said about Apples unicorn blue skies philosophy: Your current technology will be absolutely no match for your competitors products from, say, five or six years from now. Take your apple-shaped ball and go home.
It is most people's hope that future technology outperforms current technology, otherwise what's the point? Dare I suggest that Apple's own future CPU's will outperform the M1 generation too? I feel the pain of windows/PC fans... I really do, but your time to shine will come soon enough once AMD and Intel get their new tech out into the wild. Looks like Intel are working on some big/little tech.
You seem to think M1 is the end the road for Apple ... what have Apple demonstrated since M1 came out last year? They deliver on their promises.
Intel? Not so much. AMD? No response to Apple Nvidia? Still needing 300W to power their graphics chips.
Guarantee you MacPro next year will have 4xCompute and 4-8xGPU of the 32core M1Max, and yeah, over 1.6TB/s of memory bandwidth and unified memory (imagine all that graphics memory).
I'm genuinely stunned by what they're offering here - and the claims about performance per watt. Between the refined N5 process and their profligate use of transistors to hit their performance targets, I don't doubt the sincerity of their claims. I just wish I could get one of these in a system that runs the software I use!
It's likely close to best in terms of CPU industry wide (certainly best for efficiency), and the rest is comparable to a good console SoC but better tuned for power efficiency (use of LPDDR5). It's about time that the game console's superior fully integrated architectural style made it to personal computers; it had been stalled by the grip of Intel, with the lack of their own graphics IP and their margins from selling an additional chipset at an older node keeping the antiquated current desktop and laptop over busses alive. There are some benefits to modularity in terms of upgradeability and just plain fun, but the vast majority of consumers never crack open their case once.
Amen! The discrete GPU shall be remembered, but not missed! It's finally time to set sail on that ancient tech. (Add-in cards in the 2020s?! We had add-in cards way back in the 80s and 90s, Sound cards seem so preposterous of a thing to have in a modern computer, and so should GFX, at least in laptops)
We may yet be getting these soon in the form of Qualcomm chips. Whether they can run the software we love to use, well that will be up to Microsoft. If Apple's chips are fast enough to translate x86, can Qualcomm's as well? That's my hope. More and more, I see the future of Microsoft Windows living in the ARM world with x86 translation support. And then I see the personal computing landscape mirror that somewhat of the iOS/Android, but with more a more organized front (Windows) competing against Apple's offering.
Agreed. Qualcomm, Microsoft et al should work on Rosetta-like solutions so we can move on from aging architectures without losing software compatibility.
It appears that folk are putting down this SoC because of an aversion to Apple, and won't acknowledge, in the spirit of good sportsmanship, that this is an impressive piece of tech. They just can't admit this silly company from Cupertino coming close to, equalling, or beating the fellows they're fond of. As a result, we've got the remarks criticising price, soldering, MacOS, lack of AV1, as a way to say, "This thing is rubbish. Don't bother."
We're all supposed to be lovers of computers here, and this article is discussing an SoC. What does price, or even soldering, have to do with that? We can't own a Space Shuttle, but isn't it nice to discuss it and say, "Certainly, I'm no fan of NASA, but this is pretty good stuff."
My personal favourite is the AV1 attack, a tactic used to throw stones at this giant. Surely, anyone who's doing editing, will not encode to AV1 in the middle stages? AV1, if used, will be saved for the final step, using software encoding and libaom. Decoding would've been nice; but Apple could add it easily, and the sort done in software isn't that bad.
I can't stomach Apple, their status connotation, or their products. Quite frankly, they put me off. But that won't stop me from admitting the merit of these SoCs. And Apple worshippers, the same applies to you, when you're putting down Windows, Intel, and AMD as if they're from the bin. It's that spirit of superiority which grates on x86, Windows people, causing them to make fun of Apple. Nobody likes when someone acts as if they're better than everybody else.
Your space shuttle example is flawed, at least in terms of the long history made prior to for-profit space travel. Shuttles were about enriching humanity more than lining the pockets of yacht buyers. Yes, some of it was nationalism, which was about the latter. But, the main idea, at least on the surface, was not profiteering.
These companies have too much money and make terrible decisions, like the shattering panel on the 13" M1 and using security patches to fill landfills.
We may love tech but we also love getting a good deal. Apple could have made another chip that's even bigger. It left some die area on the table. With corporations, as profiteering comes first, 'just enough' is the goal — not 'let's max-out the possibility'. A bigger chip would have meant a lower price for this 'Max'. Instead of using its advantage to push humanity foward, it's content to do 'just enough' to maximize margin.
While it can be argued that that's the best strategy for keeping a corporation alive, putting it into the position where it can create innovative products, 'sell less for more' is the overarching mantra of the corporation. That relies upon marketing, which is about inculcating delusion.
It's also a fact that all that money is used to keep innovation down. IP serfdom. It's great for the wealthy powerful few. It's a huge impediment for creative invention for those who aren't. Look at how long copyright lasts now. It's a vampiric parasitic neo-serfdom apparatus — like the corporation.
I agree, Oxford Guy, with all my heart. These rotters are only out to make money, and will say anything to fill their coffers. Sustainability and yoga in vogue today? Well, play into that and your product will sell. "Here at Pineapple, we care about sustainability, and that's why we're using ethically-sourced green materials." Add a bit of emotional music and it's a hit.
I'd go so far to say that today's tech companies---Facebook, Google, Apple, and co.---are wielding a species of soft totalitarianism, wrapped up in marketing that plays into consumers' desires and vanities. Sprinkle with "empower" here and "empower" there, and you've got them. You're not buying a phone: you're buying empowerment and liberation. Nor is one buying a chemical concoction called make-up that probably ruins the skin, but a ticket to youth, attractiveness, and success in courtship. Further tips: add the idea that everyone's doing it ("try the app that's been downloaded by most Americans"), lip-service to choice, and correct alignment with current politics.
The reality is the PC industry hasn't innovated for over a decade. All they've done is add more fans, coolers and more optimizations, when we should be following the Moorse Law.
Apple come along and redefines the industry with their Apple Silicon which clearly are YEARS ahead of their competition. No credible person would think that Apple's isn't gonna keep 2x their product for the next few years. Apple is already designing M4 for all we know. They are just flexing their muscles in small chunks at a time.
Yet, PC folks continue to ridiculous Apple as "piece of junk". It's embarrassing to call themselves computer enthusiasts. Tech is tech. It's not a religion.
Apple has their shortcomings (getting rid of ports, excessive thinness to their laptops at the expense of performance, butterfly keyboard, etc ...), but no PC fanboy wants to admit that Apple does produce quality products compared to their competition.
Apple fanboys wants "acknowledgment", while PC fanboys go to great lengths to deny them and continue to ridicule them. No apple fanboy are gonna just take that lying now. It's a vicious cycle.
If PC fanboys just admit that Apple makes quality products, I'm 100% certain Apple fanboys will also admit that choices are GOOD.
Some people like a supped up Honda Civic, while others like their BMW maintained by the factory warranty. Each their own. It doesn't mean all BMW are crash and Civic a infinitely better and cheaper.
I agree that if people could just admit a competitor is good, when good, all would be well. It's hard, I know, but has a medicinal effect on the mind, almost if a burden were lifted off one's chest. Such is truth.
I don't agree that the PC space hasn't innovated. How about Sandy Bridge and Zen? Even Bulldozer, despite being a disaster. If Zen's turning the tables on Intel and raising IPC ~15% each year isn't astounding, I give it up. And as far as I remember, Renoir wasn't that far behind the M1---and that's with the handicap of x86's decoding overhead, among other things (5 vs. 7 nm). I'm confident that if AMD built an ARM CPU, after a couple iterations, if not on the first, they'll match or surpass Apple. And I even doubt whether ARM's all it's cut out to be. If x86 must go down, let's hope the industry chooses RISC-V.
While excellent and worthy of applause, the M1 is hardly years ahead of the competition. Where does it stand against Zen 3? Is it really that big of a difference as the story's being painted? Once more, the search for truth. The ultimate test, to see who's the best in design, would be to let Apple craft an x86 CPU or AMD an ARM one.
I think it is in terms of packaging and efficiency. Outright performance maybe not, but the fact that it makes *no compromises* in it's beating of anything the PC space can offer is the major news here. There are no negatives about this chip. It's better in just about everything, and in major ways such as efficiency and parallelism.
If anything, this should be lauded by the PC community. This SHOULD give the kick in the proverbial butt to the likes of Intel/AMD/Quallcomm/NVidia to change their thinking in CPU design, to get back on track with Moore's law. I'm excited to see how the PC industry reacts to this.
Will it gain back performance lead at some point, or will it forever be stuck losing to Apple a'la Android/iOS SoC designs?
When the M1 first came out, I felt it would recalibrate the frequency/width/IPC axes, and still do. AMD and Intel only had themselves to compare against all this time. Though Apple's not a direct competitor at present, I'm confident AMD could beat them if they had to, now that they see what sort of performance they've got to aim for. Those who are making fun of x86 underestimate what AMD's capable of. Intel learnt the hard way.
Hmm, you really think so? I mean, AMD's Ryzen is good, but it's not really any better than Intel's best (Tiger Lake) and will soon be eclipsed by Alder Lake. Ryzen has just caught up to what Intel's been able to offer, but I don't see it as much better. At the very least, compared to these new M1 chips, AMD and Intel chips are nearly identical.
I suppose I just don't see AMD as the one challenging Apple's CPU prowess. They don't have the R&D budget to do so. And Intel? I'm not sure they can ever recover, they're not hiring enough young engineers to rethink the paradigm shifts needed to compete with the coming of ARM.
That leaves Qualcomm and their Nuvia Acquisition, which no one really knows how seriously to take. If Nuvia's design roadmap have them developing M1-like CPUs, then I think Quallcomm's future is bright.
Or perhaps it's not so black and white. X86 might survive just fine, and we'll continue to see a healthy battle and innovation. Afterall, that's the best case for us consumers.
I think it takes more than big R&D budget to make a winning CPU: it was Bulldozer-era AMD that designed Zen. And we've seen that dollars thrown left and right, in Intel fashion, may but doesn't necessarily produce excellence.
Whether x86 will go down, no one can tell right now. As it stands, there is no true competitor on the desktop, Apple being isolated in its own enchanted realm. Qualcomm, who knows? There's a possibility Intel or AMD could announce an ARM CPU (RISC-V being less likely because of no Windows version yet), causing x86 to fade away. I won't be surprised to see Intel trying some trick like this. "If we can't fight Ryzen, why not pull out the carpet from under it?"
As for paradigm shifts, while innovation and flexible thinking are excellent, drastic change has often been disastrous: Pentium 4 and Bulldozer. It's the tried-and-tested ways that work, and further perfecting those. As for ARM, apart from the fixed-length instructions, I don't think there's anything really special about it, as is often painted in the ARM-x86 narrative.
To think Apple will 2x their product is insane. This is not the M1 in reality as it has been in development for years in the iPhone at the main core level. All the easy gains have been made already. I would not be surprised to see a 15 % per generation improvement from here.
These are impressive chips with the M1 Pro hitting the midrange sweet spot in the midrange. I'd love to see Mac Minis and iMacs using these chips soon where they can ride the frequency voltage curve a notch or two higher to really see what these designs are capable of.
The layout of the extra encoders on the M1 Max seem to be targeted at an odd niche vs. what else Apple could have used that die space for. I will argue for the first set of encoder found in the baseline M1 Pro just the extra units is serving an ultra small niche who will actively utilize them.
That dies space would have been better leveraged for two additional things: an on-die FPGA or even more memory channels. The FPGA programmability would permit *some* additional acceleration for codecs but obviously not hit the same performance/die space or performance/watt as a the dedicated units but it would help for those that need more than the first set of encoders. The other idea of additional memory controller is less about increasing memory bandwidth but increasing raw memory capacity: 64 GB isn't a lot when venturing into workloads like 8K video editing. Boosting capacity up to 48/96 GB would see more usage than the secondary encoders and have a better fit across more workloads. The down side of adding additional memory controllers would be greater die size (~500 mm^2?) which leads to high cost of the SoC itself. Total system cost would also increase due to the additional memory chips too. Even with these tradeoffs, I think it'd have been the better choice than a second set of hardware encoders.
M1Max is clearly targetting the high-end laptops. Odd niche? Apple obviously is gonna target their platform for optimizations. LOL.
Otherwise, it just means they have no confidence that their encoders are any good. Which makes no sense. If you don't endorse your own products, who will?
The industry will give Apple a second look once they see the power/watt. Money talks.
Absolutely it's about increasing memory bandwidth. Otherwise, all that memory isn't gonna do much in graphic intensive tasks. Why do you think GPU memory have such high bandwidth compared to System Memory on PC?
If anything, M1 and M1Pro/Max have demonstrated that Apple knows what they are doing.
Agreed. Apple hit it out of the park with the M1 Pro and M1 Max. We shouldn’t have been surprised since the M1 was so powerful at 10W, but some people won’t believe it until they see it.
I have no problem with Apple including a ProRes encoder on the M1 Pro and M1 Max. That does make sense to in terms of performance, power consumption and silicon investment in the design. My issue is that adding a *second* encoder to the M1 Max is incredibly niche. Ditching that in favor of two 128 bit wide interfaces to mostly increase memory capacity would have been the better trade off given the market Apple works in. The memory bandwidth boost would be nice but not the primary driver when the M1 Max as is already has 400 GByte/s of bandwidth. When doing high resolution video editing, 64 GB of memory can be consumed quickly. Simply put ,more RAM would have had a bigger impact to more end users than the second encoder block on the M1 Max.
Obviously the 2nd encoder is specifically targeting video editors/serious content creators. You also get 2x memory BW (200 to 400GB/s) to go along with it.
That's why they made you pay extra $200 and another $400 memory upgrade to get it (16GB to 32GB). ($600 total).
I mean, to go from 32B transistor to 53B transistors ... no one seriously expects Apple to not charge a pretty penny for it. People who bitch and moan are people who expects to get their 2ch DDR4 64GB for $400 ...
Except Apple is giving your 400GB/s LDDR5-6400 unified memory architecure.
Only layman equate those two products are the same.
I am very impressed with these chips! The unified memory architecture seems to be a tremendous leap forward in the laptop class CPU and GPU landscape. Apple has disrupted Intel and AMD and NVIDIA with the M1 and now has pulled so far ahead of them with the M1 Pro and Max. Nothing anyone else is currently shipping in this category is even close to competing in ALL the important laptop metrics.
I have a $1100 M1 MacBook Air that outperforms my $3300 Intel Core i7-8850H Macbook Pro in most of the tasks that I perform on a daily basis. Mostly ingest, edit, and export of 42MP RAW files in Lightroom Classic. Secondarily transcoding 1080p and 4k h.264 to ProRes andor HEVC. Lastly email, web, streaming, etc. And that is even before Lightroom was optimized for the Apple Silicon. All while being silent versus annoyingly loud. All while being cool or warm to the touch versus too hot to put on my lap. All with 1/4 the RAM. Oh yeah, I still think constantly about my battery life, but that is from 25 years of laptop use! When I actually check I still have plenty of battery life to complete my task and then watch hours of streaming videos while reading websites like this side by side. I will definitely upgrade to a laptop built around one of these SOCs as soon as the reviews determine which is the best in price to performance.
An aside, I have disabled comments on all websites for many years and just recently enabled them again. Not much has changed; 9 out of 10 comments are from Windows / Android fanboys or Intel / AMD / NVIDIA proselytizers. The rest are Apple Defenders spiced with actual talk about the SOC itself.
Actually, what I am is a customer. That means I purchase the products that I speak of because they meet my needs based on real world value and performance tests. Not hypothetical scenarios or aspersions cast from a misguided tribal allegiance to a particular corporation.
If you mean the memory bandwidth of the chip I think it's ~400 GB/s. The SSD read speed in the new MacBook Pros Apple states as 7.4 GB/s (very fast, about 35% faster than a PS5). I don't think they mentioned the write speed, which probably means it's slower.
Apple press embargo lifts at 9am ET on Monday. Detailed reviews will hit a few seconds later.
As far as SSD goes, it's either PCIe 3.0 x8 or PCIe 4.0 x 4 (I favour the latter).
I speculate there will be a Mac Pro with 4x these NVMe SSDs for a total of 30GB/s aggregate bandwidth to backing storage, either 32TB in total or 64TB total by the time it's released.
* 32TB or 64TB, all soldered on a single motherboard with up to 4 Pro Max CPU packages also soldered on it. If slots are provided, which could just be slots on a daughtercard connected via TB4 to the main motherboard, then you could put additional NVMe drives on a riser or attach SATA drives for more storage. Hopefully the case will still provide somewhere to put them & keep the whole thing tidy.
Its shame that Apple is only company whose tried to make beefier ARM chip.. It would be interesting, how would be virtualization performace with Windows machine, if it would be faster than real x86 Windows machine, it would clear victory otherwise it would be only great for MacOS ecosystem behind their iron curtain..
I apologize if this was mentioned already but the diagrams don’t match the die photos, so the diagrams are just abstractions? Also, are all those performance vs power graphs idealized? Because all the curves are so smooth.
Anand review of Apple are abysmally bad. The former head Anand, of course being an Apple employee.. The people at Anand covering Apple are really nothing more than cheap Apple shills :-(
Ian Cuttress has mentioned that Anand La Shimpi has little to no interaction (let alone involvement) with Anandtech anymore. Apple doesn't need to pay sites for coverage when their products are just that good and every website comes to the same consensus. Time to step your game up, PC
As amazing as these chips are, they’re largely based on an extension of A14 tech (the same 5nm process and CPU/GPU cores are the same, just clocked higher). It’s cool to be able to estimate the performance of 2022/2023 Macs just by looking at the current iPhones and scaling appropriately. M2 expected mid next year should have about 1900 single core geek bench, 9000 multicore, and 3TFlops GPU in a 15 Watt TDP. Already giving M1 Pro CPU a run for its money, much faster machine learning acceleration, and good enough GPU for most applications in a super low power design. Then the M2 Max will be up to 50% faster GPU than M1 Max! 15 Tflops in a thin laptop! Apple chip team isn’t slowing down…
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
373 Comments
Back to Article
Hifihedgehog - Monday, October 18, 2021 - link
Exciting yet massively overpriced.jaju123 - Monday, October 18, 2021 - link
Overpriced compared to what? What else offers this SOC performance, battery, and Mini LED display, just to name a few unique selling points? They can charge basically what they want.Hifihedgehog - Monday, October 18, 2021 - link
To previously generation MacBook Pros, that's what. Sure, you can list off the fancy features and yes, you are totally right. But for the majority of people, regardless of features, the price will be a huge deterrent.Hifihedgehog - Monday, October 18, 2021 - link
*previous generation MacBook Proat_clucks - Tuesday, October 19, 2021 - link
There's no *device* out there that has this level of performance AND efficiency. You may get a mobile device with similar performance and much worse efficiency, or a workstation class device with better performance but abysmal (by comparison) efficiency.But you certainly won't find a $2000 device that's really comparable unless you look at one very specific thing and say "aha, the cheaper one is better at this". But let's be honest, such comparison is not useful to anyone but Intel :).
Realistically if this had official Windows or Linux support, even with the expected loss in performance or efficiency, I'd get one in a heartbeat. But as I'm not a big MacOS fan I'll just watch.
Absolutely impressive chip though.
s.yu - Tuesday, October 19, 2021 - link
Ah yes, if it had Windows...preferably Windows Hello(FP sensor) and a touchscreen...and a digitizer and flips 180 degrees :)Kangal - Wednesday, October 20, 2021 - link
Indeed, impressive chip.I noted my satisfaction/dissatisfaction a whole year ago with the original Apple M1. I even suggested that Apple should release a family of chipsets for their devices. It was mainly for being more competitive and having better product segmentation. This didn't happen, and it looks like its only somewhat happening. Also they could update their "chipset-family" with the subsequent architectural improvements per generation. For instance;
Apple M10, ~7W, 4 large cores, 8cu GPU... for 9in tablet, ultra thin, fanless
Apple M11, ~10W, 8 large cores, 8cu GPU... for 11in laptop, ultra thin, fanless
Apple M13, ~15W, 8 large cores, 16cu GPU... for 14in laptop, thin, active cooled
Apple M15, ~25W, 8 large cores, 32cu GPU... for 17in laptop, thick, active cooled
Apple M17, ~45W, 16 large cores, 32cu GPU... for 29in iMac, thick, AC power
Apple M19, ~95W, 16 large cores, 64cu GPU.... for Mac Pro, desktop, strong cooling
...and after 1.5 years, they can move unto the next refined architecture/node (ex Apple M20, M23, M25, M27, M29 etc etc, and repeat the cycle every 18 months). I'll reiterate, this was the lineup that I had in mind, and I commented about this an entire year ago.
robotManThingy - Thursday, October 21, 2021 - link
That's not going to happen.florian - Thursday, October 21, 2021 - link
if you wanna spend the extra dollars you can just get and run parallels desktop on itvFunct - Tuesday, October 19, 2021 - link
This is a pro machine.You're saying a company that spends $100,000 on camera lenses is going to find this laptop to be too expensive?
Hifihedgehog - Tuesday, October 19, 2021 - link
The film industry is deeply rooted in Quadro so while they might spend $100,000 on camera lenses, they would not be buying Apple just because. This appeals to the small freelancers, photographers and videographers, not the big blockbuster-churning cornerstone movie studios. And even among the low budget folks, there are many who are vested in NVIDIA and/or Windows ecosystem because Apple lacks many key codecs, APIs, and software.at_clucks - Tuesday, October 19, 2021 - link
I can tell you for sure that there are so many shops that have paid more in software licenses for MacOS than all the hardware put together.Sure, places like Pixar swim in Quadros, Teslas, Radeon Pro SSGs. But everywhere else you're most likely to find Apple Final Cut or Avid Media Composer, Nuke or After Effects, Blackmagic Fusion or Davinci Resolve, Autodesk Maya or Cinema 4D running mainly on Macs. The cost of the metal is less than the cost of the licenses.
vFunct - Tuesday, October 19, 2021 - link
Only a certain subsection of the film industry use Windows, in particular, corporate VFX houses. And they're not the companies that buy $100,000 lenses.The vast majority of production houses use Macs.
I still don't know why you think a company that can afford a $100,000 lens would think a $6000 laptop would be too expensive?
xcalybur - Sunday, October 24, 2021 - link
There are plenty of TV and Movie crews that use Macs. Ted Lasso is completely edited and outputted on Mac Laptops. I can't imagine how much faster the editing would be on these new MBP's. I am forecasting a 5-10 times speed increase in Prores. So let us say it takes 2 weeks to complete a whole series on their existing Mac and 40% of the time is limited by rendering times. The new M1 Max would bring that down to 3 days. Is that worth it?Godofbiscuits - Saturday, October 23, 2021 - link
I'll never understand why people don't apply context to these costs. People were complaining about the entry-level Mac Pro tower's graphics card being "anemic" for a $6000 configuration, not realizing that there were a ton of audio pros out there who would be putting a $4500 audio card in a tower that didn't require a powerhouse GPU and wouldn't want to pay for it.sfg2 - Tuesday, November 9, 2021 - link
So... it's fine that's the graphics are only "anemic" in a $6000 configuration because with good graphics it would be even more expensive?Great logic there.
tbutler13 - Thursday, October 21, 2021 - link
I’m not sure what you mean. If you view both of these as potential heirs of the 15/16” MacBook Pro, the pricing is virtually the same or better. This isn’t meant as the next generation of the M1 13” MacBook Pro.Godofbiscuits - Saturday, October 23, 2021 - link
Pricing is more or less the same. Price per unit performance is about ½-⅔ of what it was.varase - Monday, October 25, 2021 - link
You know, I _owned_ a 2019 16" MacBook Pro, core-i9 2.3ghz, AMD Radeon Pro 5500M 8 GB, 32 GB RAM, 2 TB SSD purchased from Adorama via their AppleInsider link for a discount for $3499.I purchased the new 2021 16" MacBook Pro, M1 Max, 32 GPU cores, 32 GB RAM, 2 TB SSD directly from Apple with no discount for $3899. In case there were any issues, I wanted to be sure I owned an Apple SKU so Apple Storage managers would have more flexibility in terms of returns or upgrades (they have far less flexibility when the SKU is for some other vendor).
The AppleInsider offer probably offered me a $200 discount (that's what they're currently offering on a M1 Max, 32 GPU, 32 GB RAM, 1 TB SSD), so the net difference is probably $200, a 5.71% increase. Over the two year period, that could easily reflect inflation and increased tariffs.
So no ... this new model _does not_ reflect a big increase of the last model.
varase - Monday, October 25, 2021 - link
Addendum: Last time I did max out the GPU, but did *not* max out the CPU - there was a core-i9 2.6ghz available which I did not buy.This time, I _did_ max out what the M1 Max could give me except for memory.
brucethemoose - Monday, October 18, 2021 - link
There's nothing even close to this level of performance+power efficiency you can get in another laptop.IMO its the *previous* gen Macbooks that were overpriced, as they were basically hot, thin conventional laptops. This is a whole different animal.
Hifihedgehog - Monday, October 18, 2021 - link
It's a Surface Book killer, that's for sure, if you were in the market for one. However, if you are an industry pro, there are many lacking features. For one, if you are film industry, you can already ignore this. Quadro is the standard there for the major pillar studios. You'd get laughed out of the room just because of the lack of CUDA. Never mind that this has an inexplicable lack of hardware AV1 codec support.melgross - Monday, October 18, 2021 - link
Don’t make things up. Much, if not most film and Tv work is done on Nacs.goatfajitas - Monday, October 18, 2021 - link
Well, Apple does a really good job of making people think that anyhow. Much like they do a really good job of making people think they invented everything on a smartphone and that they are faster than the competition.sharath.naik - Monday, October 18, 2021 - link
This is what PC SOC was always supposed to end up. Apple just got there first. Intel in their attempt to make money tried hard to stop integrating memory on CPU SOC to sell more chips. This will change the landscape in Apples favor if Intel and AMD does not have an equivalent answer to this soon.damianrobertjones - Tuesday, October 19, 2021 - link
Where did they go first?FLORIDAMAN85 - Monday, October 25, 2021 - link
Agreed. The whole concept of an APU is rooted in a design like this. Too bad the two desktop CPU manufacturers have little desire to produce a high end product like this.caribbeanblue - Tuesday, October 19, 2021 - link
They're faster than the competition though.Hifihedgehog - Tuesday, October 19, 2021 - link
> Well, Apple does a really good job of making people think that anyhow.You can see it in the comments here. I bet not one of them actually has spoken to someone who works in TV and film. It is ridiculous. Apple=/=pro.
Nowmich - Friday, October 22, 2021 - link
Dude, let it go. Are you going to disagree with everyone on this forum with the goal of showing you are the smartest? Go for a walk, take a breath, read a book.marcolorenzo - Tuesday, October 19, 2021 - link
They are faster than the competition. It's not just Apple saying this. Unless you think Anandtech base their reviews on Apple's marketing lolmdriftmeyer - Tuesday, October 19, 2021 - link
It’s not. All of PIXAR/Disney Animation, ILM, etc are using custom applications built with heavy cloud farm processing and Renderman latest is heavily CUDA based.https://rmanwiki.pixar.com/display/REN24/Installat...
XPU is a proprietary CUDA solution. Support for Maxwell requires Optix—NVIDIA CUDA
The main workstations are Windows/Linux. Sure Artist textures and such have Macs but their entire workflow is a series of custom applications leveraging CUDA.
Steve has been dead for over a decade. We know much of the post processing film industry aren’t using FCP. It’s mainly improved thanks to AppleTV+ studio.
The one application that has large support is Logic Pro, and speaking as someone who uses it daily that Mac Pro 2019 decked out is a beast. You have two routes, invest heavily in standard Analog hardware to offload major portions of your workflow and leverage automation in LP then export stems to others working in Pro Tools for final mix/ mastering or get a 16 core refurb Xeon with Afterburner and duo 6800XT Pro GPGPUs or dump another $10-20k in studio Neve gear like the Rupert Neve Portico II master bus stereo processor for $4k alone or the recommended mastering stack complete from Rupert Neve and ultimately the idea of a quality home studio with treated acoustic paneling and more isn’t the cute laptop in the back seat of a cab mixing or strumming a guitar on a bed. You pull a Rick Beato studio investment and have older copies of LP on a 2012 Mac Pro—until today requiring Big Sur for 10.7–or you move to Xeon because everything is still via Rosetta 2 or use straight x86.
Even Apogee Digital isn’t certified Apple Silicon yet and requires Rosetta 2 if you must be Silicon or die.
A lot of studios will buy the Mac Pro 2019 load it up and use it for the next decade.
MakePerceive - Tuesday, October 19, 2021 - link
I work both in video production and music production, and I'm going the hardware route (Neve analog for tracking, hardware fx etc), but for computing and software but I'm holding out for the Apple Silicon Mac Pro for my production company. I was very close to switching to PC (had my Threadripper build all specced-out and funds available), and then Apple released M1. So I decided to wait for the Mac Pro, based on the performance of that and taking a leap of faith that it can be satisfactorily scaled up for max performance. (There was a graph of the trend of the performance growth over generations of Apple ARM CPUs here on AnandTech - pretty impressive). I'd assume the same growth rate for Apple Silicon in years to come. I trialled the M1 Macbook air (my wife's computer!) and it did a fine job for all of my apps, Rosetta included. Seems like M1 Max is a step in the right direction, but I do wonder how they can cater for ultimate maximum performance where power consumption is no concern, and there is an endless need for RAM etc. As well as how they will allow for user-expandability, for example if the user wants to replace the CPU / SoC. That Mac Pro had better be a proper, modular pro workstation or there is going to be a new generation of burnt, upset pro users who slowly migrate to PC (assuming they don't use Logic etc - I'm on Pro Tools and other cross-platform software)Oxford Guy - Tuesday, October 19, 2021 - link
All that tech and music is worse than it used to be.hemedans - Tuesday, October 19, 2021 - link
This is the Biggest myth, not true Most of Film/Animation works are done on Either pre built Machine or Workstation run Linux or other proprietary software, most of time have Xenon.Just do simple Google search on what companies like DreamWorks, Disney, Pixar etc use.
andygrace - Tuesday, October 19, 2021 - link
Render farms aren't that important for most video and even motion-picture pros anymore.Apple tried with the rack models with fibre channel Xsan ages ago, but rendering is a really specialist task for the biggest of the big guys.
Most of our rendering and encoding is now handled at AWS anyway where TCO is an order of magnitude lower than having on-premises hardware which dates quickly and costs a fortune in manpower to keep ticking over. Our major competitors also now use AWS or Google Cloud.
Being able to spin up massively powerful instances with multiple NVIDIA GPUs on demand is the best case for us. If you don't require CUDA and AMD's your thing, well they do that too now.
Apple are targeting this thing right. There's a lot of semi-pro content people who are making serious money out of YouTube/FB/Insta. FCP X is great for them and as mentioned Logic is very popular.
None of that target needs Avid Media Composer or DaVinci Resolve but even that now runs natively on Apple ARM. (Incidentally, Blackmagic Design ATEM and cameras are now getting wildly popular at the low and even higher end.)
So on the capture/edit side, these new MBPs are going to be huge within the industry. For the extreme high end - definitely not the target demo. Being able to scale up and out when required is far better for us.
powerarmour - Tuesday, October 19, 2021 - link
Correct, at least someone here knows what they're talking about.damianrobertjones - Tuesday, October 19, 2021 - link
I'd love a Nac.oryanh - Wednesday, October 20, 2021 - link
I have a Nac for that.Hifihedgehog - Tuesday, October 19, 2021 - link
> Much, if not most film and Tv work is done on Nacs.Nacs. Good one! Ask someone at Pixar, Dreamworks, ILM, etc. Quadro is the industry standard and most certainly NOT Macintosh.
sfg2 - Tuesday, November 9, 2021 - link
No, most of it is not done on either Nacs or Macs. Stop lying.tipoo - Monday, October 18, 2021 - link
That Surface Laptop Studio looks even sillier having gone with 4 cores now, now that's overpricedgescom - Monday, October 18, 2021 - link
Cuda&Quadro. The last unsolved mystery.Karaqx - Monday, October 18, 2021 - link
Who needs cuda when apple has ml create and metal. Tensorflow 2.4+ is optimized for apple gpus, with puny m1 outperforming i9 with radeon macbook by a considerable margin (this is google's own benchs)iphone4ever - Tuesday, October 19, 2021 - link
So you are comparing a laptop, yes a laptop, to a workstation that costs about 8x as much. LOL.kpbendi - Tuesday, October 19, 2021 - link
Yeah I guess Blackmagic, Adobe, Otoy, Maxon (just to name a few) all got laughet out of the room with M1 native Resolve, Premiere Pro, Octane Render, Cinema4D, Redshift.Also film industry doesn't give a damn about built in ProRes hardware acceleration either, regardless of it supports 30x 4K or 7x8K vide stream simultenously.
Oh my.
varase - Tuesday, October 19, 2021 - link
Actually, av1 support was introduced with Big Sur (macOS 11).Godofbiscuits - Saturday, October 23, 2021 - link
The (world's) first hardware AV1 encoder just became available 6 months ago. Who TF else has it in their systems? Decoding AV1 is a relative cakewalk and that's by design in the codec.melgross - Monday, October 18, 2021 - link
No, it won’t. People doing the kind of work these machines will excel in won’t mind the pricing, which isn’t bad at all. I just ordered a 16” 64GB RAM and 2TB drive. I think it’s a bargain at this performance level. There’s nothing to really compete.And since a laptop is intended to be used as such—sans power cord, the performance difference will be even greater, since most performance Windows AMD/Intel machines suffer greatly when not plugged in.
danielfranklin - Monday, October 18, 2021 - link
Hence the lower end 14" partially disable chip at quite a good price over the old equiv MBP.Otherwise if you dont want the "fancy features" the M1 MBP or air are more reasonable.
Did you expect the new high-end notebook at the low end notebook price or something?
No ones ever going to suggest Apple isnt expensive. But at least they arent just buying off the shelf shit that everyone else uses anymore and charge 50% more.
TeaMat - Monday, October 18, 2021 - link
Apple has always provided pro products that are a sort of combination of really good components where the end result is something that is either a good deal for exactly the specs they provide, or simply has no equal, but looks expensive if you don’t need all of the fancy stuff they have decided should be included in the SKU. For example, the Mac Pros and iMac Pros offered Xeon processors. Xeons are expensive whether you buy your workstation from Apple or Dell (or anyone else), and they have a few extra features like buffered ECC RAM support. If you just want a PC with a fast processor and don’t care about the extra features of the Xeon you can buy an i7 or i9 machine from PC OEMs or make one yourself and you will end up with a machine that is faster than the Mac Pro for less money. Similarly, when you add up the mini-led and the 10 core CPU and the 32 core GPU with 400GB/s of bandwidth and all the other features of these new laptops, it’s pretty hard to find something comparable. If you’re like me and just want an M1 macbook that supports more than one external display natively, and you don’t get value from all the area they spent on video decoders and encoders and whatnot, then they look pretty expensive. I might still buy one :p but there are definitely a lot of awesome things in this machine that many people simply don’t need or won’t use.lilkwarrior - Sunday, October 24, 2021 - link
What you just said is why the M1 Macbook exists vs the M1 Pro and M1 Max Macbook Pr exists.M1 Macbook with a TB4 Hub got your needs covered addressing the multiple displays use case you need that it doesn’t put-of-the-box. More than 1 external display is definitely a “pro” thing, but I anticipate them making that included with M2
iphone4ever - Tuesday, October 19, 2021 - link
I have owned multiple generations of MacBooks for 14+ years and these Pro series are priced right in line with the previous ones (BTW, Apple is also very good at pricing their products) and are a very good deal when you consider all the leading edge technology in them and what other laptop even comes close to this level of performance.misan - Tuesday, October 19, 2021 - link
How that? You get much more performance for the same price, not to mention that display or the battery. Sure, the new 14" is more expensive than the old four-port Intel 13", but the 13" M1 is significantly faster than Ice Lake anyway for less money.And of course, the higher-end models are pricy, but these are very serious mobile workstations. And they compare very well to other workstations at the same price level.
Henry 3 Dogg - Tuesday, October 19, 2021 - link
"But for the majority of people, regardless of features, the price will be a huge deterrent"True, but then the majority of people never bought a previous generation MacBook Pro either.
These machines are not overpriced compared to previous generations of MacBook Pro.
marcolorenzo - Tuesday, October 19, 2021 - link
The majority of people don't need a laptop like this and would be perfectly happy with a regular M1 laptop like the MBP13 or MB Air, both of which are actually very reasonably priced for what you get. For the people that need or want the best of the best, the MBP14 and MBP16 are definitely not overpriced.ABR - Wednesday, October 20, 2021 - link
Actually the 16" is priced identically to the previous generation (2019). Source: bought a 2019, optioned out a comparable 2021.ledsled - Wednesday, October 20, 2021 - link
Just for fun, go price a desktop computer that compares with this. You would need an AMD 5800X, fairly high end GPU, memory, 1 TB ram, PSU, case, operating system, keyboard, mouse and a monitor ... I was up to $2800 and it could easily be significantly more. And that is for a clunky desktop with virtually no software. This is an extremely clean packaged system that is mobile. Has a battery, keyboard, mouse and again is MOBILE. Comparing this to systems that are 1/5 the power just isn't a fair comparison.yitwail - Friday, October 22, 2021 - link
Previous generation MacBook Pros with discrete graphics chips were also overpriced from the perspective of a majority of users. So the intended customer base are users willing to pay more for advanced technologyGodofbiscuits - Saturday, October 23, 2021 - link
Try again. These machines are priced in line with what MacBook Pros have been for years. Only there's an enormous jump in performance this time around instead of an incremental one from Intel. And now the fans won't be kicking on and drowning out the world.nevcairiel - Monday, October 18, 2021 - link
You can be the only product and still be overpriced.Hifihedgehog - Monday, October 18, 2021 - link
Exactly. This is the world's best laptop but yet a niche for Apple super fans. Maybe Apple will get major studios hooked, but they would need to get Apple chips into server farms. No way the technical teams code their rendering tools for two entirely different GPU and CPU archs. And don't see that happening for another five years at least.name99 - Monday, October 18, 2021 - link
Let’s see if you’re still trotting out that same old boring company about prices when these become the systems of choice for eSports champions…Fulljack - Monday, October 18, 2021 - link
that's just speculation.currently no eSports games run on ARM64 natively.
hughJ- - Monday, October 18, 2021 - link
Esports champions don't choose their hardware, they play at LAN events with hardware chosen by the event (where a system OEM/integrator will often be a sponsor). Similarly for their personal use, players belonging to esports teams are use and/or promote whatever hardware is attached to the organization's portfolio of sponsors.cjlacz - Monday, October 18, 2021 - link
I was coding on one architecture and shipping on another back in the 2000s at a previous job. I'm doing it now with the M1. I'm sure there are projects where it can't be done right now, but there are also plenty of opportunities were it's fine.rmari - Monday, October 18, 2021 - link
Macs have always been expensive.But they are marketed for affluent consumers who can easily afford them.
BMW, Mercedes Benz, and Teslas are expensive and no one complains.
My 2016 Macbook Pro 16 i9 64GB 2TB cost $5000.
The 2021 Macbook Pro 16 M1Max 64GB 2TB costs $4300.
It's actually cheaper.
Hifihedgehog - Monday, October 18, 2021 - link
Has little to do with affluence. I have easily $10K-plus in DIY hardware alone and Microsoft Surface is a whole other ball of wax.By the way, you are comparing a five year old system when SSDs and RAM cost significantly more.
hanssonrickard - Monday, October 18, 2021 - link
He is actually comparing it to what was available at that time ad what he had to pay back then to get that kind of hardware.What else should he compare it to?
Dug - Monday, October 18, 2021 - link
No. You are neglecting the memory speed, interface, bus speed, process node for custom chip, mini led screen, etc. The biggest one in inflation. So yeah, it's cheaper.Hifihedgehog - Tuesday, October 19, 2021 - link
> The biggest one in inflation.Strawman argument. Technology goes down in price over time. 2TB SSDs cost $800-$1000 in 2015. Now, they cost just $200-$300 in 2021. Oh my goodness, yes, they are indeed overcharging compared to what they should be, commensurate with current capacity-per-dollar costs.
Doan Kwotme - Wednesday, October 20, 2021 - link
No question, DRAM and SSD are where Apple soaks up the profit. Their prices on these are 3 times what the PC world charges. Once hooked on the decision to buy a Mac at the entry price, the rest is just numbers on the credit card.Sailor23M - Tuesday, October 19, 2021 - link
Just the screen alone makes me wanna buy one.Swole4life - Tuesday, October 19, 2021 - link
Apple makes over hyped garbageMini led is a trash tech oled kills mini led in every way..
Apple is good at marketing trash for weak minded people
The amd samsung team up will eventually kill anything apple claims...
Apples chips over heat and bogs performance with in minutes...
Swole4life - Tuesday, October 19, 2021 - link
Ditched the apple garbage after the 12,promax, m1 ipad... Apple is nothing more than a fashion statement...Henry 3 Dogg - Tuesday, October 19, 2021 - link
So by your own admission you are weak minded.Hold onto that and leave the serious stuff for the grown ups.
zony249 - Tuesday, October 19, 2021 - link
I have a feeling I've seen this guy somewhere. I remember because his "behaviour" stood out.TEAMSWITCHER - Tuesday, October 19, 2021 - link
OLED has limited peak brightness... It's the only way to prevent burn-in.Batmeat - Tuesday, October 19, 2021 - link
They do that with all their products regardless.Oxford Guy - Tuesday, October 19, 2021 - link
Blue pixel aging is still the tech’s main weak point, although, yes, image retention/burn is a biggie.StinkyPinky - Monday, October 18, 2021 - link
It really isn't. Desktop level GPU and CPU performance, micro-led 120hz display, fastest in class storage. This is a desktop replacement laptop.Silver5urfer - Monday, October 18, 2021 - link
There are already laptops which Apple is comparing with what is that ? Reinventing the wheel at TSMC 5nm is what they are doing here. DTR existed since Alienware M18x R2 and today's Clevo X170SM which is a true Desktop replacement because it runs 10900K full speed and a mobile RTX380 at higher performance targets.Once you run this M1X on such high workload the CPU and GPU will not keep up with efficiency it's the law of physics and nobody is going to cheat physics.
Basically buy this for a notch and a crappy OS with locked down ecosystem on a BGA soldered Hardware with zero user customization for show off and notched POS display for bragging rights at even more expensive cost.
askar - Monday, October 18, 2021 - link
but the laptops you've mentioned don't have: as good of a display (no mini-LED, no brightness, no HDR), any acceptable battery life, comparable portability and sound system. MacOS can shine if you work well with the command promptHifihedgehog - Monday, October 18, 2021 - link
Film industry is laughing at you right now. Find someone who knows someone who actually works at Pixar or another CGI shop and they will straight up tell you that this won't work for their workflow. Quadro is the industry standard unless you are a little guy doing small-time freelance.TanjB - Monday, October 18, 2021 - link
What if you put 8 of these on a card? Or 16? You are assuming the way into professional workflow does not include changing things up.tipoo - Monday, October 18, 2021 - link
There's a big world between full frame Hollywood or Pixar level movie making and people with high needs for video editing. Youtube is kind of a big thing now. There's also no mobile GPU you can pay any amount of money for that can access 64GB of RAM with equal speed, there were SSGs that tried to bolt on SSD storage, but this is 400GB/s LPDDR5.Pretending the uber high end of the industry which you're definitely not in invalidates what this is good for is just poor reasoning.
web2dot0 - Tuesday, October 19, 2021 - link
Are you saying all the people who are buying MBP are people who work at Pixar and CGI ships and only uses Quadro as their card of choice?Seems like you are the delusional one.
There are millions of people who would gladly pay a good hefty amount of money or a MBP ...
Icehawk - Tuesday, October 19, 2021 - link
These ARE priced well tbh. For example we just got some Dell Precisions with i9s, basic screen, no dGPU and LIST on them is $5,500 (biz pays around 50% retail). The Dell is probably 2x thicker and about the only plus is it has, IMO, better port selection.I’m surprised because the regular pre-m1 MacBook isn’t that great pricewise.
For our graphics need we use a workstation with a fat $3k+ Quadro. It just encodes video all day.
adda - Wednesday, October 20, 2021 - link
You know I'm starting to suspect you're the kind of 12 year old that bores his classmates to death with your high volume lectures on the superiority of PCs (and your maxed out gaming rig) to those awful Macs. One tell is your constant refrain of "the film industry is laughing at you", which is the reasoning of an insecure child. Additionally, the only thing you seem to think you know about this film industry is "Quadro" and "AV1", which you have chanted about 20 times now. I've seen the "go ask any industry pro" a lot, and typically is weirded by people who know little to nothing about the industry in question.vladx - Tuesday, October 19, 2021 - link
> but the laptops you've mentioned don't have: as good of a display (no mini-LED, no brightness, no HDR), any acceptable battery life, comparable portability and sound system. MacOS can shine if you work well with the command promptMy MSI Creator 17 has all of that, do some research before spouting crap like that
blargh4 - Monday, October 18, 2021 - link
I have a Clevo, it's an unwieldy, heavy, loud brick with pretty lousy OS integration. Not much of a comparison to a MBP if you take laptop ergonomics remotely seriously.I'm sure for macos power users, whoever those people are, these machines will be second-to-none. For every else it's a bit of a mystery what all this power would be useful for.
Silver5urfer - Monday, October 18, 2021 - link
Sell your Clevo and get this then better.gescom - Monday, October 18, 2021 - link
"Alienware M18x R2, Clevo X170SM"Please don't :)
cfenton - Monday, October 18, 2021 - link
Those aren't even close to the same weight or battery life. They might be as fast, but they're barely portable.Chinoman - Monday, October 18, 2021 - link
Have you seen how much less heat this generates vs. an Intel CPU performing the same task?web2dot0 - Tuesday, October 19, 2021 - link
You know M1Max has ... ProRes Encoders/Decoders and able to handle 7 streams of 8k ProRes on a laptop chip right?Does your M18x R2 with RTX3080 do that? With/without a power cable?
You are drunk on specs and forget that this is the age of ACCELERATORS. Most of the stuff is offload from the CPU and onto custo silicon or GPU.
vladx - Tuesday, October 19, 2021 - link
> You know M1Max has ... ProRes Encoders/Decoders and able to handle 7 streams of 8k ProRes on a laptop chip right?No one besides Apple users cares about ProRes which is a proprietary Apple format
Chirpie - Tuesday, October 19, 2021 - link
Well... YEAH. No one but Apple users own an Apple product. LOLBlark64 - Wednesday, October 20, 2021 - link
Yeah, I guess that's why there are no cameras that record directly to ProRes Raw: https://www.bhphotovideo.com/c/buy/prores-raw-comp...(Even more that record to ProRes 422 or 444)
andygrace - Friday, October 22, 2021 - link
Tons of cameras and ubiquitous mobile monitor/recorder combos now record ProRes natively.It's actually a whole suite of codecs from lower end but still good, through to extremely high end film intermediates.
In fact it's starting to taking over parts of the video and TV industry from Sony's seemingly unstoppable Betacam/HDCAM/XDCAM lineup which is incredible considering those standards go back to half-inch analog tape in 1981.
Digital Betacam was THE standard for production in the 90s and SD era. Some of the biggest films of the 2000s were shot in HDCAM or HDCAM SR including Lucasfilm. Then they moved to file based workflow with XDCAM first on 23GB and 50GB professional discs which are still in use and then to flash based XDCAM.
But now Sony have a serious challenger on the record and ingest side.
With ProRes codecs built right into the wildly popular Atomos Ninja outboard monitor/recorders, camos (like me when I can't find an operator - for example in the middle of COVID) can record better quality out the HDSDI back of their expensive industry standard camcorders, and only use the Sony cards or discs as backup. They're incredibly cheap - for pro TV prices - and great little field monitors which include ProRes RAW mode.
AJA gear is similar, and over the past decade Blackmagic Design has become massively popular with entire pro studio setups based on ProRes including their stunningly good and great value cameras it's exceptional news. Those are seriously mainstream in production today.
With all the history of being the undisputed king of pro-video, it's astounding Sony themselves are releasing gear with ProRes built right in and releasing firmware updates to make some of the older semi-pro camcorders output it too for those Atomos recorders and others.
Add the new iPhone 13 Pro recording ProRes natively and it's hard to overstate how massive a change this seems like it's going to be.
A
Spunjji - Tuesday, October 19, 2021 - link
Pointing out that you can get ~20% higher GPU performance and comparable CPU performance out of a device that weighs 2-3x as much and draws 3-5x as much power under load isn't exactly a compelling argument.dejuknow - Monday, October 18, 2021 - link
On the contrary, it's underpriced (or "right-priced"). Even purely from a performance perspective, show me a single laptop with equivalent CPU/GPU performance at Apple's prices.Karaqx - Monday, October 18, 2021 - link
400gb/s ddr5 and 7 gb/s ssd wich I'm assuming uses pcie gen 5 interface because goddamn 7.5gb/s, they both are quite expensive. quantum dot and mini-led backlid display is very expensive. don't be fooled by it's size, this monster should be compared to a high end quadro laptop.Alistair - Tuesday, October 19, 2021 - link
pcie 4 is 7.5 gB/sRudde - Tuesday, October 19, 2021 - link
Pcie gen 4 x4 or pcie gen 3 x8. Source: Appleiphone4ever - Tuesday, October 19, 2021 - link
My 2020 8-core i7 Dell Laptop cost over $2500 gets smoking hot and pales in comparison to these computers in terms of performance, design and quality. The Apple touchpads are 1,000% better than they crappy one Dell includes.damianrobertjones - Tuesday, October 19, 2021 - link
Totally agree. They're milking their customers. It also gives them an easy pass to INCREASE the price for later releases.Fools and their money... . (Although, saying that, they are pretty damn good).
TEAMSWITCHER - Tuesday, October 19, 2021 - link
The pricing is actually very good. The lowest standard SKU will probably destroy any comparably priced Intel based laptop, and do so without having fans that sound like a leaf blower.Morky - Tuesday, October 19, 2021 - link
No it's not. You don't need these machines if you're not a creative pro using these to make a living. The Air has crazy good performance for most people.shelbystripes - Tuesday, October 19, 2021 - link
If the performance is even close to what Apple claims here, calling this “massively overpriced” is laughable. Something is only “overpriced” if you’re being charged too much for what you’re getting. There won’t be a better price/performance combo for most users who actually NEED this much portable power.There won’t be any other laptop for photographers. Its only competition will really be the iPad Pro, for people who can live with its limitations for ultraportable workflow. Those who want more power for on-the-road editing will be jumping all over this. Lightroom on Apple Silicon performance alone makes this worth the price.
haeli05 - Tuesday, October 19, 2021 - link
It seems like every apple hater is just poor and enviousknolf - Friday, October 22, 2021 - link
Similar specced Dell XPS 15 ( 1 TB SSD, 3050 Ti, 32 GB RAM ) is in fact more expensive compared to the 14 inch Macbook Pro with the top M1 Pro SOC ( 10 core CPU, 16 core GPU, 32 Gb RAM, 1 TB SSD ).xcalybur - Sunday, October 24, 2021 - link
Let's take a look at the closest competition. The MSI Creator 17 has a MiniLED display. It has a Intel 11800H which is 25% slower than the M1 Max. It has a 3080 which is about the same as the M1 Max, it has a 4K display, which is only slightly larger than the 16-inch Macbook Pro. The MSI has a max of 1000 nits brightness. The 16" MBP is 1000 nits sustained with 1600 peak. The MSI does have more ports. The MSI has 720p camera whereas the Mac has 1080p with notch. The Mac is 0.6 pounds less at 4.8lbs. The Mac supports 4 external monitors. The MSI supports 3. And the big one is the MSI's battery life is 9 hours and the MBP is 21. So based on your comment the MSI should be around $2200-2500 and the Mac of course is $3499 with the same 32GB RAM and 1TB SSD. But wait, the MSI is actually $3499. Yup, sounds overpriced to me.Comments like this are an opinion. For the fastest laptop ever made by a long shot with one of the best screens ever and just about the highest battery life of any laptop that is worth it and it is the same price as a less competitive x86 laptop. My opinion is it is expensive, but worth every penny to someone who needs it.
FLORIDAMAN85 - Monday, October 25, 2021 - link
What, did we expect a rational price/performance ratio from the silver eyed monster?snowdrop - Monday, October 18, 2021 - link
The M1 Pro cpu has variants with either 6 or 8 performance cores. The base M1 Pro has 6 performance cores. Please update the article to reflect this.catinthefurnace - Monday, October 18, 2021 - link
Yes, and only on the 14 inch model. The 16 inch model base is 8 perf, and 2 efficiency.Ryan Smith - Monday, October 18, 2021 - link
Note that this is a silicon-focused article, not a laptop article. So we're talking about the dies as-is, which is 8 performance cores.snowdrop - Monday, October 18, 2021 - link
The die "as-is" is includes either 6 performance cores or 8 performance cores. Even if the 6 performance core model is just a binned part (2 failed cores) it is still an M1 Pro - the article should not imply that all M1 Pros have 8 cores... since they do not all have 8 cores.Ppietra - Monday, October 18, 2021 - link
technically all of them have 8 performance cores.And since they are not discussing the laptops being sold...
snowdrop - Monday, October 18, 2021 - link
The base 8/14c M1 Pro offers 6 performance cores and the upgraded 10/14c and 10/16c M1 Pro offers 8 performance cores. The performance gap is likely to be higher between these two socs than between the 10/16 M1 Pro and the 10/32c M1 Max in most applications. Not being clear what Apple is selling is a disservice to readers who are literally putting orders in now.GraXXoR - Monday, October 18, 2021 - link
And if they’re putting in orders based on a random tech article and not looking at the specs realizing that there are two less cores then they likely won’t miss them.Anyone who can spend 3,000 without fully researching the product they’re buying isn’t worried about money.
Anyone not smart enough to notice two
Missing cores in the spec sheet and subsequently investigate the consequences will likely make bigger mistakes in their life than choose a potentially underspecced laptop.
TEAMSWITCHER - Tuesday, October 19, 2021 - link
I don't think you have your specs right.Nigel Tufnel - Wednesday, October 20, 2021 - link
It's true and should be noted, but let's be real. Anyone who is reaching for one of these is doing it largely for the performance (the M1 Air/Pro are already quite good for "normal" things and much cheaper). So I think very few people are going to buy the 6-core model, it's really just there for Apple to do something with some binned chips and present a lower price floor for the model lineup. Getting the base model makes little sense for almost anyone.For someone to spend $2K on one of these and not go an extra $300 for another 2 performance cores and another 2 GPU cores would be quite insane.
KPOM - Thursday, October 21, 2021 - link
It’s $550 more than a MacBook Air with 16GB RAM and a 256GB SSD. The extra $550 brings a much better screen, better speakers, microphone, and webcam, and the ability to drive 2 external displays. For some people, that might be worth it even if they don’t need the extra 2 CPU cores and 6 GPU cores.web2dot0 - Monday, October 18, 2021 - link
Intel, AMD, and MS should scared SHITLESS right now.No PC laptop in the world can match the power of the MBP14,M1Max,64GB,8TB in 3.5lbs.
And can run at FULL SPEED without the power cable.
Let the NotchGate and the "it's too expensive" talking points begin. LOL.
But let's face it, it's gonna take YEARS before the PC market can catch up to Apple.
All you guys know deep in your heart.
nevcairiel - Monday, October 18, 2021 - link
And yet it still doesn't matter as much as you want us to believe, because the hardware is locked into an ecosystem that many have no interest in, or not the ability to switch to.I'm not looking for a new job and stop gaming all my favorite games just to be able to use a Mac with fancy new hardware.
If Apple were to just sell hardware, it would be a much bigger deal, instead they sell lifestyle, that happens to come with hardware.
Chinoman - Monday, October 18, 2021 - link
Who says you have to quit your job to buy a MacBook? Like, where is this line of logic even from?markiz - Tuesday, October 19, 2021 - link
Because many at work MUST have windows. Even if this mac was 10x better than it really is (it does look awesome) it is completely impossible for many industries to switch or support.ddps - Monday, October 18, 2021 - link
You play games, I work in a Unix command line. You use Windows, I use macOS. I find your Windows ecosystem confining.fazalmajid - Tuesday, October 19, 2021 - link
I loathe Windows as much as anyone but Windows Subsystem for Linux works quite well and in fact Docker on Windows uses it directly with no VM unlike on Mac.ddps - Tuesday, October 19, 2021 - link
I've not found that WSL provides the sort of seamless command line-native UI app integration that powers things like https://www.barebones.com/products/bbedit/benefits... and other such things. In macOS, the command line is part of a full-fledged UNIX app ecosystem that is something totally different from X Window / Gnome / KDE / Unity / etc.ABR - Wednesday, October 20, 2021 - link
No VM? :) What do you think WSL (2+) is then?web2dot0 - Tuesday, October 19, 2021 - link
You've never heard of VMs?You don't like MacOS fine ... this is a Hardware discussion. Not whether you love/hate MacOS.
Why do you need to get a new job for buy a MBP14/16? Is $2000 too much for you to afford? If so, maybe you should get a better paying job. Maybe you should forget about getting a new computer and put food on the table first.
I didn't know this forum is about making ends meet because you are destitute. I feel bad for not everyone can afford luxuries in life, but this forum isn't about self pity.
markiz - Tuesday, October 19, 2021 - link
They obviously use their computer for work and their work requires windows.Oxford Guy - Tuesday, October 19, 2021 - link
Unless Apple is going to make it possible to run the hardware without macOS, macOS is part of the discussion.vladx - Tuesday, October 19, 2021 - link
> If Apple were to just sell hardware, it would be a much bigger deal, instead they sell lifestyle, that happens to come with hardware.Well said, outside US I doubt Macbooks will sell much better than in the pAST.
Hifihedgehog - Monday, October 18, 2021 - link
LOL. Nope. Apple's GPU, like AMD's, while amazing from test tube standpoint, lacks the developer support for CGI and machine learning. NVIDIA's CUDA and CGI industry ecosystem is second-to-none. I find it laughable they omitted AV1 and market this as a serious film indudstyr tool. The little one-man shops will love this, for sure, but the industry-leading pros will always use a Quadro workstation away from the studio. Wake me up when Pixar and Moving Picture Company are using MacBook Pros. I'll wait... Zzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzz.michael2k - Monday, October 18, 2021 - link
I don't understand your point.Pros really don't have the kind of 'fanboyism' you're demonstrating.
If a M1M MBP is as powerful as it seems, Pixar will write the software needed to use it. It's CPU performance makes it competitive with a 12 core Intel part and it's GPU performance compares to a 3080 RTX mobile; of course, maybe Pixar will prefer to not write new software and just use a Razer Blade 15 (the benched laptop) instead.
That said, we're talking about laptops. Pixar (and many other companies) already use MacBook Pros because they are portable, and use Linux hardware farms because they are cheap. The question is if a Mac Pro with a 120W part (64 GPU cores) will entice the likes of Pixar, which I doubt. Again, they use Linux farms because the hardware is cheap. I don't see a Mac Pro hitting the price/performance curve they need to render their movies.
Hifihedgehog - Monday, October 18, 2021 - link
> Pros really don't have the kind of 'fanboyism' you're demonstrating.Exactly except your fanboyism is showing. CUDA and Quadro are the industry standard for GPU accelerated machine learning and rendering in the professional sphere. Apple has alway been the little guys on the fringes or niche corner cases.
defferoo - Monday, October 18, 2021 - link
you don't actually do work on the server farms... you work on a workstation or mobile workstation, then render on the server farm. The workstation just needs to be able to output in the format the server farm accepts. you've said the same things over and over in these comments, but you still don't really know what you're talking about. In case you didn't know, Pixar's Renderman runs on Windows, macOS, and Linux, it's basically platform agnostic.Blark64 - Monday, October 18, 2021 - link
It’s clear from your comments here that you don’t know anything about modern high-end animation and vfx production. There’s a multiplicity of roles (animation, fx, lighting, layout, simulation, etc.), all with different workstation requirements. CUDA is mostly useless for an animator, say, who needs responsive viewport playback of animated characters, and which is CPU bound. An fx animator or simulation artist, on the other hand, could make use of CUDA. High end studios are mostly not using CUDA for rendering, as their huge scenes don’t fit in VRAM, and out of core memory reduces the GPU render advantage significantly. These new Mac laptops could render scenes in Octane or Redshift that are currently impractical on the vast majority of NVidia cards, due to their comparatively massive memory pool.web2dot0 - Tuesday, October 19, 2021 - link
As others have pointed out. You are just talking out of your ass because your Anti-Apple fanboyism is showing.If movie studios thinking Apple hardware is all shit, why do you think they are buying MacPro by the truckload? LOL.
Hope we have waken you up from your excessive koolaid drinking.
Specs are specs.
MBP16 64GB, M1Max 32GPU Cores, XDR Display 120Hz are gonna destroy any PC laptops you throw at it at their thermal envelope. Those are just facts buddy. Keep coping.
Competition is good. I thought almost everyone in this forum like a healthy dose of competitions.
Intel has been sleep on their wheels and AMD is coming late to the game. Apple is turning the table upside down.
Time for the industry to innovate. Apple took the lead, now it's time for the rest of the industry to wake up or be left behind.
R_Type - Monday, October 18, 2021 - link
Lets have some perspective here. This is their 2nd n5 SoC. Everybody else is playing this game on second class processes:"AMD advertises 26.8bn transistors for the Navi 21 GPU design at 520mm², Apple here has over double the transistors at a lower die size."
The same goes for Intel (but that's their own fault).
Silver5urfer - Monday, October 18, 2021 - link
What are you talking ?The laptops they are comparing to are crippled junk. If you want real PC Laptop look at Clevo X170SM which runs 10900K at 5.0GHz all core OC. That will destroy this overpriced garbage which beats a measly 11800H TGL 10nmSF with 48W PL1 and locked down clocks at 4.6GHz max Turbo. The GPU on this as per Apple claims is a 2080 Class not 3080. And Clevo X170SM has a closer to desktop 3080 MXM GPU.
Finally the NVMe Storage, PC laptops have replaceable SSDs, HDDs / 2.5" SATA SSDs. And replaceable components if anything goes wrong. This abomination is a soldered POS design at screaming $2500 cost for which I can build a real desktop powerhouse which can run Linux and Windows and run anything that I can throw at it. Or get a maxed out X170SM CLEVO.
AMD is going to have Zen 4 Raphael on TSMC 5N that will destroy these CPUs and once the Hopper and RDNA3 with MCM arrive in the pure performance aspect this will be obliterated, as Nvidia is having 150% CUDA core count increase over the GA102 on their new Hopper or whatever arch they call it.
Without power cable ? Did you even see M1 ? Once you run it at full speed the efficiency drops like a brick. It's basic Physics. There's no way this SoC is going to sip power and perform around RTX2070/2080 Class. The Clevo X170SM is going to wreck havoc if unleashed that is from a 2020 CPU and 2020 GPU performance.
robotManThingy - Monday, October 18, 2021 - link
And just think, this is only a mobile chip! They clearly have a high-end desktop chip coming for the MacPro. I'm guessing it will be called the M1 Extreme. Whatever they call it, it will completely redefine Apple's position in the market and you can bet that unified memory will play a huge role in it's success.Bp_968 - Wednesday, October 20, 2021 - link
This isn't really true. Just like you can't expect to turn a 3080 down to 100w and get exactly 1/3rd the performance you can't expect to take a 30w part and goose it up to 300w and expect 10 times the performance.Apple is specifically targeting this power usage. Intel or AMD are designing an arch to perform from 15w up to 200w, apple is designing an arch to work at one main target power usage (from tablet to lightweight laptop). Nvidia does the same thing in reverse and so their laptop parts always seem to struggle on a per watt basis VS a integrated gpu (i mean, is that really a surprise considering the laptop 3080 requires pcie access, its own memory subsystem, etc all that require significant power budget).
I suspect if AMD decided to make a APU with similar specs it could reach similar power levels. The potential weak link with x86 is that they dont have complete control over the OS like apple does. Of course many of us here also consider that a feature, not a weakness.
My wife asked about the new apple hardware and what it might mean. I said its a technical achievement, but not one thats likely to effect us in any meaningful way other than potentially pushing forward more competitive hardware from PC suppliers. And its true. If your not already an Apple "follower" its unlikely a faster, lower power laptop is suddenly going to turn you into one. Another poster said it would be a different thing altogether if they were releasing the hardware for direct sale, but apple has no desire or intention to do that.
Tldr: i can put the most powerful most efficient engine ever made in a truck and it doesn't really change much for you as a customer if what you need is a car.
Ppietra - Wednesday, October 20, 2021 - link
Apple doesn’t need to turn up the power consumption and decrease the efficiency of its individual GPU and CPU cores to increase performance.The idea that is being talked about is that Apple will increase the number of cores by a factor of 4. 40 CPU cores, 128 GPU cores, in something similar to chiplet package.
It would loose some overall efficiency in performance per watt but Apple would continue to have an advantage in power consumption and would be extremely competitive in performance.
Tomatotech - Monday, October 18, 2021 - link
Silver5urfer: "PC laptops have replaceable SSDs, HDDs / 2.5" SATA SSDs."Are you a comedian? Are you saying that proper laptops only have HDDs or SATA SSDs?
Do you realise how unbelievably slow a HDD is compared to the NVMe storage on these MBPs?
Let's take one of the best HDDs on the market, a Seagate Barracuda 7200.14, but feel free to use any other HDD, how does it do at 4K random read? 770KB/sec or so. Let's call it 0.8MB/sec to be generous.
Now take one of various NVMe PCIE4.0 SSDs, the WD Black SN850 or the Samsung 980 Pro. They all max around 1 million IOPs at 4k random read, which if I have my maths right, is about 4GB/sec. Or around 5,000 times faster than your 'fast HDD'. That's the kind of drive a proper modern performance laptop should have. Which is the kind of drive the Apple MBPs have.
What about your precious SATA SSDs? SATA tops out at 600MB/s for SATA III. The NVMe drives I mentioned above do 7+GB/s. That's 12 times faster.
As for the rest of your claims, you're making them before this new MBP has even been reviewed and independently benchmarked. And you're also comparing it to your imaginary Zen 4 Raphael which hasn't even been released yet.
You're full of the worst sort of FUD and vapourware combined with an extremely poor grasp of how tech is used in the real world.
Silver5urfer - Monday, October 18, 2021 - link
Dude what are you on ? I said SATA HDD or SSDs which is SATA III standard. PC laptops already have NVMe which I already mentioned. You are strawmanning to peak go and do your BS else where. A soldered POS is a soldered POS nothing is going to change that fact. 8TB of soldered junk if it goes kaput you and your precious $6000 laptop is a junk.Henry 3 Dogg - Tuesday, October 19, 2021 - link
"8TB of soldered junk if it goes kaput you and your precious $6000 laptop is a junk."No. An Apple motherboard repair is the same few hundred dollar price regardless of the amount of storage on your motherboard.
Yet again you don't know what you are talking about.
Oxford Guy - Tuesday, October 19, 2021 - link
Send your machine to California so all the data on it can be hoovered — in particular, whatever data hasn't been sucked up by all the cloud processes.photovirus - Wednesday, October 20, 2021 - link
Apple's SSDs are always encrypted since T2 (2018), and the key is stored in the Secure Enclave, so you won't be able to read the chips without a functioning original SoC w/ its own Secure Enclave.With FileVault enabled (which you can do on first startup when you create your first user account), not only the original SoC is required w/ its hardware key, but also the user's password.
Oxford Guy - Sunday, October 24, 2021 - link
For ordinary folk.colinstalter - Monday, October 18, 2021 - link
Hahahahaha who in their right mind would compare that Clevo machine to the new MBPs? The Clevo has nearly 600W TDP, requires 2 PSUs, and probably can't run for more than 30 minutes unplugged under workload. That thing is a monstrosity and will still be way worse at a lot of workloads. Also what does that say about the MBP's that THAT is the computer you have to compare them to?Spunjji - Tuesday, October 19, 2021 - link
Not sure how to break it to you, but the 11800H at 4.6Ghz is faster than the 10900K at 5.0Ghz - and when it comes to perf/watt this chip kicks the snot out of both."Closer to desktop" is a relative term for a mobile 3080, they're all based on the GPU that powers the 3070 because Nvidia blew the thermal budget.
The soldered-down criticism is pretty valid (although not so much for a professional workflow where service contracts and network storage come into play) but you really just come across as a fanboy when you act like this device is going to come out looking bad in direct competition with a *4.7Kg* DTR that pulls 500W and produces 62dB of sound under maximum load.
I'll certainly be interested to see how it compares to AMD CPUs and GPUs on 5nm, but it's a little bit like comparing an F1 car to a Bugatti Veyron - they do different things in different ways. Apple are going for maximum efficiency in specific tasks under their own OS, AMD can't optimise to that extent.
meacupla - Monday, October 18, 2021 - link
Okay...Supreme performance is one thing, but why does M1 have to have a list of games it can play, rather than the entire steam library? Why is there still software, like Adobe illustrator, that still can't run perfectly on M1. Why does M1 still have SSD thrashing issues? Why can't you upgrade the SSD or RAM on your M1? It would be a shame if something were to happen to the batteries, like degradation.
AMD and Intel already have good enough mobile APUs for most usage cases. So they can't run fanless, or at full power while on battery, so what? At least they can run a majority of games and apps flawlessly.
You brush off the "it's too expensive" talking point, but you are aware that the overwhelmingly vast majority of people don't spend more than US$700 for a laptop, right? Like, this is literally where AMD, Intel, and MS make something like 90% of their profit from. You know this, right?
Now, If Apple were to lower the price of vanilla M1 macbooks to around US$650, then sure, AMD, Intel and MS should be really scared. But with the current software and price limitations? pfffffft yeah, whatever.
scottrichardson - Tuesday, October 19, 2021 - link
I can't believe these arguments still happen. Value is in the eye of the beholder. Clearly myriad people place high value in the seamless, well designed, secure eco system that Apple provides with their hardware/software/services combination. Meanwhile other people don't see the same value, and prefer products with more fundamental modularity and lower prices at the expense of not having premium/cutting-edge industrial design and components (MiniLED, M1 Max, etc). Apple products are designer, boutique products, will almost always be owned by the minority, and will continue to be seen as 'premium' options. The vast majority will of course not own an Apple laptop, and I don't think Apple or any Mac fan has a problem with that. There's literally next to no arguments you can win with many Mac/Apple fans because there's simply a totally different understanding and appreciation of what value is to them vs others. So rather than argue what's better (Win vs Mac, Apple vs Android etc), just accept that people like what they like for the reasons they choose and arguing helps nobody.Shorty_ - Tuesday, October 19, 2021 - link
I think that there's also a bit of elitism at play here. There's a lot of comments in here that come across almost as if they pitty macOS users for somehow not being as enlightened and buying 'overpriced crap' and it's their civic duty to protect the sheeple.vladx - Tuesday, October 19, 2021 - link
My Creator 17 has a MiniLED screen, I guess you didn't know Macbook isn't the first laptop with MiniLED.colinstalter - Tuesday, October 19, 2021 - link
The Creator 17 is the first and only other miniLED laptop I'm aware of, and it's coincidentally priced almost the exact same as the new MBP. But yes, that does appear to be the only semi-comparable machine on the market right now.The MBP has a higher peak brightness, better webcam, better battery life, and is thinner, but the MSI has lots more ports and is (presumably) upgradeable.
Oxford Guy - Tuesday, October 19, 2021 - link
'well designed'Would that be:
1. The shattering 13" M1 MacBook Pro screen?
2. The selling of new Macs capped at 8 GB of RAM, while simultaneously refusing security fixes for machines that have 16 GB?
3. The removal of MagSafe in previous generations for an inferior solution?
4. The defective butterfly keyboard?
5. The rapidity of software incompatibility, due to frequent changes of the operating system's innards?
6. The ever-increasing increasingly-aggressive spyware?
7. The not-enough-ports strategy to create a dongle-selling side business?
8. The inefficient touch bar?
9. The removal of WindowShade, which was better than Dock minimization in many cases?
10. The inability to turn off button flashing (something that could be done in System 6, for goodness' sake)?
11. The addition of extremely irritating window shake, with the shaking not being able to be turned off?
12. The inability to disable Dock minimization animation?
13. The stupidity of not having two separate docks, one on the left side of the screen, and one on the right, so that application position doesn't change (interfering with muscle memory) and new users aren't confused about where things should be placed? Also stupid is placing it along the bottom of the screen given the fact that vertical real estate is worth more generally.
14. The incredible number of hidden background processes, a huge number of which phone home with data that really shouldn't be transferred?
15. The inability to have good control over file metadata without resorting to 3rd-party software (and even then)?
16. The lack of adequate documentation of the operating system?
17. The inability to truly erase files and do secure erase on SSDs?
18. The inability to fully disable automatic sleep? (Perhaps this has been fixed but in Catalina it's not.)
19. The ugly new icon shape and loud + ugly new icons (versus skeumorphic) to go with garish backgrounds?
20. The APFS file system's extreme inefficiency with hard disks?
et cetera
Oxford Guy - Tuesday, October 19, 2021 - link
One specific complaint about the spyware...All sorts of remote monitoring and control tools are included in the operating system by default, rather than being able to be manually downloaded and added by the 'owner' of the machine.
Apple will say it's for ease of use. I say these things are for a certain type of ease of use.
ghoppe - Thursday, October 21, 2021 - link
None of the remote monitoring tools are enabled by default. They have to be enabled by the owner of the machine with administrator privileges, password, etc. How is this different from Remote Desktop Connection included with Windows 10?Methinks your tinfoil hat's a little tight.
Oxford Guy - Sunday, October 24, 2021 - link
1. Windows having the same sort of spyware in it is irrelevant.2. Closed-source operating systems with various kooky bits of hardware (like T2 and CPUs with black box CPUs built into them) can't be relied upon to enable and disable via the intentions of the 'consumer'.
As for the ad hom at the end of your post... citation needed.
web2dot0 - Tuesday, October 19, 2021 - link
Apple put a price on the value of not hearing a fan spin at 5000rpm, and able to run at full speed without a power cord.If you think it is not worth it, don't buy it.
But to say Apple provides no value because they are overpriced is moronic at best.
That's like saying I have a desktop PC that can out perform a laptop ... but not everyone wants a desktop computer. Get it my guy?
You clearly don't know much or anything other than spew fud.
Henry 3 Dogg - Tuesday, October 19, 2021 - link
Priceless.You complain that there is still software that can't run perfectly on an M1
And then you say that AMD / Intel machines can run the majority of games and apps flawlessly.
You don't exactly set a level playing field, do you.
Jumangi - Monday, October 18, 2021 - link
Scared of what? These laptops are thousands above what the vast majority spend on one. These represent a tiny fraction of sales. These chips are the equivalent of a RTX 3090. Absurdly powerful and absurdly expensive.Chinoman - Monday, October 18, 2021 - link
They have a sub-$1000 M1 line already for people who don't need this much performance.web2dot0 - Tuesday, October 19, 2021 - link
Apple literally have MBA, MBP13, MBP14/16 from $999 ($850 discount) all the way to fully decked out $6000 MBP16/64GB/M1Max 32Core/8TB SSDChoose your price point.
Hell, if you are really cheap, get the MacMini for $699 ...
Henry 3 Dogg - Tuesday, October 19, 2021 - link
"These laptops are thousands above what the vast majority spend on one."These laptops start at $1,999.00 (14" M1 Pro) so its hard to see how that is thousands above what the vast majority spend on a laptop.
Tomatotech - Tuesday, October 19, 2021 - link
Or get a M1 MBA, an excellent machine, for well under $1000.PixyMisa - Monday, October 18, 2021 - link
If Apple weren't so determined to destroy MacOS it would indeed be a threat.But they are, so it's not.
fazalmajid - Tuesday, October 19, 2021 - link
I can’t wait for Asahi Linux to be usable on a M1 or M2 Mac Mini.Bp_968 - Wednesday, October 20, 2021 - link
This. Anyone who thinks this isn't part of the goal is fooling themselves. Apple wants complete control of the stack, top to bottom. MacOS is still far too open and old school in its design. They want *all* software to be funnelled through the app store.This is an amazing chip, but its relevance doesn't leak outside the apple ecosystem very much. I would suggest its a much bigger concern to android than windows or linux. Linux will continue to dominate the server market and windows will continue to dominate the office and home user market and certainly the PC gaming market.
Hopefully the combo of this and the (hopeful) success of the steamdeck show AMD and intel that there is a market for a really efficient *and* powerful SoC as a portable gaming system.
Doan Kwotme - Wednesday, October 20, 2021 - link
praise TSMC for that...mdriftmeyer - Thursday, October 21, 2021 - link
In the next 6 months 5nm TSMC Zen 4 based EPYC CPUs designed around 3D MCM will be splashed all over these pages. In an additional 4 months the first 5nm Zen 4 3D MCM Workstation class TR chips will be available. The conservative claims by Lisa Su and company a year ago were 50% improvements over the previous Zen solution.During these months the newly merged AMD-Xilinx will be revealing to the world their designs and some new accelerator solutions that Xilinx currently shows are best of breed. Nothing Apple has is a long term industry first. I know the folks. I love my old company and I'm a life long OS X from the days when I worked at NeXT.
ARM isn't going to dethrone x86, Apple could care less to dethrone Windows. The company concerned the most is Intel. It's got a lot of old talent ready to retire.
Microsoft is steadily diversifying itself and continues to expand and grow its valuation. It's not scared.
AMD in the next two to four years will be > $500 billion corporation within a massive portfolio of markets to continue expansion in.
Apple will continue to expanding its EV projects and eventually have their line to show up, and show up into a highly contested industry that has thin margins, so Tim and company will be a niche player, by design, and continue expanding its services divisions.
The general consumer purchase of computers do not have the world leaping onto Macs like the world of the iPhone or the iPod.
Apple will never enter the Enterprise Markets. We were fully prepared to do so until Steve saw the books and realized the only this company survives is through the general consumer.
Another company scared is Nvidia. They are banking their future on selling ARM based CPUs to manage custom solutions for AI, CFD, FEA, and other scientific markets that they can sell truckloads of custom 3D MCM future SoC designs that they can then charge massive license fees for ala Oracle.
Without the ARM merger, long term, Nvidia will go the way of SGI. Sold off, piece by piece.
Tams80 - Thursday, October 21, 2021 - link
Heard it all when the M1 came out.And while it is a very impressive piece of silicone, look what difference that made...
palladium - Friday, October 22, 2021 - link
> But let's face it, it's gonna take YEARS before the PC market can catch up to Apple.This will depend largely on TSMC's 5nm capacity.
KPOM - Monday, October 18, 2021 - link
It will be interesting to see if there are any under-the-hood improvements to the cores. We'll see when the single-core CPU specs become available. Also, it appears Apple is binning aggressively, with chips with 6 vs 8 performance cores, and multiple GPU configurations.web2dot0 - Monday, October 18, 2021 - link
Why wouldn't they? Their 6CPU core and 14GPU core variant is already enough to destroy most PC laptops.There's no reason why they wouldn't sell binned chips. Just makes sense.
If this is not the time to rake in the profits,when?
arnabmarik - Monday, October 18, 2021 - link
my $2000 60 lbs RTX 3080 rig: Am I just a big joke now?Hifihedgehog - Monday, October 18, 2021 - link
No. They compared it to laptop GPUs. ;) This seriously lacks CUDA, video codecs (O AV1, AV1, wherefore art thou AV1?) and has just a third of the TFLOPs of a stock RTX 3080.michael2k - Monday, October 18, 2021 - link
Isn't that awesome?A laptop with 1/3 the TFLOPS of a desktop RTX 3080! Or equivalent to an RTX 3060, really, which is roughly the same performance as the mobile RTX 3080.
But yeah, this is also Apple we are talking about. Who knows when they add support for AV1 in CoreMedia:
https://developer.apple.com/documentation/coremedi...
They only added VP9 in iOS 14:
https://developer.apple.com/documentation/coremedi...
RSAUser - Tuesday, October 19, 2021 - link
@michael2k no idea as the power numbers or whether that's peak numbers. Will see once proper benchmarks come out, would hazard it can peak to a RTX 3080 in specific workloads tailored to it.@Blark64 focus on AV1 over HEVC, AV1 is about 45% better at compression size and decode at the same quality levels for "movie" quality at 4k and nears 50% at 8k. Normal proxy footage is 1080p for most, that's around 32% smaller for Av1 vs HEVC at same quality (I'm looking at MacX and a comparison paper titled "A comparative performance evaluation of VP9, x265, SVT-AV1, VVC codecs leveraging the VMAF perceptual quality metric" from June 2020.
SVT-AV1 from Netflix and Intel is great though, brings software encode/decode to only 2x hardware encode/decode, so don't think Apple needs a hardware encoder/decoder tbh, will be fine for most considering the performance of the device.
VVC is looking quite interesting.
Blark64 - Wednesday, October 20, 2021 - link
@RSAuser Sure, AV1 has some efficiency advantages over HEVC, at the cost of higher computational requirements and limited hardware and platform support. It's a wash.The more important point I was making is that the OP was focussed for some reason on a delivery codec, not an editing/production codec, which is a focus of the coverage of these new machines (ProRes encode/decode blocks). The two classes of codec have very different goals and functional requirements, which people unfamiliar with TV/Film/VFX production rarely understand.
Production codecs prioritize lossless or visually lossless quality, symmetric encode/decode, intraframe compression, high bit depths, alpha (or depth) channel support, and dynamic range. Delivery codecs prioritize lossy compression, efficiency, and asymmetric encode/decode (meaning long compression times, lightning fast decode).
gescom - Monday, October 18, 2021 - link
Cuda alternative will come sooner than you think, oh and what's the power consumption of that 3080 alone?Blark64 - Monday, October 18, 2021 - link
Why the focus on AV1? AV1 is a nice quality delivery codec, but really no better than Hevc/mp4, and completely irrelevant on the Mac platform. What AV1 is not is a pro codec: it’s asymmetric and interframe, which kills editing performance. That’s why these new macs have hardware support for a pro codec (Prores), and the M1 has already shown that it has the grunt to decode other pro codecs like Redcode Raw in real time. In other words they are meant for content creators, not consumers.andygrace - Friday, October 22, 2021 - link
Totally right about the difference between capture/ingest and delivery/streaming codecs. Very few tech people who aren't media professionals get that. The focus on AV1 is of course openness and a lack of any royalties on the consumer side.HEVC is great and VVC looks better, but the issue is the MPEG LA and the licensing costs and the patent pool. Apple is a founder member of the Alliance for Open Media - it's hard to find a company that isn't on board.
ProRes is a totally different beast - it's looking like the Sony Betacam /HDCAM/XDCAM of the future.
blargh4 - Monday, October 18, 2021 - link
What a shame that this remarkable bit of hardware will be wasted on running macOS.KPOM - Monday, October 18, 2021 - link
Then suggest to Microsoft that they sell Windows 11 ARM licenses for use on Macs.brucethemoose - Monday, October 18, 2021 - link
Linux support is progressing at a shocking rate.You're not wrong though. I WANT this, but OSX-only is kinda a dealbreaker.
joelypolly - Monday, October 18, 2021 - link
Why though? I run most of what I run on linux pretty much natively on MacOS. A lot of things cross compile especially when it comes to doing developmentOxford Guy - Tuesday, October 19, 2021 - link
Linux doesn't destroy the viability of useful machines via the withholding of security patches nearly to the degree Apple (and now MS with 11) do. That's why.AntonErtl - Tuesday, October 19, 2021 - link
Shockingly slow? It's been a year since they started, and they are still not done. To me this indicates that it is quite hard. And IMO that's enough reason not to buy these Apple machines as sexy as they may otherwise be. If they want our business, they should support Linux rather than making it hard. And that's why our group has not bought an M1 Mac despite some of us considering it.andygrace - Friday, October 22, 2021 - link
Apple is Unix. Tools like Homebrew (disclosure: which I had a small hand in converting to native ARM during the NDA pro dev kit era) means if it the code is open and available for Linux, it will almost certainly compile and run on ARM. As a sidenote, I helped port an open x86 macro assembler to ARMv8 via brew. ... where it still complies to native x86/x86_64 code :)ex2bot - Tuesday, October 19, 2021 - link
These new laptops are *not* actually capable of running OS X.Oxford Guy - Tuesday, October 19, 2021 - link
I see what you did there. macOS.Change the capitalization and, suddenly, it's not System 1–9 (aka Mac OS).
Lavkesh - Monday, October 18, 2021 - link
I am glad it run a linux based OS and not a clusterduck that windows is. Now go back to the dark hole you came from.nickyg42 - Tuesday, October 19, 2021 - link
macOS is UNIX, not Linux, Linux-based, or UNIX-based.Silver5urfer - Tuesday, October 19, 2021 - link
That dumb guy barely has any knowledge of what is ARM what is a PC or what is even an OS. Good that you corrected him on the fundamental mistake.Henry 3 Dogg - Tuesday, October 19, 2021 - link
How much do you actually know about MacOS ?nickyg42 - Monday, October 18, 2021 - link
My 10core/16-core 16GB 1TB 14" is ordered. Can't stand my 2020 13" MBP 2.3GHz 17, it gets hot doing somewhat banal tasks, and is literally uncomfortable to use as a laptop some of the time.Decided to hold back and not go all-out -- I'll wait to do that when the M2 Pro/Max are out in a year or two, hopefully w/ Arm v9 and its SVE instruction set.
brucethemoose - Monday, October 18, 2021 - link
Imagine an M2 MAX with wide ARMv9 SIMD, and all that bandwidth.And all that GPU memory... it would be an absolute media processing monster.
Kevin G - Wednesday, October 20, 2021 - link
The M2 would be using the same cores as the A15 chip found in the iPhone 13. Not shabby at all but Apple isn't going to be rolling out SVE2 enabled chips for the Mac next year. Beyond that is anyone's guess but Apple hasn't shown interesting in developing a Mac only CPU core design. The biggest benefits of SVE2 are currently targeted by dedicated accelerator blocks in Apple's SOC designs.blanarahul - Monday, October 18, 2021 - link
Gaming Laptop when Apple? Most of us plebs don't have any use for your 16/32 core GPUs. Of course we shouldn't and wouldn't be buying a Mac for gaming anyway but it'd be cool to see Doom Eternal running at 60+ fps on M1 Max.nevcairiel - Monday, October 18, 2021 - link
Apple has never really shown any interest in gaming, and all the years of alienating the ecosystem won't be turned around over night, even if they tried.Oxford Guy - Tuesday, October 19, 2021 - link
The first piece of software for the Lisa was a game. It was called Alice.It became the first piece of software for the Mac.
After that, though... gaming was 'deprecated'.
Jobs wowed the crowd with Halo running on a smurf G3 with the first Radeon card. Vapourware. It is rather hilarious to watch that presentation, given how much Jobs wanted everyone to believe Apple was serious about games.
Oxford Guy - Tuesday, October 19, 2021 - link
It may have been the Rage 128, which predated the Radeon. Anyway... very early ATI tech and Apple was at the forefront — for about two seconds.tipoo - Thursday, October 21, 2021 - link
Not exactly Vaporware, Microsoft just bought Bungie and tied Halo to XboxOxford Guy - Sunday, October 24, 2021 - link
Mac gaming was vaporware, despite all that hype.That game being on the Mac was vaporware.
photovirus - Sunday, October 24, 2021 - link
> Apple has never really shown any interest in gamingI don't think that's entirely true.
While they haven't been doing anything with gaming on a Mac, they've made Metal a quite nice framework on iOS.
And now it's available on Macs, complete with all that GPU power.
Also: if Apple avoids silicon shortages, they might be selling some of the most affordable (!) gaming-worthy hardware in 2021 and 2022. It might resonate well with game publishers.
Chinoman - Monday, October 18, 2021 - link
Gaming laptops are a niche product. They already sell more casual games than anyone on the App Store so I don't think they're that concerned lol.TEAMSWITCHER - Tuesday, October 19, 2021 - link
The golden age of Apple Silicon Mac Software isn't here yet. Let's hope that when that day comes, game makers are on board. Just don't hold your breath.Henry 3 Dogg - Tuesday, October 19, 2021 - link
"Most of us plebs don't have any use for your 16/32 core GPUs."Then why are you reading / posting on, this article ??
Spunjji - Tuesday, October 19, 2021 - link
With GPU resources like the M1 Max has, Doom Eternal would probably be pushing 120fps at native res.andygrace - Friday, October 22, 2021 - link
Apple simply don't care about gaming on the macOS platform. iOS however is a totally different story because it's aimed at a very different market segment.Apple have chosen some critical market segments where they can make the most money and they have executed on those plans brilliantly.
Gamers are the exact opposite of people who use professional work machines; content creators, executives, people whose time is precious and consequently price is less of an issue.
Bluntly, Apple's target demographic make money on their machine by saving time.
Gamers, generally speaking, try to save money on buying a machine to spend time.
Silver5urfer - Monday, October 18, 2021 - link
"AMD advertises 26.8bn transistors for the Navi 21 GPU design at 520mm², Apple here has over double the transistors at a lower die size."Why not mention while AMD is at TSMC 7N and Apple is on 5nm ?
"In terms of performance, Apple is battling it out with the very best available in the market, comparing the performance of the M1 Max to that of a mobile GeForce RTX 3080, at 100W less power (60W vs 160W). Apple also includes a 100W TDP variant of the RTX 3080 for comparison, here, outperforming the Nvidia discrete GPU, while still using 40% less power."
Did you look at the MSI GE76 Raider actually ? It's a crippled RTX3080 garbage, which is around a 2080 performance. Not Ampere level. And again, it's on a Samsung 10nm / Samsung 8N vs TSMC 5nm,
I think these doesn't matter because It's Anandtech and Apple coverage.
catinthefurnace - Monday, October 18, 2021 - link
Right, Andrei didn't specifically call out why Apple is leading in this area, so it's like they're not even leading? Apple is cheating by using better technology. It's not fair I tells ya!Silver5urfer - Monday, October 18, 2021 - link
About the CPU performance, why not even look at the processors used lol, just saying "massively" doesn't cut it tbh. Those are 11800H crippled BGA i7 processors at 48W puny TDP at 70W short power duration, that too locked at 4.6GHz max turbo speed. x86 parts scale up with power. So on a BGA board they are literal trash. Unfortunately the market is not like that and people love to buy them.I hope when you review you will put everything clear for eg running a CBR23 on say, a 11900K or 12900K or 5950X with full processor package power consumption on a "sustained load" vs the same for M1X processors with stock and full unlocked.
Also notch on a laptop ? Insanely stupid. Finally the price is $2000 for a pure soldered design and no consumer rights at all, a completely locked down OS and ecosystem blackbox.
Henry 3 Dogg - Tuesday, October 19, 2021 - link
As a consumer, I prefer my right to have a secure machine to your right to demand that my machine must be vulnerable so that you, if it was yours, would be able to do whatever you wanted on it.Oxford Guy - Tuesday, October 19, 2021 - link
Secure from being able to erase your files? Because that's what all-soldered means.It means snoops can get your IP if they want it, even if you've somehow managed to evade the 'cloud'.
Blark64 - Wednesday, October 20, 2021 - link
Can you explain these assertions? I'm at a loss to what you are referring to...Oxford Guy - Sunday, October 24, 2021 - link
You can't remove the storage, the RAM, and the battery.This means that you have to rely upon a closed-source operating system with various odd hardware bits (like the T2 chip) in the mix — for file deletion.
The APFS file system has an interesting way of handling file deletion and forensics people already documented shenanigans like placing deleted files (e.g. from the hidden FS_Events folder) into unallocated space for later retrieval, prior to APFS.
GeoffreyA - Sunday, October 24, 2021 - link
Does APFS do anything different when deleting files? Because most file systems don't actually delete anything, just the entries from their master tables. Concerning the fsevents log, even NTFS has something analogous: the USN Journal, introduced in Windows 2000.shelbystripes - Tuesday, October 19, 2021 - link
It’s being compared to mobile BGA Intel CPUs because it’s a laptop-class SoC replacing (and now competing with) mobile BGA Intel CPUs. This seems incredibly obvious.Apple isn’t selling desktops with unlimited power budget containing this silicon … yet. So comparing performance to Intel chips with unlimited power budget would be useless. Why compare what’s been designed for a laptop to a 280W CPU, when that 280W CPU will never show up in a laptop?
Spunjji - Tuesday, October 19, 2021 - link
"48W puny TDP at 70W short power duration"Do you understand what a mobile CPU is or not
Ppietra - Monday, October 18, 2021 - link
I truly don’t get your comment! The MSI GE76 Raider is one of the best gaming laptops performance wise.Ryan Smith - Monday, October 18, 2021 - link
"Why not mention while AMD is at TSMC 7N and Apple is on 5nm ?"Noted and clarified. The point was more about the transistor count than the die size, but it never hurts to add more details.
"Did you look at the MSI GE76 Raider actually ?"
Yes. We've even reviewed it. It's the most powerful gaming laptop we've seen so far, typically coming out well ahead of the next fastest laptop GPU, the Radeon RX 6800M.
https://www.anandtech.com/show/16928/the-msi-ge76-...
Silver5urfer - Monday, October 18, 2021 - link
Thank you for the corrections.MSI GE76 might be one of the most powerful ones, but the real one is Clevo X170SM, which is the king that too on a Z490 chipset with fully socketed hardware, it can run everything at max. LGA CPU at fully unlocked power limits and GPU 2080SUPER MXM without any limits and if purchased from a good reseller they get full unlock BIOS too. Unfortunately due to the market being so small, they didn't refresh with Ampere.
blargh4 - Monday, October 18, 2021 - link
A 10+lb "laptop" with a battery life of a couple hours and hair dryer fans is simply not Apple's competition here.catinthefurnace - Monday, October 18, 2021 - link
I'm struggling to understand the desire to compare the monstrosity that is the Clevo X170SM (over 10 lbs) with this MacBook Pro (4.8 lbs) as though they're in the same class.Lavkesh - Monday, October 18, 2021 - link
Ignore him. He is just a troll and a sore looser troll at that.catinthefurnace - Monday, October 18, 2021 - link
AgreedSilver5urfer - Monday, October 18, 2021 - link
Because of many thingsPrice - $2000+ if anyone looking for what people are expecting here like to dethrone AMD and Intel on the PC space, everyone is going to look what options they have.
Performance - Clevo X170SM-G has a G-Sync Ultrafast IPS display on top it has a big Heatsink which can handle 10900K sustained constant Power draw of 250W+, maintaining an all core 4.9GHz-5.0GHz Clockspeed, that's a 10C processor with 20T and no BS power caps like they are using here which is at 4.6GHz of 11800H plus many can simply buy a 10600K and then upgrade later, or even add a 11900K (this is hard to cool in that chassis still it can manage). GPU is 200W capable MXM chipset as well which is not going to throttle like most of the BGA junktops.
Next this MBP, you guys thinking this is going to consume a 20W load as Apple states and gives you a 2080 TU104 performance ? That too constantly without throttling, same for CPU magically this M1X can do a full no throttle no C-State based max performance at just 30W. The MBP has a 140W brick with and that would be probably capped at 96W-120W for this machine.
And on 96Whr battery, you guys think this is going to perform constantly hold it's high clockspeed on a denser 57Bn TSMC 5N die ? No it's not going to happen. The M1 already throws out it's efficiency when put on high workloads this will repeat the same. Why do you think Apple is advertising the fans on this laptop, and etc.
To put it shortly, the X170SM has bad battery backup vs this, true. This one has good efficiency but in peak performance this is not going to compete at all. On top the GPU workload and CPU workload where did Apple perform the benchmarks ? Don't say Final Cut Pro. ARM based designs have dedicated blocks for encode and decode. We do not have any benchmark like FireStrike or Unigine Superposition or any great AAA to compare these. Not even sure what Apple is claiming here.
Finally the user servicing. Everything is darn soldered on this. Ever seen a MacBook tear down ? Go and watch, see how KB and Battery are glued to the chassis. Ever heard of Louis Rossman ? Try to search and see what Apple does charge for a simple IC repair and instead throw a Mobo swap cost to the user on top Storage is soldered. That's a big time no. Paying over $2500 and having a hunk of junk in case of an issue is not at all acceptable.
Henry 3 Dogg - Tuesday, October 19, 2021 - link
"The MBP has a 140W brick"True. True but that doesn't necessarily mean that the machine ever draws more than 30w.
It might just mean that it charges from flat, very quickly.
photovirus - Sunday, October 24, 2021 - link
> that doesn't necessarily mean that the machine ever draws more than 30w.From Apple's graphs: GPU alone can eat up to 60W. And CPU accounts for some 30—40W.
These machines can get quite hot, should you tap into their power.
Spunjji - Tuesday, October 19, 2021 - link
You remind me of the guys who remap the ECU on their car to get an extra 50bhp for the low low price of doubling their fuel consumption and getting 1/4 of the life out of the engine. You're welcome to it, just don't think that obsession with getting the last little bit of performance out of your system represents most users.Oxford Guy - Tuesday, October 19, 2021 - link
Right to repair is somehow equivalent to damaging a machine?Fascinating!
Ppietra - Tuesday, October 19, 2021 - link
"M1 already throws out it's efficiency when put on high workloads"As far as I know that is not the typical behaviour under high workloads. M1 CPU power consumption on most high demanding task overs around 20W, or even 15W.
ex2bot - Tuesday, October 19, 2021 - link
Have you heard of Louis Rossman. Because he talks about the Apple products and how they are susceptible to dihydro monoxide infiltration and soldered on the motherboard and I’d rather have a Clevo cause it’s heavy and I’ll get reall jacked?Carstenpxi - Monday, October 18, 2021 - link
It will be interesting to see the Clock Frequency of the CPU. Judging from the bump the A15 got, perhaps 10% more might be possible here? And how much will the massive caches contribute?I've been following CPU chips since 1968, and first with the M1 jump, and now with todays announcements, I don't recall ever seeing such a large step change. With all due to respect to Intel's leadership role in the earlier years, the cynical comment would be the Apple has proven that Moore's law is not dead, but Intel is. Fortunately, there are enormous resources available there, which should keep competition healthy.
As we look to the coming year of Apple silicon, the logical thing to do is iMac and MacPro. We can then speculate if the year after, Apple plays the final card by applying its chip efficiency prowess to server farms, were the green benefits would be huge. Whether a consumer-oriented company would do this or not, is an interesting question, but the societal impact would be signifiant, so one could argue that they have a moral obligation to do it.
R_Type - Monday, October 18, 2021 - link
Yikes there's a bunch spring to mind!286 to 386 to 486 to pentium. The pentium jump was huge! K6 to k7, P4 to conroe, 'dozer to zen. Hell back in the day a cpu could be shrunk *twice* over it's lifetime.
SystemsBuilder - Monday, October 18, 2021 - link
We should hold off a bit on judgment until we see real life benchmarks come in from the various review sites. Having said that, if review site consensus confirms Apple's claims, then this shift could be bigger than when intel "conroed" the CPU market and I agree with Carstenpxi, it could very well be the biggest CPU leap step. If it holds true, it shows the limits of the x86 incremental approach (like we did not know that already) and what is possible with large financial muscles and market position to get access to most advanced process on the planet prioritized by TSMC (through Apple's $$$) ahead of AMD and Nvidia (both who are a fraction of Apple's market cap and revenue). As a business and tech guy you must admire Apple's strategic play here... and this is just the beginning!More fundamentally, taking a few steps back and looking at the complete picture:
this is a game of financial muscles so you have to look at the #
Apple market cap: revenue:
intel market cap: revenue: 220.94
SystemsBuilder - Monday, October 18, 2021 - link
Apple market cap: 2,394.23B revenue: 274.5B (2020)Intel market cap: 220.94 revenue: 77.9B (2020)
In this game, market cap and free cashflow is your currency.
Farfolomew - Wednesday, October 20, 2021 - link
Comparing Market Cap to Revenue of Apple and Intel based on your numbers, that's quite impressive for Intel. Or it just shows how little perceived value the public thinks Intel is compared to that of Apple. But when considering Market Cap for the massive companies, like Apple, I don't think you can really use it as an indicator of performance. It exists as a speculative measure for the big big companies. Intel is not a big big company, so comparing Revenues and Margins are a better metric. With the numbers you posted, Apple is *only* 4x bigger than Intel. But then, they can always sell a few more stocks to raise a crap ton more capital too!Kvaern1 - Wednesday, October 20, 2021 - link
The Pentium jump was huge?The first pentiums was slower than the last 486's.
GeoffreyA - Thursday, October 21, 2021 - link
The Pentium's superscalar architecture raised performance drastically.andygrace - Friday, October 22, 2021 - link
I'll give you 8080 to 8086. That was a massive step up, but if you remember 486DX4 at 100MHz or AMD's 133MHz equivalent, the Pentium 75MHz seemed a bit on the slow side. It did of course take huge strides forward with every new clock increase.And if you're been following CPUs since 1968, you're doing better than me. Was that in DTL logic ? That's pre Intel's 4004. It's even pre Texas Instruments TTL based 4 bit Arithmetic Logic Unit - the 74181 which powered those old PDP-11 and VAX machines! That chip didn't get released in TTL till I think late 1969 or early 1970!!
gescom - Monday, October 18, 2021 - link
+1TanjB - Monday, October 18, 2021 - link
Arguably the key story is the use of LPDDR5. Intel TigerLake Core i7-11800H may be 8 cores but it is only 2 channels of DDR4-3200, so it is amazing it can even post 60% of the perf of the M1 Pro. Using 25% of the memory throughput it would be pouring power into speculation and other Core tricks to try to compensate.DDRx memory is a trap for CPU designers. It has a purely unwarranted reputation as the ideal. Apple are way out in front because they realize that LP-DDR is actually far higher performance potential at much lower energy per bit. All you need to do is focus on how to package it. Perhaps Intel gets it by now.
ikjadoon - Monday, October 18, 2021 - link
>so it is amazing it can even post 60% of the perf of the M1 ProTo be fair, very few consumer CPU workloads are memory-bandwidth-bound: ideally, they don't even want to go to DRAM, right? Before today, the AMD EPYC HEDT behemoths, relatively speaking, had eight-channel DDR4 and there are some workstation benefits, so perhaps this is where the M1 Pro / Max are positioned.
https://www.anandtech.com/show/16478/64-cores-of-r...
But, gosh: what Apple's done with the LPDDR5 @ 400 GB/s is much more important for the GPU. I'm still hesitating to write it, "Wait, is that a typo? 400 GB/s?!"
lmcd - Tuesday, October 19, 2021 - link
Apple's cache sizes are the real story -- they're absolutely massive, and extremely local at the expense of the most efficient per-transistor layouts (or their layout is genius). If Intel spent that much die space on caches, they'd go out of business just off of the cost per die of their sub-$400 segment.The only route to competing with this madness is with chiplets in laptops, and hope the interconnect doesn't hose idle power.
TanjB - Monday, October 18, 2021 - link
400 GBps memory on the Max makes it the equivalent of a latest generation gaming console, at least in principle.RedGreenBlue - Monday, October 18, 2021 - link
Moore’s Law is dead.Long live Moore’s Law!
ftlbaby - Wednesday, October 20, 2021 - link
LOL.This.
pSupaNova - Monday, October 18, 2021 - link
Be good to see the hashrate of these new offerings. Otherwise they look good, but for the price there are much better Laptops with dual screens etc.boozed - Monday, October 18, 2021 - link
The CPU performance claims are probably reasonable but GPUs aren't held back by legacy instruction sets in the same way that CPUs are. How is the GPU performance/watt claim even remotely realilstic?Tomatotech - Monday, October 18, 2021 - link
Wait for the reviews and benchmarks and we'll see.boozed - Tuesday, October 19, 2021 - link
Indeed. Historically, Apple's simplified publicity graphs have been... optimistic, to say the list.boozed - Tuesday, October 19, 2021 - link
Least. To say the least. My kingdom for a time limited edit function!Oxford Guy - Tuesday, October 19, 2021 - link
As long as it doesn't say 'edited' in the post. That's really annoying for someone who makes plenty of typos. It makes posts seem suspicious when, in reality, they're posts that someone took the time to clean up.caribbeanblue - Tuesday, October 19, 2021 - link
Nope, they'Ve generally understated their performance.bill44 - Monday, October 18, 2021 - link
Still no HDMI 2.1 port. No 4K/120Hz over HDMI for my HDMI 2.1 monitor. Do we at least get UHS-II SD card reader? The rest is great.zamroni - Monday, October 18, 2021 - link
The lightest is 1.6 kg which is very heavy for high end 14" laptopPixyMisa - Monday, October 18, 2021 - link
That is pretty hefty. My 14.5" Dell is 1.25kg.OreoCookie - Tuesday, October 19, 2021 - link
Apple could have reduced weight by including a smaller battery. I, for one, am very glad they haven’t. I’m totally cool if the pro machine is heavier in exchange for battery life and power.If I want a light machine, I’ll get an Air. (Keep in mind that the current Air’s enclosure has not yet been designed for the much lower TDP of the M1, so I reckon a redesign will be a bit lighter.)
Dug - Monday, October 18, 2021 - link
If you consider the battery takes up most of the weight, you get that 17hr claimed battery life that others don'tSpunjji - Tuesday, October 19, 2021 - link
For the performance? No, not really. It's neither extremely light nor particularly heavy; you bottom out at around 1Kg for a low-performance 14" device and go up to 2Kg for something with GPU performance that will slightly exceed the M1 Pro.zodiacfml - Monday, October 18, 2021 - link
i hate you apple, surpassed my expectations.👏 I look at AMD's APU on the PS5/Xbox as the best innovation currently in the consumer PC but considering AMD's humble size, a tech company bigger than AMD can outdo it and this is exactly it.I've been preaching recently that AMD should immediately build Arm chips with huge AMD graphics where the end game is selling this to Apple as they don't have graphics prowess, turns out this game plan is not a walk in the park.
Farfolomew - Wednesday, October 20, 2021 - link
My thoughts echo yours. The console chip design from AMD is something that needs to be replicated fast in the PC space. The era of discrete GPUs is very fast approaching the end, and the PC needs highly integrated SoCs with large memory bandwidths like Apple's to compete on the efficiency front.I wonder if Qualcomm's Nuvia team is going in this direction with their Apple-esque CPUs? It will be exciting next year to see how these turn out and to see how they're adopted into the Microsoft PC world.
blargh4 - Monday, October 18, 2021 - link
Not quite related to the SoC specifically, but anyone know if Apple's ProMotion refreshes on demand a la gsync, or does it change between fixed refresh rates on the fly?mikegrok - Tuesday, October 19, 2021 - link
It probably refreshes either when the next frame is ready, or every 1/10 of a second, whichever comes first.abufrejoval - Monday, October 18, 2021 - link
So they are going all-in on solidered die carrier memory--no expandability: that, together with the large pools of last level SLC gives them GDDRx/HBM like bandwidth with LDDRx like latencies (as well as huge DRAM related power savings), which is great as long as your CPU/GPU workload demands lie right on that linear line of CPU/GPU-core/RAM capacities for their 1x/2x/4x configurations.The M1x basically become appliances in the four basic sizes (I guess some intermediate binning related survivers will round out the offer), which I've imagined for some time via a PCIe or InfinityFabric backend using AMD "APU-type" CCDs, HBM/GDDRx and IODs.
What they sacrifice is the last vestiges of what had me buy an Apple ][ (clone) and let me switch to the PC afterwards: the ability to make it a "personal computer" by adding parts and capabilities where I wanted them throughout its life-cycle (slots!).
I can see how that wouldn't matter to most, because they can fit their needs into these standard sizes, especially since they may be quite reasonable for mainstream (my own systems tend to have 2x-4x the RAM).
Of course it would be nice if there still was some way to hang a CXL, IF or PCIe off the larger chips, but Apple will just point out that this type of compromise would cost silicon real-estate they prefer to put into performance and interest only a few.
Of course they could still come out with server variants sans GPUs (or far reduced parts) that in fact do offer some type of scale-up for RAM and workstation expandability. But somehow I believe I get the message, that their goal is to occupy that productivity niche and leave everything else to "niche" vendors, which now includes x86.
Well executed, Apple!
And believe me, that doesn't come easy to someone who's professional career has been x86 since 1984.
I still don't see myself buying anything Apple, but that's because I am an IT professional who builds his infrastructure taylormade to his needs since decades, not a "user".
I'd get myself one for curiosities sake (just like I got myself a Raspberry PI as a toy), but at the prices I am expecting for these, curiosity will stop short of getting one that might actually be usable for something interesting (the M1Max), when I get paid for doing things with CUDA on Linux.
Getting enough machine learning training power into a battery operated notebook is still much futher away than electrical power anywhere I sit down to work. Just like with "phones", I barely use the computational power nor battery capacity of the notebooks I own. My current Ryzen 5800U is total overkill, while I'd happily put 64GB of RAM in it (but it's 16GB soldered). So if I actually do want to run a couple of VMs in a resort, I'll have to pack the other slightly heftier (64GB and it will do CUDA, but not for long on battery).
I can probably buy two or three more, add 8TB NVMe and double RAM on each and still have money left vs. what Apple will charge.
Yes, they won't have as much power per Watt taken from the battery, but that does not matter to me... enough to get my Raspberry a fruity compagnon ;-)
name99 - Monday, October 18, 2021 - link
OK, so now that we've all got out of our systems- cost too much
- suck compared to team Intel/AMD/nVidia
- don't include <weird specialist feature I insist every computer on earth has to include>
let's try to return to technology.
Note the blocks (in red) at the very bottom of the M1 Max. On the left-most side we have a block that is mirrored higher up, above SLC and just to the left of the 8 CPUs. Next we have a block that is mirrored higher up above SLC, to the right of the 8 CPUs.
Apple tell us that with Max we get 2x Pro Res Encoders and Decoders. Presumably those blocks; one minor question of interest is whether those blocks are *only* ProRes or are essentially generic encoders and decoders; ie you may get double the generic media encode/decode on Max, which may be useful for videographers beyond just Pro Res users?
It certainly also looks like the NPU was doubled. Did I miss that in the event? I don't recall Apple saying as such. (Also looks like the NPU -- or somethingNPU-relevant -- extends beyond the border drawn by Andrei, when you compare the blocks in the two locations).
Finally we get the stuff at the right of the Max bottom edge, which replicated the area in blue above the NPU. Any suggestions? Is that more NPU support hardware (??? it's larger than what Andrei draws as the NPU). Lots of SRAMs -- just by eye, comparing it to the P cluster L2, it could 16MB or so of cache.
So this all suggests that
(a) with the Max you also get doubled NPU resources (presumably to search through more video streams for whatever NPU's search for -- faces, body poses, cats, etc)
(b) the NPU comes with a fairly hefty secondary cache (unless you can think of something else that those blocks represent). Looking at the M1 die, you can find blocks that look kinda similar near the NPU, but nothing that's a great match to my eyes. So is this an additional M1X change, that it comes with the same baseline NPU as the M1/A14, but augmented with a substantial NPU-specific "L2" (which might be specialized for training or holding weights or whatever it is that people want from NPUs)?
abufrejoval - Monday, October 18, 2021 - link
Well, I love your speculation, but on the Apple shop page, the SoC configuration makes no difference on core count of neural engine, it remains at 16 "cores" for all three variants, M1/Pro/Max.You may argue unique differentiation for M1 SoC and how they do RAM with it, but SSD storage is just commodity. And all their cleverness about using DRAM to produce GDDR5 class bandwidth leaves a bad taste when they sell it at HBM prices.
Ursury around here starts at 20% above market price and Apple is at 200% for SSD and RAM.
After the minimally interesting config got me beyond €6000, my curiosity died.
abufrejoval - Monday, October 18, 2021 - link
Usury (no bear included, just greed)--net edit!Tomatotech - Monday, October 18, 2021 - link
Apple used to charge absolute rip-off prices for bog-standard SODIMMs in their models. By comparison, this new pricing of $400 for a 32GB RAM upgrade to 64GB is actually not too bad.This is NOT DDR4 RAM. This is running LPDDR5 RAM, and is the first customer laptop / desktop in the world to run LPDDR5. You're paying for that first-adopter advantage. (There are some very recent phones that run LPDDR5 but I think they max at 12GB and use a slower variant)
Mid-range DDR4 for desktops seems to run at about $5/GB for 2 x 16GB. But go up to 2 x 32GB, and suddenly it's around $10/GB especially on the high end, so you're looking at around $320 for 32GB of fast high-end DDR4.
The M1 Max runs extremely fast extremely specialist RAM that is a generation faster than DDR4 / LPDDR4, and the 64GB is concentrated down into only 4 on-chip RAM modules.
Getting that at only $12.50/GB for the extra 32GB is a bit of a bargain at this point in time.
(I previously said this was DDR5 RAM, I was wrong. As for storage, yes $400/TB is stupidly steep and shouldn't cost so much extra even for fast pcie 4.0 storage. That's more or less a commodity by now.)
lmcd - Tuesday, October 19, 2021 - link
Remember there's probably like 1 or 2 NAND options they've rated their SoC-internal controller for, which means they've probably had to select for top binning on both power and performance due to sharing architecture with the M1 iPad Pro.Which, honestly, is less of a sacrifice than I expected from Apple's ARM transition. This whole thing has been disturbingly smooth even considering the software incompatibility lumps.
abufrejoval - Tuesday, October 19, 2021 - link
Thanks for lecture, but I actually already admired their creative super-wide four channel interface for the M1max, which unfortunately sacrifices any external expandability.Still, while it's a special and needs to be managed with a complex die carrier and assembly, in overall cost it's relatively normal technology and thus cost, everything high-volume items.
So they make their typical >200% margin also on the DRAM.
dc443 - Thursday, October 21, 2021 - link
One charges a premium for a halo product. That is simply what one does, it's very a simple economic calculation. I expected the upcharge to go to 64GB to be north of $1k, it was not, it was $400.Bambel - Tuesday, October 19, 2021 - link
> Note the blocks (in red) at the very bottom of the M1 Max. On the left-most side we have a block that is mirrored higher up, above SLC and just to the left of the 8 CPUs. Next we have a block that is mirrored higher up above SLC, to the right of the 8 CPUs.My guess is that these blocks are for redundancy. Apple already does a lot of binning and to have some IP-blocks as spare should further increase yield. Note in the upper right corner the four identical blocks? The M1 has two of them and i think these are four TB-controlers and only three of them are used.
The fact that the Pro looks pretty much like the upper part of the Max makes me think if they only manufacture Max dies and then cut off the lower part if it's going to be a Pro. It's a tradeoff between two manufactoring runs and only one run with admittedly lots of wasted silicon. I guess we will see actual die shots at the end of next week and it should be visible if there are some "residuals" from a larger die on the Pro die.
Harry_Wild - Monday, October 18, 2021 - link
Apple’s M1 Max is alien technology for a laptop! Wow! 400GB/second throughput! Best fiber optic throughput is 10GB/second! Standard is up to 1 GB/second! Amazing!Kevin G - Monday, October 18, 2021 - link
400 GByte/s of bandwidth in a laptop is impressive but you're clearly underestimating where fiber networking currently is. 800 Gbit Ethernet is a thing which is roughly 100 GByte of bandwidth there at the highend. 1 Tbit and 1.6 Tbit Ethernet are in the draft stages. 2.5, 3.2 and 5.0 Tbit speeds are in proposal, though they'll likely need silicon photonics to scale to such speeds.400 GByte is a lot of bandwidth but you only have to move a few centimeters where as those fiber specifications are able to move data at those rates in the ranges of 10s of kilometers.
Nick B - Tuesday, October 19, 2021 - link
Please note the memory bandwidth is measured in GB/s, not Gbps or Gbit/s.400GB/s converted = 3,200 Gbit/S or 3.2 Tbit/s
lmcd - Monday, October 18, 2021 - link
This goes to show what you can do when you don't have to care about a product priced for ordinary people.Apple's mobile die sizes have already been massive compared to competing mobile SoCs. It's not shocking that they made the same tradeoff for their laptop SoCs. It's just sad to think how many chiplet-type designs could be harvested from the same silicon wafers.
Nick B - Tuesday, October 19, 2021 - link
A15 die size – 8.58mm x 12.55mm = 107.68 mm2Can you share dimensions info of the Snapdragon 888 or Exynos 2100?
I'd like to see the info you're basing your opinions on.
Karaqx - Monday, October 18, 2021 - link
apple just announced the fastest windows laptop, m1 max variant with vm windows 11 😂😂Teckk - Monday, October 18, 2021 - link
Microsoft better be working on a secret SoC of its own as relying on Intel is not getting it much. It needs a major boost to the Surface line with a bigger performance jump and battery lifelmcd - Tuesday, October 19, 2021 - link
No one can charge the luxury prices that Apple gets away with. That means that there's no way to subsidize such a massive chip.This is the same problem that Intel has, let alone AMD and Nvidia. MS? Wouldn't sell nearly the volume needed.
Ppietra - Tuesday, October 19, 2021 - link
Also Apple has the A series chips that help spread some of the development costs.andynormancx - Thursday, October 21, 2021 - link
It is kind of hard to compare now that Apple has gone ARM, but Microsoft's Surface laptop pricing (in the UK at least) has been pretty comparable to Apple's x86 laptop pricing.Teckk - Monday, October 18, 2021 - link
So the frequency war is over eh? No mention of frequency by Apple.photovirus - Sunday, October 24, 2021 - link
I don't think Apple ever published frequencies for their own chips, only Intel ones.eastcoast_pete - Tuesday, October 19, 2021 - link
One thing I really like here (as a non-Mac user) is that Apple is finally breaking this nonsensical barrier to wider, higher-throughput memory buses for CPUs and APUs wide open. For whatever reason, we x86 users have been told that there is no good reason or no benefit to such wide RAM access. Well, maybe now AMD and Intel will reconsider. Memory bandwidths of 200 - 400 GB/s using working RAM (yes, it's DDR5, but so what) is something to aspire to!andrewaggb - Tuesday, October 19, 2021 - link
AMD's been selling APU's with high memory bandwidth and graphics performance for years in the current and previous gen consoles. Maybe they'll finally start selling them into the consumer PC market.Oxford Guy - Tuesday, October 19, 2021 - link
'Maybe they'll finally start selling them into the consumer PC market.'Maybe with competition. Unlikely without. AMD is competing against the PC gaming market with its 'consoles'.
Spunjji - Tuesday, October 19, 2021 - link
It's all about economics. Perhaps this will open a pathway for AMD and Intel to pursue larger APUs, but for the time being the cost vs. performance trade-offs haven't made sense.Xajel - Tuesday, October 19, 2021 - link
If a PC company will do soldered RAM like how Apple M1 Pro & Max did, then I'm finally okay'ish with soldered RAM.I'm still losing the flexibility and the affordability of self upgrade, but man.. seeing this compared to 8GB & 16GB soldered is an Earth/Sky different. especially with the memory bandwidth. I'm not complaining here, period.
Oxford Guy - Tuesday, October 19, 2021 - link
RAM upgrades have always been the most important thing to extend the useful lifespan of laptops, except in rarer cases where someone gets far more than they feel they need when the buy the machine (something much more common now that 32 GB has become a lot more affordable).It should always be assumed that soldered RAM in laptops is about making planned obsolescence faster more than anything else.
Is that extra bandwidth such a great advantage, once the amount of RAM is no longer enough to prevent slowdowns?
640K wasn't enough for anyone for long.
Oxford Guy - Tuesday, October 19, 2021 - link
Case in point...Apple sold Macbook Pro machines with 16 GB of RAM. I have a 15" 2013 model with that much. It will shortly become 'deprecated' — no longer able to function with Apple's level of security on the Internet. I know a number of people with older 16 GB Macs that haven't been able to be secure for a long time now, machines that are 100% adequate for their needs in all other respects (particularly given the fact that they have SSDs). Meanwhile, the company has been introducing new machines that are capped at 8 GB.
It's insanity for consumers on a parade float of 'wicked fast' banality. Yes, yes... very fast. Very quick to end up in the landfill because of inadequate RAM.
dc443 - Thursday, October 21, 2021 - link
What does security have to do with system memory size?Oxford Guy - Sunday, October 24, 2021 - link
'I know a number of people with older 16 GB Macs that haven't been able to be secure for a long time now, machines that are 100% adequate for their needs in all other respects (particularly given the fact that they have SSDs). Meanwhile, the company has been introducing new machines that are capped at 8 GB.'web2dot0 - Wednesday, October 20, 2021 - link
99% of people don't ever upgrade the laptop ... Those are just facts. Most laptops become slow/obsolete in 5-6years.My MBP15.4 from 2012 w/16GB still works fine. It's getting slow now, but never had memory problems for general tasks.
At some point, more ram doesn't do anything. You just need more compute. If you need 32GB of ram, chances are, that laptop you bought in 2012-2015 isn't fast enough no matter how much RAM you put in it.
Oxford Guy - Sunday, October 24, 2021 - link
'My MBP15.4 from 2012 w/16GB still works fine. It's getting slow now, but never had memory problems for general tasks.'I guess you missed this:
'Meanwhile, the company has been introducing new machines that are capped at 8 GB.'
palladium - Tuesday, October 19, 2021 - link
512 bit LPDDR5 can't be cheap, power or cost wise. I'll wait for the benchmarks.web2dot0 - Tuesday, October 19, 2021 - link
Power will be amazing for the performance. Industry leading. If history serves, the benchmarks will reflect that.Everytime people questioned Apple about their claims about M1, they end up outperforming their claims.
Those are just facts.
Pacinamac - Tuesday, October 19, 2021 - link
In my over 30 years of computing, I've never once used a Mac or Apple product. As a filmmaker I am starting to think it may be time to bite the bullet and learn.Nothing in the PC market will be able to touch this, from a video editing standpoint. I just wish it couple play AAA games.
Bakaburg1 - Tuesday, October 19, 2021 - link
Do you expect performance differences in CPU-only tasks between the max and the pro?AntonErtl - Tuesday, October 19, 2021 - link
An interesting announcement. Very high memory bandwidth (especially for M1 max), probably for the benefit of the GPU.The transistor density of the max is more than twice as much as that of the Navi 21 despite the Navi 21 having more (dense) cache; they must have made their logic more than twice as dense as AMD, which is more than I would expect from the process advantage.
I wonder about the 4 RAM packages, each of which contains 16GB (128Gb); AFAIK RAM dies nowadays have at most 16Gb; so the packages contain 8 dies; stacked or in a flat layout?
Why is Apple doing such a big GPU? Is this important for their existing markets, or do they want to get into new markets? Maybe a game console?
Overall:
The good: Very impressive efficiency.
The bad: Non-expandable.
The ugly: Apple.
Rudde - Tuesday, October 19, 2021 - link
LPDDR5 uses 16-bit interfaces. The M1 Max has a 512-bit memory interface and 4 memory modules. Each memory module would have a 128-bit interface. Combine 8 dies of 16 Gb for a memory module with a 128-bit interface and 16 GB of memory. Four of those and you have a M1 Max with a 512-bit memory interface and 64 GB RAM memory.web2dot0 - Wednesday, October 20, 2021 - link
Overall:Me: Sounds like Apple is innovating.
You: Sounds like a hater.
Just by being the virtue of Apple don't make it "ugly". They are just as corporate as AMD/Intel/Nvidia.
Apple is not some evil doer.
Raqia - Tuesday, October 19, 2021 - link
Hey Andrei, in the M1 Max dieshot, aren't there two of the structures homologous to what you're calling the NPU in the M1 Pro? There's one in the copied portion to the M1 Pro and one in the lower left corner. There's also something that looks like an SLC block next to each as well.Rudde - Tuesday, October 19, 2021 - link
The bottom of the M1 max contains duplicates of at least 4 IPs. The only IP Apple has claimed to be doubled (excluding the GPU itself) is the ProRes encoder and decoder. I drew some boxes around the IPs I found: https://imgur.com/a/u1onxdI. The blue box is the Neural Engine (compare with the annotated die shot of M1 https://images.anandtech.com/doci/16252/M1.png).Spunjji - Tuesday, October 19, 2021 - link
My best guess: It's for yield. They need all of their dies to have those components functional, and that area between the memory interfaces would otherwise not be doing much.zony249 - Tuesday, October 19, 2021 - link
Dont know if anyone noticed, but in the M1 Max, at the bottom of the SoC, there is an additional 16-core Neural Engine. Must be disabled, but its interesting they just copied and pasted it on the other half.The Hardcard - Tuesday, October 19, 2021 - link
It looks like there are several spare parts down there. Cheaper to have them on every die than to be throwing away usable CPU/GPUs because of some defect in some random IP block. Given the size of the die, a lot of that area might have ended up being scrap silicon anyway.The Hardcard - Tuesday, October 19, 2021 - link
Actually, the memory controllers take the die that far anyway, might as well use it for some backup blocks.OreoCookie - Wednesday, October 20, 2021 - link
Are these die shots real? AFAIK Apple has a habit of showing conceptualized die shots to prevent competitors from seeing the layout (which they can easily do afterwards anyway, but I digress). I'd be surprised if they actually doubled the neural engine, but just kept one active. Wouldn't it be more efficient to include a few more “units”/“cores” than they needed instead?Jasonovich - Tuesday, October 19, 2021 - link
I don't take much credence in Apple marketing hype, you really really can't make comparisons with the x86 unless there's parity with the software and setup.It's still a phone CPU but marketing is a wonderful thing, you can turn a pigs ear into silk. In the land of warped reality, where the sky is the colour of the rainbow and flying unicorns nesting in the treetops, anything is possible, it's about perception not facts.
In terms of energy efficiency and kudos to Apple, M1 or whatever incarnation sets the benchmark, not sure if the upcoming V-RISC processors would improve the power envelope further. The herd will always follow the fruity cult but I rather piss over their garden wall than to be part of the grass grazers.
Carstenpxi - Tuesday, October 19, 2021 - link
You are really good at poetry!The Hardcard - Tuesday, October 19, 2021 - link
How is a 400mm plus die with multiple cores each of which can run multiple workloads at the same speed if not faster than workstation CPU cores from Intel and AMD a phone CPU? It can’t fit in a phone. It can’t be powered by a phone PCB.Math is math and computations are computations. All these cores do the same computations, so I would define a phone CPU by its size and power envelope. But your definition somehow includes a CPU that has 3 times more transistors than the new IBM Power 10 and going to draw 30W or more for light loads and will pull more than 100W on heavy loads.
I would love to hear your view as to how it could be put in a realistic normal phone.
ex2bot - Tuesday, October 19, 2021 - link
THAT would be quite a phone, tho, wouldn’t it? Sure, it will need a 140w power supply and will burn up in your hand, but wow! Fast.ex2bot - Tuesday, October 19, 2021 - link
I’m with you. Upcoming processors from AMD et al in the coming years are going to make these chips look underpowered. That’s what I’ve always said about Apples unicorn blue skies philosophy: Your current technology will be absolutely no match for your competitors products from, say, five or six years from now. Take your apple-shaped ball and go home.scottrichardson - Tuesday, October 19, 2021 - link
It is most people's hope that future technology outperforms current technology, otherwise what's the point? Dare I suggest that Apple's own future CPU's will outperform the M1 generation too? I feel the pain of windows/PC fans... I really do, but your time to shine will come soon enough once AMD and Intel get their new tech out into the wild. Looks like Intel are working on some big/little tech.Kvaern1 - Wednesday, October 20, 2021 - link
Your sarcasm detector is malfunctioning.web2dot0 - Wednesday, October 20, 2021 - link
You seem to think M1 is the end the road for Apple ... what have Apple demonstrated since M1 came out last year? They deliver on their promises.Intel? Not so much.
AMD? No response to Apple
Nvidia? Still needing 300W to power their graphics chips.
Guarantee you MacPro next year will have 4xCompute and 4-8xGPU of the 32core M1Max, and yeah, over 1.6TB/s of memory bandwidth and unified memory (imagine all that graphics memory).
Surely, you will be curious too no?
Justiniana - Wednesday, October 20, 2021 - link
It’s amazing how hypothetical future tech always beats current existing Apple tech. This argument doesn’t make more sense now than it ever did.Spunjji - Tuesday, October 19, 2021 - link
I'm genuinely stunned by what they're offering here - and the claims about performance per watt. Between the refined N5 process and their profligate use of transistors to hit their performance targets, I don't doubt the sincerity of their claims. I just wish I could get one of these in a system that runs the software I use!Raqia - Tuesday, October 19, 2021 - link
It's likely close to best in terms of CPU industry wide (certainly best for efficiency), and the rest is comparable to a good console SoC but better tuned for power efficiency (use of LPDDR5). It's about time that the game console's superior fully integrated architectural style made it to personal computers; it had been stalled by the grip of Intel, with the lack of their own graphics IP and their margins from selling an additional chipset at an older node keeping the antiquated current desktop and laptop over busses alive. There are some benefits to modularity in terms of upgradeability and just plain fun, but the vast majority of consumers never crack open their case once.Farfolomew - Wednesday, October 20, 2021 - link
Amen! The discrete GPU shall be remembered, but not missed! It's finally time to set sail on that ancient tech. (Add-in cards in the 2020s?! We had add-in cards way back in the 80s and 90s, Sound cards seem so preposterous of a thing to have in a modern computer, and so should GFX, at least in laptops)scottrichardson - Tuesday, October 19, 2021 - link
What software do you use?Farfolomew - Wednesday, October 20, 2021 - link
We may yet be getting these soon in the form of Qualcomm chips. Whether they can run the software we love to use, well that will be up to Microsoft. If Apple's chips are fast enough to translate x86, can Qualcomm's as well? That's my hope. More and more, I see the future of Microsoft Windows living in the ARM world with x86 translation support. And then I see the personal computing landscape mirror that somewhat of the iOS/Android, but with more a more organized front (Windows) competing against Apple's offering.KPOM - Thursday, October 21, 2021 - link
Agreed. Qualcomm, Microsoft et al should work on Rosetta-like solutions so we can move on from aging architectures without losing software compatibility.andynormancx - Thursday, October 21, 2021 - link
Microsoft already have a Rosetta-like solution for their ARM version of Windows.It seems to be a bit of a work-in-progress so far https://www.techrepublic.com/article/windows-on-ar...
Hard to know whether it performs as well as Rosetta or not though, given that it is typically running on much less performant non Apple ARM SoCs.
GeoffreyA - Thursday, October 21, 2021 - link
In fact, the WOW subsystem which provides this support, at least for 16-bit x86 applications, predates Rosetta.GeoffreyA - Tuesday, October 19, 2021 - link
It appears that folk are putting down this SoC because of an aversion to Apple, and won't acknowledge, in the spirit of good sportsmanship, that this is an impressive piece of tech. They just can't admit this silly company from Cupertino coming close to, equalling, or beating the fellows they're fond of. As a result, we've got the remarks criticising price, soldering, MacOS, lack of AV1, as a way to say, "This thing is rubbish. Don't bother."We're all supposed to be lovers of computers here, and this article is discussing an SoC. What does price, or even soldering, have to do with that? We can't own a Space Shuttle, but isn't it nice to discuss it and say, "Certainly, I'm no fan of NASA, but this is pretty good stuff."
My personal favourite is the AV1 attack, a tactic used to throw stones at this giant. Surely, anyone who's doing editing, will not encode to AV1 in the middle stages? AV1, if used, will be saved for the final step, using software encoding and libaom. Decoding would've been nice; but Apple could add it easily, and the sort done in software isn't that bad.
I can't stomach Apple, their status connotation, or their products. Quite frankly, they put me off. But that won't stop me from admitting the merit of these SoCs. And Apple worshippers, the same applies to you, when you're putting down Windows, Intel, and AMD as if they're from the bin. It's that spirit of superiority which grates on x86, Windows people, causing them to make fun of Apple. Nobody likes when someone acts as if they're better than everybody else.
ddps - Tuesday, October 19, 2021 - link
@GeoffreyA - great post. One thing I always like to say, though, is we should never love something that can't love us back! Cheers!GeoffreyA - Wednesday, October 20, 2021 - link
Absolutely, ddps. Great saying!Oxford Guy - Tuesday, October 19, 2021 - link
Corporate fanboyism is just as biased.Your space shuttle example is flawed, at least in terms of the long history made prior to for-profit space travel. Shuttles were about enriching humanity more than lining the pockets of yacht buyers. Yes, some of it was nationalism, which was about the latter. But, the main idea, at least on the surface, was not profiteering.
These companies have too much money and make terrible decisions, like the shattering panel on the 13" M1 and using security patches to fill landfills.
We may love tech but we also love getting a good deal. Apple could have made another chip that's even bigger. It left some die area on the table. With corporations, as profiteering comes first, 'just enough' is the goal — not 'let's max-out the possibility'. A bigger chip would have meant a lower price for this 'Max'. Instead of using its advantage to push humanity foward, it's content to do 'just enough' to maximize margin.
While it can be argued that that's the best strategy for keeping a corporation alive, putting it into the position where it can create innovative products, 'sell less for more' is the overarching mantra of the corporation. That relies upon marketing, which is about inculcating delusion.
It's also a fact that all that money is used to keep innovation down. IP serfdom. It's great for the wealthy powerful few. It's a huge impediment for creative invention for those who aren't. Look at how long copyright lasts now. It's a vampiric parasitic neo-serfdom apparatus — like the corporation.
GeoffreyA - Wednesday, October 20, 2021 - link
I agree, Oxford Guy, with all my heart. These rotters are only out to make money, and will say anything to fill their coffers. Sustainability and yoga in vogue today? Well, play into that and your product will sell. "Here at Pineapple, we care about sustainability, and that's why we're using ethically-sourced green materials." Add a bit of emotional music and it's a hit.I'd go so far to say that today's tech companies---Facebook, Google, Apple, and co.---are wielding a species of soft totalitarianism, wrapped up in marketing that plays into consumers' desires and vanities. Sprinkle with "empower" here and "empower" there, and you've got them. You're not buying a phone: you're buying empowerment and liberation. Nor is one buying a chemical concoction called make-up that probably ruins the skin, but a ticket to youth, attractiveness, and success in courtship. Further tips: add the idea that everyone's doing it ("try the app that's been downloaded by most Americans"), lip-service to choice, and correct alignment with current politics.
web2dot0 - Wednesday, October 20, 2021 - link
The reality is the PC industry hasn't innovated for over a decade. All they've done is add more fans, coolers and more optimizations, when we should be following the Moorse Law.Apple come along and redefines the industry with their Apple Silicon which clearly are YEARS ahead of their competition. No credible person would think that Apple's isn't gonna keep 2x their product for the next few years. Apple is already designing M4 for all we know. They are just flexing their muscles in small chunks at a time.
Yet, PC folks continue to ridiculous Apple as "piece of junk". It's embarrassing to call themselves computer enthusiasts. Tech is tech. It's not a religion.
Apple has their shortcomings (getting rid of ports, excessive thinness to their laptops at the expense of performance, butterfly keyboard, etc ...), but no PC fanboy wants to admit that Apple does produce quality products compared to their competition.
Apple fanboys wants "acknowledgment", while PC fanboys go to great lengths to deny them and continue to ridicule them. No apple fanboy are gonna just take that lying now. It's a vicious cycle.
If PC fanboys just admit that Apple makes quality products, I'm 100% certain Apple fanboys will also admit that choices are GOOD.
Some people like a supped up Honda Civic, while others like their BMW maintained by the factory warranty. Each their own. It doesn't mean all BMW are crash and Civic a infinitely better and cheaper.
GeoffreyA - Wednesday, October 20, 2021 - link
I agree that if people could just admit a competitor is good, when good, all would be well. It's hard, I know, but has a medicinal effect on the mind, almost if a burden were lifted off one's chest. Such is truth.I don't agree that the PC space hasn't innovated. How about Sandy Bridge and Zen? Even Bulldozer, despite being a disaster. If Zen's turning the tables on Intel and raising IPC ~15% each year isn't astounding, I give it up. And as far as I remember, Renoir wasn't that far behind the M1---and that's with the handicap of x86's decoding overhead, among other things (5 vs. 7 nm). I'm confident that if AMD built an ARM CPU, after a couple iterations, if not on the first, they'll match or surpass Apple. And I even doubt whether ARM's all it's cut out to be. If x86 must go down, let's hope the industry chooses RISC-V.
While excellent and worthy of applause, the M1 is hardly years ahead of the competition. Where does it stand against Zen 3? Is it really that big of a difference as the story's being painted? Once more, the search for truth. The ultimate test, to see who's the best in design, would be to let Apple craft an x86 CPU or AMD an ARM one.
Farfolomew - Wednesday, October 20, 2021 - link
I think it is in terms of packaging and efficiency. Outright performance maybe not, but the fact that it makes *no compromises* in it's beating of anything the PC space can offer is the major news here. There are no negatives about this chip. It's better in just about everything, and in major ways such as efficiency and parallelism.If anything, this should be lauded by the PC community. This SHOULD give the kick in the proverbial butt to the likes of Intel/AMD/Quallcomm/NVidia to change their thinking in CPU design, to get back on track with Moore's law. I'm excited to see how the PC industry reacts to this.
Will it gain back performance lead at some point, or will it forever be stuck losing to Apple a'la Android/iOS SoC designs?
GeoffreyA - Thursday, October 21, 2021 - link
When the M1 first came out, I felt it would recalibrate the frequency/width/IPC axes, and still do. AMD and Intel only had themselves to compare against all this time. Though Apple's not a direct competitor at present, I'm confident AMD could beat them if they had to, now that they see what sort of performance they've got to aim for. Those who are making fun of x86 underestimate what AMD's capable of. Intel learnt the hard way.Farfolomew - Saturday, October 23, 2021 - link
Hmm, you really think so? I mean, AMD's Ryzen is good, but it's not really any better than Intel's best (Tiger Lake) and will soon be eclipsed by Alder Lake. Ryzen has just caught up to what Intel's been able to offer, but I don't see it as much better. At the very least, compared to these new M1 chips, AMD and Intel chips are nearly identical.I suppose I just don't see AMD as the one challenging Apple's CPU prowess. They don't have the R&D budget to do so. And Intel? I'm not sure they can ever recover, they're not hiring enough young engineers to rethink the paradigm shifts needed to compete with the coming of ARM.
That leaves Qualcomm and their Nuvia Acquisition, which no one really knows how seriously to take. If Nuvia's design roadmap have them developing M1-like CPUs, then I think Quallcomm's future is bright.
Or perhaps it's not so black and white. X86 might survive just fine, and we'll continue to see a healthy battle and innovation. Afterall, that's the best case for us consumers.
GeoffreyA - Sunday, October 24, 2021 - link
I think it takes more than big R&D budget to make a winning CPU: it was Bulldozer-era AMD that designed Zen. And we've seen that dollars thrown left and right, in Intel fashion, may but doesn't necessarily produce excellence.Whether x86 will go down, no one can tell right now. As it stands, there is no true competitor on the desktop, Apple being isolated in its own enchanted realm. Qualcomm, who knows? There's a possibility Intel or AMD could announce an ARM CPU (RISC-V being less likely because of no Windows version yet), causing x86 to fade away. I won't be surprised to see Intel trying some trick like this. "If we can't fight Ryzen, why not pull out the carpet from under it?"
As for paradigm shifts, while innovation and flexible thinking are excellent, drastic change has often been disastrous: Pentium 4 and Bulldozer. It's the tried-and-tested ways that work, and further perfecting those. As for ARM, apart from the fixed-length instructions, I don't think there's anything really special about it, as is often painted in the ARM-x86 narrative.
Speedfriend - Thursday, October 21, 2021 - link
To think Apple will 2x their product is insane. This is not the M1 in reality as it has been in development for years in the iPhone at the main core level. All the easy gains have been made already. I would not be surprised to see a 15 % per generation improvement from here.WuMing2 - Tuesday, October 19, 2021 - link
Memory bandwidth is half of Fujitsu A64FX employed in the most powerful supercomputer in the world. In a laptop. Incredible.KPOM - Thursday, October 21, 2021 - link
Nice.Kevin G - Wednesday, October 20, 2021 - link
These are impressive chips with the M1 Pro hitting the midrange sweet spot in the midrange. I'd love to see Mac Minis and iMacs using these chips soon where they can ride the frequency voltage curve a notch or two higher to really see what these designs are capable of.The layout of the extra encoders on the M1 Max seem to be targeted at an odd niche vs. what else Apple could have used that die space for. I will argue for the first set of encoder found in the baseline M1 Pro just the extra units is serving an ultra small niche who will actively utilize them.
That dies space would have been better leveraged for two additional things: an on-die FPGA or even more memory channels. The FPGA programmability would permit *some* additional acceleration for codecs but obviously not hit the same performance/die space or performance/watt as a the dedicated units but it would help for those that need more than the first set of encoders. The other idea of additional memory controller is less about increasing memory bandwidth but increasing raw memory capacity: 64 GB isn't a lot when venturing into workloads like 8K video editing. Boosting capacity up to 48/96 GB would see more usage than the secondary encoders and have a better fit across more workloads. The down side of adding additional memory controllers would be greater die size (~500 mm^2?) which leads to high cost of the SoC itself. Total system cost would also increase due to the additional memory chips too. Even with these tradeoffs, I think it'd have been the better choice than a second set of hardware encoders.
web2dot0 - Wednesday, October 20, 2021 - link
M1Max is clearly targetting the high-end laptops. Odd niche? Apple obviously is gonna target their platform for optimizations. LOL.Otherwise, it just means they have no confidence that their encoders are any good. Which makes no sense. If you don't endorse your own products, who will?
The industry will give Apple a second look once they see the power/watt. Money talks.
Absolutely it's about increasing memory bandwidth. Otherwise, all that memory isn't gonna do much in graphic intensive tasks. Why do you think GPU memory have such high bandwidth compared to System Memory on PC?
If anything, M1 and M1Pro/Max have demonstrated that Apple knows what they are doing.
KPOM - Thursday, October 21, 2021 - link
Agreed. Apple hit it out of the park with the M1 Pro and M1 Max. We shouldn’t have been surprised since the M1 was so powerful at 10W, but some people won’t believe it until they see it.Kevin G - Friday, October 22, 2021 - link
I have no problem with Apple including a ProRes encoder on the M1 Pro and M1 Max. That does make sense to in terms of performance, power consumption and silicon investment in the design. My issue is that adding a *second* encoder to the M1 Max is incredibly niche. Ditching that in favor of two 128 bit wide interfaces to mostly increase memory capacity would have been the better trade off given the market Apple works in. The memory bandwidth boost would be nice but not the primary driver when the M1 Max as is already has 400 GByte/s of bandwidth. When doing high resolution video editing, 64 GB of memory can be consumed quickly. Simply put ,more RAM would have had a bigger impact to more end users than the second encoder block on the M1 Max.web2dot0 - Saturday, October 23, 2021 - link
Obviously the 2nd encoder is specifically targeting video editors/serious content creators. You also get 2x memory BW (200 to 400GB/s) to go along with it.That's why they made you pay extra $200 and another $400 memory upgrade to get it (16GB to 32GB). ($600 total).
I mean, to go from 32B transistor to 53B transistors ... no one seriously expects Apple to not charge a pretty penny for it. People who bitch and moan are people who expects to get their 2ch DDR4 64GB for $400 ...
Except Apple is giving your 400GB/s LDDR5-6400 unified memory architecure.
Only layman equate those two products are the same.
ftlbaby - Wednesday, October 20, 2021 - link
I am very impressed with these chips! The unified memory architecture seems to be a tremendous leap forward in the laptop class CPU and GPU landscape. Apple has disrupted Intel and AMD and NVIDIA with the M1 and now has pulled so far ahead of them with the M1 Pro and Max. Nothing anyone else is currently shipping in this category is even close to competing in ALL the important laptop metrics.I have a $1100 M1 MacBook Air that outperforms my $3300 Intel Core i7-8850H Macbook Pro in most of the tasks that I perform on a daily basis. Mostly ingest, edit, and export of 42MP RAW files in Lightroom Classic. Secondarily transcoding 1080p and 4k h.264 to ProRes andor HEVC. Lastly email, web, streaming, etc. And that is even before Lightroom was optimized for the Apple Silicon. All while being silent versus annoyingly loud. All while being cool or warm to the touch versus too hot to put on my lap. All with 1/4 the RAM. Oh yeah, I still think constantly about my battery life, but that is from 25 years of laptop use! When I actually check I still have plenty of battery life to complete my task and then watch hours of streaming videos while reading websites like this side by side. I will definitely upgrade to a laptop built around one of these SOCs as soon as the reviews determine which is the best in price to performance.
An aside, I have disabled comments on all websites for many years and just recently enabled them again. Not much has changed; 9 out of 10 comments are from Windows / Android fanboys or Intel / AMD / NVIDIA proselytizers. The rest are Apple Defenders spiced with actual talk about the SOC itself.
Tams80 - Saturday, October 23, 2021 - link
The biggest fanboy here appears to be you.ftlbaby - Tuesday, October 26, 2021 - link
Actually, what I am is a customer. That means I purchase the products that I speak of because they meet my needs based on real world value and performance tests. Not hypothetical scenarios or aspersions cast from a misguided tribal allegiance to a particular corporation.Nick B - Tuesday, November 2, 2021 - link
Nope, you win the prizetechgadgetgeek - Wednesday, October 20, 2021 - link
Anyone know the read/write speeds? New PCIe4?Nigel Tufnel - Thursday, October 21, 2021 - link
If you mean the memory bandwidth of the chip I think it's ~400 GB/s. The SSD read speed in the new MacBook Pros Apple states as 7.4 GB/s (very fast, about 35% faster than a PS5). I don't think they mentioned the write speed, which probably means it's slower.Tomatotech - Thursday, October 21, 2021 - link
Apple press embargo lifts at 9am ET on Monday. Detailed reviews will hit a few seconds later.As far as SSD goes, it's either PCIe 3.0 x8 or PCIe 4.0 x 4 (I favour the latter).
I speculate there will be a Mac Pro with 4x these NVMe SSDs for a total of 30GB/s aggregate bandwidth to backing storage, either 32TB in total or 64TB total by the time it's released.
Tomatotech - Thursday, October 21, 2021 - link
* 32TB or 64TB, all soldered on a single motherboard with up to 4 Pro Max CPU packages also soldered on it. If slots are provided, which could just be slots on a daughtercard connected via TB4 to the main motherboard, then you could put additional NVMe drives on a riser or attach SATA drives for more storage. Hopefully the case will still provide somewhere to put them & keep the whole thing tidy.Ppietra - Thursday, October 21, 2021 - link
I would expect Apple to make a chiplet instead of soldering 4 different M1 Max packages onto the motherboard.ruthan - Thursday, October 21, 2021 - link
Its shame that Apple is only company whose tried to make beefier ARM chip.. It would be interesting, how would be virtualization performace with Windows machine, if it would be faster than real x86 Windows machine, it would clear victory otherwise it would be only great for MacOS ecosystem behind their iron curtain..06GTOSC - Thursday, October 21, 2021 - link
Wondering what the yields are on the M1 Max.yitwail - Thursday, October 21, 2021 - link
I apologize if this was mentioned already but the diagrams don’t match the die photos, so the diagrams are just abstractions? Also, are all those performance vs power graphs idealized? Because all the curves are so smooth.mvalborg - Friday, October 22, 2021 - link
Anand review of Apple are abysmally bad. The former head Anand, of course being an Apple employee.. The people at Anand covering Apple are really nothing more than cheap Apple shills :-(Farfolomew - Saturday, October 23, 2021 - link
Ian Cuttress has mentioned that Anand La Shimpi has little to no interaction (let alone involvement) with Anandtech anymore. Apple doesn't need to pay sites for coverage when their products are just that good and every website comes to the same consensus. Time to step your game up, PCweb2dot0 - Saturday, October 23, 2021 - link
If you want to call someone a shill, please provide which part of Ian Cutress article on M1/M1Pro/Max is inaccurate.Which part of below stats are lies?
1) 400GB/s 32ch LDDR5-6400
2) 7.4GB/s SSD
3) 30W CPU TDP, 60W GPU TDP
How is the review bad? Or is it because your feelings are hurt that your AMD/Nvidia/Intel is not getting the Performance/Watt that Apple is producing?
Curious.
corinthos - Monday, October 25, 2021 - link
Just wait for Intel Awesome Lake release. It will TORCH M1 in 2026.Ashan360 - Monday, October 25, 2021 - link
As amazing as these chips are, they’re largely based on an extension of A14 tech (the same 5nm process and CPU/GPU cores are the same, just clocked higher). It’s cool to be able to estimate the performance of 2022/2023 Macs just by looking at the current iPhones and scaling appropriately. M2 expected mid next year should have about 1900 single core geek bench, 9000 multicore, and 3TFlops GPU in a 15 Watt TDP. Already giving M1 Pro CPU a run for its money, much faster machine learning acceleration, and good enough GPU for most applications in a super low power design. Then the M2 Max will be up to 50% faster GPU than M1 Max! 15 Tflops in a thin laptop! Apple chip team isn’t slowing down…