Comments Locked

32 Comments

Back to Article

  • lemurbutton - Monday, June 6, 2022 - link

    Where is the M1 Ultra review?

    The team here decided to review The SilentiumPC Fera 5 CPU Cooler over the M1 Ultra?
  • shabby - Monday, June 6, 2022 - link

    You forgot the ampere/6900/5800x3d reviews.
  • techjunkie123 - Monday, June 6, 2022 - link

    I'd second this comment. M1 ultra is the most interesting piece of tech due to the dual GPU and GPU element. Based off other people's performance data, it seemed to me that not all applications could use both sets of cores, resulting in similar performance to M1 Max in some cases. Would be interesting to read a deep dive.

    Ampere and RDNA2 refreshes and X3D were kind of boring in comparison, since the performance is largely as expected, I e., 5-15 % for GPUs and 15% for X3D in gaming with a mild regression in compute.
  • hecksagon - Monday, June 6, 2022 - link

    Probably tremendously easier to review Ampere and RDNA2 refreshes over the M1 Ultra because the tools and information to do so are readily available.
  • Hifihedgehog - Monday, June 6, 2022 - link

    It's the most breakthrough hardware, no doubt. It's software and closed nature make it a no-go for most industries. It's akin to putting a rocket thruster into a sedan. It's cool and maybe some semi-pros and amateurs will get some use out of it, but no modern, tech-smart corporation would ever adopt it when their vertical and horizontal deployment stacks demand uniformity in toolsets, management, libraries (e.g. Windows, Linux, CUDA, etc.). So unless Apple wants to sell their hardware in servers so the Apple silicon and ecosystem uniformly covers the whole hardware stack in a corporation, that is a major problem when, say, a company has to optimize and target their code to a client platform that is notably different than the render farm they are running for a movie they are rendering CGI for.
  • Joe Guide - Monday, June 6, 2022 - link

    I don't think Apple cares that much about the pro market. The relentless drive in chip improvement 15-20% per year in the A/M chips is to improve AI, which simplifies and improves camera, video, search, software, and conferencing (all the stuff WWDC presented). That's the stuff consumer wants and where the money is.

    Ask Motorola, Blackberry and Nokia what vertical stack means.
  • valuearb - Wednesday, June 29, 2022 - link

    Yet somehow Apple is the biggest PC maker in the world. Its over 20% of PC revenues because it focuses on creative professionals who spend the most.
  • Silver5urfer - Monday, June 6, 2022 - link

    From your recent comment history esp on AM5 and X3D, you would buy this anyways regardless of Anandtech reviews. Why bother ? Ah I see to brag about how great Apple BGA overpriced junk is I guess.

    When Andrei was there M1 got slaughtered by a batch of BGA throttling jokebooks. Once Alder Lake launched Apple IPC and entire M series got owned. The Ultra and Pro cost over $3000 for basic DRAM capacity and Soldered Storage. So it ended up again that Apple fanboy market where their users will buy regardless.

    M2 will get slaughtered by Zen 4. Ada Lovelace and RDNA3 will destroy whatever that's left of the humiliation left by M1 Ultra getting owned by 2 year old RTX3090.
  • Oxford Guy - Saturday, June 11, 2022 - link

    'whatever that's left of the humiliation left by M1 Ultra getting owned by 2 year old RTX3090'

    Are you referring to to the 3090 outperforming Apple's GPU by using something like 50% more power? Or what it double? I don't remember.

    From what I saw back when I paid even a little bit of attention to this issue, Apple's claim about outperforming Nvidia was correct, when performance-per-watt is taken into account. Not specifying that element is crooked and typical for how all of these companies speak to the public.
  • Ryan Smith - Monday, June 6, 2022 - link

    "The team here decided to review The SilentiumPC Fera 5 CPU Cooler over the M1 Ultra?"

    As I'm sure you're aware, it's not an either/or situation. Different staff members specialize in different types of hardware.

    Unfortunately, the staff members who specialize in CPUs are no longer here. And it's proving to be a bear to try to replace them in this market. The brain drain from journalism to engineering is very real right now.
  • Oxford Guy - Saturday, June 11, 2022 - link

    We don't need engineers, just marketeers to make funny faces that can be slapped onto a YouTube clip.

    Watch us take something out of a box! It's like magic!
  • meacupla - Monday, June 6, 2022 - link

    I have just one comment to make.

    Why did apple choose this specific thumbnail for their WWDC?
    Not only do they have that uncanny valley thing going, Why did Apple decide it was best to decapitate them from their bodies?
  • Oxford Guy - Saturday, June 11, 2022 - link

    Whatever those things are supposed to be, they're ugly and creepy.
  • Hifihedgehog - Monday, June 6, 2022 - link

    Apple Freeform is just Microsoft Whiteboard or a shared OneNote, which already integrate into Teams... yawn.
  • iceman-sven77 - Monday, June 6, 2022 - link

    Metal 3 & DirectAccess was a joke. Copying 2-3 year old features from Microsoft. And still no support for advanced features like Ray-tracing.
  • RSAUser - Monday, June 6, 2022 - link

    You won't see ray tracing support for a while as the GPU isn't really strong enough for it, e.g. Nvidia side it only really starts being okay at 3070 level, and Apple is still going through a translation layer.
  • GC2:CS - Monday, June 6, 2022 - link

    Why would Apple drop A9 and A10 iPhones compatibility but keep A9 iPads ?

    Mac OS also droped quite a load of devices.

    M2 seems like an A15X. But why so little so late ? M1 was avalable like 2 months after A14.
    Looks like there is some serious slowdiwns in pipelines across the semiconductor indiustry.
  • techconc - Monday, June 6, 2022 - link

    @GC2:CS - Why so little so late? Probably because TSMC's next process node wasn't ready. If they're staying on 2nd gen 5nm process, this is basically what they have to do.

    That said, honestly, 18% CPU increase, 25% faster GPU, 40% faster NPU, better media blocks with ProRES, higher memory bandwidth, etc. seems like a pretty decent generational bump to me.
  • vlad42 - Monday, June 6, 2022 - link

    It's not like there have been significant performance improvements to Apple's CPUs in a few years now. It has been 10% - 15% total performance increases for the past few generations and that includes clock speed. Most of the gains touted have been in various accelerators such as the NPU, GPU, image processor, etc.

    If I remember correctly, Andre found the A15 only had about a 5% IPC/performance per clock increase. The rest of the gain was from increased clock speed.

    As far as slow downs are concerned, TSMC's 3N is still fairly new and was unlikely to be viable for the large M# series chips Apple needs. I remember a lot of people complaining about the increased prices for the M1 based laptops and Mac Mini compared with the outgoing equivalent Intel skews. Apple is probably trying to address that by sticking with 5N while putting M2 on a 3N-like process would have kept prices high.
  • Oyeve - Monday, June 6, 2022 - link

    Nothing really WOW, yet again. And another notch?
  • brucethemoose - Monday, June 6, 2022 - link

    24GB of memory... so that must be an asymmetric setup on a 128 bit bus, right? Or does DDR5 allow for uniform bandwidth across all 24GB?

    Also, still no mention of AV1... not even decode. That's not good. Like Apple or not, this is going to hold the whole codec back.
  • SarahKerrigan - Monday, June 6, 2022 - link

    It's LPDDR5, not DDR5. It's not asymmetric. 6GB and 12GB LPDDR5 packages exist.
  • RSAUser - Monday, June 6, 2022 - link

    That CPU comparison graph is pretty bad, the 1255U is designed for a 15W TDP, being an overclocked 1250U (which is designed for a 9W TDP). Its top performance is more like a quad core or something, plus it's a 2+8 not a 10 core. I hate the marketing, it's a good chip already, these stupid graphs are just misleading.

Log in

Don't have an account? Sign up now