Comments Locked

205 Comments

Back to Article

  • Freeb!rd - Monday, September 26, 2022 - link

    This paragraph reads like someone having a stroke while writing it...

    "Although this is overridable through manually overclocking with a maximum TJ Max of up to 115°C, it’s key tovitalte that users will need to use more premium and aggressive cooling types to squeeze every last drop of performance from ZAMD intended thistended when designing Zen 4, and as such, has opted not to bundls own CPU coolers with the retail packages."

    and someone's spell checker is broken.
  • gryer7421 - Monday, September 26, 2022 - link

    It reads like a GTP-3 BOT .... :(
  • Threska - Monday, September 26, 2022 - link

    We now know "Zencally " is a word.
  • Gavin Bonshor - Monday, September 26, 2022 - link

    Hi, yeah something screwy happened, but it's fixed now. Apologies. I think it may be time to update to a new system, and software. This isn't the first time it's jumbled stuff up for me.
  • Cow86 - Monday, September 26, 2022 - link

    I wish I could say that all the errors in the article are fixed, but that very paragraph even still has several (big) errors in it... A missing letter is one thing, half a sentence just missing and going into the next is another.
  • herozeros - Monday, September 26, 2022 - link

    Keep the copy editor awake, or fire them. Grammar/syntax/CMS error, it doesn't matter if it gives me a headache reading this.
  • Ryan Smith - Monday, September 26, 2022 - link

    Unfortunately we're having to do this kind of live. It's been a very busy past two weeks and we haven't had as much time to prepare as we like. So most of what you're seeing is first-draft copy, which I'll get around to editing as I can.

    Digital publications do not employ dedicated copy editors any more. They have all been let go for cost efficiency reasons.
  • flyingpants265 - Monday, September 26, 2022 - link

    What? Come on now.
  • Hifihedgehog - Monday, September 26, 2022 - link

    @flypants265: It's kind of like Microsoft who got rid of their QA team and made all of their developers honorary QA tests. They can't help it that their leadership is being stupid. Don't blame Ryan or Gavin. Blame these greedy cheapskates that likewise didn't want to pay Ian Cutress enough to want to stay.
  • Hifihedgehog - Monday, September 26, 2022 - link

    *QA testers
  • linuxgeex - Monday, September 26, 2022 - link

    All Microsoft customers are QA testers, lol. That's always how it's been.
  • Kangal - Tuesday, September 27, 2022 - link

    Isn't that what goes for Linux?
    The only difference is that you don't pay money, you just pay in time, effort, frustration, and your soul.
  • Hifihedgehog - Tuesday, September 27, 2022 - link

    Exactly. And you compile your own kernel for 24 hours hoping it will finish successfully.
  • at_clucks - Wednesday, October 19, 2022 - link

    Not if you use the latest Ryzen 9 7950X. You may still pray it's successful at the end but God will answer a lot faster :).
  • elforeign - Monday, September 26, 2022 - link

    Ah yes, the capitalistic adage of less is more. I'm sorry you guys have to deal with this, as with anyone in the workforce, where the powers that be sit on their ass with their cushy millions and say workers can do less with more and pile on with disregard.

    On a further note, I have been coming to Anandtech since the mid 00's. While I can understand the expectation surrounding good grammar and flawless articles, some issues are bound to come up now and then. The vitriol you guys receive for some simple grammar or syntax mistake is crazy.
  • rarson - Wednesday, September 28, 2022 - link

    "Ah yes, the capitalistic adage of less is more."

    This is not a thing.
  • herozeros - Monday, September 26, 2022 - link

    Kind reply, thanks. Hope your week lets you catch up.

    No more copy editors?! I guess my blonde is all now truly grey . . . sigh
  • Threska - Monday, September 26, 2022 - link

    Outsourced to AI.
  • emn13 - Monday, September 26, 2022 - link

    I for one thoroughly enjoyed your article, and appreciate the technical content - a few editing nits don't detract from that.

    And hey, if I were to whine about embarrassing editing mistakes, rather than focusing on a long article written in limited time due to AMD's schedule, I'd poke fun at the 100 000 000 000 $ company's press slides touting their EXPO tech's openness in the form of public "doucments". 😀
  • linuxgeex - Monday, September 26, 2022 - link

    So long as you're open to community feedback to correct hasty errors, there's no need for copy editors, and you can push your articles faster, which we'll all appreciate. Saying thanks is much more productive than making excuses. It shows that you appreciate your community.
  • tuxRoller - Monday, September 26, 2022 - link

    When does an explanation become an excuse?
  • UltraTech79 - Friday, September 30, 2022 - link

    Well rehire them or youre going to see a real quality loss. Is it really worth it in the longrun?
  • Ryan Smith - Friday, September 30, 2022 - link

    "Is it really worth it in the longrun?"

    That's a question for the people that pay the bills. It's not my call.
  • Iketh - Saturday, October 1, 2022 - link

    I will professionally edit for next to nothing just because I love this site. Email me [email protected]
  • ScottSoapbox - Tuesday, October 4, 2022 - link

    Grammarly is a cheap replacement that will catch the worst of it.
  • Sivar - Monday, September 26, 2022 - link

    I agree that the paragraph was in need of some work, but "thinkos" happen, esp. with an article of this depth. I like that you reported it, but I wonder if it could have been worded differently. Imagine spending days aggressively writing a detailed analysis, only to have one's writing compared to a stroke victim because of a tiny percent of the article.
  • Jasonovich - Sunday, October 9, 2022 - link

    Grammar fascism is distracting from the main body of the article. It's like the cream from your glass of Guinness pouring on to your fingers, no big deal just lick it off. The integrity of the article is intact and I'm sure the message was received loud and clear from Anandtech's spoof readers.
    Anyway many thanks for the excellent article, other sites don't try half as hard as the folks from Anandtech.
  • philehidiot - Wednesday, September 28, 2022 - link

    This sentence seems perfectly cromulent. I think the point purvulates nicely and is quite unfornitabulated.
  • gryer7421 - Monday, September 26, 2022 - link

    Hi, thanks for the article. In the future .. please start posting HIGHEST all-die TEMPS hit during each benchmark..

    It would be help to know and see the temps for building workstations given that INTEL and AMD both just uncorked the genie by not caring about temps anymore and only caring about ''top cou speed'' at ant (thermal) cost.
  • Gavin Bonshor - Monday, September 26, 2022 - link

    With Zen 4, the highest all-die temp is essentially 95°C, due to the way Precision Boost Overdrive works. The idea is that it will use all over the available power/thermal headroom, so those with better CPU cooling, should technically benefit more.
  • TelstarTOS - Monday, September 26, 2022 - link

    Too few games tested, no 1400p tests, no 7700X tested. Waiting for more
  • Gavin Bonshor - Monday, September 26, 2022 - link

    AMD only sampled us with the 7950X and 7600X. We'll hopefully get our 7700X in the near future. In regards to game testing, you'll much more/better titles in our next CPU review as we move to our 2023 suite. This will come into effect in our next CPU review.
  • rarson - Wednesday, September 28, 2022 - link

    1400p?
  • meacupla - Monday, September 26, 2022 - link

    What I am seeing is 5800X3D being a beast

    Which is why I really look forward to 7x00X3D chips
  • Gavin Bonshor - Monday, September 26, 2022 - link

    Me too!
  • FreckledTrout - Monday, September 26, 2022 - link

    Those should be really strong. Its probably AMD's answer to Intel's 13th gen.
  • kwohlt - Monday, September 26, 2022 - link

    7000 3D is more so AMD's answer to MeteorLake, as both are expected sometime in 2023. Zen 4 is sometime in 2024 and will go up against ArrowLake.

    Vanilla 7000 and RaptorLake are direct competitors.
  • kwohlt - Monday, September 26, 2022 - link

    ****Zen 5
  • nandnandnand - Monday, September 26, 2022 - link

    https://www.theverge.com/23294064/intel-deny-meteo...

    Meteor Lake will come out in late 2023, if at all. 7000X3D sounds like it could launch in January. So 7000X3D will have free reign for the better part of a year.
  • Lothyr - Monday, September 26, 2022 - link

    Same, not to mention that it should give firmware time to stabilize, time to DDR5 to get cheaper, time for PCIe5 SSD to be released, etc. So I guess 6 months-ish before I upgrade (we'll see what Intel comes up with as well).
  • RomanPixel - Tuesday, September 27, 2022 - link

    Me too!
  • kmalyugin - Monday, September 26, 2022 - link

    Wow, this article is almost unreadable. Was spellchecker turned off?
  • jonkullberg - Monday, September 26, 2022 - link

    Gaming benchmarks with DDR5-6000 CL30 please!
  • BushLin - Monday, September 26, 2022 - link

    Exactly
  • xol - Monday, September 26, 2022 - link

    wtf am I reading (context a part with tdp up to 170W from 105W) :

    "This has been possible through superior power efficiency, as Zencally a Zen 3 refinement, but on the new TSMC 5 nm process node (from TSMC 7 nm). This efficiency has allowed AMD to increase the overall TDP to 170 W from the previous 105 W but without too much penalty."

    I can't even .. "too much penalty" ??

    .. Looks like Zen has reached the end of the road imo (it had a good run) - none of the improvements here are from AMD - new DDR5, new 5nm node. The rest is "increase clocks/tdp" just like when Intel was stuck on 14nm.

    I just don't know where they are going from here
  • Threska - Monday, September 26, 2022 - link

    Well we have " While Ryzen 7000 can drive a 2 DPC/4 DIMM setup, you’re going to lose 31% of your memory bandwidth if you go that route. So for peak performance, it’ll be best to treat Ryzen 7000 as a 1 DPC platform." and " Unfortunately, the compatibility situation is essentially unchanged from the AM4 platform, which is to say that while the CPU supports ECC memory, it’s going to be up to motherboard manufacturers to properly validate it against their boards.". The memory situation seems like a sticking point for a good while till things mature.
  • BushLin - Monday, September 26, 2022 - link

    Did you read the article? Put it in eco mode (105W for a 170W part) and it still stomps over everything in MT performance. Zen 4 is more about platform improvements, Zen 5 will be the microarchitecture overhaul.
  • BushLin - Monday, September 26, 2022 - link

    Stomping everything at 65W even!
  • xol - Tuesday, September 27, 2022 - link

    Impressed that it's nominally $100 cheaper than a 5950X .Got to admit that.
  • xol - Tuesday, September 27, 2022 - link

    Eco mode does perform better eg cinebench- b maybe +23% compared to 5950X, but it's using DDR5 5200 vs DDR4-3200 (?), and the power advantage can be assumed to come from 5nm

    My original point still stands for me- 90% of benefits are from node and memory and allowing clocks as high as Tjunction allows - I don't think that is a great showing for AMD
  • Iketh - Thursday, October 6, 2022 - link

    why are you giving so much credit to ddr5? moving to new memory has always given very small gains (if any) in the beginning

    tjunction is an arbitrary number set by AMD, so using that as an argument is irrational
  • xol - Tuesday, September 27, 2022 - link

    ..but my main criticism was of the article - eg phrases like " increase the overall TDP ... without too much penalty" doesn't really make any sense - increase TDP is the penalty

    But much of the article is written as if letting TDP go *much* higher is some sort of gift from AMD -eg the examples I gave

    The article is full of nothin-burgers like this statement :
    " We feel that the higher all-core frequencies under maximum load, 95°C is a sufficient level of heat for what is on offer when it comes to overall performance"
  • kwrzesien - Monday, September 26, 2022 - link

    Whomever was the last to edit the front page needs to disable the trackpad and clean their mouse ball! 🤣
  • Threska - Monday, September 26, 2022 - link

    "But now with AMD’s modern RDNA 2 graphics architecture and TSMC fabrication process, AMD has finally seen the (ray traced) light, and is building a small GPU block into the IOD to offer integrated graphics throughout the Ryzen 7000 product stack."

    I see things like SAM and HSA being a future trend.

    https://www.electronicdesign.com/technologies/micr...
  • erotomania - Wednesday, September 28, 2022 - link

    Yes, AMD thought so too, in 2012...

    https://www.tomshardware.com/reviews/fusion-hsa-op...

    and in 2014 here at AT...

    https://www.anandtech.com/show/7677/amd-kaveri-rev...

    Hopefully this time!
  • nandnandnand - Monday, September 26, 2022 - link

    It seems that going up by 1 GHz didn't help it that much in gaming benchmarks.

    Meanwhile, the 65W results show that any Zen 4 and later APUs are going to be absurdly powerful. Especially Dragon Range.
  • Josh128 - Monday, September 26, 2022 - link

    Any way you guys can add the single core ECO mode results to the conclusion page or to the R23 results on its respective page?
  • donquixote42 - Monday, September 26, 2022 - link

    Single threaded workload would not use more than 65W anyway. So performance should be the same in ECO and non-ECO mode.
  • Josh128 - Monday, September 26, 2022 - link

    Still using a 2080Ti for the games testing is not good. Most certainly many of these results are GPU bound.
  • snowdrop - Monday, September 26, 2022 - link

    No power consumption numbers? Will the article be updated with these when they're ready?
  • jakky567 - Monday, September 26, 2022 - link

    I'm confused by USB 2, do you mean USB 2.0 or USB 4v2, or what?
  • Ryan Smith - Monday, September 26, 2022 - link

    Yes, USB 2.0.

    USB 4v2 was just announced. We're still some time off from it showing up in any AMD products.
  • Myrandex - Thursday, September 29, 2022 - link

    lol did they share any reason why to give a single USB 2.0 port?
  • Ryan Smith - Friday, September 30, 2022 - link

    Basic, low complexity I/O. Implementing a USB 2.x port is relatively simple these days. It's a bit of a failsafe, really.
  • LuxZg - Monday, September 26, 2022 - link

    One question and one observation.

    Q: ECO mode says 170W -> 105W but tested CPU was 170W -> 65W. Is that a typo or was that just to show off? I wish that sample graph showed 7600X at 105W and 65W in addition to 7950X at 170/105/65W.

    Observation: 5800X is 260$ on Amazon. So with cheaper DDR4, cheaper MBOs, and cheaper CPU, it will be big competition inside AMD's own house. At least for those that don't "need" PCIe 5.0 or future proofing.
  • andrewaggb - Monday, September 26, 2022 - link

    I was confused by that as well.
    The way I read the paragraph suggested 170w eco mode is 105w but then it's stated the cpu was tested at 65w. Was it meant to say 105w or can a 170w be dialed down to 65w and the test is correctly labelled?
  • Otritus - Monday, September 26, 2022 - link

    By default while under 95*C, 203*F, 368.15K, the 7950X will have a TDP of 170 watts and use up to 230 watts of power. You can think of it like TDP and Turbo Power on Intel. Eco mode will reduce TDP to 105 watts (and use up to 142 watts??). You can manually set the power limits, and Anandtech set them to 65 watts to demonstrate efficiency. Meaning the 7950X was not in eco mode, but a manual mode more efficient than eco mode.
  • uefi - Monday, September 26, 2022 - link

    Just by supporting Microsoft's cloud connected hardware DRM makes the 7000 series vastly inferior to all current Intel CPUs.
  • Makaveli - Monday, September 26, 2022 - link

    So you are saying intel is not going to implement this in any of their Future processors?

    If the Raptorlake review shows it supports that also i'm going to back to this message.
  • socket420 - Monday, September 26, 2022 - link

    I don't understand where these "intel rulez because they don't use pluton!!" people are coming from - one, the Intel Management Engine... exists, and two, Microsoft explicitly stated that Pluton was developed with the support of AMD, Intel and Qualcomm back in 2020. Intel is clearly on-board with it and I expect to see Pluton included in Raptor Lake or Meteor Lake, they're just late to the party because that's what Intel does best, I guess?
  • TheinsanegamerN - Tuesday, September 27, 2022 - link

    Because MS is far less trustworthy then intel, and has been making moves to block, censor, and lock down everything whenever possible via hardware. Pluton should scare people, giving MS the keys to your hardware is a nightmare.
  • Iketh - Thursday, October 6, 2022 - link

    please keep your irrational paranoia to yourself
  • TheinsanegamerN - Tuesday, September 27, 2022 - link

    He did say current, not future.
  • AndrewJacksonZA - Monday, September 26, 2022 - link

    Hi. What happened to RDR2 at 4K, please?
  • Ryan Smith - Monday, September 26, 2022 - link

    RDR2 did not behave itself properly at 4K on some of our test systems. We're still trying to isolate why.
  • AndrewJacksonZA - Monday, September 26, 2022 - link

    Thanks Ryan. I'm really interested in that and GTA V at 4K. Thank you! :-)
  • piskov - Monday, September 26, 2022 - link

    Please add current Apple CPUs if tests allow it.
  • Ghwomb - Tuesday, September 27, 2022 - link

    Yes. That would be nice. Especially since Linux and openBSD support is coming along nicely on M1 and M2. Making it a viable option for non-macOS users.
  • ddhelmet - Monday, September 26, 2022 - link

    No Dolphin benchmark?
  • Harry_Wild - Monday, September 26, 2022 - link

    Buying Zen 4 7600X and motherboard, DDR5, MvME on Black Friday and/or CyberMonday! Might be $1K! Use my current graphics card, PSU and SFF case. Still a lot of dough!
  • phoenix_rizzen - Monday, September 26, 2022 - link

    The Spec graphs are hard to read as you don't have the CPUs listed in the correct order. You should switch dark blue to be 5950X and light blue to be 3950X. Right now you have the CPUs (graphs) listed as:

    Intel
    7950X
    3950X
    5950X

    It really should be:
    Intel
    7950X
    5950X
    3950X

    That would make it a lot easier to see the generational improvements. Sort things logically, numerically. :)
  • Otritus - Monday, September 26, 2022 - link

    @Ryan Smith please do this. I was also having difficulty reading the Spec graph.
  • Gavin Bonshor - Monday, September 26, 2022 - link

    I apologize for doing it this way. I promise I'll sort it in the morning (UK based)
  • yeeeeman - Monday, September 26, 2022 - link

    Retaking the high end for 1 month.
  • yeeeeman - Monday, September 26, 2022 - link

    TBH, what I am most excited about is the zen 4 laptop parts, like the phoenix apu, with 8 zen 4 cores, rdna 3 igpu, lpddr5, 4nm cpu, 5nm gpu, that should bring some clear improvements over the 4000 series ryzen which are still amazingly good. 5000 and 6000 series haven't brought much improvements over the 4000 series, like my 4800H, so I am curious to see what the 7000 series will bring. Already dreaming about a fully metal body, slim laptop, 14-16 inch, OLED, 90Hz minimum, laminated screen, preferably touch and 360 hinge, 1.5kg top. that will be nice.
  • abufrejoval - Wednesday, September 28, 2022 - link

    Since you're hinting that Intel will change things, there is much less of a chance for Intel to catch up in the mobile sector on 10nm.

    For the laptops I see a different story at almost every five Watts of permissible power for the CPUside of things. But much less change between the 4000-7000 Zen generations at the same energy settings.

    Any hopes for a more-than-casual gaming iGPU can't but fail, because AMD can't overcome the DRAM bandwidth limitations, unless they were to start with stuff like extra channels of RAM on the die carrier like Apple (or HBM).

    And that basically leaves 13% of IPC improvements, some efficiency gains but much less clock gains, because that's mostly additional Wattage on the desktop parts, not available on battery.

    I haven't tried the 6800U yet, but even if it were to be 100% better than my 5800U, that's still too slow a GPU to drive my Lenovo Yoga Slim 7 13ACN notebook's 2560x1600 display full throttle. Even 4x speed won't change that, it just takes a 250 Watt GPU to drive that resolution more like 350 Watts for 4k.

    I just bought a nice 3k 90Hz OLED 5825U based 14" notebook (Asus Zenbook 14) for one of my sons, full metal slim but without touch for less than €1000 including taxes and he's completely stunned by the combination of display brightness (he tends to use it outside) and battery life.

    As long as you think of it as a 2D machine that will do fine display Google Earth in 3D, you'll be happy. If you try to turn it into a gaming laptop it's outright grief or severe compromises.

    And I just don't see how a dGPU on an APU makes much sense, because you just purchase capabilities twice without the ability to combine them in something that actually works. Those hybrid approaches were only ever good in theory.
  • Makaveli - Monday, September 26, 2022 - link

    "I have a 1440p 144Hz monitor and I play at 1080p just because that's what I'm used to."

    *Insert ryan reynolds meme

    Buy why?
  • Gavin Bonshor - Monday, September 26, 2022 - link

    Because I fear that if I drop below 144 Hz in any title, that my life wouldn't be able to cope. Maybe I just need to upgrade from an RX 5700 XT?
  • Makaveli - Tuesday, September 27, 2022 - link

    Ah yes its time.

    Go RDNA3
  • kryn5 - Monday, September 26, 2022 - link

    "Despite modern-day graphics cards, especially the flagships, now at the level where 1440p and 4K gaming is viable, 1080p is still a very popular resolution to play games at; I have a 1440p 144Hz monitor and I play at 1080p just because that's what I'm used to."

    I... what?
  • emn13 - Monday, September 26, 2022 - link

    The geekbench 4 ST results for the 7600x seem very low - is that benchmark result borked, or is there really something weird going on?
  • emn13 - Monday, September 26, 2022 - link

    Sorry, I meant the geekbench 4 MT not ST results. The score trails way behind even the 3600xt.
  • Silver5urfer - Monday, September 26, 2022 - link

    Good write up.

    First I would humbly request you to please include older Intel processors in your suite, it will be easier to understand the relative gains for eg the old 9th gen, 10th gen as a reliable place I see things all over on other sites, AT is at-least consistent so would be better if we have a ton of CPUs in one spot. Thanks

    Now speaking about this launch.

    The IOD is now improved by a huge factor so no more of that IF clock messing with the I/O controller and high voltage on the Zen 3 likes it's all improved so I think the USB fallout issues are fixed on this platform now. Plus the DP2.0 on iGPU is a dead giveaway on RDNA3 with DP2.0 as well.

    IMC is also improved looking at it AMD operated with synchronized clocks with DRAM now they can do it without that since IF is now at 2000MHz and the IMC and DRAM are higher at 3000MHz to match the DDR5 data rates. Plus the EXPO is also lower latency, however the MCM design causes the AIDA benchmark to have high latency vs Intel even though Intel is operating at Gear 2 ratio with similar Uncore decoupled. Surprisingly the inter core latencies did not change much, maybe that's one of the key to improving more on AMD side gotta see what they will do for Zen 5.

    The CPU clocks are insane, 5GHz on all 16C32T is a huge thing, plus even the 7600X is hitting 5.4GHz. Massive boost from AMD improving their design, plus the TSMC5N High Performance node is too good. However AMD did axed their temps and power. It's a very good move to not castrate the CPU with power limits and clocks now that's out it gets to spread it's wings. But the downside is, unlike Intel i7 series Ryzen 6 also gets hot meaning the budget buyers need to invest money in AIO vs older Zen 3 being fine on Air. That's a negative effect for AMD when they removed the Power Limits like Intel and let these rip to 250W.

    Chipset downlink capping at PCIe4.0x4 was the biggest negative I can think of it, because Intel DMI is now 4.0x8 on ADL and RPL, RKL had it at 3.0x8 CML at 3.0x4. AMD is stuck to 4.0x4 from X570. Many will not even care, but it is a disadvantage when you pay top money for X670E they should have given us the PCIe5.0x4, AMD will give that in 2024 with Zen 5 X770 chipset that's my guess.

    The ILM backplate engineering is solid that alone and the LGA1718 AM5 longevity itself is a major PLUS for AMD over LGA1700's bending ILM and EOL by 13th gen. Yes the 12th gen is a better purchase given how the Cooling requirement for i7 and i5 is not this high like R6 and R7 and the cheaper board costs plus 13th gen is coming and AMD's platform is new as well you would be a guinea pig. Depends on what people want and how much they can spend and what they want in longevity.

    Performance is top notch for 7600X and 7950X absolute sheer dominance but the pricing is higher when you see the % variance vs Zen 3 and Intel 12th gen parts, and added AIO mandatory because they are hot. The gaming performance is as expected not much to see here and the 5800X3D still is a contender there but to me that chip is worthless as it cannot match any processor in high core count workloads. Although 7600X is a champion 6C12T and it beats 12C24T in many things and the 10C20T 10th gen Intel too. IPC is massive in ST and MT workloads as expected. AMD Zen 4 will decimate ARM, Apple has only one thing lol muh efficiency all that BGA baggage, locked down ecosystem is free.

    RPSC3 perf at TPU's Red Dead Redemption is weird as I do not see any gains over Intel, given how much of a beast this AVX512 is on Zen 4 with 2x256Bit without AVX offset that too maybe they are not using AVX512. Plus their AMD Zen 3 gauging is also bad because they do not work well vs Intel 9th gen even, I wish you guys cover Dolphin emu, PCSX2, RPCS3 and Switch Emulators.

    I think best option is to wait for next year and buy these parts as they will drop, right now no PCIe5.0 SSD in high capacity. no PCIe5.0 GPU even that Nvidia skimped on it. No use of the new platform unless one is running a super damn old CPU and GPU setups.

    Shame that OC is totally dead, Zen 3 was hamfisted with its Curve Optimizer and Memory tuning becoming a head ache due to how AGESA was handled and the 1.4v high voltage and lack of documentation. Zen 4 it's even 1.0-1.2v still no OC because AMD's design basically is now pushed to maximum with it's Core TJMax temps and how it works on the basis of Core temperatures over everything else. There's no room here, AIO is saturated with 90C here. Too high heat density on AMD side similar to Intel 11th and 12th gen. Although Intel can go upto 350W and hit all cores at higher vs AMD 250W max. Well OC was on life support, only Intel is basically keeping it alive at this point after 10th gen it became worse and 12th very hot and high heat and now 13th gotta see if that DLVR regulator helps or not.

    All in all a good CPU but has some downsides to it. Not much worth for existing 2020 class HW folks at all. Better wait when DDR5 matures even further and more PCIe5.0 becomes prevalent.
  • Threska - Monday, September 26, 2022 - link

    Maybe people will start delidding.

    https://youtu.be/y_jaS_FZcjI
  • Silver5urfer - Tuesday, September 27, 2022 - link

    That Delid is a direct die, it will 100% ruin the AM5 socket for longevity and the whole CPU too. That guy runs HWBot, ofc he will make a video on his bs delid kits. Nobody should run any CPU completely blowing the IHS off. You will have a ton of issues with that. Water leak, CPU silicon die crack due to Thermodynamics and the pressure differences over the time, Liquid Metal leak. Total bust of Warranty on any parts once that LM drops on your machine game over for $5000 worth rig there.

    AMD should have done some more improvements and reduced the max TJ Max to say 90 at-least but it's what it is unfortunately (for high temps and cooling requirements) and fortunately (to have super high performance)
  • Threska - Tuesday, September 27, 2022 - link

    There are some in the comments both wondering if lapping would achieve the same and the thicker lid was giving some room for future additions like 3D cache, etc.
  • abufrejoval - Wednesday, September 28, 2022 - link

    I'm not sure that PCIe 4.0 "DMI" downlink capping is a hard cap per se by the SoC, but really the result of negotiations with the ASmedia chipset, which can't do better. I'd assume once someone comes up with a PCI 5.0 chipset/switch, there is no reason it won't do PCIe 5.0. It's just a bunch of 4 lanes, that happen to be connected to ASmedia PCIe 4.0 chips on all currrent mainboards.

    Likewise I don't see why you couldn't add the second chipset/switch to the "NVMe" port of the SoC or any of the bifurcated slots: what you see is motherboard design choices not Ryzen 7000 limitations. That just has 24 PCIe 5.0 lanes to offer in many bundle variants. It's the mainboard that straps all that flexibility to slots and ports.

    I don't see that you have to invest into AIO coolers, *unless* you want/need top clocks on all cores. E.g. if your workloads are mixed, e.g. a few threads that profit from top clocks for interactive workloads (including games) and others that are more batch oriented like large compiles or renders, you may get maximum personal value even from an air cooler that only handles 150 Watts.

    Because the interactive stuff will rev to 5.crazy clocks on say 4-8 cores, while for the batch stuff you may not wait in front of the screen anyway (or do other stuff while it's chugging in the background). So if it spends 2 extra hours on a job that might take 8 hours on AIO, that may be acceptable if it saves you from putting fluids into your computer.

    In a way AMD is now giving you a clear choice: The performance you can obtain from the high-end variants is mostly limited by the amount of cooling you want to provide. And as a side effect it also steers the power consumption: you provide 150 Watts worth of cooling, it won't consume more except for short bursts.

    In that regard it's much like a 5800U laptop, that you configure between say 15/28/35 Watts of TDP for distinct working points in terms of power vs. cooling/noise (and battery endurance).

    Hopefully AMD will provide integration tools on both Windows and Linux to check/measure/adjust the various power settings at run-time, so you can adjust your machine to your own noise/heat/performance bias, depending on the job it's running.
  • Dug - Monday, September 26, 2022 - link

    "While these comments make sense, ultimately very few users apply memory profiles (either XMP or other) as they require interaction with the BIOS"

    This is getting so old. Your assumption is incorrect which should be obvious by the millions of articles and youtube videos on building computers. Not to mention your entire article is not even directed to "general public" but to enthusiasts. Otherwise why write out this entire article? Just say you put a cpu in a motherboard and it works. Say it's fast. Article done.

    Why not test with Curve Optimizer?
  • Oxford Guy - Tuesday, September 27, 2022 - link

    This text appears again and again for the same reason Galileo was placed under house arrest.
  • socket420 - Monday, September 26, 2022 - link

    Could someone, preferably Ryan or Gavin, please elaborate on what this sentence - "the new chip is compliant with Microsoft’s Pluton initiative as well" - actually means? This is the only review I could find that mentions Pluton in conjunction with desktop Zen 4 at all, but merely saying it's "compliant" is a weird way of wording it. Is Pluton on-die and enabled by default in Ryzen 7000 desktop CPUs?
  • AndrewJacksonZA - Monday, September 26, 2022 - link

    I would imagine it's a technically correct way of saying that it's certified for Windows 11. See here about the TPM:
    www DOT microsoft DOT com/security/blog/2020/11/17/meet-the-microsoft-pluton-processor-the-security-chip-designed-for-the-future-of-windows-pcs/
  • socket420 - Monday, September 26, 2022 - link

    I'm primarily asking whether or not the Pluton security coprocessor has been incorporated into Raphael/Ryzen 7000 CPUs, and I'm pretty sure that isn't what they were implying - Microsoft *does* have a "secured-core PC" baseline for Win11 they've been pushing lately, but it's currently unclear how Pluton ties into that so I don't think Win11 "certification" has anything to do with it. Pluton wasn't mentioned in AMD's desktop Ryzen 7000 press release last month, I didn't see it in any of the Zen 4 architectural slides they showed off today and AnandTech is the only outlet that's brought it up at all, which is why I'm asking this question in the first place - AMD hasn't been particularly forthcoming about the subject and I feel like they would've mentioned Pluton in a press release if it was actually present in these chips.
  • Ryan Smith - Monday, September 26, 2022 - link

    I am not privy to the implementation details. But like other parts of the IOD, Pluton is inherited from the Ryzen 6000 Mobile parts. So it has the same Pluton implementation as those mobile chips.

    TL;DR: I don't know how they're technically accomplishing it, but yes, Pluton is there and enabled.
  • socket420 - Tuesday, September 27, 2022 - link

    Thanks for the response. Just to clarify, if I reread that section correctly, the Ryzen 7000 I/O die is a new design that had most of the additions from Ryzen 6000 ported over to it, Pluton included. That sounds incredibly damning, but I'm not sure how it's possible to confirm its presence without implementation details. I'm also unsure why AMD would brag about Pluton being present in two different mobile CPU releases from the moment they were announced while seemingly ignoring it in their new and shiny desktop Ryzen lineup up until its release date (are they hoping we won't notice?), but then again, it's been months since Ryzen 6000 was launched and no one's taken a closer look at its Pluton implementation yet, so :/

    IIRC, Lenovo ships their Ryzen 6000 Thinkpads with Pluton disabled and you have to go into their BIOS to toggle it on or off, so maybe that option showing up on consumer AM5 boards will show us if Pluton's there or not? It'd also be cool if someone asked AMD directly for a response, but Robert Hallock said he "didn't know" if Pluton was in Zen 4 and he coincidentally just left the company, so I have no idea who to reach out to.
  • Silver5urfer - Tuesday, September 27, 2022 - link

    Thanks for your question and this new garbage Pluton cancer is what I did not want to see shame how they added it.
  • Oxford Guy - Tuesday, September 27, 2022 - link

    You will own nothing and be owned by everything. You will be happy.
  • Valantar - Monday, September 26, 2022 - link

    Could you please run your per-core power draw tests for these chips like you did for Zen3?
  • takling1986 - Monday, September 26, 2022 - link

    I think this review is "streets ahead".
  • IBM760XL - Monday, September 26, 2022 - link

    All right, since they aren't read yet, I'll ask... is it easy to set a lower TDP limit, and could you examine power efficiency when the TDP is the same as it was for Ryzen 5000?

    Looking at the numbers Tom's Hardware posted, the 7950X uses about 80W more at load than the 5950X. With AMD's own slides touting the efficiency improvements being greater at lower TDPs, what I'd really like is to have an octo-core at 65W like the 5700, or perhaps a 12/16 core at 105W like the 5900/5950.

    Though I'm very likely to wait until B650 drops before making a decision, so there's plenty of time for an answer to that question to arrive.
  • abufrejoval - Wednesday, September 28, 2022 - link

    I can only guess that it should be trival to do via RyzenMaster, just in case it's not supported in the BIOS. And of course I'd demand CLI tools for both Linux and Windows.

    I cannot imagine that with a max TDP of 140 Watts a 7950X won't still be faster than a 5950X, even if it won't be quite as fast as if you let it drain the bottle at full hilt. The typical CMOS knee will still be there, only moved forward a bit and with a lot more of a "hot leg" showing towards the top.

    But gains per clock and Watt will be terrible the higher you go on the "hot leg" by nature of silicon physics and any sensible person will just use a "lesser cooler" to avoid that nonsense.
  • spaceship9876 - Monday, September 26, 2022 - link

    1. I was hoping you would use a new build of 7-zip as you are using an old version.
    2. I was hoping you were going to test the idle power consumption when using eco mode so we could compare.
  • boozed - Monday, September 26, 2022 - link

    Sweet Baby Jesus.
  • Arbie - Monday, September 26, 2022 - link

    I would really have liked to see Cinebench R23 multi with the 7950X in "105W" mode, for a more direct comparison to the 5950X. But thanks for all the work you did do here, of course.
  • nandnandnand - Monday, September 26, 2022 - link

    22% better at 65W, 49.7% better at 170W. I'll guess 35-40%.
  • Rezurecta - Monday, September 26, 2022 - link

    Absolutely great review!! I love the architectural focus of these articles rather than the '0-60 like' benchmarks of every other site! Would love a memory scaling post as well!
  • aparangement - Monday, September 26, 2022 - link

    Would it be better if using 32GB*2 memory instead of 16*2?

    I remember that Anandtech did a benchmark showing that 32*2 has performance advantage (maybe just using the same kit as in 12900K review? https://www.anandtech.com/show/17047/the-intel-12t...
  • Jboy1450 - Monday, September 26, 2022 - link

    Seems like you handicapped the Zen4 on purpose.Why would you not test at the AMD recommended memory settings? That's what the average user is actually going to use as AMD made it so easy to do. Also, what's with tested with a 2080 ti? Very disappointed and surprised that a well regarded site like yours would make such incomprehensible decisions. Biased maybe? Seems that way.
  • boozed - Monday, September 26, 2022 - link

    Not everything has to be a conspiracy
  • Jboy1450 - Tuesday, September 27, 2022 - link

    I agree, but coming to a conclusion on a platform where performance is deliberately left on the table (aside from overclocking or using PBO) seems disingenuous. By the same token, why not test AL with DDR4 since most users tend to be budget conscious and will probably choose it over the more expensive DDR5?

    I mean, I'm simply using their logic.
  • Oxford Guy - Tuesday, September 27, 2022 - link

    ‘Most users’ has never been a logical basis for an enthusiast site. It never will be.
  • Otritus - Wednesday, September 28, 2022 - link

    Anandtech has almost always tested at the manufacturer listed specification rather than a recommended overclock. Back in the day this methodology was disadvantageous to Intel as AMD could use DDR3-2133 while Intel was limited to DDR3-1600. If AMD was really so confident that Ryzen 7000 could run at DDR5-6000 cl36, then they should list that. Not say that 2DPC is only able to support DDR5-3600, and 1DPC can only guarantee DDR5-5200. As for the 2080Ti that’s just because the GPU lab burned down, so Anandtech doesn’t do GPU reviews anymore, and companies don’t want to send faster GPUs to not get reviewed.
  • misan - Tuesday, September 27, 2022 - link

    Could you update the article to include the SPEC geomean totals (for int and fp) like in previous reviews (makes comparisons easier). Also, what is the actual observed package power when running CB23 in the 65W TDP mode?
  • Blastdoor - Tuesday, September 27, 2022 - link

    Great idea!

    I’d also like to see more benchmarks — at least SPEC if nothing else — in ECO mode
  • haplo602 - Tuesday, September 27, 2022 - link

    I hope there will be a dedicated article exploring the ECO Mode, its limitations and what effects it has on perf/power/temps. This looks like the best feature of the generation so far. You can on demand change personality of the same CPU from energy efficient to maximum performance without having to buy different models.
  • Foeketijn - Tuesday, September 27, 2022 - link

    +1
  • Blastdoor - Tuesday, September 27, 2022 - link

    Agreed. I understand that AMD feels they have to compete with Intel for the absolute performance crown, even if that means insane temperatures. But I bet there are plenty of folks who would view 80% of the performance for just 40% of the watts as a good tradeoff.
  • Gigaplex - Wednesday, September 28, 2022 - link

    Feature of the generation? ECO mode exists on older Ryzen systems too.
  • Blastdoor - Wednesday, September 28, 2022 - link

    I guess the space heater race with Intel has just made it more relevant than ever before
  • amon.akira - Tuesday, September 27, 2022 - link

    "the reality is that under sustained load, depending on the aggressiveness of the cooling, is more around 5.4 GHz."

    5.4 is allcore or ST sustained?
  • TEAMSWITCHER - Tuesday, September 27, 2022 - link

    If you are already gaming at 4K, then these CPU upgrades are nearly worthless. Yet AMD (and soon Intel) is heavily targeting GAMERS in all their promotions on these new products. WHY ON EARTH DO THEY EVEN TRY?

    NVidia was gang beaten by You Tubers last week, but they put out new 4000 series products that will do far more for gaming and productivity at the same time.. Blender runs so much faster using Nvidia Optix and an RTX GPU. This industry is broken, when it attacks companies delivering REAL performance games, and then praises companies giving consumers a pittance.

    HEDTis the platform I want. Nvidia is the GPU that delivers. The rest is just garbage.
  • Gigaplex - Wednesday, September 28, 2022 - link

    Because there's a massive market for gamers that aren't gaming at 4K. Competitive e-sports comes to mind, where the CPU matters.
  • scrizz - Thursday, September 29, 2022 - link

    FACTS
  • Silver5urfer - Saturday, October 1, 2022 - link

    A CPU is not just "muh Gaming" only processor. Which is why we have a ton of benchmarks here apart from Games. Second part is 1080P resolution is still one to bench, esp when you talking about sub i5 and R6 CPUs they are going to be a 1080P machine not a 4K display.

    Second, Nvidia is a trash company do not even try to defend that - Ampere GPUs shoved with GDDR6X for Bandwidth nice but on purpose for mining, okay fine it's all Crypto and etc. But the PCB designs for RTX3090 is dumpster fire. They put Memory module near PCIe interface which will get a ton of mechanical stress AND the whole MSVDD power rail is complete pile of BS. It is prone to failure because of horrible VRM components and to make it worse Nvidia marked the 3090 price 2x of 3080 and had like 15% boost and then the VRAM on the back total fail. They fixed all of them with 3090Ti but at the end of cycle. Now they will "optimize" the drivers to gimp RTX3090Ti to make 40 series good, thats how this garbage company rolls. Imagine DLSS2 getting EOLed by FSR and then killed by DLSS3. That's pure trash garbage.

    Now the new RTX40 series is coming with rip off 4070 silicon at trash 192Bit bus which means it's a 4K class pricing (RTX3090 is $950, 3090Ti is $1K) but doesn't have proper memory and 12GB, plus the DLSS3 is fake trash, giving BS frames to get idiots drool over the frame counter which is funny because the frames are NOT real.

    HEDT is dead, nobody wants to buy them because nowadays people only play trash games like Apex, Fortnite, COD and etc GaaS trash titles and they buy PC for that only, and how many people do you think care about X670E PCH bandwidth being same as X570 nobody, you did not even mention. Market killed it, I also want HEDT but it won't come again, AMD destroyed their own HEDT with half cooked trash worse than X299. Intel SPR is delayed so HEDT is not coming back ever again it's all TR Pro, XEON W. Shame.
  • Qasar - Sunday, October 2, 2022 - link

    you come across as one angry person. wow
  • Silver5urfer - Sunday, October 2, 2022 - link

    I bought a 3090FE and now planning to sell it off because of it's TRASH VRAM on the back and BS power delivery system, Nvidia themselves revealed on RTX4090FE using literally 1:1 PCB from RTX3090Ti and claiming much lower power excursions.

    Now It's my fault but the thing I bought the 3090FE year back while 3090Ti came this year, It's a mistake on my part but Nvidia has been doing this BS since a while now. GTX970 VRAM fiasco, 2070 cutting silicon a step down just like now RTX4070, and the RTX3000 series awful power delivery VRM. Now DLSS3 which is fake the game runs at same lower FPS but they add fake data to make the people think they are running at higher FPS and rendering the game. Scam on top of scam.
  • vortmax2 - Sunday, October 16, 2022 - link

    Maybe focus on the content of his post instead of the demeanor.
  • Qasar - Thursday, October 20, 2022 - link

    " Maybe focus on the content of his post instead of the demeanor. "
    you obviously havent seen his previous posts.
  • Cristian - Tuesday, September 27, 2022 - link

    The section : "Ryzen 9 7950X at 65 W (ECO Mode): Zen 4 has Superb Efficiency" is exactly what I was looking for (and will build) .
    Thank you very much Anandtech ( Ryan Smith & Gavin Bonshor) ! ;)
  • TheinsanegamerN - Tuesday, September 27, 2022 - link

    A 7600x3d would be a superb gaming chip, if AMD makes one. Limit zen 4 to 65w and honestly these are pretty tasty. Curious how cool a 7600 runs on limited TDP.
  • nandnandnand - Tuesday, September 27, 2022 - link

    It would be interesting if they could put a bottom yield chip with less cache on the 7600X, and bump up the price by $30. Otherwise I don't think they'll bother.

    Limit the TDP yourself.
  • vortmax2 - Sunday, October 16, 2022 - link

    Some people don't want to limit TDP themselves. Nothing wrong with that.
  • Techie2 - Tuesday, September 27, 2022 - link

    What a screwed up launch of Ryzen 7000 CPUs and AM5 mobos by e-tailers. DDR5 EXPO DRAM showed up online a few days ago. On 9-27-22 it looks like e-tailers are actually hiding the four Ryzen 7000 CPUs to sell older stock. The AM5 mobos which have been sitting in inventory for weeks were not posted online until early morning instead of at midnight as in the past. You'd think by now they could figure out how to do a proper launch of a new CPU or platform but evidently not when it's AMD.

    No consumer grade Gen 5 SSDs listed by e-tailers that I could find. Are PC builders suppose to just wait until Nov. to see if they actually show up? AMD's partners may be cooperating with AMD but the purchasing experience is a piss poor sales methodology being employed IMNHO.

    YMMV
  • nandnandnand - Tuesday, September 27, 2022 - link

    AMD said weeks ago that PCIe 5.0 SSDs would be coming in November.

    Nobody should be buying this stuff on day 1 unless they like being findom'd by corporations.
  • yhselp - Tuesday, September 27, 2022 - link

    First-draft-copy issues aside, this article is written exceptionally well. The information is excellent and extensive as usual, but I feel like there's been a step-up in the way its presented/explained. Kudos and thank you.
  • yhselp - Tuesday, September 27, 2022 - link

    it's* goddamit
  • vortmax2 - Sunday, October 16, 2022 - link

    Great post. So many grammar police on here that can't help but criticize and take away from the actual purpose of the article.
  • HardwareDufus - Tuesday, September 27, 2022 - link

    Clearly both manufacturers are producing vey compelling products this time around. At the $600-$700 mark we have two CPUs trading blows; R9-7950X (Zen4) & just announced i9-13900K (RaptorLake) We will have updates of both lines, with AMD adding 3D Cache and Intel increasing Boost Speeds. Probably can't go wrong with either choice, neither one dominates the other completely.

    I think I'll bite this time around. Yeah I know Zen5 will be a new architecture and Intel will adopt chiplets and all of the benefits that accompany them... But, I think either one of these chips, probably available in volume in 1st quarter 2023, will serve most folks well.

    In the case of Intel, you can continue to use DD4 and 600 series chipsets. However really take advantage of the capabilities of the chip, DDR5 and motherboards featureing 700 series chipsets will be available, on par with Zen 4 requiring a DDR5 and otherboards featuring the new AM5 socket and 600 series chipsets. Apples to Apples when comparing the requirements to go ALL In on performance.
  • HardwareDufus - Tuesday, September 27, 2022 - link

    Dang, that was some awesome typing I just did there...
  • nandnandnand - Tuesday, September 27, 2022 - link

    3D cache will dominate over a couple hundred extra MHz in frequency (13900KS).
  • Hifihedgehog - Tuesday, September 27, 2022 - link

    @nandnandnand: That may well be true, but that's the future and months out yet. Ryzen 7000 non X3D has to sell between then and now or AMD is not going to be posting a pretty quarter. The 13900K is far cheaper in platform and unit price and will meet or exceed the 7950X for now. DDR5 and AM5 motherboards coupled with a higher price will be Ryzen 7000 series undoing, and good too. AMD needs to realize people purchase because of intrinsic quality, not brand loyalty.
  • Silver5urfer - Tuesday, September 27, 2022 - link

    Intel won't sell new mobos. They already have Z690 saturation. Barely anyone will get Z790. AMD on the other hand will continue to sell new boards, the quarter is not based on the Client only. It will include the HPC. Intel lost money there, and AMD won't be losing because Genoa is on track and SPR XEON is delayed.

    AMD AM5 is not just hey this thing is fast and just for gaming. It will be a socket that is going to last until Intel Nova Lake launches that is next 2 Intel sockets. That is a huge advantage for a small price for paying customers now.

    Also why is everyone chanting same BS that GN Steve did with AMD boards are too expensive, did you see how Z690 was at when it launched same thing it was expensive ? And DDR4 boards are worse quality and features than the premium cut DDR5. Then Intel launched B660 and AMD's B650/E is also coming. So nope that BS argument about Mobo pricing is too much thrown around. Once the B650 launches by that time 13th gen will hit Retail market and new GPUs as well. And it's November season and in America the Black Friday sales will kick in and see price cut for all products we are seeing now.

    So ultimately AMD is not going to lose money.

    The biggest BS from a smart customer pov is with Intel LGA1700 EOL and the whole socket bending crap, it's like AM4's unreliable IMC and poor IODie with it's issues. AM5 needs to prove itself but given how they removed the IF from memory clocks I can bet it won't have the issues from AM4.

    X3D is a niche market it won't be chart topper for sales at-least if it's again 7800X3D single SKU. Same for KS bin. It depends on how AMD will execute, idk why every single AMD fan says X3D is going to do something if AMD can clock it this high and also allow tuning then it will be a true gen refresh to compete vs Meteor Lake else it will be just a Gaming Juggernaut.
  • nandnandnand - Tuesday, September 27, 2022 - link

    @Silver5urfer rumored to be 3 SKUs, including a 7900X3D, and +30% average performance instead of 15%. I guess that would be a result of improved latency, bandwidth, no voltage/clock decreases, etc.
  • Silver5urfer - Wednesday, September 28, 2022 - link

    A 7950X3D means it will have extreme high heat because not only single cache stack you are adding 2 stacks atop the CCDs, how will AMD able to remove that ? Unless the way Cowos TSMC Stacking is technically changed OR they have to lap out the IHS internally to reduce the thickness and compensate the high heat transfer. The current IHS is thick due to many reasons one can assume - The LGA1718 stability, Chiplet integrity with high heat and pressure of HS and cooler compat and it causes the heat density increase, which is why 95C.

    I really think a 7800X3D is the only way for AMD even though rumors mention 3SKUs because a total SKU refresh totally cannibalize the entire 7000 lineup, because a 7600X is to get best gaming out of AM5 with cheaper option almost at more than 1/2 the price reduction vs a top end R9. And R7 7900X is basically an all rounder like 5900X best for gaming and production now you add the Cache block it would have to fight with 7900X.

    Voltage reduction was done on Zen 3 because AMD shoved 1.4v through all Ryzen 5000 processors, insanely high and IODie was also on high voltage, causing all that instability add the 1.3v bin silicon, everything gets better including the heat density. Zen 4 TSMC 5N is much better because it's just 1.2v now at high clock rate. The voltage is not an issue anymore, the design of the Zen 4 itself is like this, how AMD intended to breathe fire at 95C even for 7600X is the hint.
  • nandnandnand - Wednesday, September 28, 2022 - link

    Heat was never the problem for the 5800X3D. It was only voltage, due to using an immature 3D (2.5D) chiplet technology that could not be run at the higher voltages. So I don't think the 7950X3D can't happen. If they have to drop voltages and clocks again, then hopefully the cache has improved.

    I think AMD should do at least a 7950X3D and 7800X3D. They can prevent cannibalization by giving it a healthy price bump. Probably +$100 to the 7950X3D, +$50 to the 7800X3D, and let the 7700X price drift lower. 7900X3D doesn't make sense, and people would love a 7600X3D but AMD would not.
  • nandnandnand - Tuesday, September 27, 2022 - link

    @Hifihedgehog OP compared 7000X3D to the 13900KS, that's what I addressed.
  • Hifihedgehog - Tuesday, September 27, 2022 - link

    Wrong: the i9-13900K is less than $600. The 7950X is going to have to have its price lowered, especially with the price of DDR5 and the motherboards simply off the charts. And good too: Lisa Su needs to be running a price war and not pretend that her company has more market share.
  • The Von Matrices - Tuesday, September 27, 2022 - link

    A price war doesn't benefit AMD when they are supply constrained by TSMC and selling every chip they can manufacture. There's a reason that AMD doesn't offer any products in the <=$100 CPU market right now and it isn't because they don't want to make money.
  • Hifihedgehog - Tuesday, September 27, 2022 - link

    https://download.intel.com/newsroom/2022/2022innov...
  • dwade123 - Tuesday, September 27, 2022 - link

    Overheated and overpriced. Don't let those scumbags tell you that "95C is normal" because it's not. Avoid at all cost!
  • Thanny - Tuesday, September 27, 2022 - link

    Running the memory at JEDEC speeds is definitely the wrong choice for a review. While it may be true that most people don't set the memory profile in the BIOS, none of those people read CPU reviews. Essentially every person who would read this reviews will be setting memory to the XMP/EXPO settings.

    So you're essentially invalidating your test results for the only people who see them.
  • Oxford Guy - Tuesday, September 27, 2022 - link

    This has been posted for years.
  • BoredInPDX - Tuesday, September 27, 2022 - link

    I’m confused. I they 720p tests you write:
    “All gaming tests here were run using integrated graphics, with a variation of 720p resolutions and at minimum settings.”

    Yet all the prior-gen AMD CPUs tested are lacking an IGP. Am I missing something?
  • Ryan Smith - Friday, September 30, 2022 - link

    You are not missing anything; we did not run any iGPU tests. That's a bit of boilerplate text that did not get scrubbed from this article. Thanks for bringing it up!
  • Gigaplex - Wednesday, September 28, 2022 - link

    There's some odd results here and the article commentary doesn't seem to touch on it. Why is the 7600X absolutely trounced in Geekbench 4.0 MT? The second slowest CPU (3600XT) more than doubles it. And yet the 7950X wins by a mile in that same test, so it shouldn't be architectural. And in some of the gaming tests, the 7600X wins, and in some it comes dead last.
  • Dribble - Wednesday, September 28, 2022 - link

    The processors are particularly cache bound - i.e. it fits in cache it runs very fast, if it doesn't it falls off rapidly. That is often visible in games where it'll run amazingly in some (mostly older) games, but tend to fall off, particularly in the lows, in more complex (mostly newer) games.
  • ricebunny - Wednesday, September 28, 2022 - link

    The SPEC multithreaded tests are N separate instantiations of the single thread tests. That’s a perfect scenario where there is no dependency or serialization in the workload and tells us very little how the CPUs would perform in a parallel workload application. There are SPEC tests specifically designed to test parallel performance, but I do not see them included in this report. Anandtech, can you comment on this?
  • abufrejoval - Wednesday, September 28, 2022 - link

    Emerging dGPUs not supporting PCIe 5.0 is just crippleware!

    While I can easily see that 16 lanes of PCIe 5.0 won't do much for any game, I can very much see what I'd do with the 8 lanes left over when all dGPU bandwidth requirements can be met with just 8 lanes of PCIe 5.0.

    Why can't they just be good PCIe citizens and negotiate to use 16 lanes of PCIe 4.0 on lesser or previous generation boards and optimize lane allocation on higher end PCIe 5.0 systems that can then use bifurcation to add say a 100Gbit NIC, plenty of Thunderbolt 4 or better yet, something CXL?

    Actually I'd be really astonished if this wasn't even an artifical cap and that the Nvidia chips may actually be able to do PCIe 5.0.

    It's just that they'd much rather have people use NVlink.
  • TheinsanegamerN - Tuesday, October 4, 2022 - link

    Um....dude, 4.0x16 and 5.0x8 have the same bandwidth, and no GPU today can saturate 4.0, not even close. The 300ti OCed manages to saturate.....2.0. 3.0 is a whopping 7% faster.

    You got awhile man.
  • abufrejoval - Wednesday, September 28, 2022 - link

    It should be interesting to see if AMD is opening the architecture for 3rd parties to exploit the actual potential of the Ryzen 7000 chips.

    The current mainboard/slot era that dates back to the 1981 IBM-PC (or the Apple ][) really is coming to an end and perhaps few things highlight this as well as a 600 Watt GPU that has a 65 Watt mainboard hanging under it.

    We may really need something more S100 or VME, for those old enough to understand that.

    Thunderbolt cables handle 4 lanes of PCIe 3.0 today and AFAIK cables are used for much higher lane counts and PCIe revisions within high-end server chassis today, even if perhaps at shorter lengths and with connectors designed for somewhat less (especially less frequent) pluggability.

    Their main advantage is vastly reduced issues with mainboard traces and much better use of 3D space to optimize air flow cooling.

    Sure those cables aren't cheap, but perhaps the cross-over point for additional PCB layers has been passed. And optical interconnects are waiting in the wings: they will use cables, too.

    You stick PCIe 5.0 x4 fixed length cables out from all sides of an AM5 socket and connect those either to high bandwidth devices (e.g. dGPU) or a switch (PCIe 5.0 variant of the current ASMedia), you get tons of flexibility and expandability in a box form factor, that may not resemble an age old PC very much, but deliver tons of performance and expandability in a deskside form factor.

    You want to recycle all your nice PCIe 3.0 2TB NVMe drives? Just add a board that puts a PCIe 5.0 20 lane switch between (even PCIe 4.0 might do fine if it's 50% $$$).

    And if your dGPU actually needs 8 lanes of PCIe 5.0 to deliver top performance, connect two of those x4 cables to undo a bit of bifurcation!

    How those cable connected board would then mount in a chassis and be cooled across a large range of form factors and power ranges is up for lots of great engineers to solve, while dense servers may already provide lots of the design bricks.

    Unfortunately all that would require AMD to open up the base initialization code and large parts of the BIOS, which I guess currently has the ASmedia chip(s) pretty much hardwired into it.

    And AMD with all their "we don't do artificial market segmentation" publicity in the past, seem to have become far more receptive to its bottom line benefits recently, to allow a free transition from console to PC/workstation and servers of all sizes.

    And it would take a high-volume vendor (or AMD itself), a client side Open Compute project or similar to push that form factor the the scale where it becomes economically viable.

    It's high time for a PC 2.0 (which isn't a PS/2) to bridge into the CXL universe even on desktops and workstations.
  • Oxford Guy - Wednesday, September 28, 2022 - link

    "The current mainboard/slot era that dates back to the 1981 IBM-PC (or the Apple ][)"

    Absolutely nothing about the IBM PC was new. The Micral N introduced slots in a microcomputer and the S-100 bus, introduced by the Altair, became the first big standard.
  • Tomatotech - Friday, September 30, 2022 - link

    Nice idea but you’re swimming against the flow of history. The trend is always to more tightly integrate various components into smaller and smaller packages. Apple have moved to onboard RAM in the same package as the CPU which has bought significant bandwidth advantages and seems to have boosted iGPU to the level of low-end dGPUs.

    The main takeaway from your metaphor of the 650w dGPU with a 55w mainboard and 100-200w CPU is that high-end dGPUs are now effectively separate computers in their own right - especially as a decent one can be well over 50% of the cost of the whole PC - and are being constrained by having to fit into the PC in terms of physical space, power supply capacity, and cooling capacity.

    It’s a shrinking market on both the low end and high end for home use of dGPU, given these innovations and constraints and I don’t know where it’s going to go from here.

    Since I got optic fibre, I’ve started renting cloud based high-end dGPU and it has been amazing albeit the software interface has been frustrating at times. With symmetric gigabit service and 1-3ms ping, it’s like having it under my desk. I worked out that for unlimited hours and given the cost of electricity, it would take 10 years for my cloud rental costs to match the cost of buying and running a home high end dGPU.

    Not everyone has optic fibre of course but globally it’s rolling out year by year so the trend is clear again.
  • Castillan - Wednesday, September 28, 2022 - link

    "

    clang version 10.0.0
    clang version 7.0.1 (ssh://[email protected]/flang-compiler/flang-driver.git
    24bd54da5c41af04838bbe7b68f830840d47fc03)

    -Ofast -fomit-frame-pointer
    -march=x86-64
    -mtune=core-avx2
    -mfma -mavx -mavx2
    "

    ...and then later the article says:

    "The performance increase can be explained by a number of variables, including the switch from DDR4 to DDR5 memory, a large increase in clock speed, as well as the inclusion of the AVX-512 instruction set, albeit using two 256-bit pumps."

    The problem here being that those arguments to Clang will NOT enable AVX-512. Only AVX2 will be enabled. I verified this on an AVX512 system.

    To enable AVX512, at least at the most basic level, you'll want to use "-mavx512f ". There's also a whole stack of other AVX512 capabilities, which are enabled with "-mavx512dq -mavx512bw -mavx512vbmi -mavx512vbmi2 -mavx512vl" but some may not be supported. It won't hurt to include those on the command line though, until you try to compile something that makes use of those specific features, and then you'll see a failure if the platform doesn't support those extensions.
  • Ryan Smith - Friday, September 30, 2022 - link

    Correct. AVX-512 is not in play here. That is an error in analysis on our part. Thanks!
  • pman6 - Thursday, September 29, 2022 - link

    intel supports 8k60 AV1 decode.

    Does ryzen 7000 support 8k60 ??
  • GeoffreyA - Monday, October 3, 2022 - link

    The Radeon Technology Group is getting 16K ready.
  • yhselp - Thursday, September 29, 2022 - link

    I'd love to see you investigate memory scaling on the Zen 4 core.
  • Myrandex - Thursday, September 29, 2022 - link

    The table on page four mentions "Quad Channel (128-bit bus)" for memory support. Does that mean we could have a 4 memory slot solution, with one memory module per channel, with four channel support? This way to drastically increase memory bandwidth all while maintaining those fast DDR5 frequencies?
  • Ryan Smith - Friday, September 30, 2022 - link

    No. That configuration would be no different than a 2 DIMM setup in terms of bandwidth or capacity. Slotted memory is all configured DIMMs; as in Dual Inline Memory Module.
  • GeoffreyA - Friday, September 30, 2022 - link

    All in all, excellent work, AMD, on the 7950X. Undoubtedly shocking performance. Even that dubious AVX-512 benchmark where Intel used to win, Zen 4 has taken command of it. However, lower your prices, AMD, and don't be so greedy. Little by little, you are becoming Intel. Don't be evil.

    Thanks, Ryan and Gavin, for the review and all the hard work. Much appreciated. Have a great week.
  • Footman36 - Friday, September 30, 2022 - link

    Yawn. I really don't see what the big fuss is about. I currently run 5600X and was interested to see how the 7600X compared and while it does look like a true uplift in performance over the 5600X, I would have to factor in cost of new motherboard and DDR5 ram! On top of that, the comparison is not exactly apples to apples in the testing. 7600X has a turbo speed of 5.3, 5600X 4.6. 7600X runs with 5200 DDR5 and 5600X 3200 DDR4, 7600X has TDP 105W, 5600X 65W. If you take a look at the final page where the 7950X is tested in ECO mode which effectively supplies 65W instead of 105W you lose 18% performance. If we try to do apples to apples and use eco mode with 7600X, to get apples to apples with 65W of 5600W, then lower boost to 4.6ghz then the performance of the 2 cpu's looks very similar. Perhaps not the way I should be analyzing the results, but just my observation....
  • Tom Sunday - Friday, September 30, 2022 - link

    Just today received a special sales notice from Micro Center giving away FREE 32GB DDR5 with any purchase of a Ryzen 7000 series CPU. I wonder if AMD is sponsering such a sales push and this early in the game? Giving away a $190 value is a big deal in the trying times of today!
  • Castillan - Sunday, October 2, 2022 - link

    I suspect that's a Microcenter specific deal only. The RAM is 5600 at a fairly high latency (I think it was CAS40?). DDR5 prices have plummeted as well. The memory I picked up from Microcenter was 6600/CAS34 and marked down to 279 from 499.

    I'd guess that they have a surplus of a certain stock item that wasn't selling, and decided to use this promo to offload unwanted stock and still look good.
  • imaskar - Friday, September 30, 2022 - link

    It would be really great to add code compilation tests: Java, Go, C++ (linux kernel), Rust.
  • dizzynosed - Saturday, October 1, 2022 - link

    Si what shall I buy? Intel, amd, ??? Which cpu?? I only game.
  • rocky12345 - Saturday, October 1, 2022 - link

    What's wrong with the gaming scores on the 7000 series there is no way a 5000 series should be able to match or beat a 7000 AMD CPU. I know this because I have a AMD Ryzen 5900x properly setup and tweaked. AMD is said to have sent DDR5 6000 with the test CPU's and asked the reviewers to use that to test with. Lets face it 97% of the people buying a new AMD Zen 4 setup or Intel 12th gen are not going to be using bargain basement low speed ram and if they do happen to buy cheaper ram most are more than likely to try and run it at the highest speed possible. did I read that right you used CL44 DDR5 5200Mhz talk about dead heading performance.

    Also maybe I missed it but what was the Intel test system setup? other than that it was a decent review. I never have seen Ryzen 5000 that close in gaming I guess using slow DDR5 knee jerks Ryzen 7000. My own ram is running at CL16 4000Mhz 2000IF and at the reported number in the review if I had the same video card I would be either faster or only slightly slower than the test results here for games and that would give me false hope that my Zen 3 was faster than it really is lol.
  • Oxford Guy - Sunday, October 2, 2022 - link

    The only way you're going to see movement on this is if you lobby AMD to abandon JEDEC.

    This site sees JEDEC as all there is.
  • GeoffreyA - Monday, October 3, 2022 - link

    I think it's about keeping a common baseline of memory speed, especially since Anandtech's database is about having parts directly comparable.
  • Oxford Guy - Monday, October 10, 2022 - link

    That’s not the reason that has been given again and again and it’s a terrible one anyway. The parts are different. The memory that goes best with those parts differs.
  • GeoffreyA - Tuesday, October 11, 2022 - link

    They should have set all the systems to DDR4 3200 and called it a day.
  • byte99 - Sunday, October 2, 2022 - link

    I'm a bit confused. When Anandtech was doing their efficiency analysis, it seemed they were taking the 65W Eco mode label as the actual package power, instread of actually measuring it (as they usually do). When Ars Technica measured the package power of the 7950X and 7600X in 65w Eco Mode, they found it was 90W for both.

    [ https://arstechnica.com/gadgets/2022/09/ryzen-7600... ]

    Did Anandtech miss something obvious, or am I missing something?
  • RestChem - Wednesday, October 5, 2022 - link

    Meh, time will out the ultimate price-points and all that, but as it emerges I really wonder what kind of users are looking to drop this kind of dollarses on high-end AMD builds. My gut is that they've priced themselves out of their primary demographic, and max TDP is right up there too, same as with their GPUs. When it comes down to a difference of a couple hundred bucks per build (assuming people build these with the pricey DDR5-6000 there's scant mobo support for through whatever AMD's integrated mem-OC profile scheme is) are there going to be enough users who just root hard enough for the underdog to build on these platforms, contra even high-end Alder Lake or (however much extra, reamins at time of writing to be seen) Raptor Lake builds? Before the announcements I was expecting AMD to get in cheap again, promise at least like performance for a bit of a discount, but it seems even those days are over and they want to play head-to-head. I wish them the best but I don't see them scoring well in that fight.
  • tvdang7 - Thursday, October 6, 2022 - link

    " I have a 1440p 144Hz monitor and I play at 1080p just because that's what I'm used to."
    Is this some kind of joke. We are supposed to listen to reviewers that are stuck in 2010
  • Hresna - Sunday, October 9, 2022 - link

    I’m curious as to whether there’s any appreciable difference to a consumer as to whether a particular PCIe lane or USB port is provisioned by the CPU or the Chipset…. Like, is there a reliability, performance, or some other metric difference?

    I’m just curious why it’s a design consideration to even include them in the CPU design to begin with, unless it has to do with how the CPU lanes are multiplexed in/out of the CPU and somehow some of the lanes can talk inter-device via the chipset without involving the cpu…
  • bigtree - Monday, October 10, 2022 - link

    Where is octa channel memory? dual channel memory is a $300 CPU.
    Where is native Thunderbolt 4 support?
    (mac minis have had thunderbolt 3 for over 5 years).
    Cant even find one X670 Motherboard with 4x Thunderbolt 4 ports. And you want $300? Thunderbolt 4 should be standard on the cheapest boards. Its a $20 chip.
  • Oxford Guy - Monday, October 10, 2022 - link

    The mission of corporations is to extract profit for shareholders and protect the lavish lifestyles of the rich. It is not to provide value to the plebs. Do the absolute minimum is the mantra.
  • RedGreenBlue - Tuesday, October 11, 2022 - link

    That must be why Intel made Thunderbolt royalty-free and it’s now built into USB 4.
  • Oxford Guy - Wednesday, October 12, 2022 - link

    It probably can afford to since states like Ohio are willing to bankroll half of the cost of its fabs.
  • RedGreenBlue - Tuesday, October 11, 2022 - link

    It’s built into USB 4 now. Just make sure it’s functional already because it might need a driver, AMD did that on the 600 series. Aside from that important fact, I don’t care if there aren't many boards with it. The thunderbolt ecosystem has been crap since the beginning. Peripheral makers didn’t take advantage of it because USB was a more common approach and intel didn’t make thunderbolt cheap to implement. The Mac Minis have it because Apple made a big bet on it when it came out. These days it’s nice to have but it’s a throw-away feature unless you have a niche product that needs it. It’s for niche purposes and that would have been a waste of pci lanes. I would’ve liked it for external GPU’s but intel effectively shut that down and I don’t know if they’ve opened the door to it again. USB is way more convenient.
  • RedGreenBlue - Tuesday, October 11, 2022 - link

    And 8 channel memory, like, this sounds like a joke. That’s for server or workstation cpus because of how many layers it takes for the wiring on the board and the pins on the socket. That’s part of why server and workstation boards are so expensive. If you need that much bandwidth you’re in the wrong market segment. Look at Threadripper chips.
  • RedGreenBlue - Tuesday, October 11, 2022 - link

    It would be appreciated if architecture reviews had the pipeline differences in a chart to compare across generations. Anandtech used to have that included and it gave a good comparison of different generations and competitor architectures. I can understand not including it in the product review but I don’t remember a chart being in the previous Zen 4 overview article.
  • Yirath - Tuesday, October 11, 2022 - link

    Well I appreciate the info on the new chip. I am a bit disappointed reading the comments that the chip falls short of it's expectations. As a fan of AMD I'll still probably go with this on my next build.
  • fybyfyby - Tuesday, October 18, 2022 - link

    And what shortcomings do you mean? Im fresh user of 7950x and I wouldnt go back to 5900x. 7950X is much more efficient and powerful. Of course now its also investment into new platform. Its not as cheap. And for many people it doesnt make sense. Its absolutely understandable.
  • Vorl - Thursday, October 20, 2022 - link

    If this is a rewview of the 7950 and the 7600 why isn't the 7600 in the SPEC tests?
  • namcost - Friday, October 21, 2022 - link

    1:1:1 would mean 3000:3000:3000.... The infinity clock doesnt run 3000. So this whole article is factually wrong except the part where you stated that infinity clock was running 2000. That would mean 2000:3000:3000 which is not 1:1:1 at all....
  • npoc - Wednesday, October 26, 2022 - link

    Why doesn't anyone report idle power consumption anymore. I don't care how much power my computer uses when it's running full out because that only happens 1% of the time. 99% of the time my system sits idle waiting to do things. Please report idle power consumption both at the 12v CPU rail, and at the whole system level *(with similarly specced machines, i.e. same nvme, same ram, same GPU, same PSU, similarly specced motherboard). I don't game, but I do have a server that needs upgrading. I honestly care most about how much power this upgrade will cost or save me over my existing i7-4771 (yes that's a thing).

Log in

Don't have an account? Sign up now