Comments Locked

153 Comments

Back to Article

  • undervolted_dc - Tuesday, January 25, 2022 - link

    100wh battery and 115w isn't a bit high ? is all that needed to win in benchmarks ?
  • shabby - Tuesday, January 25, 2022 - link

    Seems like it, except they always lose in the battery life benchmark
  • nico_mach - Tuesday, January 25, 2022 - link

    That battery life is U-G-L-Y for sure.
  • mothringer - Tuesday, January 25, 2022 - link

    It's a DTR, that's actually pretty good by DTR standards. Almost double what mine gets.
  • tipoo - Tuesday, January 25, 2022 - link

    In what world is a 5900 + 6800 also not a desktop replacement?
  • temps - Tuesday, January 25, 2022 - link

    the one where its performance is too poor for it to actually be used as a desktop replacement
  • BillBear - Tuesday, January 25, 2022 - link

    Except for the part where performance on this craters the second you unplug this from the wall?
  • lilo777 - Tuesday, January 25, 2022 - link

    Why would you unplug a desktop from the wall?
  • melgross - Wednesday, January 26, 2022 - link

    Because it’s not actually a desktop. It’s not that useful away from the socket though, which is a problem. But that’s been a problem for all Windows machines, no matter which CPU is inside. Performance drops to unusable levels for many software packages.
  • TheinsanegamerN - Friday, January 28, 2022 - link

    What a surprise, melgross doesnt understand how DTRs work.
  • corejamz - Friday, February 4, 2022 - link

    Awesome article keep it up guys <a href="https://corejamz.com/">Mp3 download</a>
  • corejamz - Friday, February 4, 2022 - link

    https://corejamz.com/
  • sandeep_r_89 - Monday, January 31, 2022 - link

    Depends on the performance plan, and other things like DPTF, plus the laptop manufacturer UEFI doing screwy things with respect to power management and heat.

    Too much nonsense gets in the way.

    And of course Windows is always doing something in the background, and not actually allowing the CPU to sleep as well as it should. Can't get good battery life with E-cores if the P-cores can't actually turn off.
  • deil - Thursday, February 10, 2022 - link

    That's kinda expected when your laptops is 330W on wall but 45 on battery ? Who would think a battery can provide consistently 115W for cpu alone, when it's not brick sized ? That's why AMD wins this IMHO, their laptops loose on a plug, but on battery they win soundly. There is another thing to be said about 250W of heat under your palm that 330W brick suggest.
    I think that if you would disconnect power while under load, this would just explode.
  • at_clucks - Wednesday, January 26, 2022 - link

    Apple's M1 seems to be in ballpark performance (from the few benchmarks I've seen on ArsTechnica) so if you want battery powered performance you don't go for the 7lbs DTR gaming Christmas tree but more likely to the 4.5lbs Mac.

    On the other hand you gotta love a CPU focused review of a laptop subsequently comparing the storage performance to another machine's who's storage details get no mention as far as I can tell. I mean what's the point of showing me how much faster the 2 PCIe 4.0 SSDs in the MSI are compared to the 5400RPM SATA HDD in the Asus laptop? I'm glad it's class leading though...
  • jrocket - Wednesday, January 26, 2022 - link

    Or better yet, a much more power efficient Ryzen laptop, so you don't have to run macOS.
  • corinthos - Wednesday, January 26, 2022 - link

    M1/Pro/Max provided optimized for your desired workloads. AMD Ryzen & 3070+ also do pretty well in terms of battery life. One really needs to test out their typical workloads to determine exactly how much battery life is gained by going Apple vs AMD, rather than just go by reviews based on reviewers' test scenarios. Also, being able to properly gauge how often you need to be unplugged is another thing to factor into a purchasing decision. If it's not as much as you think it would be, then you'll get more power for your dollar getting a desktop.
  • Netmsm - Wednesday, January 26, 2022 - link

    The title is about testing Alder Lake but actually it is about testing MSI laptop! Who in their right mind would consider this gaming tests as a justifiable review for Alder Lake?
    What are you doing Doc?
  • Spunjji - Thursday, January 27, 2022 - link

    "the one where its performance is too poor for it to actually be used as a desktop replacement"
    Weird how often the benchmark level for "enough performance" magically moves to be as much performance as Intel provide...
  • evolucion8 - Wednesday, February 2, 2022 - link

    So, a laptop with Ryzen 5800X/Radeon RX 6700XT class performance is not a desktop replacement? In which parallel world it isn't? With the fact that according to tests, it barely loses any CPU performance when unplugged and retains over 75% of its GPU performance as well?
  • blanarahul - Tuesday, January 25, 2022 - link

    That has always been the case with Intel CPUs for the past 4-5 years. Desktop Alder Lake touches 240 watts to beat AMD and laptop Alder Lake crosses 110 watts in a portable chassis to beat AMD.

    I am looking forward to how the 6800H performs.
  • Yojimbo - Tuesday, January 25, 2022 - link

    You're looking at the wrong laptop as a basis for this discussion.
  • FMinus - Tuesday, January 25, 2022 - link

    Looking at how this loses to 4800U from 2020 at 30W in CB20, this does not bode well for Intel in anything but 100W+ scenarios. 6000 U series will walk over anything Intel puts out. A shame really. The efficiency cores seem to be a waste of die space, what I suspected.
  • IntelUser2000 - Wednesday, January 26, 2022 - link

    Actually you are mistaken. The E cores are what allows them to achieve this performance level. Without it it'll be worse.

    So the problem is the P cores are too inefficient.
  • Spunjji - Thursday, January 27, 2022 - link

    The E cores aren't actually very efficient in terms of power - just die area. Their main purpose is marketing: they let Intel advertise a "14 core" CPU and scrape out wins in multi-threaded productivity benchmarks (when power isn't constrained).
  • demian_thorne - Tuesday, January 25, 2022 - link

    I hope the “right” laptop does not cause 3rd degree burns :)
  • Spunjji - Thursday, January 27, 2022 - link

    You say they're looking at the wrong laptop, but it's the laptop Intel provided, so it's worth asking yourself why Intel would provide this specific laptop. Much like the Ice Lake and Tiger Lake launches, Intel is front-loading reviews with high benchmark scores from an over-cooled platform that the vast majority of end users will not see reflected in actual products.
  • drothgery - Tuesday, January 25, 2022 - link

    It's not great, but anything using an H-series CPU is going to spend a lot more time plugged in than on battery, and it's better than Tiger Lake H or Comet Lake H, so it's not like they're getting worse there.
  • yeeeeman - Tuesday, January 25, 2022 - link

    this is a performance oriented laptop. I think having the OPTION to get the highest performance with close to unlimited power is good. If you keep it on balanced power profile it will consume 70W and lose about 10% of performance, so still a LOT faster than any other laptop.
  • Kamen Rider Blade - Tuesday, January 25, 2022 - link

    Then go get a desktop AlderLake PC. You'll get even more Performance!
  • Timoo - Tuesday, January 25, 2022 - link

    Yes, and if you want your gaming rig to be portable, because you love gatherings and hate to drag around your 36" screen, case, cables, mouse, keyboard, etc., then this is a good solution.
  • melgross - Wednesday, January 26, 2022 - link

    Sure, and when ten or more people are together, they find that they can’t all plug in at the same time. Wonderful!
  • TheinsanegamerN - Friday, January 28, 2022 - link

    And when 100 get together they'll find they cant talk over each other! I love strawmen arguments!
  • PeachNCream - Friday, January 28, 2022 - link

    Its more of a kids oriented laptop than performance oriented given the branding, colorful light bulbs, and other presentation methods for the hardware. Something designed for performance in the laptop space is more along the lines of a Dell Precision for example. This is instead a toy for little boys that want to be gamerz0rz.
  • TheinsanegamerN - Friday, January 28, 2022 - link

    Kids are not buying several thousand dollar RGB PCs. Adults are.

    Just like games. It's not kids buying microtransactiosn by the billions. It's adults in their 30s. They're the biggest market.

    Millenials never grew up.
  • vlad42 - Friday, January 28, 2022 - link

    You're right adults are buying them...for their kids.
  • vlad42 - Friday, January 28, 2022 - link

    The overwhelming majority of the target market for these types of laptops are people in middle school through undergraduate. The thing is, most grow out of this aesthetic and, because they have more disposable income as they get older, are likely to have a lighter laptop/tablet paired with a console/desktop for high performance computing/gaming.

    Think about, if the millennials who were interested in this type of device/aesthetic when they were in middle school to undergraduate were generally still interested in it, then there would be far more devices like it available from Dell, HP, Lonovo, etc. Instead, for every device like this one, there are a dozen(s) ultrabooks with more of a MacBook/ThinkPad aesthetic.
  • Ananke - Tuesday, February 1, 2022 - link

    Intel H series targets portable workstations aka HP Z Book and Dell Precision. It's irrelevant to compare this MSI to a corporate market class laptop. I would prefer this 12900H instead of the Xeons in my Z-Book for example. This is what large corporations are buying, none of them buys AMD regardless any performance. Priority is security and manageability. A comparison to M1Pro would be somehow relevant, it falls in the same price segment and corporate market.
  • vladlazlo - Wednesday, February 2, 2022 - link

    Intel and security... That's a good one...
    Haven't heard that since the last time there was a news about 240+ security vulnerabilities that you get for free with intel processors. AMD and all the ARM based processors combined can't put a dent on that record, even if you triple the number of vulnerabilities they 'offer'.
    Maybe these adventurous people you speak of want more vulnerabilities?
  • vladlazlo - Thursday, February 3, 2022 - link

    You might want to read this...
    https://www.tomshardware.com/features/intel-amd-mo...
    Or this

    https://www.zdnet.com/pictures/all-the-major-intel...

    before you start talking about security.....
  • Netmsm - Tuesday, January 25, 2022 - link

    "High? not at all" said Patrick Gelsinger.
    :)))
  • maroon1 - Wednesday, January 26, 2022 - link

    115w is max power for shot period. sustained power consumption is 85w in extreme performance mode

    And 75w when using balanced mode but you lose 3% in cinebench
  • undervolted_dc - Wednesday, January 26, 2022 - link

    ... and spec sheet says 45w, but probably during bench are using the full power boost.. and gets benchmarked against 45w ( with boost at 54w) in performance..
    e cores seems to no help in low performance tasks also.. seems less efficient than amd despite having the complexity to manage hybrid cores in the OS
  • Spunjji - Thursday, January 27, 2022 - link

    It's Intel's strategy ever since they got stuck on 14nm. Prior to that, they kept dropping power while incrementing performance, leading to a progressively improving overall experience (better sustained power, less bulky devices, etc.)

    Then AMD got back in the game and Intel had to compete somehow, so they started raising their TDP ratings to get benchmarks that look good. The results are easy to see when you look at sites like Notebookcheck that measure performance of large groups of devices: Intel get good benchmarks, but performance drops quickly under sustained use in anything other than a DTR, and there's a massive error bar around the average performance of an Intel mobile CPU because it's dependent on how well the chassis can remove 45W+ of heat (even the Ultra Mobile ones).
  • sandeep_r_89 - Monday, January 31, 2022 - link

    Higher battery capacity doesn't help win benchmarks........
  • bogamia - Tuesday, January 25, 2022 - link

    wonder how it would perform against Rembrandt which will presumably offer 30% better mt uplift.
  • Spunjji - Thursday, January 27, 2022 - link

    I wouldn't expect to see an increase like that in practice. Rembrandt may be a little faster than Cezanne, but probably not a lot.

    If you're one of the majority of people who buy a device that's designed to cool 45W sustained, though, then it'll be less of a disappointment than Alder Lake.
  • vlad42 - Friday, January 28, 2022 - link

    Maybe, maybe not. The base clock does appear to have gone up from 1.9 GHz in the 5800U to 2.7 GHz in the 6800U, a 42% increase! Granted, it is possible the 2.7 GHz base clock is only when the chip is in 28W mode - it really was not clear from the slide.

    We also do not know how much benefit RDNA2 is bringing over Vega in terms of energy efficiency (there may be more thermal/power budget available to the Zen 3 cores), the improvements in 6N vs 7N, or the yield improvements/better binning now that the process is even more mature - 6N is still just a variant of 7N after all.

    While I would normally agree with you, we clearly need more information to know for certain. The Ice Lake to Tiger Lake improvements were pretty significant and the changes Intel made look to be fairly similar to what AMD has done here.
  • Samus - Saturday, January 29, 2022 - link

    I agree. I don't think AMD is going to take the IPC performance crown back here unless they can considerably scale up clock speed at the cost of efficiency like a Black Edition part.

    The fact is all these new CPU's are SO fast that it's Lamborghini's vs Ferrari's. I think people are going to prefer the one that has double the MPG at a virtually unnoticeable performance penalty.
  • FwFred - Wednesday, February 2, 2022 - link

    How does higher clock speed help IPC exactly? Let's just call it single thread performance and leave IPC to microarchitectural discussions.
  • Lbibass - Tuesday, January 25, 2022 - link

    It seems that there may be something wrong with your benchmarking process. Linus Tech Tips was getting significantly improved battery life over the previous gen GE76 Raider in watching YouTube videos, nearly a 40% improvement.

    For the Alder Lake Raider in your review to get significantly less battery life than the previous gen shows signs that there is something going on with your specific laptop that I wouldn’t be so quick to blame on background power consumption.
  • doungmli - Tuesday, January 25, 2022 - link

    you have to see the configuration tested, here the autonomy problem is surely linked to the 17" 360hz screen. I don't even understand their comparison with the asus g513 which has a 15" screen ans only 144hz
  • Brett Howse - Tuesday, January 25, 2022 - link

    Our results were very close to what Intel had sent us as to what they were getting so I think they are fine.
  • TheinsanegamerN - Friday, January 28, 2022 - link

    But they are not in the same ball park with what other reviewers are getting. It suggests either there was a difference in the manner the laptop was tested or there was something off about this specific model you got.
  • deil - Thursday, February 10, 2022 - link

    or that drivers still suck, it did not go to E-cores as planned, hence the difference.
  • Silma - Tuesday, January 25, 2022 - link

    The laws of physics are the laws of physics.
    On battery life and performance per watt, Intel won't catch up with AMD until it goes 7 nm, and then again, it won't catch up with Apple until it goes 5 nm, assuming the change is immediate.
    In reality Intel will probably needs to go 4 or 3 nm to catch up.
  • drothgery - Tuesday, January 25, 2022 - link

    Intel 7 and TSMC N7 are similar-density processes (though the latter is more power efficient). That until was calling Intel 7 "10nm Enhanced Superfin" until recently and people who should know better always called TSMC N7 "7nm" doesn't actually make them that different. Neither process actually has any 7nm features, or even any 10nm features; the last time marketing names for semiconductor processes actually reflected any feature size on real chips they were still using microns, not nanometers.
  • Otritus - Friday, January 28, 2022 - link

    Planar transistors had size match the smallest dimension of the transistor. So 28nm was in fact 28nm. The move to FinFET dramatically decreased the smallest dimension (14/16 nm would be called 8nm if they didn’t change how transistors were named). Since there is not standardized naming for FinFET and GAAFET transistors you have the arbitrary naming that all companies now use creating confusion around how performant and dense a process is.
  • Duwelon - Tuesday, January 25, 2022 - link

    Wonder why they don't offer a 144hz VA panel.
  • Calin - Wednesday, January 26, 2022 - link

    That is a $3000 laptop, they can't use "second-best" technologies in it.
    (and by the way, I have a "second-best" MVA 32 inch display, and the contrast is absolutely incredible. Colors are good, viewing angles are good, but that contrast <3
  • KPOM - Tuesday, January 25, 2022 - link

    It would be interesting to throw in a few tests to see how this performs against the M1 Max in the MacBook Pro. Obviously the MacBook Pro isn't optimized for gaming, but I can think of some tests (video encoding, etc.) that would be suitable for a cross-platform comparison.
  • Brett Howse - Tuesday, January 25, 2022 - link

    I don't have one sadly. I think it would be a good comparison as well.
  • IGTrading - Tuesday, January 25, 2022 - link

    On of the worst Intel launches in recent history.

    Gone are the 300+ design wins from the time of Haswell. Intel barely has managed to get Alder Lake into 100 laptops, using all its clout, money & influence, while AMD ... unsurprisingly, has more than 200 design wins for Ryzen 6000 series.

    Use a 330W power brick and the largest possible battery and call this "mobile" ? :) Pathetic.

    I'm really eager to see Alder Lake performing at 25W and see the true performance and efficiency.

    When getting down to earth, in the 15W~ 35W space, I think the competition with AMD will be tight, but definitely no "+30% performance for 200% more power consumption".

    Looking forward to see normal Alder Lake laptops competing with AMD Ryzen 6000 series.
  • temps - Tuesday, January 25, 2022 - link

    Again I'm flummoxed as to how anandtech, a very smart, extremely well written site staffed by highly educated people, has a comment section full of incredible dumb, poorly thought out and blatantly partisan fanboys
  • m53 - Tuesday, January 25, 2022 - link

    @temps: I second.
  • Spunjji - Thursday, January 27, 2022 - link

    Doctor, heal thyself
  • Meenimynimo - Tuesday, January 25, 2022 - link

    There are >300 designs for alder lake laptop. Of these, 100 will be Alder Lake-H, and 100 will be Alder Lake Evo (overlap with H, P, and U).

    Ryzen 6000 is DOA
  • IGTrading - Wednesday, January 26, 2022 - link

    @Meenimynimo could you please post a link to the source of the information ? Who said that Intel has more than 300 design winds for Alder-Lake at launch ?
  • Spunjji - Thursday, January 27, 2022 - link

    200 design wins, feature parity, and superior power efficiency in a power-constrained platform is "DOA"? Okay.
  • Techtree101 - Tuesday, January 25, 2022 - link

    When is it actually available? I missed that part.
  • soloracerx - Tuesday, January 25, 2022 - link

    NewEgg just put up the pre-orders today for the MSI line 12th gen lappys. Staggered release from Feb through April. The one reviewed here will be available 3/28/22 for $4199. https://www.newegg.com/p/pl?d=12th+gen+laptop
  • mabellon - Tuesday, January 25, 2022 - link

    “The new Alder Lake system could only average 83% of its original frames per second, because that task was deprioritized by Thread Director to free up additional resources for the foreground jobs.”

    Foreground and background detection is provided by the OS. The CPU alone has no way to identify which threads are rendering on screen, the OS and window management tracking is responsible. This window tracking, and more, are critical for hybrid scheduling, favored core scheduling, etc. Https://docs.microsoft.com/en-us/windows/win32/procthread/quality-of-service

    If you want to test Thread Director you need to test workloads with multiple concurrent workload classifications by Thread Director. A complex workload (or multiple concurrent workloads) with different thread execution (I.e. some AVX, some not). For example, if you have two on-screen workloads (I.e. both prefer performance), but there are more concurrent workload threads to schedule than there are P-Cores, which subset of those threads will get the most performance out of the limited set of P-Cores?Homogenous workloads where all threads are roughly equivalent and running the same instruction mix are unlikely to have much impact (e.g. Cinebench, Handbrake). Workloads that fit entirely on P-Cores are unlikely to have much impact as there is no overflow to E-cores.

    Another alternative would be if a particular workload classification was inverted versus the norm. For example, normally one would expect E-cores to be the most efficient. On a laptop you might want background services and apps scheduled to the most efficient cores for battery life. But what if the workload running happens to heavily exercise instructions that are actually more efficient to run on P-cores?

    Those are some examples of how Thread Director plays a role.
  • kenansadhu - Thursday, January 27, 2022 - link

    Will it be where the AMD's design win out? Because there wouldn't be any risk for the user of any OS or scheduler's mistake (since all cores are the same).
  • Da W - Tuesday, January 25, 2022 - link

    I remember back in the days when AMD GPU was beating NVIdia but with 50 watt more power draw, they were discarded as crap!
    How things change. Wattage does not seem to matter anymore.
  • Calin - Wednesday, January 26, 2022 - link

    I chose nVidia because I didn't had a PCI-E power cable (or the wattage). At roughly similar price and performance, AMD broke the 75W limit and nVidia didn't.
    With processors already installed in systems, the equation changes a bit - you can still get the performance, at a lower battery life.
  • Spunjji - Thursday, January 27, 2022 - link

    Generally speaking, you can assume that watts are important when AMD draw more of them; not so much when Intel or Nvidia draw more and win.
  • TheinsanegamerN - Friday, January 28, 2022 - link

    Nobody disregarded AMD's hawaii GPUs for pulling more power, tech media was singing their praises from every rooftop.

    They slammed the 390x for being just a rebranded 290x that drew even more power but couldnt keep up with nvidia's high end.

    But context is the enemy of the fanboi.
  • BillBear - Tuesday, January 25, 2022 - link

    You guys really need to start posting performance numbers when the laptop is unplugged, now that power draw figures have become so ridiculous that laptops immediately have to throttle down when unplugged.
  • TekCheck - Tuesday, January 25, 2022 - link

    Benchamark tech-tubers and tech website should follow Anandtech's practice and start publishing performance+power benchmarks for laptops, to give us better context behind high bars in any test. This is less important for desktops, but as laptops are usually portable devices, it is paramount that we know the power behind performance.

    If 12900HK is winning in any bars, say 20% over 6900HX, we need to know at which power level. Is it at 45W? At 60W? At 115W? If the win is at high wattage only, users will have a wrong impression that such CPU has "better" performance, which is untrue without knowing the power needed for such performance.
  • Brett Howse - Tuesday, January 25, 2022 - link

    Mentioned this in the review. The AMD 5900HX system also draws the same power at load.
  • Kamen Rider Blade - Tuesday, January 25, 2022 - link

    But for how long?

    How many Joules of Energy did it consume to perform the same task in the same time period?
  • TekCheck - Tuesday, January 25, 2022 - link

    Benchamark tech-tubers and tech website must start publishing performance+power benchmarks for laptops, to give us better context behind high bars in any test. This is less important for desktops, but as laptops are usually portable devices, it is paramount that we know the power behind performance.

    If 12900HK is winning in any bars, say 20% over 6900HX, we need to know at which power level. Is it at 45W? At 60W? At 115W? If the win is at high wattage only, users will have a wrong impression that such CPU has "better" performance, which is untrue without knowing the power needed for such performance.
  • Kamen Rider Blade - Tuesday, January 25, 2022 - link

    AlderLake is "American Muscle Car" with max performance with no regards to Fuel/Energy Efficiency or Heat Output.

    Ryzen 6000 is European / Japanese thoughtful design that cares about Energy/Heat Output Efficiency
  • TekCheck - Tuesday, January 25, 2022 - link

    Well put.
  • BillBear - Tuesday, January 25, 2022 - link

    Which is another way to say that AlderLake is a chip to use in a luggable computer, but not a laptop.
  • Kamen Rider Blade - Tuesday, January 25, 2022 - link

    It should be considered a "BagTop" computer.

    You're going to need a Large 'BackPack' or 'LapTop Bag' to carry it around.

    It's not designed to fit or sit on your Lap.
  • drothgery - Wednesday, January 26, 2022 - link

    Adler Lake H. Just like every 45W+ notebook chip before it (though that we're now calling sub-7 lb machines 'luggable computers' instead of laptops is a mark of how much has changed on that score).

    Adler Lake P and U are much lower-power parts (on basically the same die).
  • Brett Howse - Tuesday, January 25, 2022 - link

    That's a very poor analogy. I wrote in the review that the AMD system also pulls the same numbers at load.
  • Kamen Rider Blade - Tuesday, January 25, 2022 - link

    You wrote this:

    Battery Life Summary

    In a word the battery life could be summed up as "unimpressive". The Raider GE76 is not an ideal test bed to determine CPU efficiency under load since the underlying power draw is significant. To see how Alder Lake compares we will have to wait for more power efficient platforms to get more meaningful results.

    You showed the results.

    The battery is piss poor compared to the AMD equivalent when doing similar work loads.
  • Brett Howse - Tuesday, January 25, 2022 - link

    Yes exactly. Alder Lake doesn't have much to do with the battery life in the Raider GE76. There is a massive amount of power being used by the display/GPU/memory/storage which is masking any impacts Alder Lake can or can't have. As I said, we need to wait for different systems to get a better feel for how Alder Lake impacts battery life.
  • Kamen Rider Blade - Tuesday, January 25, 2022 - link

    So the ASUS G513QY w/ it's Ryzen 9 5900HX + Radeon 6800M + It's display / Memory don't also impact power consumption?

    That the ASUS that you chose wasn't similar enough?
  • Brett Howse - Tuesday, January 25, 2022 - link

    Of course they do. AMD does way better on battery life but not because of the CPU. That's the point I've been trying to convey.

    AMD's power gating on their GPU is very impressive.

    You said Alder Lake is "max performance with no regards to Fuel/Energy Efficiency" but in both the AMD system and the Intel system, neither CPU is the determining factor for battery life. They are almost irrelevant. Even in a thin and light system with no GPU, the display draws more power than any other component including the CPU.

    I would love an apples to apples comparison but I don't have a Raider GE76 with RTX 3080 Ti and Ryzen 5900HX. Sadly we are constrained by what we have been provided to test, and what manufacturers build.
  • Kamen Rider Blade - Tuesday, January 25, 2022 - link

    =(
  • Kamen Rider Blade - Tuesday, January 25, 2022 - link

    MSI Raider GE76
    CPU = Intel Core i9-12900HK w/ 85 Watts TDP
    GPU = NVIDIA RTX 3080 Ti for Laptops
    RAM = 32 GB DDR5-4800
    Display = 17.3" 2K | 1080p @ 360 Hz
    Battery = 99.9 Wh

    Asus ROG Strix G15 G513QY
    CPU = AMD Ryzen 9 5900HX __ w/ 45 Watts TDP
    GPU = AMD Radeon RX 6800M
    RAM = 16 GB DDR4-3200
    Display = 15.6" 2K | 1080p @ 300 Hz
    Battery = 90.0 Wh
  • IntelUser2000 - Wednesday, January 26, 2022 - link

    5900HX in the G15 isn't running at 45W, that's his whole point, but either you are choosing to ignore it or didn't see it.

    "The ASUS G513QY with the AMD Ryzen 9 5900HX has a similar system, and it also draws around 85 Watts in its maximum performance mode on a 45-Watt processor."

    That's the *same* power as Alderlake-H.
  • undervolted_dc - Wednesday, January 26, 2022 - link

    alder lake it's not at maximum performance mode 85w, it's 115w a good test should account for power and price, a cpu which cost 2x because it's the super top 1% binning for 10% median better perf in benchmarks or that consume 2x for the same should be accounted in each bench results, top performance is a factor but the most important one for laptops are perf/watt/price and for servers is perf/TCO , perf alone is useless ( or useful to reach a certain wanted point from the press.... )
  • Kamen Rider Blade - Tuesday, January 25, 2022 - link

    The specs are similar enough, the only difference is that extra 1.7" display diagonal and extra 16 GB of RAM, but MSI has the more efficient DDR5 instead of older DDR4.

    MSI also has an extra 9.9 Wh to it's advantage.

    Yet the AlderLake laptop was "UnImpressive"
  • jjjag - Wednesday, January 26, 2022 - link

    Brett you are hilarious trying to reason with a fanboi. It's like trying to convince a trumper to get vaxxed because of all the past success we've had with smallpox and polio. For them, it's a religious argument. "Religious" because it ignores all real data and observations, and it just makes them feel better about themselves to hate something. But A+ for trying!
  • TheinsanegamerN - Friday, January 28, 2022 - link

    Funny, I didnt know the black community was full of trumpers. Or the hispanic community, for that matter......
  • Sunrise089 - Wednesday, January 26, 2022 - link

    But Brett, you’re giving the Intel system credit for the GPU+memory+storage. I don’t understand the rationale behind crediting the PCMark scores to Alder Lake but ‘crediting’ the poor battery life to MSI.
  • corinthos - Wednesday, January 26, 2022 - link

    Let's get real, how much power are those things really taking up.. there are AMD Ryzen laptops using 3080 that do better. Maybe test without a display, GPU, memory and storage, then I bet it'll hit 6 hours.
  • TheinsanegamerN - Friday, January 28, 2022 - link

    So AMD ryzen 5900hx laptops just use magical RAM/storage/screens that draw no power then? Because those laptops obliterate this thing in perf/watt.
  • corinthos - Wednesday, January 26, 2022 - link

    "To see how Alder Lake compares we will have to wait for more power efficient platforms to get more meaningful results."

    So in other words, wait for something possibly coming AFTER this itteration of Alder Lake?
  • kwohlt - Wednesday, January 26, 2022 - link

    I think the point being is that this is the highest performance, full desktop replacement chip. There are still P and U series Alder Lake that hasn't been reviewed yet.
  • Spunjji - Thursday, January 27, 2022 - link

    I understood "platform" here to mean notebook design. Notebooks that aren't prioritising outright performance may provide better battery life even with the same Alder Lake chip.
  • Kamen Rider Blade - Tuesday, January 25, 2022 - link

    You can't look at CPU only Power Load, you need to factor in the total system since you're buying a LapTop.
  • Brett Howse - Tuesday, January 25, 2022 - link

    I get that you want AMD to win here. I do. But you are misinterpreting the results.
  • Kamen Rider Blade - Tuesday, January 25, 2022 - link

    You couldn't find a "Like for Like" LapTop setup is what you're stating.
  • Sergey1001 - Thursday, January 27, 2022 - link

    https://www.notebookcheck.net/MSI-GE76-Raider-Lapt...
    MSI GE76 Raider (GE76 Series)
    ProcessorIntel Core i9-12900HK 14 x 1.8 - 5 GHz, 135 W PL2 / Short Burst, 110 W PL1 / Sustained, Alder Lake-P
    https://www.notebookcheck.net/Asus-ROG-Strix-G15-g...
    Asus ROG Strix G15 G513QY (ROG Strix G15 G513 Series)
    ProcessorAMD Ryzen 9 5900HX 8 x 3.3 - 4.6 GHz, 80.6 W PL2 / Short Burst, 70.4 W PL1 / Sustained, Cezanne H (Zen 3)
    The MSI GE76 Raider has 1.56 times the allowed power consumption than the Asus ROG Strix G15 G513QY. Making biased reviews is not good.
  • TheinsanegamerN - Friday, January 28, 2022 - link

    This is anandtech. Their quality is a minor echo of what it once was.

    Reviewers are so desperat eto avoid the 5900hx's performance. Watch when the 6900hx comes out this year and obliterates alderlake in both performance and power consumption.
  • lmcd - Wednesday, January 26, 2022 - link

    Ryzen 6000 does not exist as of 1/26/2022.

    Your analogy is awful but ironically correct -- comparable European/Japanese sports cars also get horrifying mileage, and the performance gap is in the nonsense territory with fanboys on either side debating what the immeasurable differences mean.
  • Spunjji - Thursday, January 27, 2022 - link

    👆
  • PeachNCream - Friday, January 28, 2022 - link

    Its always funny to watch people.try to compare computers to cars. That literally happens all of the time in the tech industry and rarely fits the situation because, well obviously, computers are not cars.
  • TheinsanegamerN - Friday, January 28, 2022 - link

    It's always funny to watch people complain about car comparisons simply because "well cars and computers are different" without any other context. Tells me you have no real counterargument.
  • cowymtber - Tuesday, January 25, 2022 - link

    Tim from HUB...Do your power normalization magic, and release the demons!
  • Timoo - Tuesday, January 25, 2022 - link

    I haven't really seen the comment floating around yet.
    SInce Intel got bumped by Apple, could this design be their result out of a failed collaboration?

    I mean, they planned to bring this design, say, 3 to 5 years ago, for Apple MacBooks etc.
    But Apple -meanwhile- was thinking they could do better by themselves?

    So, now Intel had their Big.Little design, in co-op with Apple, and nowhere to go but to the x86 market in full-force. Not backed by the idea that they would also end up in the Apple store...
  • Brett Howse - Tuesday, January 25, 2022 - link

    Not likely. Apple is a big customer, but not Intel's biggest.
  • Timoo - Tuesday, January 25, 2022 - link

    Ok, I just found it kind of "coincidential" that a year after we got the news Apple has dumped Intel for their own M1, Intel comes with a "similar" design. Which must have been in development already since -say- 2016 or 2017.

    Maybe I have to search it in the area of thinking along with the ARM train, where Big.Little is normal, and Intel realising they have Big.Little for x86 laying around (Atom + Core). I just got the impression that it was Apple who might have triggered this idea with Intel...

    But then again; that would bring a lot of Queensize Drama to this release (the failed Intel-Apple chip). And I love drama. So I am biased, presumably :-D

    Thanks for your reply!
  • IntelUser2000 - Wednesday, January 26, 2022 - link

    No, the hybrid combo was first presented by Intel back in 2005 or so by then-CTO Justin Rattner.
    https://images.anandtech.com/reviews/tradeshows/ID...

    It might have been first at Intel if they didn't have a braindead management and CEO and executed on 10nm.
  • lmcd - Wednesday, January 26, 2022 - link

    Intel probably started work closer to 2012. The roadmap for Atom has been converging toward serving as a desktop/laptop small core since the smartphone Atoms got canned and tablet got deprioritized. By no coincidence, that happened when Apple slammed every smartphone chip with the A7, which was also Intel's public wakeup that Apple could easily reach laptop performance on a quick timeline.
  • lmcd - Wednesday, January 26, 2022 - link

    (To be clear, point is that they probably had wind of the upcoming release's significance in 2012 and started planning then. 2013 the A7 release, 2014 is the last Intel smartphone SoC, which was probably mostly done and committed to at the point that the A7 was released.
  • lemurbutton - Tuesday, January 25, 2022 - link

    People still compare this to AMD? AMD is far behind.

    Let's just compare Apple and Intel. At this point, Apple is far far far ahead of Intel which is far ahead of AMD.
  • web2dot0 - Wednesday, January 26, 2022 - link

    Now we know why Apple left Intel. Intel’s direction going forward is simply different from Apple.

    Their power performance curve just don’t match and it seems Intel has no interest in satisfying Apple.

    End of relationship
  • corinthos - Wednesday, January 26, 2022 - link

    Apple also left Intel because it failed to deliver and failed to deliver on time.
  • corinthos - Wednesday, January 26, 2022 - link

    that is overly simplistic. there are some workloads at which AMD Ryzen laptops excel, blowing Apple out of the water. It is workload dependent. know your needs to make the appropriate choice for you.
  • Spunjji - Thursday, January 27, 2022 - link

    People compare it to AMD because AMD is not actually "far behind" and is the only other option for a Windows platform. If you'd like Intel to charge whatever they want for their products then go ahead, ignore AMD.
  • maroon1 - Wednesday, January 26, 2022 - link

    12900HQ sustained power consumption is 85w in extreme performance mode and 75w in balanced mode but you only lose 3% performance in cinebench

    So, the huge gap between 5900HX and 12900HQ gap in power can't be coming from the CPU only. 12900HQ laptops has 3080 Ti, faster SSD and difference display. So those likely played factor

    I wish to see power consumption difference between the CPU only. Not the whole laptop power consumption
  • ddhelmet - Wednesday, January 26, 2022 - link

    What's the difference between 6+8 processors? Better binned or just clock speeds?
  • Otritus - Friday, January 28, 2022 - link

    6+8 processors means that you have 6 high performance cores and 8 high efficiency cores. In Alder Lake 4 high efficiency cores is about equal to 1.5 big cores in multi-threaded performance, allowing Intel to achieve about 9 cores of performance with 8 cores worth of die size. It also helps boost battery life because Intel big cores are incredibly bloated. Different 6+8 processors will be binned and clocked differently.
  • Hulk - Wednesday, January 26, 2022 - link

    For us tech heads it would be nice to know the P and E average effective clocks during testing at the 3 power levels reported?
  • tkSteveFOX - Wednesday, January 26, 2022 - link

    Just a note for Brett here, when you do a laptop review thermals, throttling and noise levels are important things to have in the review.
  • Brett Howse - Wednesday, January 26, 2022 - link

    This isn't a laptop review. Please check the link to the GE 76 Raider review we did in September.
  • TheinsanegamerN - Friday, January 28, 2022 - link

    *reviews laptop CPU*
    *uses laptop*
    *this is not a laptop review*

    Whew, anandtech quality ladies and gentlemen! I guess that throttling and thermals are not important in CPU tests either? Or would that get in the way of using RAM to exuse the hideous power draw?
  • corinthos - Wednesday, January 26, 2022 - link

    Whole point of a laptop is portability and using unplugged for a considerable amount of the time. That's why you pay a premium. If the best Intel can do is 3.x hours on battery, this is essentially a desktop alternative/replacement, in which case you get even more for your money just buying a desktop.
  • Brett Howse - Wednesday, January 26, 2022 - link

    This is literally a desktop replacement. That's actually a product category for notebooks.
  • Spunjji - Thursday, January 27, 2022 - link

    I'm sorry you have to keep replying to comments like this...
  • Spunjji - Thursday, January 27, 2022 - link

    For me, those "percentage of no load performance" graphs may have been the most interesting. It certainly shows what Alder Lake can offer when it's being used to the fullest. It doesn't represent a use-case that I'd ever put a laptop to, but it will be very interesting for the sort of user for whom 20 threads in a mobile CPU is less a flashy selling point and more a necessity.

    Otherwise it's looking as I expected - peak performance is significantly better than the ageing Cezanne platform, sustained performance in a slightly more representative platform remains to be seen. Tiger Lake H was ~25% down on Cezanne in terms of performance/watt in multithreaded loads, so there's certainly potential here for Intel to have caught up and maybe even surpassed that.
  • abufrejoval - Thursday, January 27, 2022 - link

    The least impressive statement in this review is this “Perhaps the most impressive result though is Intel’s Thread Director, which provides very impressive system responsiveness even when the system is at 100% CPU load…”, because to me it sounds either like paid content or lack of reflection.

    Intel is pushing E-cores as a “must have”, because it’s exclusive to their platform, very much like MMX or AVX-512 back then.

    But it’s mostly yet another marketing smoke bomb.

    I am convinced you could achieve a very similar gain in responsiveness by emulating the 8 E-cores via the 2 P-cores they replace in terms of silicon real-estate on Alder Lake. What you perceive as a hardware benefit is mostly an OS defect in workload management.

    What happens here is that long running batch and latency sensitive interactive workloads are being separated and assigned to hardware partitioned processing pools dependent on whether they are running in the “foreground” or “background”. Doing that in the Windows task manager today, is obviously cumbersome, but writing a tool that prohibits the usage of all CPU cores once workloads are switched to background, should be trivial enough. And to my knowledge even cache partitioning has been part of x86 since Broadwell to ensure that busy background batch tasks won’t flush latency sensitive interactive workloads entirely from them.

    Yes, E-cores have been proven to squeeze longer run times out of smartphones or more concurrent session support per Watt in certain cloud servers, because they are designed to be more efficient in in terms of instructions per Watt/h at the price of instructions per unit of time.

    But those constraints do not apply to a gaming laptop or most desktop computers. Actually, even on ultra-thin laptops CPU core power consumption is becoming a rather insignificant contributor to overall energy spend, outside of some synthetic fringe cases.

    Intel fits 8 E-cores into a similar space for 2 P-cores, so the high-end mobile Alder-Lake parts could just be 8 P cores or 32 E cores or any of the other permutations. And quite obviously you would be able to find workloads with an ideal fit for each, just as you’d also find workloads that violate either performance or efficiency targets on them.

    Intel promotes a hardware partitioned compromise between E- and P-cores and then interestingly charges an E-core premium on desktop parts, where collective energy savings on remotely managed always-on volume parts might actually provide an ecological benefit. But I can’t help thinking, that a software solution via “E-core emulation” would deliver more flexibility and adjustable performance on most laptops and workstations and prefer 8P+0E over 6P+4E practically everywhere except server parts.
  • Bik - Thursday, January 27, 2022 - link

    Simple thought will prove this correct: a big task that feels like it bogs down the whole system, will no longer be so if you willing to sacrify some cores for background tasks. The scheduler just isnt smart enough to do that today.

    There's another thing I notice. When it is true that 8-E cores put up more performance than 2-P cores, Intel claims of silicon real-estate between the two being equal may not be correct. Because if they do, we'll surely have all E-cores cpu for heavy multi-thread work loads. But that didnt, and I doubt, will ever happen.
  • diediealldie - Thursday, January 27, 2022 - link

    oh, they will. There's Sierra Forest AP which consists of 128 e-cores. Intel is not making all e-cores yet since launched platforms are kind of mass-market general-purpose ones which need high ST performance along with good MT performance.
  • deil - Friday, January 28, 2022 - link

    Does anyone have info of idle power of this thing/clocks it has on idle on p/e cores ?
    Just curious.
  • Spunjji - Friday, January 28, 2022 - link

    No idea yet, everyone seems to have been seeded the same platform and its mediocre idle power characteristics have more to do with the way it's configured than the CPU / chipset combo.
  • Rezurecta - Friday, January 28, 2022 - link

    Agree with this article. This processor isn't too interesting. It is basically a full desktop part and we've seen that Intel is nice and fast with Alder Lake. Show me the U parts we'll see in normal laptops!
  • PeachNCream - Friday, January 28, 2022 - link

    Killer-branded network adapters. Disgusting gimmicky software is a no thanks.
  • Vitor - Friday, January 28, 2022 - link

    360hz is useless. No game came close to achieve that. 144hz with better colors and/or 1440p would be much better.
  • Samus - Saturday, January 29, 2022 - link

    It's like Prescott all over again, except Intel actually edges out AMD in performance too, but at what cost? Nearly double the power draw for 10-20% IPC improvement?
  • ciparis - Monday, January 31, 2022 - link

    When will this laptop be available?
  • IUU - Tuesday, February 1, 2022 - link

    Intel's processors are fine. It is just that Intel had possibly to pretend they can't move from 14 NM , so they don't kill the entire industry, which had not been competitive for ages. At 55 percent the transistor density they already beat Apple , of course by employing more wattage. Can you imagine what would happen if they were a whole node ahead, that is at a lithography comparable with TSMC' 3 NM. Desktops and laptops with performance x3 to x5 that of Apple's or AMD's at the same higher or lower prices.

    PS Don't Tell me you believe Intel could not transition to a new node all these years? You really believe that?
  • TekCheck - Friday, February 4, 2022 - link

    Sounds like an industrial conspiracy against progress, something similar to what Edison did to prevent Tesla's innovations moving the world forward and faster.

    Except, it is not true in Intel's case. What would be a good reason to allow TSMC to rapidly gain a competitive edge by stifling own progress? Interesting hypothesis, but silly. They got complacent, their previous CEOs made several bad decisions and now they are forced to buy TSMC's process to stay in the game.
  • IUU - Tuesday, February 1, 2022 - link

    or to put it another way they were able to efficiently or less efficiently compete with whole industry while using the same lithography for 3 Moore cycles; that's centuries in computer time. Regarding money , they have never really faced a real problem. They have their way of squeezing a river of dollars out of thin air.
  • TanishqHooda - Friday, February 4, 2022 - link

    Can you do same review on i7 of this version?
  • m53 - Friday, February 4, 2022 - link

    HUB did an i7 review here https://youtu.be/OYvXx6x3AKc
  • AaronS678 - Wednesday, February 16, 2022 - link

    Absolutely handicapped the AMD system, almost every benchmark has a graphics component and you choose a laptop with a significantly weaker gpu that doesn't even have a mux switch, the exact platform is out there, but these results are made to put AMD in a bad light, thats why the scores go from G531QY vs GE76 to 12900HK vs 5900HX, despite how almost every benchmark is heavily affected by graphics performance, way to go anandtech

Log in

Don't have an account? Sign up now