so basically a mere 6% better Cinebench MT score at the cost of almost 100 extra watts. I dunno in what universe would anyone want this instead of a 7950x.
At platform level it is over 200W difference. Impressive. And I agree, nobody in teh right mind should get Intel over AMD, unless they have very specific workload in which that 6% makes a difference worth hundreds/thousand of dollars in electricity per year.
😆😆😆😆😆😆 AMDrip fanboys are hilarious and delusional And what bullshit connect about the electricity bill per year... thousands.. really???? Dang kid, you are hilariously sad
I find it amazing. It takes more than 200W MORE to beat the 7950. The difference in efficiency is unbelievable. Buying Intel today still makes no sense unless that extra 5-10% in some specific benchmark really make a huge difference. Otherwise it'll cost you dearly in electricity.
While Anand has a policy of testing things out-of-the-box, which is fine, it is well known ADL and RPL can be power constrained to something like 125W max, while losing performance in the single digits range. It would be really useful if we had a follow up article looking into that.
So, 6% faster than previous gen, a bit (10%?) faster than AMD's 7950. Consuming over 200W *more* than the Ryzen 7950. I'd say Intel's power efficiency is still almost half that of the ryzen. It's amazing how far behind they are.
This power consumption / heat output is insane… this is putting their 90nm Netburst Prescott / Pentium D Smithfield days to shame. Remember when Apple left IBM/Motorola alliance? Power architecture power consumption going thru the roof, and intel JUST pivoted back to PIII/Pentium M-based Core arch. No wonder why Apple dumped Intel, they called what they were seeing really early on. Arm for windows/linux desktop needs to get more serious, apple's desktop arm is proving nearly as powerful using a fraction of the power draw. Windows is ready, and can even run non-arm code too.
What point are you trying to make, that you have no clue how thermodynamics work? This 14900K manages to pull 430 watt peak. 430. 0.43 kilowatt. one CPU. It is still beat by a 80 watt peak 7800x3d. What is your point?
People remember Netburst CPUs as being absurdly power hungry, but they forget that even the most power-hungry Netburst CPUs still only had a TDP of 130W. Today that would be considered a normal or even a low TDP for a flagship CPU. It's actually understating the TDP if you compare it to a Netburst CPU.
Just one question: do these AI "tools" connect to the Internet, after they "measure specific system characteristics, including telemetry from integrated sensors", to send that data to those Intel servers that are in the "cloud"?
I recommend everyone go to Tom's Hardware if they are missing something here. They'll have the reviews, decent ones IMO, and are owned by the same company as AnandKek.
Some motherboards will let you just set a power limit. I'd like to see a benchmark where the power limit is set to only the advertising number (125 W) and see what it can do with that constraint. 400+ watts just seems insane. My laptop is currently suffering terrible battery life because the CPU throttles up and gets hot and cooks the laptop because of exactly this Power Be Damned philosophy. I want a quiet desktop that isn't going to cook me if I'm sitting next to it, and isn't going to just cook the motherboard components and fail after a few years.
I was expecting the new chip to be slightly more power efficient with a year of design tweaks and improvements. (And you'll note Intel wants you to think this because they kept the 125W marketing power usage on the box.) I am kinda baffled how Intel is executing so poorly. Nobody had a gun to their head forcing them to release this product. There's some deeply broken structural inertia in the organization to just keep pumping out products and not disrupting the flow of new model numbers. Somebody in Intel should have been screaming and said the plan wasn't working, rather than just keeping their head down to deliver a new model number for no reason.
Andandtech did this with the 13900k vs 7950X at different TDP/PPT. Basically the Ryzen at 65W TDP or 88W PPT was faster than Intel at 125W TDP. Once the Ryzen was set to 105W TDP or 142W PPT the Intel needed 253W TDP to be faster. In fact the scaling on the Ryzem dropped off quite quickly over 105W TDP.
Well, I just built a new system with an Ryzen 9 7900x that I got on sale for $380 a couple weeks ago and have set at a 105w TDP. Looks like I have no regrets here either in performance or efficiency.
Re all the justified comments about excessive power draw, is this not only when using it at peak capacity?
If you're using it at peak capacity, all the time, then I agree, you've got the wrong CPU. It's like driving your vehicle at or over 6000rpm all the time.
For everyone else who's using a compatible MB and prior gen intel cpu, who wants a drop in upgrade, this may be useful?
(I'm using an amd 5950x here, with no regrets. When I need the cores (and I do use them), it's there. The rest of the time, it just idles..)
I have 4 7950x machines where I encode using Handbrake SVT-AV1 almost 24/7. AT shows that the Intel is faster, but @ 2x the power consumption literally, AMD is still better. Besides my ambient rises at least 6-7c with the machines going 100%, I can't imagine how the 13900k/14900k will behave. Insane. Besides having all the machines going 100% with a 5000BTU AC it blows my circuit breaker, so I run the AC power with an extension from another room. I can't imagine how the 13900k/14900k will behave.
PS: Before anyone says I should have gone for a 64+ core EPYC, it was still cheaper to build these 4 systems over a 64c Eypc, taking into consideration 12CH memory, server board, etc. and these run at least 5.1ghz all core over a Epyc at 3.5~ GHZ
I did mess around with HW AV1 encoding on a Intel ARC A380. Quality was pretty good, but the file sizes are at least double (for GPU) for very very similar or even better quality (for SVT-AV1). I'm not doing live streaming, more like encoding for VOD, in this case filesize and bitrates are important as well as storage use. I'm using SSDs so smaller filesize = better. At that, the smaller the size, the more users I can serve at port speed.
You can certainly drop it in, but it's not an upgrade going from 13th to 14th gen. It's a sidegrade at best. For the price, you would be better off upgrading to watercooling with a 280mm or 360mm radiator.
AMD has not been generous with multi-threading at the entry level, e.g. 7600/7800X.
I don't agree that they have to start selling 32-cores soon. They have a few ways to address multi-threading with Zen 5 or Zen 6. The easiest one for Zen 5 would be to make a 24-core with an 8+16 configuration. But that probably won't stop them from releasing an entry level 6-core.
If they boost the core counts of the normal/fast chiplets in the future, then core counts will rise across the board. For example, a 12-core chiplet would probably get disabled to 8-10 cores for the entry level, instead of 6. A 16-core chiplet could get disabled to 10-12. That is not happening with Zen 5 as far as we know.
Why Intel and AMD not make enthusiastic 100-200 core 6-7 GHz processors with TDP 1 kW for those who don't care how much they consume because already have 10-20 kW solar panel systems on their roof ? Let others be jealous. Green energy proponents will be dancing in joy.
No, there IS a place for ALL the performance at ANY cost. Some people are like that really. Not everyone lives and dies based on power consumption like the comments have you believing. I don't care, I can pay the bill and it's really peanuts difference between a 14900k and a 7950x in terms of the difference it would make on the power bill. If I am looking for max performance for my usage that isn't even a consideration, only performance matters. By that measure the 14900k is better sometimes and sometimes not. So it would come down to very specific use cases. What I'm saying is, everyone here moans about power usage but ignores the fact that not everyone pinches pennies on the power bill or needs to worry about it and just wants the max performance. That's why 4090s exist and sell well.
I own a 4090, still wouldn't buy a 14900k, was gonna buy then they decided to refresh.
I dislike the heat element. This much possible power draw is broken,how can you cool that and maintain peak speeds. It's pointless.
The only ones who could use these are people on propper custom loops or ln2 overclocking.
They are literally pointless CPUs and the new gen of intel CPUs was just a total waste of time, development and retail. Just think how much waste they've made for this new gen that's the same, just the packaging and everything what a waste.
I also dont buy this gen Intel processors, so what?Cheap solar energy and efficient heat pumps will slowly change mentality. Now power supply in PC 1kW, soon they will be 2 kW and rarely who will care. People don't care about power efficiency. 99.9% don't even know the price per kilowatt-hour. Kids playing games dont even want to hear about it. If you care then find the solutions how to use that possible 0.25-1 megawatt of solar power falling on your property. Power efficiency is just the salespeople buzzword
Server and supercomputer high core count chips heavily rely on power efficiency, hence they have clocks 2-3 GHz. For example the 64 core EPIC processor essentially equivalent to the future 24-32 core consumer processor at 6 GHZ at 3-5x cost. If someone worry too much about power consumption just buy couple used solar panels for 100 bucks and you will cut the cost by half or 100% with smart backup
Those peak power numbers are disgustingly bad. I wouldn't want to pay the utility bill for that so I'm glad I paired a keyboard to my phone and make do with less than 8w peak total system power consumption. Oddly enough, I don't feel as though I'm missing anything without some obnoxious box filled with a CPU like one of these and some obesity-level graphics card.
You aren't even the target market for any of this so your comment is useless and pointless. You are not an enthusiast, gamer, or need the power for work.
I old a 4090 and I wouldn't want a CPU with the same power as it. I was waiting for Meteor Lake upgrading from my 5800X3D, but when I heard it was a refresh I was like oh no.
Looks like I'm going to be a beta tester for AMDs 6000 series, and I'm quite frankly bored of AMD and it's crash test consumer development.
I know times are changing, I know, and I loved to tinker but it's getting too long in the tooth with AMD ATM with all the agesa "fixes" which is usually a big performance loss.
I really wanted INTEL to come out brawling,but all they're doing is digging their own grave.
They shouldn't have released ANOTHER refresh, especially one as bad as this.
Thanks for continuing to run SPEC2017, but I'm really missing the cumulative scores. Also, I wish we could get cumulative scores on E-cores only and P-cores only, as well as populating that graph with some other popular CPUs, as was done up to the i9-12900K review.
For reference, please see the chart titled "SPEC2017 Rate-N Estimated Total", at the bottom of this page:
Then don't let the motherboard run away with power lol, Honestly what is wrong with you reviewers fixated on how much power it can draw? Set it to Intel's 253w limit and enjoy almost the same performance as one that is consuming stupid amounts of power, It's not rocket science or do we not know how to set a motherboard up these days?
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
57 Comments
Back to Article
DabuXian - Tuesday, October 17, 2023 - link
so basically a mere 6% better Cinebench MT score at the cost of almost 100 extra watts. I dunno in what universe would anyone want this instead of a 7950x.yankeeDDL - Tuesday, October 17, 2023 - link
At platform level it is over 200W difference. Impressive.And I agree, nobody in teh right mind should get Intel over AMD, unless they have very specific workload in which that 6% makes a difference worth hundreds/thousand of dollars in electricity per year.
schujj07 - Tuesday, October 17, 2023 - link
If you have a workload like that then you run Epyc or Threadripper as the task is probably VERY threaded.shaolin95 - Thursday, December 21, 2023 - link
😆😆😆😆😆😆 AMDrip fanboys are hilarious and delusionalAnd what bullshit connect about the electricity bill per year... thousands.. really???? Dang kid, you are hilariously sad
lemurbutton - Tuesday, October 17, 2023 - link
Who cares about CInebench MT? It's a benchmark for a niche software in a niche.powerarmour - Wednesday, October 18, 2023 - link
Wouldn't buy the 7950X either, not interested in any CPU that draws >200W unless I'm building a HEDT workstation.shabby - Tuesday, October 17, 2023 - link
Lol @ the power usage, this will make a nice heater this winter.yankeeDDL - Tuesday, October 17, 2023 - link
I find it amazing. It takes more than 200W MORE to beat the 7950.The difference in efficiency is unbelievable.
Buying Intel today still makes no sense unless that extra 5-10% in some specific benchmark really make a huge difference. Otherwise it'll cost you dearly in electricity.
bug77 - Thursday, October 19, 2023 - link
While Anand has a policy of testing things out-of-the-box, which is fine, it is well known ADL and RPL can be power constrained to something like 125W max, while losing performance in the single digits range.It would be really useful if we had a follow up article looking into that.
yankeeDDL - Tuesday, October 17, 2023 - link
So, 6% faster than previous gen, a bit (10%?) faster than AMD's 7950.Consuming over 200W *more* than the Ryzen 7950.
I'd say Intel's power efficiency is still almost half that of the ryzen. It's amazing how far behind they are.
colinstu - Tuesday, October 17, 2023 - link
This power consumption / heat output is insane… this is putting their 90nm Netburst Prescott / Pentium D Smithfield days to shame. Remember when Apple left IBM/Motorola alliance? Power architecture power consumption going thru the roof, and intel JUST pivoted back to PIII/Pentium M-based Core arch. No wonder why Apple dumped Intel, they called what they were seeing really early on. Arm for windows/linux desktop needs to get more serious, apple's desktop arm is proving nearly as powerful using a fraction of the power draw. Windows is ready, and can even run non-arm code too.herozeros - Tuesday, October 17, 2023 - link
My AMD AM5 would like a word with you …FLEXOBENDER - Tuesday, October 17, 2023 - link
What point are you trying to make, that you have no clue how thermodynamics work?This 14900K manages to pull 430 watt peak. 430. 0.43 kilowatt. one CPU.
It is still beat by a 80 watt peak 7800x3d. What is your point?
boozed - Wednesday, October 18, 2023 - link
I think the point was that you don't have to abandon x86 for ARM to achieve good efficiency, just Intel.The Von Matrices - Thursday, October 19, 2023 - link
People remember Netburst CPUs as being absurdly power hungry, but they forget that even the most power-hungry Netburst CPUs still only had a TDP of 130W. Today that would be considered a normal or even a low TDP for a flagship CPU. It's actually understating the TDP if you compare it to a Netburst CPU.GeoffreyA - Friday, October 20, 2023 - link
And didn't Cedar Mill further drop that to a 65W TDP?GeoffreyA - Friday, October 20, 2023 - link
Possibly, ISA is just a small piece of the power puzzle, and the rest of the design is what's carrying the weight.An interesting article:
https://chipsandcheese.com/2021/07/13/arm-or-x86-i...
Azjaran - Tuesday, October 17, 2023 - link
Did i miss something or are there no temperatures shown? Because 428W shouldn't be on the low side and demands a good Cooling Solution.Gastec - Tuesday, October 17, 2023 - link
Just one question: do these AI "tools" connect to the Internet, after they "measure specific system characteristics, including telemetry from integrated sensors", to send that data to those Intel servers that are in the "cloud"?TheinsanegamerN - Tuesday, October 17, 2023 - link
Of course they do. Even if they say they dont.Gastec - Friday, October 20, 2023 - link
Maybe they do it through a proxy app, part of the overall package of Windows' Telemetry?pookguy88 - Tuesday, October 17, 2023 - link
you didn't have a 13700k to test against?shabby - Tuesday, October 17, 2023 - link
Yup pity, that would show us what those 4 e-cores can actually do.TheinsanegamerN - Tuesday, October 17, 2023 - link
I mean they still dont have a GPU test bed going on 3 years post fire. I wouldnt expect much.nandnandnand - Wednesday, October 18, 2023 - link
I recommend everyone go to Tom's Hardware if they are missing something here. They'll have the reviews, decent ones IMO, and are owned by the same company as AnandKek.TheinsanegamerN - Thursday, October 19, 2023 - link
Tom's Hardware was caught shilling for Nvidia eons ago. They're another dinosaur of the tech space.Techspot, Techpowerup, and reviewers like gamers nexus are the new hotness.
wrosecrans - Tuesday, October 17, 2023 - link
Some motherboards will let you just set a power limit. I'd like to see a benchmark where the power limit is set to only the advertising number (125 W) and see what it can do with that constraint. 400+ watts just seems insane. My laptop is currently suffering terrible battery life because the CPU throttles up and gets hot and cooks the laptop because of exactly this Power Be Damned philosophy. I want a quiet desktop that isn't going to cook me if I'm sitting next to it, and isn't going to just cook the motherboard components and fail after a few years.I was expecting the new chip to be slightly more power efficient with a year of design tweaks and improvements. (And you'll note Intel wants you to think this because they kept the 125W marketing power usage on the box.) I am kinda baffled how Intel is executing so poorly. Nobody had a gun to their head forcing them to release this product. There's some deeply broken structural inertia in the organization to just keep pumping out products and not disrupting the flow of new model numbers. Somebody in Intel should have been screaming and said the plan wasn't working, rather than just keeping their head down to deliver a new model number for no reason.
TheinsanegamerN - Tuesday, October 17, 2023 - link
If you want low power, get a ryzen. The 7800x3d tops out at just 50 watt.Performance loss, if anything like raptor lake (which this is) will be 15%+ down at 125 watt, more if they make heavy use of P cores.
schujj07 - Tuesday, October 17, 2023 - link
Andandtech did this with the 13900k vs 7950X at different TDP/PPT. Basically the Ryzen at 65W TDP or 88W PPT was faster than Intel at 125W TDP. Once the Ryzen was set to 105W TDP or 142W PPT the Intel needed 253W TDP to be faster. In fact the scaling on the Ryzem dropped off quite quickly over 105W TDP.mode_13h - Wednesday, October 18, 2023 - link
This: https://www.anandtech.com/show/17641/lighter-touch...mga318 - Tuesday, October 17, 2023 - link
Well, I just built a new system with an Ryzen 9 7900x that I got on sale for $380 a couple weeks ago and have set at a 105w TDP. Looks like I have no regrets here either in performance or efficiency.Farfolomew - Tuesday, October 17, 2023 - link
The new Pentium 5!Gradius2 - Tuesday, October 17, 2023 - link
So 13900k is better as you can get one for $450charlesg - Tuesday, October 17, 2023 - link
Re all the justified comments about excessive power draw, is this not only when using it at peak capacity?If you're using it at peak capacity, all the time, then I agree, you've got the wrong CPU. It's like driving your vehicle at or over 6000rpm all the time.
For everyone else who's using a compatible MB and prior gen intel cpu, who wants a drop in upgrade, this may be useful?
(I'm using an amd 5950x here, with no regrets. When I need the cores (and I do use them), it's there. The rest of the time, it just idles..)
rUmX - Tuesday, October 17, 2023 - link
I have 4 7950x machines where I encode using Handbrake SVT-AV1 almost 24/7. AT shows that the Intel is faster, but @ 2x the power consumption literally, AMD is still better. Besides my ambient rises at least 6-7c with the machines going 100%, I can't imagine how the 13900k/14900k will behave. Insane. Besides having all the machines going 100% with a 5000BTU AC it blows my circuit breaker, so I run the AC power with an extension from another room. I can't imagine how the 13900k/14900k will behave.PS: Before anyone says I should have gone for a 64+ core EPYC, it was still cheaper to build these 4 systems over a 64c Eypc, taking into consideration 12CH memory, server board, etc. and these run at least 5.1ghz all core over a Epyc at 3.5~ GHZ
flgt - Wednesday, October 18, 2023 - link
The insane part of what you are doing is encoding in SW. Give up on some quality and run quicksync on an intel processor along with ARC GPU's for AV1.rUmX - Monday, October 23, 2023 - link
I did mess around with HW AV1 encoding on a Intel ARC A380. Quality was pretty good, but the file sizes are at least double (for GPU) for very very similar or even better quality (for SVT-AV1). I'm not doing live streaming, more like encoding for VOD, in this case filesize and bitrates are important as well as storage use. I'm using SSDs so smaller filesize = better. At that, the smaller the size, the more users I can serve at port speed.meacupla - Wednesday, October 18, 2023 - link
You can certainly drop it in, but it's not an upgrade going from 13th to 14th gen. It's a sidegrade at best. For the price, you would be better off upgrading to watercooling with a 280mm or 360mm radiator.SanX - Wednesday, October 18, 2023 - link
Ideally comparisons have to include previous gen competition too like AMD 5950x to convince people to upgradeSanX - Wednesday, October 18, 2023 - link
AMD has to start selling 32-core consumer chips based one their new 16-core chiplets versus older 8-core onesbananaforscale - Wednesday, October 18, 2023 - link
Why?nandnandnand - Wednesday, October 18, 2023 - link
AMD has not been generous with multi-threading at the entry level, e.g. 7600/7800X.I don't agree that they have to start selling 32-cores soon. They have a few ways to address multi-threading with Zen 5 or Zen 6. The easiest one for Zen 5 would be to make a 24-core with an 8+16 configuration. But that probably won't stop them from releasing an entry level 6-core.
If they boost the core counts of the normal/fast chiplets in the future, then core counts will rise across the board. For example, a 12-core chiplet would probably get disabled to 8-10 cores for the entry level, instead of 6. A 16-core chiplet could get disabled to 10-12. That is not happening with Zen 5 as far as we know.
SanX - Wednesday, October 18, 2023 - link
Why Intel and AMD not make enthusiastic 100-200 core 6-7 GHz processors with TDP 1 kW for those who don't care how much they consume because already have 10-20 kW solar panel systems on their roof ? Let others be jealous. Green energy proponents will be dancing in joy.bananaforscale - Wednesday, October 18, 2023 - link
Shut up, troll.cmdrdredd - Wednesday, October 18, 2023 - link
No, there IS a place for ALL the performance at ANY cost. Some people are like that really. Not everyone lives and dies based on power consumption like the comments have you believing. I don't care, I can pay the bill and it's really peanuts difference between a 14900k and a 7950x in terms of the difference it would make on the power bill. If I am looking for max performance for my usage that isn't even a consideration, only performance matters. By that measure the 14900k is better sometimes and sometimes not. So it would come down to very specific use cases. What I'm saying is, everyone here moans about power usage but ignores the fact that not everyone pinches pennies on the power bill or needs to worry about it and just wants the max performance. That's why 4090s exist and sell well.ItsAdam - Wednesday, October 18, 2023 - link
I own a 4090, still wouldn't buy a 14900k, was gonna buy then they decided to refresh.I dislike the heat element. This much possible power draw is broken,how can you cool that and maintain peak speeds. It's pointless.
The only ones who could use these are people on propper custom loops or ln2 overclocking.
They are literally pointless CPUs and the new gen of intel CPUs was just a total waste of time, development and retail. Just think how much waste they've made for this new gen that's the same, just the packaging and everything what a waste.
Pointless. Like your post.
SanX - Thursday, October 19, 2023 - link
I also dont buy this gen Intel processors, so what?Cheap solar energy and efficient heat pumps will slowly change mentality. Now power supply in PC 1kW, soon they will be 2 kW and rarely who will care. People don't care about power efficiency. 99.9% don't even know the price per kilowatt-hour. Kids playing games dont even want to hear about it. If you care then find the solutions how to use that possible 0.25-1 megawatt of solar power falling on your property. Power efficiency is just the salespeople buzzwordThe Von Matrices - Wednesday, October 18, 2023 - link
They already do make super-high TDP processors. They're called EYPC and Xeon.SanX - Thursday, October 19, 2023 - link
Server and supercomputer high core count chips heavily rely on power efficiency, hence they have clocks 2-3 GHz. For example the 64 core EPIC processor essentially equivalent to the future 24-32 core consumer processor at 6 GHZ at 3-5x cost. If someone worry too much about power consumption just buy couple used solar panels for 100 bucks and you will cut the cost by half or 100% with smart backupPeachNCream - Wednesday, October 18, 2023 - link
Those peak power numbers are disgustingly bad. I wouldn't want to pay the utility bill for that so I'm glad I paired a keyboard to my phone and make do with less than 8w peak total system power consumption. Oddly enough, I don't feel as though I'm missing anything without some obnoxious box filled with a CPU like one of these and some obesity-level graphics card.cmdrdredd - Wednesday, October 18, 2023 - link
You aren't even the target market for any of this so your comment is useless and pointless. You are not an enthusiast, gamer, or need the power for work.ItsAdam - Wednesday, October 18, 2023 - link
I old a 4090 and I wouldn't want a CPU with the same power as it. I was waiting for Meteor Lake upgrading from my 5800X3D, but when I heard it was a refresh I was like oh no.Looks like I'm going to be a beta tester for AMDs 6000 series, and I'm quite frankly bored of AMD and it's crash test consumer development.
I know times are changing, I know, and I loved to tinker but it's getting too long in the tooth with AMD ATM with all the agesa "fixes" which is usually a big performance loss.
I really wanted INTEL to come out brawling,but all they're doing is digging their own grave.
They shouldn't have released ANOTHER refresh, especially one as bad as this.
lilo777 - Wednesday, October 18, 2023 - link
You do not pay the utility for peak power consumption. You pay for actual consumption which is much lower because power peaks are rare and short.mode_13h - Wednesday, October 18, 2023 - link
> power peaks are rare and short.Depends on what you're doing. If rendering, video encoding, or lots of software compilation, then not necessarily.
mode_13h - Wednesday, October 18, 2023 - link
Thanks for continuing to run SPEC2017, but I'm really missing the cumulative scores. Also, I wish we could get cumulative scores on E-cores only and P-cores only, as well as populating that graph with some other popular CPUs, as was done up to the i9-12900K review.For reference, please see the chart titled "SPEC2017 Rate-N Estimated Total", at the bottom of this page:
https://www.anandtech.com/show/17047/the-intel-12t...
The following page of that review goes on to explore the P & E cores.
Perhaps this would be good material for a follow-on article?
eloyard - Thursday, October 19, 2023 - link
2000s called, want their Net-Burst back.Reinforcer - Saturday, October 28, 2023 - link
Then don't let the motherboard run away with power lol, Honestly what is wrong with you reviewers fixated on how much power it can draw? Set it to Intel's 253w limit and enjoy almost the same performance as one that is consuming stupid amounts of power, It's not rocket science or do we not know how to set a motherboard up these days?