Sure, the i9-13900KS can be seen as "the best"... ...but it's hard to justify it for the real-world when there's the Intel Core i5-13600k right there! You're paying half the amount and using less power, to achieve a very impressive score right up there with the best.
I guess even more impressive is the AMD r7-7700 which trades blows with that chipset in performance, competitively price, uses significantly less power, and most importantly has a long support (AM5) period.
I wish we continue to see this trend, with the luxury products making it clear to show how good the mid-high range products have become. In some parts this is a correction in the market due to new competition. Consumers rejoice!
A very expensive and fast cpu is testet with some very slow memory. I don’t see the point in doing that. The slowest memory rated as DDR5-5600 in my local webshop is CL40 and the most commen is CL36 and I could choose CL28. CL46 is not an option. I’m not in the market for this cpu, but if I were I would combine it with DDR5-7200 at CL32. If memory overclocking is not an option, then I would use DDR5-5600 CL36.
I forgot to add, that apart from my minor critics regarding choice of memory, that it is nice that AnandTech makes these thorough reviews. I don’t think I can find such quality reviews anywhere else for pc-class hardware.
Honestly, I wish these high end CPUs were tested at the speeds they can run 128GB of RAM at. From what I've seen around the web, these tested RAM speeds are higher than either can manage reliably when all 4 dimms are filled.
"*We changed the motherboard to the GIGABYTE Z690 Aorus Master as the MSI MPG Z790 Carbon WIFI we used for our previous 13th Gen Core series reviews refused to play ball. We don't like to make these changes lightly, but we weren't able to source another Carbon in time for this review"
Did it fail outright, or still work for other chips but choke on this extra-power hungry one?
It is doing weird and crazy things with all my chips. On half of them, it will even refuse to POST. I've reached out to MSI UK, but I'm yet to receive a response. At this point, I may as well re-test all the 13th Gen SKUs on another board for parity purposes.
I could think of better things to do with 360W of power like running ALL of the lights in my home, running my laptop at full load, charging and using my phone, and feeding the router necessary to support my internet connection (and I would still have close to 100W left for other things). With a 13900KS, you get a CPU and nothing else, not even cooling for said processor let alone the rest of the computer hardware in the system required to support it. Enthusiasts are apparently idiots with bottomless lines of credit that Intel thinks will brainlessly pay interest to VISA and, sadly, they might be right.
I paid cash for mine and love it. Cheaper than the i74930k I built 8 or 9 years ago. When a processor kicks most others butts for that long, it's worth paying for. It's like buying a performance car vs a grocery getter. It performs!
Car people "logic" also escapes me because of how self-centered it is to expend more of our limited energy resources than necessary to accomplish something, but huge pickup trucks, gigantic SUVs, and pointlessly overpowered cars are around in large numbers so at least you can safely say you followed the school of fish into the proverbial mouth of a demise of your own making as you suffer while living on a trashed planet. GG Earth
Please add Microsoft Flight Simulator to your benchmark tests. Few games are so demanding on the CPU as MSFS. When you look at the posted gaming benchmarks, there's little difference between most CPUs for each game benchmarked. Throw in a game that really showcases what a CPU can do in a CPU demanding title.
Is MFS really CPU intensive? Or does it just use one thread much like Quake III? If that's the case then you can just benchmark ioquake or OpenArena.
Or for an even more hilarious result, why not try a multi-threaded python program? Then you could watch that CPU intensive GIL (Global Interpreter Lock) wreak havoc on the tested CPUs via python's polling process.
They won't, but for a practical reason. It's an always online, frequently updated game, and test data would only be relevant for a single, back to back comparison between parts. No way to accurately compare to previous test results.
There was a screenshot floating around not too long ago (from Anandtech iirc) showing why not - the activation process is a nightmare if you try to flip it between machines for testing. The want to use it, MS stand in the way.
I know this is primarily a PC/Windows oriented review but I would have loved to have seen the M2 pro/max included in here. I know that it would have been smoked in most of these benchmarks, especially multi-core, but it would still be interesting to see the power usage and relative performance. Whether we want it or not, there’s a 3rd player in the CPU game now!
There's a 3rd player taking part, but in a different game. If it can't run the same software or platform, it's irrelevant other than for comparing instruction sets.
10-20Mhz extra on average, for the "Favoured Cores" only, and for an additional $110 and 25W more. I know there are plenty of Greater Fools in the tech space, but surely there is a limit to how much even they will tolerate?
FWIW I got the wrong impression from that statement too.
I was a little more put off by the remark that "360 W isn't that easy to cool ... with a typical tower air cooler". That implies that someone clever enough could reasonably do so. I don't think it's possible - and if that's true would just say so. For the benefit of novice readers who (as I know from my own experience) will assume that *they* can.
You can technically use a premium tower cooler with this CPU, such as a Noctua NH-D15, and still see decent levels of performance; but you won't get all of it.
The issue is you shouldn't. This is because thermal throttling will roll back the frequency and CPU V-Core until it finds a sustainable thermal load that can dissipate. It's very clear in the conclusion that custom loop water cooling is the way to go.
Do we know for sure that a custom loop could cool it? You alluded to it in the piece, but 320W in that small of an area is going to be hard to get proper heat transfer from.
Custom loop cooling can easily transfer that heat out(Water has huge heat capacity), each 120-140mm radiator is capable of ≈150W of cooling (This is dependent on fan speed on rads and temperature delta, good rule of thumb). I run dual 280 rads, so ≈ 600w cooling. I Only have 2 fans installed currently and my system draw is 400W (13700k + GTX1080) and I never see temps above mid 60's on GPU or CPU, I will bring this down further by increasing air flow (Speed/ additional fans).
This chip was built for OC world records (LN) and for those with custom loops and not bothered by the price. I will likely opt for a 13900k and skip the extra binning tax personally, but its nice to see them push the envelope in this space, it does improve the full product stack beneath it as they bin for top turbo speed.
Do you think you could test various memory speeds to see how the latest AMD and Intel chips react to higher and lower RAM speeds? Investigate if latency or frequency is more important?
12th - 13th gen upgrade is just hogwash. Mediocre boost and massive heat density increase with high clock speeds. 8P saturation is the big deal here. Useless E cores pumping is just for those MT workloads, and bonus on Intel CPUs you have very low 3.2GHz base clocks vs AMD's 4GHz base clocks on all Zen 4 processors. E cores are even worse.
$730 MSRP + Tax for a binned chip but the RPL-R Refresh is hitting soon, so this purchase is going to be thrown into drain, no matter how binning is. Also from 13th gen SP binning is not much a big deal like of 10th gen because most of the 13900K processors and i9 parts KF are already top bins, they selected high quality silicon. Whereas for 10900K i9 there were a ton of low SP rated i9 and some of them were poorer in clock speed ratio they ended up in 10850K. Whereas for 13900K many can hit 5.8GHz all core OC aka 8P max at 400W+ in All Core CBR type workloads.
Plus 39K-40K on 7950X vs 43K max (OCed optimized 13900K) but wattage ? 420W on Intel vs 270W-300W max on AMD. It makes no sense to buy into this LGA socket bending platform. Only if you care about that PCIe chipset bandwidth shortage on Zen 4 PCIe 4.0x4 link vs 4.0x8 on RPL you buy Intel. The performance difference is negligible, bonus you do not get to buy some aftermarket BS ILM brace nor hunt for 8000MHz kits and trying to optimize it without WHEA (granted you have a top MC SP rated i9) or try to cool down the 350-400W of high heat, and virtually no possibility of using Air Cooler. Plus you get a Zen 5 socket in upgrade which will wipe the floor. Look at AMD's Gen vs Gen improvements, it was always high esp Zen 2 vs Zen 3 vs Zen 4 all giant leaps vs Intel 10th-11th-12th (yea this is big because Intel sabotaged the 11th by backport) - 13th is so small vs 12th.
I forgot to mention AVX512, Zen 4 is having massive advantage over Intel in that and RPCS3 is a major workload esp for those old God of War, Motor Storm, Ratchet and Clank, Sonic, Red Dead Redemption and other ton of solid titles on Sony's Playstation 3 (PS4 is just junk).
In looking at many of your benchmarks (e.g., the CPU Benchmark Performance: Science suite), there's no way to tell from what you've posted if they are single-core or multi-core, and if they are CPU-only or CPU+GPU.
Ideally this info. should be added to the graphic header for each benchmark. You could put it in small type, right after where you say, e.g., "Time in Seconds (lower is better)". Once you've updated your headers, it would be no extra work for you to provide that info.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
39 Comments
Back to Article
erinadreno - Friday, January 27, 2023 - link
No.nandnandnand - Friday, January 27, 2023 - link
Practically useless, even at 720p. Looking forward to 7000X3D.Kangal - Friday, January 27, 2023 - link
Sure, the i9-13900KS can be seen as "the best"......but it's hard to justify it for the real-world when there's the Intel Core i5-13600k right there! You're paying half the amount and using less power, to achieve a very impressive score right up there with the best.
I guess even more impressive is the AMD r7-7700 which trades blows with that chipset in performance, competitively price, uses significantly less power, and most importantly has a long support (AM5) period.
I wish we continue to see this trend, with the luxury products making it clear to show how good the mid-high range products have become. In some parts this is a correction in the market due to new competition. Consumers rejoice!
Andresen - Friday, January 27, 2023 - link
A very expensive and fast cpu is testet with some very slow memory. I don’t see the point in doing that. The slowest memory rated as DDR5-5600 in my local webshop is CL40 and the most commen is CL36 and I could choose CL28. CL46 is not an option. I’m not in the market for this cpu, but if I were I would combine it with DDR5-7200 at CL32. If memory overclocking is not an option, then I would use DDR5-5600 CL36.Makaveli - Friday, January 27, 2023 - link
Have to agree with you on this. No one buying this CPU is going to be pairing with the DDR5-5600 memory.Andresen - Monday, January 30, 2023 - link
I forgot to add, that apart from my minor critics regarding choice of memory, that it is nice that AnandTech makes these thorough reviews. I don’t think I can find such quality reviews anywhere else for pc-class hardware.Great_Scott - Wednesday, February 1, 2023 - link
The reviews are great, although I can't imagine there's a single hand-built Ryzen PC in existence that's using memory at JEDEC timings.It was interesting to see how the Ryzen 7950X was able to keep up in a large number of tests.
GeorgeV - Saturday, February 11, 2023 - link
Honestly, I wish these high end CPUs were tested at the speeds they can run 128GB of RAM at. From what I've seen around the web, these tested RAM speeds are higher than either can manage reliably when all 4 dimms are filled.jospoortvliet - Monday, February 13, 2023 - link
Well Andresen thinks it should be faster you think slower. I guess they did it right?DanNeely - Friday, January 27, 2023 - link
"*We changed the motherboard to the GIGABYTE Z690 Aorus Master as the MSI MPG Z790 Carbon WIFI we used for our previous 13th Gen Core series reviews refused to play ball. We don't like to make these changes lightly, but we weren't able to source another Carbon in time for this review"Did it fail outright, or still work for other chips but choke on this extra-power hungry one?
Gavin Bonshor - Friday, January 27, 2023 - link
It is doing weird and crazy things with all my chips. On half of them, it will even refuse to POST. I've reached out to MSI UK, but I'm yet to receive a response. At this point, I may as well re-test all the 13th Gen SKUs on another board for parity purposes.PeachNCream - Friday, January 27, 2023 - link
I could think of better things to do with 360W of power like running ALL of the lights in my home, running my laptop at full load, charging and using my phone, and feeding the router necessary to support my internet connection (and I would still have close to 100W left for other things). With a 13900KS, you get a CPU and nothing else, not even cooling for said processor let alone the rest of the computer hardware in the system required to support it. Enthusiasts are apparently idiots with bottomless lines of credit that Intel thinks will brainlessly pay interest to VISA and, sadly, they might be right.jgrimm2364 - Thursday, February 2, 2023 - link
I paid cash for mine and love it. Cheaper than the i74930k I built 8 or 9 years ago. When a processor kicks most others butts for that long, it's worth paying for. It's like buying a performance car vs a grocery getter. It performs!PeachNCream - Sunday, February 5, 2023 - link
Car people "logic" also escapes me because of how self-centered it is to expend more of our limited energy resources than necessary to accomplish something, but huge pickup trucks, gigantic SUVs, and pointlessly overpowered cars are around in large numbers so at least you can safely say you followed the school of fish into the proverbial mouth of a demise of your own making as you suffer while living on a trashed planet. GG Earthdontlistentome - Friday, January 27, 2023 - link
320w. Just disgusting waste.This is the most retarded CPU since the dying days of the Pentium 4.
blppt - Saturday, January 28, 2023 - link
I think you have blotted the 9590 out of your memory.Jorgp2 - Saturday, January 28, 2023 - link
Or the 7950x which still uses 260wemike09 - Friday, January 27, 2023 - link
Please add Microsoft Flight Simulator to your benchmark tests. Few games are so demanding on the CPU as MSFS. When you look at the posted gaming benchmarks, there's little difference between most CPUs for each game benchmarked. Throw in a game that really showcases what a CPU can do in a CPU demanding title.ballsystemlord - Friday, January 27, 2023 - link
Is MFS really CPU intensive? Or does it just use one thread much like Quake III? If that's the case then you can just benchmark ioquake or OpenArena.Or for an even more hilarious result, why not try a multi-threaded python program? Then you could watch that CPU intensive GIL (Global Interpreter Lock) wreak havoc on the tested CPUs via python's polling process.
ballsystemlord - Friday, January 27, 2023 - link
PS: You'd have to request a high frame rate, like 500fps, for a Quake III client to saturate the CPU.Slash3 - Saturday, January 28, 2023 - link
They won't, but for a practical reason. It's an always online, frequently updated game, and test data would only be relevant for a single, back to back comparison between parts. No way to accurately compare to previous test results.Peskarik - Saturday, January 28, 2023 - link
Exactly!Please add Microsoft Flight Simulator, Anandtech!!!
dontlistentome - Saturday, January 28, 2023 - link
There was a screenshot floating around not too long ago (from Anandtech iirc) showing why not - the activation process is a nightmare if you try to flip it between machines for testing. The want to use it, MS stand in the way.scottrichardson - Friday, January 27, 2023 - link
I know this is primarily a PC/Windows oriented review but I would have loved to have seen the M2 pro/max included in here. I know that it would have been smoked in most of these benchmarks, especially multi-core, but it would still be interesting to see the power usage and relative performance. Whether we want it or not, there’s a 3rd player in the CPU game now!dontlistentome - Saturday, January 28, 2023 - link
There's a 3rd player taking part, but in a different game. If it can't run the same software or platform, it's irrelevant other than for comparing instruction sets.Tunnah - Friday, January 27, 2023 - link
1/3rd more power for single digit performance increase, how very Intel.Peskarik - Saturday, January 28, 2023 - link
AMD Ryzen 9 7900 wins for me.Carmen00 - Saturday, January 28, 2023 - link
10-20Mhz extra on average, for the "Favoured Cores" only, and for an additional $110 and 25W more. I know there are plenty of Greater Fools in the tech space, but surely there is a limit to how much even they will tolerate?albie_ - Saturday, January 28, 2023 - link
"The peak power figures from our power testing show that the Core i9-13900K drew an impressive 359.9 W at full load."I'm dumbfounded by this statement. Is this supposed to be selling point and good to have? What sort of tech journalist are you?
Ryan Smith - Saturday, January 28, 2023 - link
"What sort of tech journalist are you?"The sarcastic kind.
That statement was fully tongue in cheek. Intel made a consumer desktop chip that draws 360 Watts. It's all a bit silly, innit?
Arbie - Saturday, January 28, 2023 - link
FWIW I got the wrong impression from that statement too.I was a little more put off by the remark that "360 W isn't that easy to cool ... with a typical tower air cooler". That implies that someone clever enough could reasonably do so. I don't think it's possible - and if that's true would just say so. For the benefit of novice readers who (as I know from my own experience) will assume that *they* can.
Gavin Bonshor - Saturday, January 28, 2023 - link
You can technically use a premium tower cooler with this CPU, such as a Noctua NH-D15, and still see decent levels of performance; but you won't get all of it.The issue is you shouldn't. This is because thermal throttling will roll back the frequency and CPU V-Core until it finds a sustainable thermal load that can dissipate. It's very clear in the conclusion that custom loop water cooling is the way to go.
A5 - Saturday, January 28, 2023 - link
Do we know for sure that a custom loop could cool it? You alluded to it in the piece, but 320W in that small of an area is going to be hard to get proper heat transfer from.cyrusfox - Monday, January 30, 2023 - link
Custom loop cooling can easily transfer that heat out(Water has huge heat capacity), each 120-140mm radiator is capable of ≈150W of cooling (This is dependent on fan speed on rads and temperature delta, good rule of thumb). I run dual 280 rads, so ≈ 600w cooling. I Only have 2 fans installed currently and my system draw is 400W (13700k + GTX1080) and I never see temps above mid 60's on GPU or CPU, I will bring this down further by increasing air flow (Speed/ additional fans).This chip was built for OC world records (LN) and for those with custom loops and not bothered by the price. I will likely opt for a 13900k and skip the extra binning tax personally, but its nice to see them push the envelope in this space, it does improve the full product stack beneath it as they bin for top turbo speed.
qwertymac93 - Saturday, January 28, 2023 - link
Do you think you could test various memory speeds to see how the latest AMD and Intel chips react to higher and lower RAM speeds? Investigate if latency or frequency is more important?Silver5urfer - Tuesday, January 31, 2023 - link
12th - 13th gen upgrade is just hogwash. Mediocre boost and massive heat density increase with high clock speeds. 8P saturation is the big deal here. Useless E cores pumping is just for those MT workloads, and bonus on Intel CPUs you have very low 3.2GHz base clocks vs AMD's 4GHz base clocks on all Zen 4 processors. E cores are even worse.$730 MSRP + Tax for a binned chip but the RPL-R Refresh is hitting soon, so this purchase is going to be thrown into drain, no matter how binning is. Also from 13th gen SP binning is not much a big deal like of 10th gen because most of the 13900K processors and i9 parts KF are already top bins, they selected high quality silicon. Whereas for 10900K i9 there were a ton of low SP rated i9 and some of them were poorer in clock speed ratio they ended up in 10850K. Whereas for 13900K many can hit 5.8GHz all core OC aka 8P max at 400W+ in All Core CBR type workloads.
Plus 39K-40K on 7950X vs 43K max (OCed optimized 13900K) but wattage ? 420W on Intel vs 270W-300W max on AMD. It makes no sense to buy into this LGA socket bending platform. Only if you care about that PCIe chipset bandwidth shortage on Zen 4 PCIe 4.0x4 link vs 4.0x8 on RPL you buy Intel. The performance difference is negligible, bonus you do not get to buy some aftermarket BS ILM brace nor hunt for 8000MHz kits and trying to optimize it without WHEA (granted you have a top MC SP rated i9) or try to cool down the 350-400W of high heat, and virtually no possibility of using Air Cooler. Plus you get a Zen 5 socket in upgrade which will wipe the floor. Look at AMD's Gen vs Gen improvements, it was always high esp Zen 2 vs Zen 3 vs Zen 4 all giant leaps vs Intel 10th-11th-12th (yea this is big because Intel sabotaged the 11th by backport) - 13th is so small vs 12th.
Silver5urfer - Tuesday, January 31, 2023 - link
I forgot to mention AVX512, Zen 4 is having massive advantage over Intel in that and RPCS3 is a major workload esp for those old God of War, Motor Storm, Ratchet and Clank, Sonic, Red Dead Redemption and other ton of solid titles on Sony's Playstation 3 (PS4 is just junk).byte99 - Sunday, February 19, 2023 - link
In looking at many of your benchmarks (e.g., the CPU Benchmark Performance: Science suite), there's no way to tell from what you've posted if they are single-core or multi-core, and if they are CPU-only or CPU+GPU.Ideally this info. should be added to the graphic header for each benchmark. You could put it in small type, right after where you say, e.g., "Time in Seconds (lower is better)". Once you've updated your headers, it would be no extra work for you to provide that info.
birthday wishes to employee - Saturday, February 25, 2023 - link
Great post thanks for sharing