Comments Locked

123 Comments

Back to Article

  • Flunk - Friday, March 19, 2021 - link

    Pretty lame to limit this feature to only the most expensive consumer processors, especially since their major competitor doesn't do that.
  • meacupla - Friday, March 19, 2021 - link

    Well, how else could they artificially segment i9 and i7?
  • nandnandnand - Friday, March 19, 2021 - link

    They could add another 2 cores to the i9... oh wait (for Alder Lake).
  • RU482 - Friday, March 19, 2021 - link

    yeah...WTF were they thinking downgrading the i9 core count from 10 to 8 on the 10th to 11th gen transition?
  • deepblue08 - Friday, March 19, 2021 - link

    Because they are stuck at 14nm process. They cannot add much to their chips.
  • dotjaz - Friday, March 19, 2021 - link

    That's simply not true. They could have just removed AVX512 and keep 10 cores, they chose not to.
  • Jorgp2 - Friday, March 19, 2021 - link

    And you're spouting nonsense.

    AVX-512 doesn't add much to the die size, its mostly the larger core and caches.
  • rahvin - Saturday, March 20, 2021 - link

    avx512 adds a ton of thermal if it's active. Either they'd have to disable it or just limit core count so the processor doesn't trigger it's thermal clock limits that handicap performance.

    The OP was right, they made the decision to keep it and remove the cores rather than go the other direction.
  • Calin - Monday, March 22, 2021 - link

    While AVX-512 is a thermal hog, it is efficient in work/watts - 8 cores with AVX-512 will beat 10 cores without in the same power budget.
  • Beaver M. - Monday, March 22, 2021 - link

    AVX512 cores will beat 4 times the amount of non-AVX512 cores, actually.
  • Calin - Monday, March 22, 2021 - link

    AVX512 is much more "baked in" than another two cores.
    I assume they got some better timings with 8 cores as compared to 10 cores (Rocket Lake suffers at least a bit in internal timings - cache accesses, memory accesses - compared to its "predecessor")
    (they most certainly got lower power use and lower silicon size by using 8 instead of 10 cores, even though one core is now a small proportion of almost any processor - x64, ARM, Apple M1, Apple phone/tablet).
  • GeoffreyA - Monday, March 22, 2021 - link

    Looks like it doesn't cost them much, at least from a silicon point of view, to put in AVX512. I could be wrong, and will research this further, but it appears that AVX512 has been implemented by combining the 256-bit FMAs from Port 0 and 5, one from each. An FMA was added to Port 5 (and a "MulHi").

    In short, if I got this right, it only took adding another FMA (and MulHi?); and combining both 256-bit FMAs---one from Port 0 and one from 5---creates a single AVX512 "port," or rather the high and low order bits are dispatched to 0 and 5. Doesn't seem like a big cost, silicon-wise, though in execution terms it ends up eating quite a bit of power. Perhaps that's why AMD doesn't mind adding it in Zen 4: all it would take is combining the two 256-bit ports on the FP end when AVX512 code is being executed. On the other hand, perhaps the main burden is actually the decoding of AVX512 in the front end, and that's where RKL is taking such a massive hit. I don't know.

    https://www.anandtech.com/show/14514/examining-int...

    https://en.wikichip.org/w/images/thumb/2/2d/sunny_...
  • quadibloc - Monday, March 22, 2021 - link

    And I think they made a good choice. When AVX-512 is used, it almost doubles the chip's performance. Which is why I think these chips will look better compared to AMD's chips than people think at the moment. However, since you can get 16 cores from AMD, an 8-core from Intel, even if it was twice as good because of AVX-512, would only be the same in throughput. But it might really shine in single-thread performance.
  • Spunjji - Monday, March 22, 2021 - link

    "When AVX-512 is used, it almost doubles the chip's performance"
    No, not when it's used in the way most applications will actually use it - as a small fraction of a large code base.

    Have you even read the review to see how much power these chips suck down in AVX-512 mode? They're fast, sure, but the vast majority of users won't have sufficient cooling to get peak performance running AVX-512-heavy code.
  • peevee - Tuesday, March 30, 2021 - link

    Nobody but testers care about single-thread AVX512 performance. If a developer cared enough to integrate AVX512 in some performance-critical code, they have made this part of code multithreaded 15+ years ago.
  • Duwelon - Saturday, March 20, 2021 - link

    They could have ditched the useless iGPU they keep touting. Higher end Zen 3 models don't have iGPUs and they're selling like hot cakes, I wish Intel would shake up their marketing department because they're basically forcing their engineers to add features enthusiasts (mostly) don't want. Some do really want the iGPUs for quicksync etc but without the iGPU they could have easily had a SKU with 2 to 4 more cores in the same die space.
  • Duwelon - Saturday, March 20, 2021 - link

    Also before someone says -KF models, those still have the iGPU taking up die space and silicon, they're just disabled.
  • eastcoast_pete - Saturday, March 20, 2021 - link

    Until the recent pricing craze with dGPUs, I would also have agreed; many of us here don't really care for the iGPU. However, nowadays, having an iGPU allows to at least proceed with a build and have a functioning PC while waiting for the dGPU (or for prices to come down). Plus, there are many users who like a reasonably fast CPU but don't want the dGPU.
  • Qasar - Saturday, March 20, 2021 - link

    eastcoast_pete
    IF you are upgrading, and you still have access to the old parts., why cant the previous vid card be carried over, and used, until the new vid card can be bought? in some cases, would make the argument of if the cpu has an igp or not, pointless..
  • eastcoast_pete - Saturday, March 20, 2021 - link

    I am currently considering switching back from a gaming laptop with dGPU back to a desktop setup, so reusing the existing dGPU in a desktop is not an option I have.
    The other potential build I am looking into is an HTPC, and using the i5 11400 or 11500 CPU might be a good start for that. Rocket Lakes having support for AV1 and 4K in 10bit HDR output is an asset for this. At the higher price levels and for high-end gaming, I don't think the Rocket Lakes i9 or i7 are that competitive with the larger Zen3 Ryzens.
  • Calin - Monday, March 22, 2021 - link

    If you had a low-end/budget GPU on a older generation system, the integrated AMD GPU will match that or more.
    And maybe the old PC is getting donated (with its GPU in place).
    Or maybe the GPU was a dud, or it has some other issues
    Plenty of reasons for an integrated GPU to be useful (even if only for a while).
  • GeoffreyA - Tuesday, March 23, 2021 - link

    "Plenty of reasons for an integrated GPU"

    For people like me who play games only rarely now, and are on a slender budget, the iGPUs are a blessing.
  • LiKenun - Saturday, March 20, 2021 - link

    Their market is not enthusiasts.
  • Beaver M. - Monday, March 22, 2021 - link

    Most of their customers need it. You cant compare that to Zen3 numbers, since Zen3 has massive availability problems and its mostly enthusiasts who buy high end AMD. Thats why Intel is still in front in terms of sales and market share.
    Even Intel cant produce 2 different versions of dies with all the shortages. You should know that by now.
  • Qasar - Monday, March 22, 2021 - link

    " its mostly enthusiasts who buy high end AMD " same can be said about high end intel, whats your point ?
  • Kamen Rider Blade - Friday, March 19, 2021 - link

    Intel wanted you to have AVX-512 instruction set available to you for some reason. Ergo the consumer lost 2 cores because of the transistor real estate was given over to AVX-512
  • rolfaalto - Friday, March 19, 2021 - link

    For those of us who want to run AVX-512 fast, this chip is brilliant -- fastest single-threaded chip in the world and obliterates the competition for a cheap price!
  • Qasar - Friday, March 19, 2021 - link

    but ONLY at avx 512
  • Hifihedgehog - Friday, March 19, 2021 - link

    And at 300 watts... or more. We’ve not even seen yet just how hot the 11900K gets with AVX-512.
  • dotjaz - Friday, March 19, 2021 - link

    And what would you actually gain from AVX512 as a consumer?
  • Spunjji - Monday, March 22, 2021 - link

    "For those of us who want to run AVX-512 fast"
    Good to know there is a CPU out there for you and the little mouse that lives in your pocket.
  • eastcoast_pete - Saturday, March 20, 2021 - link

    Don't forget that right now, AVX512 is the key remaining differentiator Intel has vs. Zen3. If Intel is smart, they'll promote the hell out of AVX512 use, provide programming tools for it etc. For pretty much all other use scenarios, Zen3 has them beat for now, and Intel knows that.
  • Qasar - Saturday, March 20, 2021 - link

    even then avx512 is pointless, what programs, OTHER then benchmarks use it ? but at the same time, IF amd also supported it, then maybe more programs would then use it, i have also read that amd doesnt support it cause they feel the trade offs of implementing it, IE : increased power usage, increased temps, lower clocks, just are not worth it right now.
  • smilingcrow - Saturday, March 20, 2021 - link

    The issue isn't the power consumption as you can use a larger AVX offset. That way you manage the maximum Watts but still benefit from the superior power efficiency of AVX-512.
    But the lack of software support and the extra die space mitigates against it.
    Although I recently bought some DAW software that supports it.
  • Qasar - Saturday, March 20, 2021 - link

    " as you can use a larger AVX offset. " then what would be the point ?
    " superior power efficiency of AVX-512. " going by how much power avx 512 uses, it doesnt seem like its very power efficient :-)
  • GeoffreyA - Sunday, March 21, 2021 - link

    I was hoping AMD would boycott AVX512, and be firm about why, but now that they're implementing it---in Zen 4?---it will sadly remain on CPUs for a long time, whether much used or not.
  • Beaver M. - Monday, March 22, 2021 - link

    "Dont implement new instruction sets, because no software can use them".
    You have any idea where we would be if people would listen to such nonsense?
  • LiKenun - Saturday, March 20, 2021 - link

    First they gotta sell AVX-512 to the developers. I see AVX-512 and I think to myself… this only works on Intel, so if I optimize for it, it takes away limited resources that I could dedicate to code that works for 100% of the market. Depending on the customer the software is for (and Intel is found in a lot of business environments), it may or may not be worth it.
  • whatthe123 - Sunday, March 21, 2021 - link

    your compiler can handle an avx512 build for you. the onus is on intel getting avx512 into people's hands without crazy high power draw, hand optimizing for avx512 isn't even part of the equation at this point unless you're writing software for personal use/enterprise.
  • Carmen00 - Sunday, March 21, 2021 - link

    AVX-512 can only be used for algorithms that have some non-trivial SIMD (single instruction, multiple data) component. It's not just a case of saying "change the compilers" or "educate the developers". If the algorithm isn't amenable to SIMD (and FYI, many many many algorithms aren't), then AVX-512 is just not applicable. And even when an algorithm DOES have a SIMD alternative, that is often more difficult to understand, debug, and maintain. Since SIMD algorithms and single-threaded algorithms can be quite different, there's no way for compilers to go automatically from one to the other, except in the most trivial of cases.

    The way I see it, the situation is only getting worse for Intel in this area because there's already a far better SIMD device in most computers - it's called a GPU. One of the big barriers to using that is that the programming model is different and the copy-time might offset any advantage gained. But if those are addressed ... and, by the way, AMD and others are addressing those ... then why would anyone want to use a janky SIMD-lite instead?
  • whatthe123 - Monday, March 22, 2021 - link

    There is no need to "change compilers," AVX512 flagging is available on common compilers. If SIMD was a non-starter there would be no use for SIMD extensions like SSE at all, yet the vast majority of modern software supports SSE and most support AVX1-2. AMD supports AVX2 and performs very well and they are most likely going to add AVX512 support to their enterprise Genoa chips. Please stop spouting nonsense.
  • Beaver M. - Monday, March 22, 2021 - link

    The official explanation for this wasnt enough for you?
    Well, IDK what else one can say then, if you wont accept simple physical limitations.
  • MonkeyPaw - Friday, March 19, 2021 - link

    It wouldn’t be Intel if they did it any other way.
  • YB1064 - Friday, March 19, 2021 - link

    Does it make the RGB LEDs glow brighter as well? Intel, back to the good ol' days of shenanigans. These guys just can't seem to get it together.
  • XacTactX - Friday, March 19, 2021 - link

    My hunch is that Intel is doing this to provide more meaningful segmentation to the 11700K and the 11900K. If Intel was not using features like this and DDR4-3200 Gear 1 to cap the 11700K, the performance of the two would be so close that people wouldn't have any reason to spend extra for the 11900K. There is not a strong reason for the 11900K to exist when both products are 8C16T and one is clocked 10% higher
  • Zingam - Monday, March 22, 2021 - link

    Why is Apple able to have just a couple of SKUs and Intel needs gazillions? Is it binning? Doesn't Apple do binning? Or is Intel tech so bad and inconsistent?
  • GeoffreyA - Monday, March 22, 2021 - link

    To show they're active, not stagnant, and producing tonnes of stuff.
  • Spunjji - Monday, March 22, 2021 - link

    Once upon a time, it was binning.

    Then some marketing genius at Intel realised they could maximise profitability by forcing people who need certain features to buy the most expensive SKUs that they might not otherwise need.

    Now it seems like they do it as some sort of self-justification for keeping an entire department of people whose job it is to map out mind-crushingly complex matrices of features.
  • GeoffreyA - Tuesday, March 23, 2021 - link

    Well said.
  • Calin - Monday, March 22, 2021 - link

    "Pretty lame to limit this feature to only the most expensive consumer processors"
    Business as usual at Intel.
  • Spunjji - Monday, March 22, 2021 - link

    Yup. It's Intel's SOP, but it makes less than no sense to limit this feature to a CPU that - in the majority of cases - will never see the feature used, due to overclocking being preferred instead.
  • TristanSDX - Friday, March 19, 2021 - link

    So much different turbos, time to reduce it
  • nandnandnand - Friday, March 19, 2021 - link

    It's time for more underclocking. Save the planet.
  • Pinn - Friday, March 19, 2021 - link

    You mean reduce errors on clean power research?
  • JayNor - Friday, March 19, 2021 - link

    What kind of warranty preserving overclocking would this enable with one of the Cryo Cooling solutions?
  • shabby - Friday, March 19, 2021 - link

    Can't wait for "ultra, who cares about wattage, boost" feature...
  • at_clucks - Friday, March 19, 2021 - link

    So regular Intel CPU operating mode?
  • Makaveli - Friday, March 19, 2021 - link

    "AMD does the same thing, and they call it Precision Boost 2, and it was introduced in April 2018 with Zen+."

    Is this a reference to PBO and PBO2 is only on Zen 3?
  • drexnx - Friday, March 19, 2021 - link

    PB2 and PBO are completely different.

    I think technically even Raven Ridge processors implemented PB2-like behavior vs the fixed schedule on Summit Ridge.
  • Makaveli - Friday, March 19, 2021 - link

    I'm aware they are different but PB02 hasn't been out since 2018 so that is why I believe the statement is referencing the first version.
  • Slash3 - Saturday, March 20, 2021 - link

    PB and PBO/PBO2 are separate variations. The former is inherent boost control and the latter is a much more aggressive, user enabled auto overclock setting.
  • SaturnusDK - Friday, March 19, 2021 - link

    No. Precision Boost 2 introduced the feature with Zen+. PBO and PBO2 are iterations on that.
  • JayNor - Friday, March 19, 2021 - link

    "BIOSes enabling ABT are only being distributed now "

    so, will we see a new benchmark update using this feature with the parts purchased from Germany, prior to the official release?
  • nandnandnand - Friday, March 19, 2021 - link

    Nope. Only the i9-11900K and i9-11900KF get this feature. i7-11700K is unaffected.
  • Qasar - Friday, March 19, 2021 - link

    why? chances are this still wont change much, tiger lake will still use alot of power, if not more. and it will still be a dud
  • Ian Cutress - Friday, March 19, 2021 - link

    Only i9-K and i9-KF are affected, not i7-K.
    Plus, ABT isn't enabled by default. We test at default. But we might run some numbers.
  • Assimilator87 - Saturday, March 20, 2021 - link

    Ian, since you test at default settings, if the motherboard chosen for a CPU review happens to have MCE enabled by default, is that left on?
  • dsplover - Friday, March 19, 2021 - link

    Just build a better CPU. Higher IPC, cooler thermals, etc.
    This marketing crap is to justify a decade old 14nm design that lost market share to AMD has become tiresome reading new ways to sell a chair.
  • GeoffreyA - Monday, March 22, 2021 - link

    When content is lacking, one has to rely on marketing to save the day. We'll see this for a while until Intel truly delivers. Even then, I suppose, the Marketing will still go on. This is what the folk at Intel are good at. Besides, all those millions have got to go somewhere: and so PowerPoint, Impress, and Photoshop work round the clock.
  • GeoffreyA - Monday, March 22, 2021 - link

    One request I have for the Intel Marketing Department, if it falls under their domain. Please come up with some better names next time and get rid of all these Lakes and Coves. Intel used to think of some choice names before (Coppermine, Northwood, Sandy/Ivy Bridge, Clover Trail), so please crack out a map from geography class: what's more, it'll be a great deal of fun.
  • Beaver M. - Monday, March 22, 2021 - link

    Im taking it. IDK what you want from a CPU, but I only care about real world performance. Only secondly about power draw. If they give me 200 extra MHz for free, I would be stupid not to take it.
    Some special kind of AMD users seem to hate additional performance... but only if its for Intel. ;)
  • Spunjji - Monday, March 22, 2021 - link

    "I only care about real world performance"
    Then you'll be buying an AMD CPU, presumably? 😁

    Jokes aside, though, did you read the bit where they pointed out that this is only on the i9 SKU? The only difference between this and MCE is that MCE is guaranteed up until your CPU hits the thermal throttling limit, and this just isn't - so by your own metric, you want MCE and not this "feature".
  • Bagheera - Monday, March 22, 2021 - link

    some special kinds of Intel users think they actually get more performance with Intel chips.
  • GeoffreyA - Tuesday, March 23, 2021 - link

    That is the power of Intel Inside and GenuineIntel.
  • dwillmore - Friday, March 19, 2021 - link

    AMD intruduced this back in 2018/4 in Zen+ where they had an processor that actually had thermal headroom and could make use of it--especially on the lower end processors with more thermal headroom.

    Intel introduces it in 2021/3 and only on high end processors made from a 10nm processor design back ported to a fairly power hungry 14nm process where they have little thermal headroom--so the feature doesn't end up doing all that much.
  • nandnandnand - Saturday, March 20, 2021 - link

    It presumably helps create a performance gap between the barely distinguishable i9 and i7, and makes sure that the i9-11900K wins at gaming consistently over Zen 3. Intel's fans will pony up the cash, and AMD's prices will drop, so it's a win-win.
  • Qasar - Saturday, March 20, 2021 - link

    " and makes sure that the i9-11900K wins at gaming consistently over Zen 3. "
    how would you know this ? has there been a review of this posted ?
    going by the 11700k review on here, the 11700k looks to lose more often to the 5800x then it wins, even losing in some games to the 9900k. the 11900k may close that gap, but it may not be by much, and, while using more power then the 11700k does.
  • SaturnusDK - Sunday, March 21, 2021 - link

    Not sure how you see the 11900K win over zen 3 in games. By just the CPU MSRP the 11900K competes directly with the 5900X which is a few percent better overall than the 5800X. Given the preview test of the 11700K it seems unlikely that the 11900K will bridge the gap up to the 5900X. It may be relatively on par with the 5800X but then you're looking at a $90 price disparity just going by the MSRP, nevermind the added cost for more expensive MBs, coolers and PSUs.
    So the 5900X will almost certainly continue to be the gaming king CPU, and slaughtering the 11900K in anything else except extremely niche avx512 tests that aren't really useful for any consumer, and never will be.
    Availability is really the only thing holding AMD back still. Although 5600X/5800X now seems to be in general stock everywhere at MSRP, the 5900X/5950X are still hard to find. Two of my freinds have pick up one each in the last 14 days at just $20 and $30 over MSRP so it seems that is improving as well.
  • Spunjji - Monday, March 22, 2021 - link

    The sort of lolcows who buy an i9 K and don't overclock but do bung a chiller on top will love this.

    For everyone else, you're right, it makes painfully little sense.
  • Hulk - Friday, March 19, 2021 - link

    Reminds me of the P4 "Emergency Edition."
  • GeoffreyA - Saturday, March 20, 2021 - link

    Zen 3 is making them shiver just like the FX-51 did.
  • Spunjji - Monday, March 22, 2021 - link

    That was Comet Lake! This is... well, I'd argue that we're off the map and into uncharted territory here.
  • GeoffreyA - Tuesday, March 23, 2021 - link

    Yonder be the land of Gloomy Cove, where dragons, wraiths, and other nameless things roam.
  • WaltC - Friday, March 19, 2021 - link

    Intel's 2021 motto: "If you can't beat 'em, dazzle the crowd with BS and put anti-Apple ads on television"...;)
  • Hifihedgehog - Friday, March 19, 2021 - link

    Gelsinger is surely in charge by now, so if he is going to right the ship, he needs to start and have them stop all these shenanigans that do nothing for Intel to actually execute. Baseless smack just shows how dumb and weak you are. If the Intel keeps this up, they’ll become the Detroit Lions of chipmakers.
  • arashi - Sunday, March 21, 2021 - link

    Why do people think Gelsinger has nothing to do with this?

    He okay-ed these.
  • Qasar - Monday, March 22, 2021 - link

    more then likely, it was too far into the pipeline to stop, so he couldn't stop these, or change them in any way, and just had to run with it.
  • Spunjji - Monday, March 22, 2021 - link

    That's extremely unlikely.
  • rolfaalto - Friday, March 19, 2021 - link

    Presumably the main benefit of the i9 is that these are chips that Intel has pre-binned -- they reliably run cooler and faster. So, if you overclock it's probably worth it, otherwise not.
  • vol.2 - Friday, March 19, 2021 - link

    What process node is this new microprocessor on? The last time I looked into it, they were still on 14nm many, many years ago. How far have the made it at this point?
  • meacupla - Friday, March 19, 2021 - link

    Get this... it's 14nm
  • philehidiot - Saturday, March 20, 2021 - link

    Excuse you, that's 14nm+++++++++++++++++++++++++, I'll haven you know.

    Don't confuse that with ancient 14nm. That didn't have ANY pluses.
  • Beaver M. - Monday, March 22, 2021 - link

    Yeah, it would be very weird if they could squeeze these kind of performance numbers out of ancient 14nm.
    But compared to TSMC or Samsung, Intels 14nm was always more like 12 or 11nm.
  • rocky12345 - Saturday, March 20, 2021 - link

    Well I guess they had to figure out some way to make the i9 11900K & KF perform some what better than the 11th gen i7 K & KF CPU's when you factor in that they are pretty much the same exact CPU as the 11th gen i9's other than Intel binning the i9's like crazy so they can hit those short bursts of extra MHz over the i7 K's & KF's counter parts.

    So basically Intel is trying to limit the i7's as much as possible so the i9's look much better thaan they actually are over the i7's. So what's to stop the actual end user from just overclocking the i7 K & KF to match the i9's or better than them. I'm sure Intel will find some more ways to try and limit the i7's so they over clock worse than they could have without Intel putting in some extra roads blocks.
  • Qasar - Saturday, March 20, 2021 - link

    that is assuming these 11000 series cpus even over clock at all passed what they would normally run with out doing anything. its probably a safe bet, intel has, and is doing everything it can to get 11th gen to be even remotely competitive with zen 3, let alone showing any improvement over 10th gen.
  • Spunjji - Monday, March 22, 2021 - link

    "So what's to stop the actual end user from just overclocking the i7 K & KF to match the i9's or better than them."

    Indications so far are that the i7 bins are really not very good. That makes sense - these aren't going to be around for long, so there won't be much time to fill out the more popular i7 bin with chips they could have sold as a more profitable i9.
  • Oxford Guy - Sunday, March 21, 2021 - link

    This whole thing about not entering BIOS... I'd like to see data on it.

    I can't even imagine building a PC and not entering the BIOS during that process. Then, there are BIOS updates which are often very important.

    Buy an OEM machine (Dell, HP, Lenovo) — yes... I can see not entering BIOS, I suppose. But, 'enthusiast' is a pretty wide category.

    At least Intel isn't voiding warranties for using this floating turbo. I suppose demanding such a low temperature is part of why.
  • Everett F Sargent - Sunday, March 21, 2021 - link

    I bought three of those OEM HP PC's over the past three years. All at about the $500 price point (i5-8400, 3700X and 3400G). Compared to any OC'able MB it is like a desert is to an ocean. Almost no reason to mess with the BIOS (as there is little in the way of end user adjustable settings) except for a must have critical BIOS update.
  • GeoffreyA - Monday, March 22, 2021 - link

    OEM are almost always like that, so little one can alter.
  • Spunjji - Monday, March 22, 2021 - link

    It's hard to speak anything other than anecdotally about this, but even among my friends who self-build, I'm in a distinct minority by being willing to dive into the BIOS to do anything more complex than changing a boot device, setting the system clock, or activating the XMP settings.
  • Oxford Guy - Monday, March 22, 2021 - link

    XMP was one of the main claims Anandtech was making and I don’t buy it.
  • lmcd - Monday, March 22, 2021 - link

    XMP is technically overclocking and theoretically is grounds for not accepting an RMA. Don't think that'd be enforced though.
  • Oxford Guy - Saturday, March 27, 2021 - link

    Oh, right...
  • GeoffreyA - Tuesday, March 23, 2021 - link

    I feel that many don't enjoy going into the BIOS for fear of breaking something, and outside the enthusiast demographic, perhaps no one enters except under guidance from someone in the know, when the computer isn't working. I don't mind going in, and it's lovely to have a rich BIOS, but when I do change something, there's a slight worry if something else was inadvertently changed along the way, through a stray keystroke perhaps, or if the altered setting even saved correctly (BIOS being software after all). Also, updates are nice but it's a bit of a pain having to put in all of one's settings again, and wondering whether some UEFI or CSM option wasn't missed.
  • Oxford Guy - Saturday, March 27, 2021 - link

    What you feel isn't pertinent. Hard data to support Anandtech's claim is, particularly given how much it gimped performance of Zen 1 and 2. Hruska was able to do his tests at 3200 on day 1.
  • GeoffreyA - Sunday, March 28, 2021 - link

    Agreed, but was just giving my opinion. Perhaps a survey or poll might be of use here, but if Ian puts one up, it'll likely be mainly enthusiasts who answer.
  • Hrel - Sunday, March 21, 2021 - link

    Man, how far Intel has fallen. They can't even manage to be competitive anymore when 4 years ago it was like "Intel rules the world!". I honestly can't believe they dropped the ball so hard on manufacturing node, I keep waiting for them to be like "oh shit son, we got 1nm processors all of a sudden!" then be all "but they're actually $200/core so fuck you consumer! Ahahahaha" cause they're Intel :/
  • Beaver M. - Monday, March 22, 2021 - link

    Eh, right now it looks thats the way AMD is going. A 5950X for $1000... LOL
    Not that the others are better priced. Even in the 6-core department Intel is far superior. With ancient 14nm at that!
    And AMD fans always cried about prices of Intel, like you do here, yet when AMD does it, its suddenly ok.
  • Spunjji - Monday, March 22, 2021 - link

    And Intel fans always defended prices of Intel, and yet when AMD does it, it's suddenly bad. 🙄 You're two sides of the same coin.

    In 2016 the 8-core 3Ghz 6950X launched for $1800 - by 2019 AMD were willing to sell you an 8-core 3.6Ghz CPU for $330. Thanks, Intel! /s

    The 16-core 5950X costs *$800* (not 1000) because there's literally nothing else on the market that comes close to touching it. You can clown about that all you want, it's irrelevant to the (fallacious) point you were desperately trying to make.
  • jeremyshaw - Tuesday, March 23, 2021 - link

    > The 16-core 5950X costs *$800* (not 1000) because there's literally nothing else on the market that comes close to touching it. You can clown about that all you want, it's irrelevant to the (fallacious) point you were desperately trying to make.

    Same excuse was made for Intel's lost decade of quad cores. Intel even had a few generations with price hikes, despite single digit performance gains (in both ST and MT).

    Oh, well. I'm sure the same fanbois will be so understanding w.r.t. Nvidia's price/perf staying flat with Turing, right? Right?
  • GeoffreyA - Tuesday, March 23, 2021 - link

    It's a double standard, as they say.
  • Bagheera - Monday, March 22, 2021 - link

    humor me this: how much does a 16-core Intel cpu cost?
  • Qasar - Monday, March 22, 2021 - link

    " And AMD fans always cried about prices of Intel, like you do here, yet when AMD does it, its suddenly ok. " more like intel fans NEVER cried about the prices intel charged, but amd raises their prices, and justifiably so, as the performance is there, and most people complained. go figure.
  • dsplover - Sunday, March 21, 2021 - link

    Never thought I’d see a Ryzen 5600X whoop Intel w/ single core scores.
    I don’t need the fastest chips. i7 4790k/8086k are actually plenty of IPC for my work.
    I was going to wait for Intel to leapfrog AMD to upgrade but an AMD Desktop APU seems to be the chip for me.
    Was hoping Intel would keep a cooler, lower watt with 6/8 cores and double the cache. Would’ve kept me in the Intel camp.
    Their endless 14nm nonsense and core chasing strategy just isn’t for me.
    Guess I’m going with AMD this summer. Alder Lake and Smart Boost or Turbo charge means nothing to me.
    Low latency, high IPC, low watt is all I need. Sad Intel can’t deliver...
  • Zingam - Monday, March 22, 2021 - link

    I don't really know how great current AMD is but I feel their offerings as a let down. Their mobile products seem as a second class: always months later, integrated GPU still on Vega, no USB4 for years to come (what about HDMI 2.1, DP 2.1, 5nm), still lagging behind NVIDIA. They may be great on Desktop but I don't see myself using desktop computer ever again. My priorities are desktop like performance at the lowest possible TDP (which means no fan noise).
    And I really want ti give Intel and NVIDIA a kick for their business practices, yet AMD still won't give me a reason to switch/upgrade for at least until Zen 4 APUs come out and yet they seem will be on RDNA2, well after RDNA3 is out.
    I expect to hook a triple 4k monitors and to experience smooth performance in all applications, with no fan noise. Unless of course I am playing Crysis.

    AMD when do will you deliver? Current tech and not rebranded products? Why can Apple bring a mobile product with a desktop like performance and you can:t?
  • watzupken - Monday, March 22, 2021 - link

    While I think Rocket Lake is a decent processor, a few things that Intel did just made it DOA.
    1. Annoucing a more exciting Alder Lake will arrive the same year,
    2. Crippling the i7 and lower range CPU with a slower memory ratio
    3. Reducing prices of Comet Lake while increasing Rocket Lake prices despite lower core count and not convincingly faster than Comet Lake in games
    I feel the lost of this boost may not matter to most since a slight increase in clockspeed won't improve performance drastically. In addition, it will take a very high end cooler and motherboard in order to all the CPU to run at such elevated clockspeed. So I don't believe it will work for most people.
  • bigboxes - Monday, March 22, 2021 - link

    Does it come with a big red button for the front of the case?
  • Beaver M. - Monday, March 22, 2021 - link

    Nah, cases dont have Turbo buttons anymore.
    Turbos are working automatically nowdays. Didnt you read the article?
  • Spunjji - Monday, March 22, 2021 - link

    r/woooosh
  • dsplover - Monday, March 22, 2021 - link

    Never thought I’d see the day when Intel makes an announcement, or another review is “leaked” and I don’t care enough to read. I’m bored by Intel. AMD would be exciting if they upgrade their 5000 APUs. Then there’s App£€...NOT.
  • GeoffreyA - Tuesday, March 23, 2021 - link

    "I’m bored by Intel"

    Also Pentium D era.
  • [email protected] - Sunday, March 28, 2021 - link

    Lame Marketing article. TB2? TB3 on 1 or 2 cores or those that can support it. Huh? Guess you gotta have some type of news given 7nm is far away. Meltdown and Specter still being addressed? How do I get a refund on my i7-8700k which has now lost at least %20 of its performance?
  • dfg2 - Sunday, April 18, 2021 - link

    it is the best for 2 cores of i9,i7 & i5 processor

Log in

Don't have an account? Sign up now