Comments Locked

30 Comments

Back to Article

  • shabby - Monday, October 16, 2023 - link

    100-200mhz bumps lol
  • shabby - Monday, October 16, 2023 - link

    Only the 700k has a descent core bump, the rest are junk
  • Flunk - Monday, October 16, 2023 - link

    ZZZZ, why bother?
  • kpb321 - Monday, October 16, 2023 - link

    Okay so they released the i9-13900KS as a regular chip for the i9-14900K and the higher end i7 chips are now partially binned i9 chips. Pretty big yawn. No wonder they are fine announcing these ahead of time. Maybe it will push the prices of 12th and 13th gen chips down even more making them even better options but this is the non-upgrade I think pretty much everyone expected.
  • Duwelon - Monday, October 16, 2023 - link

    What are desktop users (i.e. gamers) supposed to do with 8+ E cores? Do they assist gaming workloads even a little bit? Whats the point of them on desktop again?
  • Tom_Yum - Monday, October 16, 2023 - link

    To win multicore benchmark scores.
  • wrosecrans - Monday, October 16, 2023 - link

    The short answer is that 8 extra cores is no particular benefit to somebody playing a video game. No current game requires that many cores because the market for such a game would be vanishingly small.

    If you have a zillion browser tabs open to crappy pages with JS that burns cycles, it can be helpful. Which is kinda sad that so many modern web pages have gotten so bad that looking at the web is a serious driver for faster CPU's. But the main benefit is folks doing things like content creation. If you are rendering CGI while also editing video, you can't have too many cores. Or if you are a developer working on big projects, being able to build across more cores can be a big help. (Though at this point, some real world applications have fewer source files than a modern desktop has cores. So if you mainly work on tiny utilities, even developers may start to see a point of diminishing returns with more cores!)
  • Duwelon - Monday, October 16, 2023 - link

    I wonder if we're not just witnessing an AI-driven fad. I built a 16 core machine 3 years ago (zen 3) thinking I'd get a lot of use out of it with ML-driven photo editing / enhancement tools. The technology is good for niche cases but it's very primitive, not ready for commercial use in many cases, at least not when it's obvious it's computer generated. In other words, are the vast majority of these E Cores doing absolutely nothing?
  • nandnandnand - Wednesday, October 18, 2023 - link

    We haven't even seen dedicated AI accelerators reach Intel and AMD desktop lineups yet (they are already in Phoenix and Meteor Lake), which they might do within the next 2 years.
  • Samus - Tuesday, October 17, 2023 - link

    Exactly this. Bearing in mind that most games are built around an 8-core ecosystem (due to the XBOX and PS5 being such) it's doubtful PC ports, or PC games that may be ported to consoles, will be designed around utilizing more than 8 cores. The fact is most AAA games are still GPU limited on the most powerful CPU's.

    Elsewhere, in other applications, cores are incredibly useful, but not to most typical desktop users. Compression\decompression benefit tremendously as does any other multithreaded task. But with rendering, especially AI plugins for Adobe etc, using GPU acceleration, CPU performance again is becoming less critical.

    And this is why Intel can get away with calling 13th gen parts 14th gen parts: nobody cares.
  • Dizoja86 - Monday, October 16, 2023 - link

    Multitasking and productivity. A lot of us use our computers for more than gaming.

    Although there are few current games that would utilize every core, I would definitely make use of these in working with photography and scientific software. I definitely don't regret my 5900x for similar reasons, even though a 5800x would do the same job in most games.
  • aparangement - Monday, October 16, 2023 - link

    I'm wondering what exactly are the "more than gaming" working loads that actually depends the E cores.

    For laptops I guess a few E cores make sense as light weighted tasks could be transferred. But even for laptop, indeed 2 or 3 E cores are not good enough?

    Servers on the other hand might also benefit from large number of E cores, since most of the tasks are heavily threaded, and E cores are more energy saving.

    But home use computers? I really don't get it...
  • Samus - Tuesday, October 17, 2023 - link

    E cores are exceptionally powerful. Intel's E cores are between Skylake and Coffee Lake cores in performance, and Zen 2 cores used in the XBOX and PS5 are comparable to Coffee Lake.

    I wouldn't dare say E cores are as powerful as Zen 2 cores, but in the case of a console SoC, I bet they are pretty close due to power budgeting, the lower clockspeed of XBOX and PS5 (~3.5GHz) and the way RDNA2 GPU core utilization reduces the package power availability to the CPU cores, reducing their ability to clock high.
  • Duwelon - Tuesday, October 17, 2023 - link

    Who in the mainstream are these E cores on desktop helping? Personally I suspect whether a mid-tier or higher PC has 8 E cores for 8 Million E Cores the vast majority of users wouldn't notice a single difference because practically speaking almost nothing can scale to more than 4 or 8 cores.
  • Tilmitt - Monday, October 16, 2023 - link

    Back to refreshing Skylake for 5 years again.
  • lmcd - Monday, October 16, 2023 - link

    Ironically Skylake refreshes were more justified. Skylake had a ton of errata and Kaby Lake fixed most of its sleep issues, while Coffee Lake supposedly fixed virtualization-based security and Comet Lake fixed Meltdown issues.

    Raptor Lake is maybe a few fixes and a better spin, and this refresh is even more pointless (and probably isn't even a spin).
  • meacupla - Tuesday, October 17, 2023 - link

    At least this one still works on LGA1700.
    When was the last time Intel supported the same socket for 3 generations of CPUs?
  • duploxxx - Tuesday, October 17, 2023 - link

    so just a new stepping is now called a generation? Yeah you are so aligned with Intel marketing tricks.... All this is the result of failing Meteor lake, not capable of having enough P core in the package and unable to go high enough in GHZ to leap vs RDL.
  • PeachNCream - Tuesday, October 17, 2023 - link

    Yeah, whatever. 253W TDP. That's stupid even in an obsolete desktop form factor case. Managing cooling and power consumption on a monthly utility bill for something like that and the inevitably mandatory, comically-gigantic graphics card people will pair it up with is beyond reason. While it isn't quite as irresponsible as having children, the impact of entertainment is dreadfully high for all of us living on this dying planet together.
  • Tilmitt - Tuesday, October 17, 2023 - link

    Your genes will be replaced by people with no inhibitions over reproducing.
  • charlesg - Tuesday, October 17, 2023 - link

    What a depressing and distorted viewpoint of reality!
    Assuming you actually believe this nonsense, and aren't a bot, maybe it's time to subscribe to a new ideology that is closer to reality?
    You'll be much happier.
  • PeachNCream - Wednesday, October 18, 2023 - link

    So you presume it's a depressing thing to want to limit population growth to well below the carrying capacity of the planet and use forms/methods of entertainment that are not as adversely impactful of our available resources? If nothing else, that's a good example of why we're soiling the collective cage we all share together. Thanks for proving the point.
  • charlesg - Wednesday, October 18, 2023 - link

    Your perspective of "max capacity" and "available resources" are awry.
  • PeachNCream - Thursday, October 19, 2023 - link

    Do a little research.
  • nandnandnand - Wednesday, October 18, 2023 - link

    There is an i9-13900T, you know. 24 cores, 35W base TDP, 106W turbo TDP.

    Be the change you want to see in the world and throttle your CPUs.
  • jimbo2779 - Tuesday, October 17, 2023 - link

    I thought they were dropping the "i" branding.
  • ingwe - Tuesday, October 17, 2023 - link

    These power draws just feel wild. I am running a 5600X at 30W that does just fine. I realize I am leaving some performance on the table, but even in some more demanding games it isn't a problem. I can't imagine running my CPU at 250 W.
  • Eletriarnation - Tuesday, October 17, 2023 - link

    This is exciting, not because of anything about the chips themselves but because this is the third generation of processors on a single socket. We haven't seen Intel do this since Socket 775 unless I'm forgetting something. Maybe they'll be a bit more willing to let platforms last 3+ years going forward.
  • Samus - Tuesday, October 17, 2023 - link

    I couldn't help but notice that too, but does 14th gen even count as a generational advancement?
  • hubick - Tuesday, October 17, 2023 - link

    How many people work at Intel? Y'all showed up every day for a year, and THIS is what we get from all those hours?

    This amounts to NOTHING. Nothing of real use anyway. Garbage. Replace everyone involved with someone who can get Thunderbolt 5 out the door faster, at least that will be useful for something.

    Also, DDR5 is still basically a wash over DDR4 due to latency, except maybe at the very very high-end? Y'all need to push RAM vendors to make a system upgrade worthwhile.

    I feel like my Threadripper 3960X was a hell of a good time to buy a new system. I'm not feeling even slightly compelled to replace it with this garbage. Maybe when TB5 does come out... but that probably won't actually be on shelves until 2025.

Log in

Don't have an account? Sign up now