Comments Locked

14 Comments

Back to Article

  • FWhitTrampoline - Monday, January 8, 2024 - link

    I fine with on Module Memory as long as there's PCIe 5.0/CXL support as well for CXL based DRAM Modules! But the On Module DRAM should have at minimum a 128 Bit wide channel to each DRAM Die on the module so any iGPU is not starved for bandwidth. That and the lowest capacity SKUs should start at 16GB minimum capacity of on Module DRAM.
  • mode_13h - Tuesday, January 9, 2024 - link

    > fine with on Module Memory as long as there's PCIe 5.0/CXL support

    Not in 2024 or in (most) laptops. In future HX laptops, maybe there'll be a couple M.2 slots that support CXL protocol, so you can put memory expansion modules there.
  • Samus - Tuesday, January 9, 2024 - link

    Combine on-package memory with external memory? I don't see it happening but that would be a neat hat trick.
  • 1_rick - Tuesday, January 9, 2024 - link

    Why not? Even $10-20 microcontrollers can do it.
  • do_not_arrest - Tuesday, January 9, 2024 - link

    By all means, don't miss the big picture. Each of these chips is targeted at a particular MARKET SEGMENT. Lunar Lake is clearly for very thin/light and low-power cases where expansion is not an important part of the picture. Think some kind of low-cost tablet that you essentially throw away after a few years. They have other "Lake" chips that are meant for other segments. These product announcements are more for system builders and OEMs
  • Diogene7 - Thursday, January 18, 2024 - link

    I wish the same, but as of 2024, I am not sure but I think that the CXL protocol has been thought primarily to be used in servers, and as such, I doubt it has the technical power saving specifications that would allow it to be usable in a laptop.

    It is really pity because CXL 2.0 and higher is much, much needed to open the door to the integration of innovative 3rd party hardware like dedicated 3rd party AI accelerator, or new emerging low latency Non Volatile Memory (especially MRAM),… that could bring fresh innovation…
  • meacupla - Tuesday, January 9, 2024 - link

    If they are going to do this, then they should go all in and increase memory bandwidth.
    128bit bus (2 channels) is already insufficient. I can understand being space and signal integrity constrained with traditional designs, but this solves that.

    Yes please to 192bit and 256bit memory bus. Especially if they want to shove AI/TOPS as the next "big" thing.
  • mode_13h - Tuesday, January 9, 2024 - link

    > Intel still hasn’t even confirmed what process nodes it’s using – the company has
    > continually been reiterating that they intend to get it out the door in 2024.

    I've seen conflicting reports, with some saying 18A and others saying 20A. If I had to bet, I'd go with 20A for something that's due to launch this year.

    > The demo chip has two DRAM packages on one of the edges of the chip
    > (presumably LPDDR5X), making this the first time that Intel has placed regular
    > DRAM on a Core chip package.

    Not true. There was this: https://wccftech.com/asus-intel-supernova-som-chip...

    > Lunar Lake is slated to offer “significant” IPC improvements for the CPU core.

    Lion Cove is said to be a new architecture.

    > Meanwhile the GPU and NPU will each offer three-times the AI performance.
    > How Intel will be achieving this remains unclear,

    Well, it's rumored to use Battlemage architecture. I thought I saw a rumor that Lunar Lake would have an tGPU with the equivalent of up to 384 EUs (exactly 3x of Meteor Lake), but more recent leaks suggest a much smaller tGPU. Of course, there are probably multiple tGPUs, for addressing different markets and product tiers.

    I think the most exciting implication of these announcements is the possibility that they might be following Apple and widening the memory data path to 256 bits. They're going to need more bandwidth to feed a faster GPU & NPU. Also, energy efficiency tends to be better with a wider, slower interface.
  • 1_rick - Tuesday, January 9, 2024 - link

    It doesn't look like those Asus laptops were ever released (at least, if I go out to Asus' site, there's no buy link for them, not even one that takes me to 3rd-party vendor websites like with actual, shipping products.)
  • JMaca - Wednesday, January 10, 2024 - link

    TSMC N3 is more likely.

    20A is not library complete. It can't be used for the SOC tile and might not be able to do the CPU tile either (since it includes the GPU). Arrowlakes compute tile could be 20A.

    18A could work but it won't be ready for a 2024 launch.
  • lmcd - Monday, January 15, 2024 - link

    Arrow is 3nm though, we already know that.

    20A might not be library complete but if it has sufficient low power libraries, it's ahead of where 4nm shipped for Meteor Lake.
  • ceisserer - Tuesday, January 9, 2024 - link

    finally - I've been waiting for years to get on-chip memory connected via a broad interface with the APU. i thought AMD would provide this because of their expertise in the console market where they already have exactly this, but it was Apple who brought it into the "PC" market. Unfortunate for me, because as a linux user I don't want to buy their closed-down hardware.

    So with lunar lake I now finally have something to wait for replaxing my amd cezanne based laptop.
  • Diogene7 - Thursday, January 18, 2024 - link

    I wish that R&D would be even more advanced and that they would be using Non Volatile Memory (NVM) like SOT-MRAM because in theory, it could bring new innovation like always-on functionalities and lower latency to significantly improve responsiveness…
  • Cd7890 - Sunday, January 14, 2024 - link

    lest, not least

Log in

Don't have an account? Sign up now