Comments Locked

31 Comments

Back to Article

  • James5mith - Monday, March 27, 2023 - link

    Does any of this help them keep chips consuming less than 1kW of power? Because it feels like that is where we are headed. Efficiency be damned, ramp up power for performance. Seems like too long ago Intel promised no feature would be included if it didn't offer 2%+ performance gain for less than 1% power increase.
  • Eliadbu - Monday, March 27, 2023 - link

    While we will still see improvement in perf/watt it will be lower than we've used to. It already slower than past decade that is why power consumption went up - to allow better generational improvements that would not other wise would be possible with current nodes. I fear that the day that will see no improvement in performance/watt with new node is not that far away, or it will be delayed by slower release of new node.
    We will need to find replacement for CMOS technology.
  • Diogene7 - Tuesday, March 28, 2023 - link

    According to some technology benchmarks done by Intel, Intel seems to do research in spintronics related technology (they used the term MESO) as a technology successor to CMOS.

    As of 2023, as there is already some investments done in MRAM manufacturing and a lot of ongoing research to develop manufacturing tools (like at Belgian research institute IMEC), at this point in time (March 2023), I would think that ithrre is a growing ressources allocation momentum toward spintronics, which seems to give this technology a good probability to develop further (cost going down with scale…)
  • brucethemoose - Monday, March 27, 2023 - link

    Part of that is a business choice. Clock targets *could* be lower, but most vendors seemingly figured that customers don't really care about power consumption if it makes the end products faster, cheaper to purchase and less supply constrained.

    Apple's A/M series goes against that trend though. I think they really are targeting "minimum task energy."
  • Oxford Guy - Monday, March 27, 2023 - link

    I am typing this on an 'M1 Pro' Macbook Pro. It's 'overbuilt' for the chip. Instead of going for a super thin and light form factor, Apple gave this machine a thick metal case (both in the thickness of the profile but also in the solidity of the metal).

    The benefit of that is peace and quiet. Personally, I'd much rather have a laptop that's a bit heavier and thicker in order to not hear fans.
  • mode_13h - Tuesday, March 28, 2023 - link

    What does that have to do with anything in the posts you just replied to? Seems like nothing more than an advert, to me.
  • PeachNCream - Tuesday, March 28, 2023 - link

    Mainly it was written to troll you in specific.
  • Skeptical123 - Tuesday, March 28, 2023 - link

    so it's your alt account...
  • mode_13h - Thursday, March 30, 2023 - link

    I'm sure Peaches has better things to do than troll me.
  • mode_13h - Thursday, March 30, 2023 - link

    I don't think so, because we've hardly ever had any exchanges about Apple products. There would be better ways to troll me, if that were his objective.

    I think he's probably feeling chuffed about his fancy new Mac and struggling to refrain from blabbing about it at every opportunity. Honestly, I'm happy for him if he likes it. I just don't appreciate the digression.
  • web2dot0 - Wednesday, April 12, 2023 - link

    Except it's actually not heavier compared to an older MBP.
  • Amandtec - Tuesday, March 28, 2023 - link

    100%. Data center clients do care about power efficiency which is why you often see products targeting that market having lower clock speeds and better perf/watt - performance scales linearly with clock speed increases while power consumption increases quadratically.
  • mode_13h - Tuesday, March 28, 2023 - link

    Apple can afford to trade die area for better energy-efficiency. Apple's cores are designed for phones & laptops, where battery life & weight are key selling-points.

    Intel, AMD, and even ARM have to balance efficiency against cost and raw performance, to a much greater degree than Apple. Intel and AMD will always lag Apple, on the efficiency front, until efficiency becomes the top priority of their customers.
  • web2dot0 - Wednesday, April 12, 2023 - link

    Because PC customers are people who don't care all that much about performance per watt.

    Apple is lightyears ahead of the game
  • Otritus - Monday, March 27, 2023 - link

    Denard scaling is dead. The power consumption gains from node shrinks are minimal compared to the density gains of nodes. If I can cram 80% more transistors and only gain a 20% reduction in power, power consumption rises by 44%. Just 3 generations of this would push a 100 watt chip to 300 and we are starting at well above 100 watts. The only way to gain efficiency would be to go wide and slow; counteract transistor power increases by reducing frequency. Engineers can also use better designs to reduce power consumption, but that will likely come with a performance penalty or would be done anyways.
  • back2future - Tuesday, March 28, 2023 - link

    if a system is running 30% shorter for doing same task, maybe there's the gain (if power limitations aren't an obstacle before),
    Tom's Hardware shows an interesting summary with an animated .gif (https://cdn.mos.cms.futurecdn.net/dMJaMWVxWFKvjjYM...
  • back2future - Tuesday, March 28, 2023 - link

    means: 30x shorter/faster or 1/30th of the time before for the same task
  • back2future - Tuesday, March 28, 2023 - link

    the link without adhesive brackets ( https://cdn.mos.cms.futurecdn.net/dMJaMWVxWFKvjjYM... )
  • mode_13h - Thursday, March 30, 2023 - link

    I think that's in reference to something different. This is about computing the photo masks, whereas that link is referring to a paper they published on chip layout optimization.
  • PeachNCream - Tuesday, March 28, 2023 - link

    There is good news. You can buy a laptop or a phone which both for the most part accomplish everyday computing tasks (you probably already own at least one, if not both already) and then just do things that are within the limits of your hardware to address power consumption concerns at a consumer level. Otherwise, PC consumption will be limited by electrical systems such as circuit breakers or by annoyance about HVAC limitations so we're reaching that point anyway with 1kw devices dedicated mostly to amusement. I wouldn't worry about it much. We're all going to burn on this miserable planet anyhow so you may as well be selfish and cost-ineffective to play games.
  • mode_13h - Thursday, March 30, 2023 - link

    > You can buy a laptop or a phone which both for the most part accomplish
    > everyday computing tasks

    You're neglecting the increasing amount of work being relegated to the cloud.

    > PC consumption will be limited by electrical systems such as circuit breakers
    > or by annoyance about HVAC limitations so we're reaching that point anyway
    > with 1kw devices dedicated mostly to amusement.

    Somehow, I think >= 1 kW gaming machines aren't the main driver of climate change. The privileged individuals with such machines are a relatively small but vocal minority. I doubt they even outpace crypto, in energy consumption.

    I do think gaming PCs aren't far from a plateau in power consumption, as the typical home circuit can only support so much. 1.5 kW is going to be the practical peak, for most people. 15A outlets are the most common, and you need to reserve some power for the monitor, speakers, phone charging, etc.
  • mode_13h - Thursday, March 30, 2023 - link

    BTW, datacenters are where I'd say we should be most concerned about energy efficiency.
  • web2dot0 - Wednesday, April 12, 2023 - link

    Yet, people are hating on apple for having the best performance per watt and mock them for not having the best performance because they can get better performance at 300W
  • Wereweeb - Monday, March 27, 2023 - link

    Culito 🤐
  • FunBunny2 - Monday, March 27, 2023 - link

    it would be nice if ASML or NVDIA or TSMC or IBM or ... could 'prove' that the contemplated node shrinks can actually work with the 1 or 5 atoms per component that are inevitable.
  • Eskimo - Monday, March 27, 2023 - link

    FunBunny, here is a 1 nm functional transistor a rearch lab was able to build and test in 2016. With billions more in research and development and thousands of engineers I can assure you the parties named and not know this can work. Question is making it manufacturable, reliable and cost effective to build.

    https://newscenter.lbl.gov/2016/10/06/smallest-tra...
  • FunBunny2 - Wednesday, March 29, 2023 - link

    "here is a 1 nm functional transistor a rearch lab"

    the cynics among us take that to mean there's but ONE (may be, as you say) more node shrink available before Heisenberg wins.
  • WuMing2 - Monday, March 27, 2023 - link

    AMD, Intel? Both missing the opportunity? Even Apple should not be removed from the realization that huge Ultra SoCs combined with Accelerate and Core frameworks should have a value beyond pushing pixels.
  • juancn - Tuesday, March 28, 2023 - link

    CuLitho sounds so much like `culito` in Spanish that I cannot believe nobody caught it.

    This article's title reads to english/spanish readers like "NVidia's little ass to speed up ..."
  • mode_13h - Thursday, March 30, 2023 - link

    Given this is aimed at computing masks for "small ass" nodes, maybe that was done with a wink and a nod. Nvidia is a California company... it's not as if nobody there speaks Spanish.
  • Wereweeb - Thursday, March 30, 2023 - link

    NVIDIA: Hyper Fast, Hyper Ass

Log in

Don't have an account? Sign up now