You all seem to have forgotten AMDs Motherboards chipsets' fan. These fans are by no means unreliable. They're just unusual in PC hardware in recent years. Back when CPUs (with floating point no less!) were young, it was fans like this that cooled them.
Yeah, back then the chipset fans were a big complaint from users on forums like anandtech. They'd get pet hair/dust in them. Or they'd make high pitched noises. It seems like they always used the cheapest little fans they could slap on products also.
I've used a wide variety of little fans over the years and most of them have a distressingly short lifespan. If I were to lay down money and time to purchase hardware and then further reply upon it to perform a function, I would not opt for something with a fan when other hardware exists that needs no moving parts to stay cool. Furthermore, one of the key points of SSDs was to eliminate motors that could fail which were crucial to storage operation. Sticking a fan on a storage device runs against that idea. And yes, its just as stupid as putting a tiny fan on an AMD chipset.
Cooling is a must. I just bought a Gen4 m.2 and have it installed in the same system with a Gen3. Both with the same aftermarket heatsinks. The Gen4 is 10C higher idle and 15C higher under load. Gen5 is probably much hotter. Hence the stupid little fan. 2TB Gen4 for $85 but I wish I could've gotten a Gen3 instead.
I wonder how many of these with big coolers sell; the early-adopter tax has been a thing for prior gens, but also mucking up the form factor is a bit more than that. There also seem to be plenty that run fine without a fan or large heatsink!
The intake vent for the "doomed-to-fail-in-2-months" fan is shaped stupidly to interfere with its ability to move air thanks to the partial obstruction of the silly shape. I absolutely wouldn't want to buy something like this that appears to rely on notoriously unreliable active cooling (those little fans really are not durable) to apparently maintain safe operating temps. Also who has space for all that cooling garbage in a laptop anyway? Those things are rather thin and this does not look like it would fit without cutting away at a laptop's underside.
It's funny how mainstream (not HEDT\gaming) PC's were trending downward in power consumption for nearly two decades after the Pentium 4, then the last 5-6 years things just went crazy. Typical desktop CPU's boost to 120-150w, platform draw is crazy again, monitors are power hogs - mostly because everyone has something insanely large - and now we have primary storage devices pulling...going out on a limb here...15w? 20w? If it needs a heatsink that large AND a dedicated cooling fan, come on that's just a bit much for a non-enterprise product. Hard Disks and SSD's for decades had 5-7w draw max.
CPUs aren't going to hit their maximum power limits very often. Not in gaming, only in productivity tasks that can use all the cores.
For monitors and TVs it seems to be a function of size and the display technology. Lots of brightness needed for HDR to look good, and OLED has some issues. A new technology is needed to get power back down, maybe microLED.
HDDs had physical limitations that stopped them from using too much power. Doubling speeds repeatedly seems to be the problem for SSDs. Is it even possible to make a maxed out PCIe 5.0 SSD use the same power as maxed out SATA/PCIe 3.0? There's only so much that can be done before we need optical interconnects or something.
Overall, the situation is fine for people who actually care about their power consumption. A few tweaks or purchase adjustments and you are consuming much less power. It could get much better once newer APUs and mega APUs come to market. Ditch the dGPU, use previous-gen SSDs, etc.
> CPUs aren't going to hit their maximum power limits very often. Not in gaming, only in productivity tasks that can use all the cores.
Maybe they don't max out, but they boost to extremely aggressive, inefficient voltages/clocks at the drop of a hat.
Modern GPUs are like this too, they run at very high clocks in light loads because they just keep on boosting until they hit the TDP cap.
Older hardware was not this aggressive. Maybe its just a coincidence because the clocking/power schemes were more "primitive," but it still seems incredibly wasteful, and is a hidden cost for whoever is paying the power bill.
So a CPU momentarily boosts, to load a webpage or something, but whatever it is trying to get done gets done faster. Less or the same energy is used in the end. I recall Intel and/or AMD boasting about improvements that allow cores to ramp up and go back to idle faster, like 1ms instead of 20ms or whatever it was. Aggressive is not necessarily inefficient.
With many games and applications, only a single core is going to see heavy sustained loads. I don't think you see 1 core using more than 20-30 Watts typically. I know some AnandTech reviews have looked at that.
Personally, I'm ready to ditch discrete GPUs. I won't complain about a 65 Watt APU's boosting behavior, and I'd be willing to get a "mega APU" that could use even more power (e.g. 105-170 Watt TDP).
I get what you mean about boosting to get a task done quicker, potentially saving power. But this has proved untrue. Anandtech did a scaling study of Intel and AMD CPU’s that shows modest performance penalty with massive reduction in power consumption when reducing voltage and clock speed/boost ratios. It turns out the peak performance:power consumption was 65-90w peak for Intel and 45-65w for AMD.
Neither company sells a mainstream desktop product with these stock figures. And it’s explains why the laptop CPU’s are so close to their desktop parts. They are pissing away power on desktop, twice the consumption in some cases, to get 10% more performance.
And the analogy is use to explain how this is not momentary power consumption but actually hitting your power bill harder than you think is the same as the auto market. Cars and trucks now have insane stock performance figures. Meanwhile, fuel economy has been almost stagnant for decades in most segments. Yes some of it has to do with weight but most of it has to do with people gunning it more often as the confidence behind the wheel is increased. Programs are the same way and will run your system inefficiently to get that extra boost it wants.
Comparing sequential read/write and random read/write, sequential r/w would be optimized for ~7.3kB/s and power requirement for full speed 10GB/s is ~1.5-2W/GB bandwidth(?), while Sata (600MB/s) SSD's ~5-8W/GB bandwidth(?), but lower top power requirements (3-5W). Are idling power requirements similar on Pcie5 and Sata SSDs?
It looks excellent, it has the best technology, with the best active not passive cooling, I would like to put it in a $100 ATX case with removable dust filters including the front (not only mesh) and on a motherboard Z with PCIe 5.0 & DDR5 and a K CPU ... so I'm looking forward to the price and technical details.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
26 Comments
Back to Article
meacupla - Thursday, July 6, 2023 - link
is that a weenie little fan I see in there?Soulkeeper - Friday, July 7, 2023 - link
Yes, it'll probably clog up and fail with 1 month of use.At best it'll whine/squeel for several months before it stops spinning.
Tunnah - Friday, July 7, 2023 - link
There's tons of those size fans in loads of places, what sort of POS would break within a month ? Nobody would buy them. That is absolute nonsense.Flunk - Friday, July 7, 2023 - link
Putting a tiny fan on a SSD is like a mule with a spinning wheel. No one knows how he got it and danged if he knows how to use it!ballsystemlord - Friday, July 7, 2023 - link
You all seem to have forgotten AMDs Motherboards chipsets' fan. These fans are by no means unreliable. They're just unusual in PC hardware in recent years.Back when CPUs (with floating point no less!) were young, it was fans like this that cooled them.
Soulkeeper - Saturday, July 8, 2023 - link
Yeah, back then the chipset fans were a big complaint from users on forums like anandtech. They'd get pet hair/dust in them. Or they'd make high pitched noises. It seems like they always used the cheapest little fans they could slap on products also.TheinsanegamerN - Tuesday, July 11, 2023 - link
Why is it even there? At worst, the controller might pull 10w? MAYBE 12? A passive sink can easily handle that.PeachNCream - Saturday, July 8, 2023 - link
I've used a wide variety of little fans over the years and most of them have a distressingly short lifespan. If I were to lay down money and time to purchase hardware and then further reply upon it to perform a function, I would not opt for something with a fan when other hardware exists that needs no moving parts to stay cool. Furthermore, one of the key points of SSDs was to eliminate motors that could fail which were crucial to storage operation. Sticking a fan on a storage device runs against that idea. And yes, its just as stupid as putting a tiny fan on an AMD chipset.TheinsanegamerN - Tuesday, July 11, 2023 - link
Yes, those fans are everywhere. Which is why most people hate them, they've dealt with them!meacupla - Friday, July 7, 2023 - link
If you are clogging this low impedance heatsink up in 1 month, you should install a shop dust filter.sonny73n - Wednesday, July 12, 2023 - link
Cooling is a must. I just bought a Gen4 m.2 and have it installed in the same system with a Gen3. Both with the same aftermarket heatsinks. The Gen4 is 10C higher idle and 15C higher under load. Gen5 is probably much hotter. Hence the stupid little fan.2TB Gen4 for $85 but I wish I could've gotten a Gen3 instead.
bansheexyz - Friday, July 7, 2023 - link
Ok we may need to slow down a bit here. Trying to make ssds faster without shrinking the controller die is resulting in giant heatsinks and fans lol.nandnandnand - Friday, July 7, 2023 - link
Just put it inside a glacier like that concept art.watersb - Saturday, July 8, 2023 - link
They want you to think the concept photo features water ice, but on the SSD itself? STEAMtwotwotwo - Saturday, July 8, 2023 - link
I wonder how many of these with big coolers sell; the early-adopter tax has been a thing for prior gens, but also mucking up the form factor is a bit more than that. There also seem to be plenty that run fine without a fan or large heatsink!PeachNCream - Friday, July 7, 2023 - link
The intake vent for the "doomed-to-fail-in-2-months" fan is shaped stupidly to interfere with its ability to move air thanks to the partial obstruction of the silly shape. I absolutely wouldn't want to buy something like this that appears to rely on notoriously unreliable active cooling (those little fans really are not durable) to apparently maintain safe operating temps. Also who has space for all that cooling garbage in a laptop anyway? Those things are rather thin and this does not look like it would fit without cutting away at a laptop's underside.nandnandnand - Friday, July 7, 2023 - link
This seems like a good application for that AirJet cooler. Maybe it will be ready by the time ultrafast PCIe 6.0 SSDs hit the market.ballsystemlord - Friday, July 7, 2023 - link
970? Are they making fun of Samsung which hasn't released a PCIe 5.0 drive yet?Samus - Saturday, July 8, 2023 - link
It's funny how mainstream (not HEDT\gaming) PC's were trending downward in power consumption for nearly two decades after the Pentium 4, then the last 5-6 years things just went crazy. Typical desktop CPU's boost to 120-150w, platform draw is crazy again, monitors are power hogs - mostly because everyone has something insanely large - and now we have primary storage devices pulling...going out on a limb here...15w? 20w? If it needs a heatsink that large AND a dedicated cooling fan, come on that's just a bit much for a non-enterprise product. Hard Disks and SSD's for decades had 5-7w draw max.nandnandnand - Saturday, July 8, 2023 - link
CPUs aren't going to hit their maximum power limits very often. Not in gaming, only in productivity tasks that can use all the cores.For monitors and TVs it seems to be a function of size and the display technology. Lots of brightness needed for HDR to look good, and OLED has some issues. A new technology is needed to get power back down, maybe microLED.
HDDs had physical limitations that stopped them from using too much power. Doubling speeds repeatedly seems to be the problem for SSDs. Is it even possible to make a maxed out PCIe 5.0 SSD use the same power as maxed out SATA/PCIe 3.0? There's only so much that can be done before we need optical interconnects or something.
Overall, the situation is fine for people who actually care about their power consumption. A few tweaks or purchase adjustments and you are consuming much less power. It could get much better once newer APUs and mega APUs come to market. Ditch the dGPU, use previous-gen SSDs, etc.
brucethemoose - Sunday, July 9, 2023 - link
> CPUs aren't going to hit their maximum power limits very often. Not in gaming, only in productivity tasks that can use all the cores.Maybe they don't max out, but they boost to extremely aggressive, inefficient voltages/clocks at the drop of a hat.
Modern GPUs are like this too, they run at very high clocks in light loads because they just keep on boosting until they hit the TDP cap.
Older hardware was not this aggressive. Maybe its just a coincidence because the clocking/power schemes were more "primitive," but it still seems incredibly wasteful, and is a hidden cost for whoever is paying the power bill.
nandnandnand - Sunday, July 9, 2023 - link
So a CPU momentarily boosts, to load a webpage or something, but whatever it is trying to get done gets done faster. Less or the same energy is used in the end. I recall Intel and/or AMD boasting about improvements that allow cores to ramp up and go back to idle faster, like 1ms instead of 20ms or whatever it was. Aggressive is not necessarily inefficient.With many games and applications, only a single core is going to see heavy sustained loads. I don't think you see 1 core using more than 20-30 Watts typically. I know some AnandTech reviews have looked at that.
Personally, I'm ready to ditch discrete GPUs. I won't complain about a 65 Watt APU's boosting behavior, and I'd be willing to get a "mega APU" that could use even more power (e.g. 105-170 Watt TDP).
Samus - Monday, July 10, 2023 - link
I get what you mean about boosting to get a task done quicker, potentially saving power. But this has proved untrue. Anandtech did a scaling study of Intel and AMD CPU’s that shows modest performance penalty with massive reduction in power consumption when reducing voltage and clock speed/boost ratios. It turns out the peak performance:power consumption was 65-90w peak for Intel and 45-65w for AMD.Neither company sells a mainstream desktop product with these stock figures. And it’s explains why the laptop CPU’s are so close to their desktop parts. They are pissing away power on desktop, twice the consumption in some cases, to get 10% more performance.
And the analogy is use to explain how this is not momentary power consumption but actually hitting your power bill harder than you think is the same as the auto market. Cars and trucks now have insane stock performance figures. Meanwhile, fuel economy has been almost stagnant for decades in most segments. Yes some of it has to do with weight but most of it has to do with people gunning it more often as the confidence behind the wheel is increased. Programs are the same way and will run your system inefficiently to get that extra boost it wants.
TheinsanegamerN - Tuesday, July 11, 2023 - link
Such a mega APU would be worthless due to bandwidth limitations.back2future - Saturday, July 8, 2023 - link
Comparing sequential read/write and random read/write, sequential r/w would be optimized for ~7.3kB/s and power requirement for full speed 10GB/s is ~1.5-2W/GB bandwidth(?), while Sata (600MB/s) SSD's ~5-8W/GB bandwidth(?), but lower top power requirements (3-5W).Are idling power requirements similar on Pcie5 and Sata SSDs?
Vink - Tuesday, July 11, 2023 - link
It looks excellent, it has the best technology, with the best active not passive cooling, I would like to put it in a $100 ATX case with removable dust filters including the front (not only mesh) and on a motherboard Z with PCIe 5.0 & DDR5 and a K CPU ... so I'm looking forward to the price and technical details.