Not yet. As far as I know, AMD is focusing on mobile first, so you'll have to come back later in the year. I suspect they'll time the launch of B550 around the same time.
thank you (including for the article - which is a great read).
thinking about the possibility of desktop Renoir, its interesting that: a) on the media engine side - there is no sign of AV1 b) on the display output side - there is no mention at all (inc hdmi 2.1)
it seems to me that AMD would have a real opportunity to grab the HTPC market if it achieved both of the above.
do you think there is any chance that 1. the lack of AV1 is a function of speed, i.e. a 65W Renoir media engine might have the grunt to do it? 2. the absence of information on the display output side is deliberate, i.e. AMD is holding this back for desktop?
Media encoding/decoding capabilities are certainly important, but it seems like the "HTPC market" has dwindled into a really, really small niche, too small to influence design decisions in and of itself. But that's just my perspective, don't have actual data. It's just so cumbersome compared to streaming.
HTPC barely a blip on the radar and finally, after years of waiting, maybe, MxGPU support, or something GVT-g compatible. Make it happen AMD. Report on it Anandtech!
Well HTPC is an unfortunate combination of things that can be delivered in many other ways now that technologies are shifting (mostly to mobile and streaming). Most people no longer want a (relatively) bulky and expensive full PC in their media lineup. They might have a NAS that can do it all (storage for the whole network, streaming, transcoding, output directly to TV, etc.) at a fraction of the cost, size, and effort to build and maintain. Between this and a smart TV the role of the HTPC is purely to make some enthusiasts happy.
HTPC might, but the NAS market is still there, and that's where the functionality has folded. This APU would fit right in and be an improvement of the usual anemic CPUs used in such machines.
A NAS based on one of these APUs would also be in the price range of "enthusiast" or entry-level "enterprise" appliances. To the point that it might be more cost-effective to roll your own, especially to people who are picky enough to request this level of power and functionality from their NAS devices.
I mean realistically there isn't anything a 10w Atom can't decode anymore...everything is overkill for HTPC.
As far as encoding, for what the general consumer does (twitch, etc) any midrange CPU can handle that in the background on top of any other tasks you demand. It won't be a 10-15w part, but certainly a 35w part.
I have an X5-Z8350 Atom tablet at home, I will give it a run with an AV1 encoded full HD Youtube stream and see if it handles it reasonably. I would assume that a box with more adequate cooling would do even better.
Have to agree with this. HTPCs had a very brief glimmer of market presence a few years ago, but they never really took off or make a substantial enough splash. The population at large has little interest in adding the relative complexity of a computer to their media viewing experience and most home users are purchasing laptops, not even desktops, which are even less well-suited to acting as a fixed system attached to a large display panel. If AMD does grab that market, it will not be a measurable number of sales to say the least.
I love my HTPC and am excited to rebuild it around Renoir, and I fully endorse the sentiment of this post. Most people get by with a Fire stick or the built-in "smart" features of your average modern TV.
If I had time and was more interested in consuming video content, I would probably dive into building a HTPC as well, but it would be to appeal mainly to a desire to tinker. From a practical standpoint, I would be hard-pressed to find a credible amount of work for computer dedicated to that task because watching videos isn't something I do when I'm not on an exercise bike and my phone is good enough for that chore.
For power efficiency media en/decoding is normally done with fixed function hardware; doing it in software on the GPU's general purpose cores eats power like crazy. AV1 not being present means Renoir doesn't have a fixed function block - whether due to not being done yet, taking too much die area, or something else - but not being here means you're going to have to wait until the 5000 series APUs to get support in an AMD CPU.
Bear in mind that this year will see the release of no less than *three* new video codecs. MPEG plan to release H.266/VVC (Versatile Video Coding), EVC (i.e Essential Video Coding) or MPEG-5 Part-1 and LCEVC (i.e. Low Complexity Enhancement Video Coding) or MPEG-5 Part-2. Each codec is targeted at a different market. For instance H.266/VVC is the direct successor of H.265/HEVC, while EVC is partly targeted against AV1 (its baseline profile, which is ~30% more bitrate efficient than H.264, will be royalty free).
LCEVC is not so much a new codec but a new technique to combine two layers of any two codecs at any resolution in a "hybrid" (stacked) way, in order to reduce computational complexity. Which works apparently. I place a link at the end of the comment which explains how that works. In other words the codec market of the next couple of years is going to quite more loaded and competitive than simply choosing between H.265, VP9 and AV1. This is something chip manufacturers will almost certainly take into account.
By the way, it is not yet fully clear if AV1 is going to be royalty free. Sisvel launched a patent pool for AV1 last year. Whether it has merit or not remains to be seen. However, patent confusion is worse than paying royalties for patents. If chip manufacturers have plans to add decoding and encoding support for VVC and EVC, for instance, they have already accounted the costs. But if they add AV1 support thinking it was patent free and then Sisvel goes to court to sue that would be a very unpleasant and unexpected surprise. Sisvel's patent claims are going to stall AV1 support unless they are resolved. https://www.streamingmedia.com/Articles/ReadArticl...
Codecs live or die by industry & content support. I don't see *anyone* major taking up H.266, much less EVC or LCEVC.
The transition to AV1 is not merely technical, but also political. MPEG Group has sowed their own demise.
Two of H.265's patent pools backed away away, scared of AV1's enthusiasm. I don't see Sisvel's snivelling going far, even with the latest batch of patent "disputes". How is their VP9 patent pool going?...
VVC's bit-freeze is when...2021? It'll take years for decoding hardware and *nobody* is eager to run back to MPEG.
Firstly, don't confuse the world as you wish it to be from the world as it is. Apple is all on on h.265, and ATSC3, the next US digital TV spec, already in preliminary usage in some markets, is based on h.265. When you stream netflix at 4K UHD, you're probably using h.265.
Secondly, be aware of the schedules. For h.265, the first version of the spec was 2013. By Sept 2015 Apple released the A9 with 265 decode, and a year later the A10 with 265 encode. BUT these were only exposed to the public in 2017 with iOS11. The point of the delay, presumably, was to ensure a large enough critical mass of users already at the point of announcement. This suggests that Apple will be planning a similar transition of their entire line to VVC, but it will take time -- maybe a decoder chip in 2023, an encoder in 2024, a public commitment to "the new Apple codec is VVC across all devices" in 2025.
The relevance of Apple is that Apple is something of the leader in this space. Sure, sure, there's a tiny world of open source renegades talking about Dirac and Opus and Matroska and suchlike. But in the real world of serious money, Apple generally represents the vanguard.
So these future codecs are interesting, yes. But they're also only going to be mainstream relevant (if you're with Apple) maybe late 2025, for everyone else maybe a year or so later?
Complain all you like about the existence of patent pools, but they are just not that big a deal to the big boys, especially hardware. So Apple, or your Samsung/LG/Sony TV has to pay 1, or 5 dollars per device. That's perfectly feasible in return for a spec that is both very good, and more or less legal certainty. There's just no reason to believe the driving forces in 2025 are any different from those in 2015.
"they are just not that big a deal to the big boys, especially hardware"
And yet it's "big boys" like intel, cisco, google, samsung, nvidia etc. and oh ya, APPLE, who are the founding members of the royalty-free Open Media alliance i.e. the developers of AV1.
I was expecting B550 around Computex (which is still a go last I checked). However with the recent outbreaks, I wonder how much has been pushed back or will simply be a preview then. Supply chains are disrupted at various points. I'm just curious what we know has been pushed back, what is still on schedule and what is likely to be affected that hasn't yet. I'd make for an interesting piece. :)
Asbolutely hyped to pick up a 4400G(?) for my Slim system. Currently has a 3400G but we are edging closer to me being able to play all the games I like at 1080p, on a single chip system.
Six months ago I lost my job and after that I was fortunate enough to stumble upon a great website which literally saved me• I started working for them online and in a short time after I've started averaging 15k a month••• icash68.coM
On a desktop APU it would make sense to separate CPU and GPU dies, there is plenty of space and yields would be higher. Of course the GPU side should be Navi and socket should be upgraded to 4 memory channels to allow a truly decent integrated GPU similar to 5500.
Delusional idiot alert. Because creating super expensive HEDT pin-out sized bespoke sockets solely for low-mid end market desktop APU's definitely makes ANY kind of sense... -_-
Make 6150 bucks every month... Start doing online computer-based work through our website. I have been working from home for 4 years now and I love it. I don't have a boss standing over my shoulder and I make my own hours. The tips below are very informative and anyone currently working from home or planning to in the future could use this website... WWW.iⅭash68.ⅭOⅯ
I think a 48-4900 HS without a GPU would be the perfect laptop for most people. I've been wishing for something like this for a very long time.
High core count and TDP for real work, with enough IGP for casual gaming. Still lower power than a 45w chip, especially since they're always paired with at least a 25w GPU (even though most don't need it).
U series for those who care about battery above all else, HS + IGP for the vast majority, and H + DGPU for gamers and mobile workstations.
Actually, I'd prefer they give the APU full 45W TDP 'breathing room', otherwise it will 'choke' on the 35W TDP constraints.
Most laptop OEM's don't really pay too much attention in designing adequate cooling systems for their designs, which can lead to thermal throttling, overheating and performance losses.
I'd rather they work with the 'maximum allowed TDP' for the chip (say 45W) design a cooling system that's more than enough to handle it and work from there.
I'd prefer seeing 4900H with a decent dGPU such as RX 5700M and proper cooling design to produce limited noise under full load and that both CPU/IGP and dGPU can reach/sustain their maximum advertised performance indefinitely (or for as long as one needs them).
I need productivity on the go, and I have no time for cooling shenanigans from OEM's.
35w gives tons of "breathing room", it's over twice the TDP of U series parts. It's also half the power of a 45w CPU plus low end 25w DGPU. I fully expect it'll maintain a good 3+GHz under full load (vs 2GHz for U series). For cooling, OEMs can just slap on the cooler from their 45w designs.
The vast majority of people don't need a DGPU, but they are currently forced to buy one if they want any performance CPU. That costs them more money, increases weight, and reduces battery life (two fold, less space for battery and higher draw).
You obviously want a gaming laptop, and that's not at all what I'm discussing. 80% of the market would be incredibly well served with a 35w HS APU. Far better than the crappy U series they've been forced into.
For the 10% who want battery life over anything, they can get a 15w U series. For the 10% who want max gaming they can get a 45w H series.
I've been begging for a logical laptop for so long... I seriously can't understand why OEMs have refused to even consider it. In what other industry do manufactures refuse to service the needs of 80% to cater only for the odd 20%? "Oh you want something practical? HA screw you!"
Imagine if you could only buy a 2 seat scooter for $15+k, or a massive 4 door 8ft bed truck for $100+k. And there are literally no other options, new or used...there has been only these two choices being sold by every car maker in the entire world. If you want to do anything more than a scooter can handle, you are literally forced to buy a massive overkill option no matter how impractical.
That's literally how the laptop market has been for almost a decade.
i totally see where you are going with this ...but for vast majority .. people dont care about CPU's and performance ... all they ask is a cheap laptop with good battery life....they dont even compare or ask for model numbers for CPUs .... they jsut see its as core i3 or core i5 ... or Ryzen 3/5 .. thats all they "perceive" as performance..... and lastly they want it dirt cheap. (also they expect to have a dGPU to always out perform iGPU ....even if dGPU is pants like MX150)
Frankly, the 4800 is much more capable than what most people need in a laptop. I've seen people effectively develop applications on an Atom based Chromebook with great success, and based on what I see other people that aren't enthusiasts doing, even the bottom tier 4600H is more than they need. It's exceedingly rare for any normal user to need more than even 4 cores to do their normal workload of opening tabs, checking email, talking on videochat.
I agree, for general computing tasks, 4 cores or 4 fully powered threads is good enough for most people. In a laptop, for a lot of people, the important factors are: size/weight, screen quality, battery life, design aesthetics, not necessarily in that order. I used to buy thin-and-light gaming laptops for my own perceived needs...but recently, I've found that I'm carrying the atom-powered thin and light, because the screen is good enough, the size and weight is good, and it has all-day battery life.
It's amazing how much higher density they are at now. Hopefully this is a sign of things to come. (62.82 for Renoir, vs 52 for Zen2, vs 40.5 for Navi). Anyone who says you won't gain any "node shrink" advantage staying on 7nm, doesn't realize how much room is still left.
4900H 4.4Ghz boost. That should beat any mobile Intel CPU including upcoming 10th gen. I wonder how strong the iGPU is? 1080p medium gaming? If it is, I might buy one for portable gaming and good battery life. I don't like laptop dGPU, I have one and it KILLS the battery.
The iGP should be about 30% faster than Zen+. AMD improved Vega iGP uArch in Zen 2 mobile, so that per core, that Vega is now 56% faster than Zen+.
The Zen 2 mobile Vega iGP does apparently have lower amount of compute units, but because of improved performance per core, the iGP should still be about 30% faster than previous iterations.
So, yes, I do think games set to Medium settings and 1080p should work fine... more to the point, since these APU's also apparently support LPDDR4 with high frequencies, bandwidth won't be a problem... so hypothetically at least we could see even better performance.
Those improvements that went into Zen 2 mobile Vega will also be integrated in RDNA2.
I am wondering on the top left of the die shot, it looks like 64-bit GDDR6 controller comparing to Navi 10 die shot (exactly the same and way different to Piccasso ) I am dreaming, right?
Note that AMD have chosen power-under-load as their look-at-the-big-number-we-have 'slide headline' figure, whilst the killer when it came to battery life for Ryzen Mobile 2xxx/3xxx was idle power.
There are slides on other sites, like notebookcheck.com and the video from the australians (hardware unboxed i think) That show the actual slides, and the data AMD themselves got.
1065G7 still wins in some idles, but renoir pulls ahead in actual use it seems. Which evens out to about equal / slightly better for Renoir, depending on how you weigh it.
" Compared to the desktop design, the mobile is listed as being ‘optimized for mobile’, primarily by the smaller L3 cache – only 4 MB per quad-core group, rather than the 32 MB per quad-core group we see on the desktop"
I saw that too and questioned its validity. 32mb per 4-cores? That seems really high (I can't remember what it was reading the Zen2 desktop review last year) and if it's the case, that's a huge cut down for the mobile line
This is an intriguing part. I am hoping for laptop designs with a 4800U and 5600M, but also desktop APUs. Hopefully AMD can bring some of the nee stuff forward to desktop Zen 3 as well.
The fact that OEMs are willing to make custom designs for AMD is already a good sign that they're confident in the product. Lisa Su certainly has the right stuff.
I'm pretty unimpressed by the GPU vs the Vega 11 in APU desktops. The only major advantage Renoir has is higher clocks on the GPU core and higher officially supported memory speeds. They likely got the 56% performance per core improvement by comparing to a Zen+ with Vega 11, which will be severely clocked constrained on 12nm with a bigger core, where Renoir gets an even higher clock advantage not just from the nominal clock, but also from Picasso APUs hitting their TDP limit hard in a 25W or 35W environment.
On desktop with much higher TDPs I expect Renoir to slightly beat the 3400g at stock clocks, but lose when comparing overclocked results. Picasso easily overclocks up to 1700-1800 MHz from the measly 1240 MHz stock clock. I would guess Renoir would hit around 2000, not enough to compensate for the smaller core.
There are a lot of problems with your comment, but let’s start with the obvious: The TDP of the part you mentioned is at least triple that of the 4800U. Depending on how the chip is configured it is quadruple.
These are laptop parts, we haven’t seen desktop APUs. AMD could add 3X as many Vega cores and still hit a 45-65 watt TDP or they can go aggressive on the CPU clocks like they did the 4900H.
These days doubling the GPU cores/units and running half the speed is more energy efficient. Uses more die space but I don't understand the focus on GPU MHz over energy efficiency.
1) Not sure evidence I've seen bears out that a 1700-1800Mhz GPU overclock is "easy". That sounds like the higher end of what you can expect. Would welcome evidence to the contrary, as I'm still considering picking one up. 2) RAM speed is the big difference here. The desktop APU should get much higher memory speeds than the 3400G due to the improved Zen 2 memory controller, which ought to relieve a significant bottleneck. GPU core overclocks weren't actually the best route to wringing performance out of the 3400G.
@Ian Cuttress, did they say what version of N7 they used for this? The density looks like either a HPC + variant, or a N7 mobile variant from what I can tell?
What I want is choice. And flexibility to enable it.
15 Watt TDP typically isn’t a hard limit nor is 35 or 45 Watt for that matter: It’s mostly about what can be *sustained* for more than a second or two. Vendors have allowed bursting at twice or more TDP because that’s what often defines ‘user experience’ and sells like hotcakes on mobile i7’s.
We all know the silicon is the same. Yes, there be binning but a 15 Watt part sure won’t die at 35, 45 or even 65 or 95 Watts for that matter: It will just need more juice and cooling. And of course, a design for comfortable cooling of 15 Watts won’t take 25 or 35 Watts without a bit of ‘screaming’.
But why not give a choice, when noise matters less than a deadline and you don’t want to buy a distinct machine for a temporary project?
I admit to have run machine-learning on Nvidia equipped 15.4” slim-line notebooks for days if not weeks, and having to hide them in a closet, because nobody in the office could tolerate the noise they produced at >100 Watts of CPU and GPU power consumption: That’s fine, really, when you can choose what to do where and when.
Renoir has a huge range of load vs. power consumption: Please, please, PLEASE ensure that in all form factors users can make a choice of power consumption vs. battery life or cooling by setting max and sustained Wattage preferably at run-time and not hard-wiring this into distinct SKUs. I’d want a 15 Watt ultrabook to sustain a 35 Watt workload screaming its head off, just like I’d like a 90 Watt desktop or a 60 Watt NUC to calm down to 45/35/25 Watt sustained for night-long batches in the living room or bed-side—if that’s what suits my needs: It’s not a matter of technology, just a matter of ‘product placement’.
To this point, "Renoir has a huge range of load vs. power consumption: Please, please, PLEASE ensure that in all form factors users can make a choice of power consumption vs. battery life or cooling by setting max and sustained Wattage preferably at run-time and not hard-wiring this into distinct SKUs. I’d want a 15 Watt ultrabook to sustain a 35 Watt workload screaming its head off, just like I’d like a 90 Watt desktop or a 60 Watt NUC to calm down to 45/35/25 Watt sustained for night-long batches in the living room or bed-side—if that’s what suits my needs: It’s not a matter of technology, just a matter of ‘product placement’."
I doubt they will ever let you do that on a laptop or even NUC officially. The cooling solution implemented is usually very closely correlated to the TDP of the processor. Even when it is a downgrade from say a 45W to 35W, these are usually tightly controlled by AMD and Intel. There is no guarantee that all chips will work well at a certain clockspeed across various TDP. For example, a Ryzen 7 4800H may not be able to run at a 4800U speed when you reduce the TDP from 45W to 15W. U series chips are binned to be able to run at that the specific clockspeed and likely also commands a higher premium.
Why are you using a laptop for a workstation task? If you have to hide it in a closet, then you really don't get to choose the where and when. Might as well rent out some hardware at a data center and do the work remotely. Not judging, just saying your way of doing things doesn't make a whole lot of sense to me.
I have a question: let's take Ryzen 7 4800U and Ryzen 3 4300U devices. Both are 15W parts, yet, one has 4/4 C/T and the other 8/16. The 4300U has less GPU CU and lower clock.
How can they have the same TDP? Does this mean that the 4300U is likely to stay at Turbo a lot more consistently than the 4800U?
Possibly. It will more likely mean (at least initially) that the 4300U is an inferior specimen and needs more voltage to reach its standard clock speeds.
Ian, can you make, on the review on one of these puppies a test like Andrei does on phones, that is compute the efficiency using SPEC? I want to see how Ice Lake stacks up.
"...with within the right thermal envelope,..." "but" not "with": "...but within the right thermal envelope,..."
"If the CPU went too far down this stack, while it would be saving power, each hop down the rabbit hole meant a longer time to get back out of it, diminishing performance and latency but also requiring more power changes at the silicon level." Badly written: "If the CPU went too far down this stack, while it would be saving power, each hop down the rabbit hole would mean a longer time to get back out of it, diminishing performance and increasing latency but also requiring more power state changes at the silicon level."
If the world hasn't gone into a massive recession it'll still be this time next year at best, likely Winter 2021, that we'll see a Ryzen 4000 CPU paired with an rDNA 2 GPU though. Ah well.
"Compared to the desktop design, the mobile is listed as being ‘optimized for mobile’, primarily by the smaller L3 cache – only 4 MB per quad-core group, rather than the 32 MB per quad-core group we see on the desktop."
Should be 16 MB per quad-core group (CCX) on the desktop, not 32.
Sooo the leonovo laptop is a confirmation Renoir is a 25 W SKU, and definitevely nope a 15W class cpu. To achieve the rated clock speeds the SOC needs to increase of 70 % the TDP. This the reason Intel care nothing of these AMD SKUs, at 15W a four core cpu is much more efficient for laptop responsiveness.
Smaller silicon features may require less voltage to activate. For example, there are smaller gaps; think of how big a voltage (not current) you need for a spark across a significant air gap. There might also be unrelated technological advances incorporated.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
95 Comments
Back to Article
R3MF - Monday, March 16, 2020 - link
Tell me about [desktop] Renoir...?Ian Cutress - Monday, March 16, 2020 - link
Not yet. As far as I know, AMD is focusing on mobile first, so you'll have to come back later in the year. I suspect they'll time the launch of B550 around the same time.R3MF - Monday, March 16, 2020 - link
thank you (including for the article - which is a great read).thinking about the possibility of desktop Renoir, its interesting that:
a) on the media engine side - there is no sign of AV1
b) on the display output side - there is no mention at all (inc hdmi 2.1)
it seems to me that AMD would have a real opportunity to grab the HTPC market if it achieved both of the above.
do you think there is any chance that
1. the lack of AV1 is a function of speed, i.e. a 65W Renoir media engine might have the grunt to do it?
2. the absence of information on the display output side is deliberate, i.e. AMD is holding this back for desktop?
Moizy - Monday, March 16, 2020 - link
Media encoding/decoding capabilities are certainly important, but it seems like the "HTPC market" has dwindled into a really, really small niche, too small to influence design decisions in and of itself. But that's just my perspective, don't have actual data. It's just so cumbersome compared to streaming.DanNeely - Monday, March 16, 2020 - link
Yeah. HTPC is dead as far as driving features; streaming (games, etc) and video editing are the drivers for encoding support on the PC these days.a5cent - Monday, March 16, 2020 - link
Seems like everything I care about is niche :-/HTPC barely a blip on the radar and finally, after years of waiting, maybe, MxGPU support, or something GVT-g compatible. Make it happen AMD. Report on it Anandtech!
close - Tuesday, March 17, 2020 - link
Well HTPC is an unfortunate combination of things that can be delivered in many other ways now that technologies are shifting (mostly to mobile and streaming). Most people no longer want a (relatively) bulky and expensive full PC in their media lineup. They might have a NAS that can do it all (storage for the whole network, streaming, transcoding, output directly to TV, etc.) at a fraction of the cost, size, and effort to build and maintain. Between this and a smart TV the role of the HTPC is purely to make some enthusiasts happy.WaltC - Tuesday, March 17, 2020 - link
I love my "expensive" PC, thank you...;) I would not trade it for a device, ever.Threska - Tuesday, March 17, 2020 - link
HTPC might, but the NAS market is still there, and that's where the functionality has folded. This APU would fit right in and be an improvement of the usual anemic CPUs used in such machines.Namisecond - Thursday, March 26, 2020 - link
A NAS based on one of these APUs would also be in the price range of "enthusiast" or entry-level "enterprise" appliances. To the point that it might be more cost-effective to roll your own, especially to people who are picky enough to request this level of power and functionality from their NAS devices.uibo - Monday, March 16, 2020 - link
HTPC market insignificantSamus - Tuesday, March 17, 2020 - link
I mean realistically there isn't anything a 10w Atom can't decode anymore...everything is overkill for HTPC.As far as encoding, for what the general consumer does (twitch, etc) any midrange CPU can handle that in the background on top of any other tasks you demand. It won't be a 10-15w part, but certainly a 35w part.
R3MF - Tuesday, March 17, 2020 - link
even AV1?bearing in mind that a new htpc has a ~6 year life and AV1 is the future of streaming video.
close - Tuesday, March 17, 2020 - link
I have an X5-Z8350 Atom tablet at home, I will give it a run with an AV1 encoded full HD Youtube stream and see if it handles it reasonably. I would assume that a box with more adequate cooling would do even better.PeachNCream - Tuesday, March 17, 2020 - link
Have to agree with this. HTPCs had a very brief glimmer of market presence a few years ago, but they never really took off or make a substantial enough splash. The population at large has little interest in adding the relative complexity of a computer to their media viewing experience and most home users are purchasing laptops, not even desktops, which are even less well-suited to acting as a fixed system attached to a large display panel. If AMD does grab that market, it will not be a measurable number of sales to say the least.Spunjji - Tuesday, March 17, 2020 - link
I love my HTPC and am excited to rebuild it around Renoir, and I fully endorse the sentiment of this post. Most people get by with a Fire stick or the built-in "smart" features of your average modern TV.PeachNCream - Tuesday, March 17, 2020 - link
If I had time and was more interested in consuming video content, I would probably dive into building a HTPC as well, but it would be to appeal mainly to a desire to tinker. From a practical standpoint, I would be hard-pressed to find a credible amount of work for computer dedicated to that task because watching videos isn't something I do when I'm not on an exercise bike and my phone is good enough for that chore.stephenbrooks - Tuesday, March 17, 2020 - link
Laptops make pretty good "HTPCs"... I plugged mine into a projector and sound system just today in factDanNeely - Monday, March 16, 2020 - link
For power efficiency media en/decoding is normally done with fixed function hardware; doing it in software on the GPU's general purpose cores eats power like crazy. AV1 not being present means Renoir doesn't have a fixed function block - whether due to not being done yet, taking too much die area, or something else - but not being here means you're going to have to wait until the 5000 series APUs to get support in an AMD CPU.Santoval - Tuesday, March 17, 2020 - link
Bear in mind that this year will see the release of no less than *three* new video codecs. MPEG plan to release H.266/VVC (Versatile Video Coding), EVC (i.e Essential Video Coding) or MPEG-5 Part-1 and LCEVC (i.e. Low Complexity Enhancement Video Coding) or MPEG-5 Part-2. Each codec is targeted at a different market. For instance H.266/VVC is the direct successor of H.265/HEVC, while EVC is partly targeted against AV1 (its baseline profile, which is ~30% more bitrate efficient than H.264, will be royalty free).LCEVC is not so much a new codec but a new technique to combine two layers of any two codecs at any resolution in a "hybrid" (stacked) way, in order to reduce computational complexity. Which works apparently. I place a link at the end of the comment which explains how that works. In other words the codec market of the next couple of years is going to quite more loaded and competitive than simply choosing between H.265, VP9 and AV1. This is something chip manufacturers will almost certainly take into account.
By the way, it is not yet fully clear if AV1 is going to be royalty free. Sisvel launched a patent pool for AV1 last year. Whether it has merit or not remains to be seen. However, patent confusion is worse than paying royalties for patents. If chip manufacturers have plans to add decoding and encoding support for VVC and EVC, for instance, they have already accounted the costs. But if they add AV1 support thinking it was patent free and then Sisvel goes to court to sue that would be a very unpleasant and unexpected surprise. Sisvel's patent claims are going to stall AV1 support unless they are resolved.
https://www.streamingmedia.com/Articles/ReadArticl...
Spunjji - Tuesday, March 17, 2020 - link
Oof. Thanks for sharing that, but I feel like I know more than I ever wanted to about CODECs... and still not enough :'Dikjadoon - Tuesday, March 17, 2020 - link
Codecs live or die by industry & content support. I don't see *anyone* major taking up H.266, much less EVC or LCEVC.The transition to AV1 is not merely technical, but also political. MPEG Group has sowed their own demise.
Two of H.265's patent pools backed away away, scared of AV1's enthusiasm. I don't see Sisvel's snivelling going far, even with the latest batch of patent "disputes". How is their VP9 patent pool going?...
VVC's bit-freeze is when...2021? It'll take years for decoding hardware and *nobody* is eager to run back to MPEG.
A good read: https://www.streamingmedia.com/Articles/ReadArticl...
March 2020 update: https://www.streamingmedia.com/Articles/News/Onlin...
MPEG, Sisvel, etc. missed the boat about a half decade ago.
name99 - Thursday, March 19, 2020 - link
Firstly, don't confuse the world as you wish it to be from the world as it is.Apple is all on on h.265, and ATSC3, the next US digital TV spec, already in preliminary usage in some markets, is based on h.265. When you stream netflix at 4K UHD, you're probably using h.265.
Secondly, be aware of the schedules.
For h.265, the first version of the spec was 2013. By Sept 2015 Apple released the A9 with 265 decode, and a year later the A10 with 265 encode.
BUT these were only exposed to the public in 2017 with iOS11. The point of the delay,
presumably, was to ensure a large enough critical mass of users already at the point of announcement.
This suggests that Apple will be planning a similar transition of their entire line to VVC, but it will take time -- maybe a decoder chip in 2023, an encoder in 2024, a public commitment to "the new Apple codec is VVC across all devices" in 2025.
The relevance of Apple is that Apple is something of the leader in this space. Sure, sure, there's a tiny world of open source renegades talking about Dirac and Opus and Matroska and suchlike. But in the real world of serious money, Apple generally represents the vanguard.
So these future codecs are interesting, yes. But they're also only going to be mainstream relevant (if you're with Apple) maybe late 2025, for everyone else maybe a year or so later?
name99 - Thursday, March 19, 2020 - link
Oh I realize I left out one point.We saw exactly the same stories back when h.265 became relevant. Here's a sample thread:
https://forums.anandtech.com/threads/h-264-vs-h-26...
Complain all you like about the existence of patent pools, but they are just not that big a deal to the big boys, especially hardware. So Apple, or your Samsung/LG/Sony TV has to pay 1, or 5 dollars per device. That's perfectly feasible in return for a spec that is both very good, and more or less legal certainty.
There's just no reason to believe the driving forces in 2025 are any different from those in 2015.
grant3 - Tuesday, April 14, 2020 - link
"they are just not that big a deal to the big boys, especially hardware"And yet it's "big boys" like intel, cisco, google, samsung, nvidia etc. and oh ya, APPLE, who are the founding members of the royalty-free Open Media alliance i.e. the developers of AV1.
Hifihedgehog - Monday, March 16, 2020 - link
How much later is later, guesstimate-wise? :)Kevin G - Monday, March 16, 2020 - link
I was expecting B550 around Computex (which is still a go last I checked). However with the recent outbreaks, I wonder how much has been pushed back or will simply be a preview then. Supply chains are disrupted at various points. I'm just curious what we know has been pushed back, what is still on schedule and what is likely to be affected that hasn't yet. I'd make for an interesting piece. :)AshlayW - Monday, March 16, 2020 - link
Asbolutely hyped to pick up a 4400G(?) for my Slim system. Currently has a 3400G but we are edging closer to me being able to play all the games I like at 1080p, on a single chip system.dianajmclean6 - Monday, March 23, 2020 - link
Six months ago I lost my job and after that I was fortunate enough to stumble upon a great website which literally saved me• I started working for them online and in a short time after I've started averaging 15k a month••• icash68.coMpeevee - Monday, March 23, 2020 - link
On a desktop APU it would make sense to separate CPU and GPU dies, there is plenty of space and yields would be higher. Of course the GPU side should be Navi and socket should be upgraded to 4 memory channels to allow a truly decent integrated GPU similar to 5500.Cooe - Friday, April 23, 2021 - link
Delusional idiot alert. Because creating super expensive HEDT pin-out sized bespoke sockets solely for low-mid end market desktop APU's definitely makes ANY kind of sense... -_-lisabmassey - Wednesday, March 25, 2020 - link
Make 6150 bucks every month... Start doing online computer-based work through our website. I have been working from home for 4 years now and I love it. I don't have a boss standing over my shoulder and I make my own hours. The tips below are very informative and anyone currently working from home or planning to in the future could use this website... WWW.iⅭash68.ⅭOⅯsuperflex - Thursday, May 21, 2020 - link
Do you have to swallow for $6150RamIt - Monday, March 16, 2020 - link
Looking forward for a test unit. Battery life and mild gaming may make me shift to AMD.EliteRetard - Monday, March 16, 2020 - link
I think a 48-4900 HS without a GPU would be the perfect laptop for most people.I've been wishing for something like this for a very long time.
High core count and TDP for real work, with enough IGP for casual gaming. Still lower power than a 45w chip, especially since they're always paired with at least a 25w GPU (even though most don't need it).
U series for those who care about battery above all else, HS + IGP for the vast majority, and H + DGPU for gamers and mobile workstations.
Please make it so!
deksman2 - Monday, March 16, 2020 - link
Actually, I'd prefer they give the APU full 45W TDP 'breathing room', otherwise it will 'choke' on the 35W TDP constraints.Most laptop OEM's don't really pay too much attention in designing adequate cooling systems for their designs, which can lead to thermal throttling, overheating and performance losses.
I'd rather they work with the 'maximum allowed TDP' for the chip (say 45W) design a cooling system that's more than enough to handle it and work from there.
I'd prefer seeing 4900H with a decent dGPU such as RX 5700M and proper cooling design to produce limited noise under full load and that both CPU/IGP and dGPU can reach/sustain their maximum advertised performance indefinitely (or for as long as one needs them).
I need productivity on the go, and I have no time for cooling shenanigans from OEM's.
EliteRetard - Monday, March 16, 2020 - link
35w gives tons of "breathing room", it's over twice the TDP of U series parts.It's also half the power of a 45w CPU plus low end 25w DGPU.
I fully expect it'll maintain a good 3+GHz under full load (vs 2GHz for U series).
For cooling, OEMs can just slap on the cooler from their 45w designs.
The vast majority of people don't need a DGPU, but they are currently forced to buy one if they want any performance CPU. That costs them more money, increases weight, and reduces battery life (two fold, less space for battery and higher draw).
You obviously want a gaming laptop, and that's not at all what I'm discussing.
80% of the market would be incredibly well served with a 35w HS APU.
Far better than the crappy U series they've been forced into.
For the 10% who want battery life over anything, they can get a 15w U series.
For the 10% who want max gaming they can get a 45w H series.
I've been begging for a logical laptop for so long... I seriously can't understand why OEMs have refused to even consider it. In what other industry do manufactures refuse to service the needs of 80% to cater only for the odd 20%? "Oh you want something practical? HA screw you!"
Imagine if you could only buy a 2 seat scooter for $15+k, or a massive 4 door 8ft bed truck for $100+k. And there are literally no other options, new or used...there has been only these two choices being sold by every car maker in the entire world. If you want to do anything more than a scooter can handle, you are literally forced to buy a massive overkill option no matter how impractical.
That's literally how the laptop market has been for almost a decade.
wolfesteinabhi - Tuesday, March 17, 2020 - link
i totally see where you are going with this ...but for vast majority .. people dont care about CPU's and performance ... all they ask is a cheap laptop with good battery life....they dont even compare or ask for model numbers for CPUs .... they jsut see its as core i3 or core i5 ... or Ryzen 3/5 .. thats all they "perceive" as performance..... and lastly they want it dirt cheap. (also they expect to have a dGPU to always out perform iGPU ....even if dGPU is pants like MX150)erple2 - Wednesday, March 18, 2020 - link
Frankly, the 4800 is much more capable than what most people need in a laptop. I've seen people effectively develop applications on an Atom based Chromebook with great success, and based on what I see other people that aren't enthusiasts doing, even the bottom tier 4600H is more than they need. It's exceedingly rare for any normal user to need more than even 4 cores to do their normal workload of opening tabs, checking email, talking on videochat.Namisecond - Thursday, March 26, 2020 - link
I agree, for general computing tasks, 4 cores or 4 fully powered threads is good enough for most people. In a laptop, for a lot of people, the important factors are: size/weight, screen quality, battery life, design aesthetics, not necessarily in that order.I used to buy thin-and-light gaming laptops for my own perceived needs...but recently, I've found that I'm carrying the atom-powered thin and light, because the screen is good enough, the size and weight is good, and it has all-day battery life.
Teckk - Monday, March 16, 2020 - link
This supports Thunderbolt 3?Ian Cutress - Monday, March 16, 2020 - link
If the OEM uses a TB3 controllerBradEK - Monday, March 16, 2020 - link
Page 2, under What's New: "Compared to the precious generation of Zen mobile processors". Previous?Fataliity - Monday, March 16, 2020 - link
It's amazing how much higher density they are at now. Hopefully this is a sign of things to come. (62.82 for Renoir, vs 52 for Zen2, vs 40.5 for Navi). Anyone who says you won't gain any "node shrink" advantage staying on 7nm, doesn't realize how much room is still left.maroon1 - Monday, March 16, 2020 - link
I rather see some real tests instead of slidesPeachNCream - Monday, March 16, 2020 - link
You will have to wait until products are available for testing.senttoschool - Monday, March 16, 2020 - link
You don't say?Holliday75 - Monday, March 16, 2020 - link
I believe he did.Zizo007 - Monday, March 16, 2020 - link
4900H 4.4Ghz boost. That should beat any mobile Intel CPU including upcoming 10th gen. I wonder how strong the iGPU is? 1080p medium gaming? If it is, I might buy one for portable gaming and good battery life. I don't like laptop dGPU, I have one and it KILLS the battery.milkywayer - Monday, March 16, 2020 - link
I would love to see an xps 13 based on this cpu. Assuming amd manages to beat Intel on mobile efficiency finally.Namisecond - Thursday, March 26, 2020 - link
Are you sure you don't mean an XPS 15 rather than the 13? The 13's have traditionally use U-class processors rather than H-class.deksman2 - Monday, March 16, 2020 - link
The iGP should be about 30% faster than Zen+.AMD improved Vega iGP uArch in Zen 2 mobile, so that per core, that Vega is now 56% faster than Zen+.
The Zen 2 mobile Vega iGP does apparently have lower amount of compute units, but because of improved performance per core, the iGP should still be about 30% faster than previous iterations.
So, yes, I do think games set to Medium settings and 1080p should work fine... more to the point, since these APU's also apparently support LPDDR4 with high frequencies, bandwidth won't be a problem... so hypothetically at least we could see even better performance.
Those improvements that went into Zen 2 mobile Vega will also be integrated in RDNA2.
LittlePaul - Monday, March 16, 2020 - link
I am wondering on the top left of the die shot, it looks like 64-bit GDDR6 controller comparing to Navi 10 die shot (exactly the same and way different to Piccasso )I am dreaming, right?
Spunjji - Tuesday, March 17, 2020 - link
Isn't that the CPU cores? Pretty sure that's what it is.uibo - Monday, March 16, 2020 - link
The Renoir APU graph on the first page has a block called "video codec next 2". I thought the engine was called "Video core next"?edzieba - Monday, March 16, 2020 - link
Note that AMD have chosen power-under-load as their look-at-the-big-number-we-have 'slide headline' figure, whilst the killer when it came to battery life for Ryzen Mobile 2xxx/3xxx was idle power.Fataliity - Monday, March 16, 2020 - link
There are slides on other sites, like notebookcheck.com and the video from the australians (hardware unboxed i think) That show the actual slides, and the data AMD themselves got.1065G7 still wins in some idles, but renoir pulls ahead in actual use it seems. Which evens out to about equal / slightly better for Renoir, depending on how you weigh it.
Spunjji - Tuesday, March 17, 2020 - link
They mention a number of improvements to idle, too - particularly from the Infinity Fabric, which was their big weakness before.anonomouse - Monday, March 16, 2020 - link
" Compared to the desktop design, the mobile is listed as being ‘optimized for mobile’, primarily by the smaller L3 cache – only 4 MB per quad-core group, rather than the 32 MB per quad-core group we see on the desktop"Isn't it 16MB per CCX on desktop?
Farfolomew - Tuesday, March 24, 2020 - link
I saw that too and questioned its validity. 32mb per 4-cores? That seems really high (I can't remember what it was reading the Zen2 desktop review last year) and if it's the case, that's a huge cut down for the mobile lineeek2121 - Monday, March 16, 2020 - link
This is an intriguing part. I am hoping for laptop designs with a 4800U and 5600M, but also desktop APUs. Hopefully AMD can bring some of the nee stuff forward to desktop Zen 3 as well.heffeque - Monday, March 16, 2020 - link
It would be interesting to see these in fan and fanless AMD versions of Surface Pro versus fan and fanless Intel versions of Surface Pro.I'm especially interested in battery life, since AMD 3780U Surface Pro has horrible battery life compared to its Intel counter part.
The_Assimilator - Monday, March 16, 2020 - link
The fact that OEMs are willing to make custom designs for AMD is already a good sign that they're confident in the product. Lisa Su certainly has the right stuff.Khenglish - Monday, March 16, 2020 - link
I'm pretty unimpressed by the GPU vs the Vega 11 in APU desktops. The only major advantage Renoir has is higher clocks on the GPU core and higher officially supported memory speeds. They likely got the 56% performance per core improvement by comparing to a Zen+ with Vega 11, which will be severely clocked constrained on 12nm with a bigger core, where Renoir gets an even higher clock advantage not just from the nominal clock, but also from Picasso APUs hitting their TDP limit hard in a 25W or 35W environment.On desktop with much higher TDPs I expect Renoir to slightly beat the 3400g at stock clocks, but lose when comparing overclocked results. Picasso easily overclocks up to 1700-1800 MHz from the measly 1240 MHz stock clock. I would guess Renoir would hit around 2000, not enough to compensate for the smaller core.
eek2121 - Tuesday, March 17, 2020 - link
There are a lot of problems with your comment, but let’s start with the obvious: The TDP of the part you mentioned is at least triple that of the 4800U. Depending on how the chip is configured it is quadruple.These are laptop parts, we haven’t seen desktop APUs. AMD could add 3X as many Vega cores and still hit a 45-65 watt TDP or they can go aggressive on the CPU clocks like they did the 4900H.
Spunjji - Tuesday, March 17, 2020 - link
I'm pretty sure the desktop APU won't have more Vega CUs.tygrus - Tuesday, March 17, 2020 - link
These days doubling the GPU cores/units and running half the speed is more energy efficient. Uses more die space but I don't understand the focus on GPU MHz over energy efficiency.Spunjji - Tuesday, March 17, 2020 - link
1) Not sure evidence I've seen bears out that a 1700-1800Mhz GPU overclock is "easy". That sounds like the higher end of what you can expect. Would welcome evidence to the contrary, as I'm still considering picking one up.2) RAM speed is the big difference here. The desktop APU should get much higher memory speeds than the 3400G due to the improved Zen 2 memory controller, which ought to relieve a significant bottleneck. GPU core overclocks weren't actually the best route to wringing performance out of the 3400G.
Fataliity - Monday, March 16, 2020 - link
@Ian Cuttress, did they say what version of N7 they used for this? The density looks like either a HPC + variant, or a N7 mobile variant from what I can tell?Thank you!
abufrejoval - Monday, March 16, 2020 - link
What I want is choice. And flexibility to enable it.15 Watt TDP typically isn’t a hard limit nor is 35 or 45 Watt for that matter: It’s mostly about what can be *sustained* for more than a second or two. Vendors have allowed bursting at twice or more TDP because that’s what often defines ‘user experience’ and sells like hotcakes on mobile i7’s.
We all know the silicon is the same. Yes, there be binning but a 15 Watt part sure won’t die at 35, 45 or even 65 or 95 Watts for that matter: It will just need more juice and cooling. And of course, a design for comfortable cooling of 15 Watts won’t take 25 or 35 Watts without a bit of ‘screaming’.
But why not give a choice, when noise matters less than a deadline and you don’t want to buy a distinct machine for a temporary project?
I admit to have run machine-learning on Nvidia equipped 15.4” slim-line notebooks for days if not weeks, and having to hide them in a closet, because nobody in the office could tolerate the noise they produced at >100 Watts of CPU and GPU power consumption: That’s fine, really, when you can choose what to do where and when.
Renoir has a huge range of load vs. power consumption: Please, please, PLEASE ensure that in all form factors users can make a choice of power consumption vs. battery life or cooling by setting max and sustained Wattage preferably at run-time and not hard-wiring this into distinct SKUs. I’d want a 15 Watt ultrabook to sustain a 35 Watt workload screaming its head off, just like I’d like a 90 Watt desktop or a 60 Watt NUC to calm down to 45/35/25 Watt sustained for night-long batches in the living room or bed-side—if that’s what suits my needs: It’s not a matter of technology, just a matter of ‘product placement’.
watzupken - Tuesday, March 17, 2020 - link
To this point,"Renoir has a huge range of load vs. power consumption: Please, please, PLEASE ensure that in all form factors users can make a choice of power consumption vs. battery life or cooling by setting max and sustained Wattage preferably at run-time and not hard-wiring this into distinct SKUs. I’d want a 15 Watt ultrabook to sustain a 35 Watt workload screaming its head off, just like I’d like a 90 Watt desktop or a 60 Watt NUC to calm down to 45/35/25 Watt sustained for night-long batches in the living room or bed-side—if that’s what suits my needs: It’s not a matter of technology, just a matter of ‘product placement’."
I doubt they will ever let you do that on a laptop or even NUC officially. The cooling solution implemented is usually very closely correlated to the TDP of the processor. Even when it is a downgrade from say a 45W to 35W, these are usually tightly controlled by AMD and Intel. There is no guarantee that all chips will work well at a certain clockspeed across various TDP. For example, a Ryzen 7 4800H may not be able to run at a 4800U speed when you reduce the TDP from 45W to 15W. U series chips are binned to be able to run at that the specific clockspeed and likely also commands a higher premium.
Tams80 - Tuesday, March 17, 2020 - link
This. If you want to push your system, with the risk of damaging it, then you should be free to as long as there's no direct risk of it causing harm.Namisecond - Thursday, March 26, 2020 - link
Why are you using a laptop for a workstation task? If you have to hide it in a closet, then you really don't get to choose the where and when. Might as well rent out some hardware at a data center and do the work remotely. Not judging, just saying your way of doing things doesn't make a whole lot of sense to me.yankeeDDL - Tuesday, March 17, 2020 - link
I have a question: let's take Ryzen 7 4800U and Ryzen 3 4300U devices.Both are 15W parts, yet, one has 4/4 C/T and the other 8/16.
The 4300U has less GPU CU and lower clock.
How can they have the same TDP? Does this mean that the 4300U is likely to stay at Turbo a lot more consistently than the 4800U?
Spunjji - Tuesday, March 17, 2020 - link
Possibly. It will more likely mean (at least initially) that the 4300U is an inferior specimen and needs more voltage to reach its standard clock speeds.yankeeDDL - Tuesday, March 17, 2020 - link
It's possible. I guess also that's why they can increase the base clock on lower core count.djayjp - Tuesday, March 17, 2020 - link
AMD: "Making the best graphics engine even better -- Vega 7nm" Surely they mean Navi...? Confused.yankeeDDL - Tuesday, March 17, 2020 - link
No, it is Vega architecture, ported to 7nm. With some improvements (as described in the article itself), but it is still Vega.djayjp - Wednesday, March 18, 2020 - link
Right I get that but surely Navi is a better architecture than Vega (look at Radeon VII vs 5700 XT).djayjp - Wednesday, March 18, 2020 - link
Especially for mobileNamisecond - Thursday, March 26, 2020 - link
Yeah, so is HBM over using system memory, but we're not going to see that on the 4000-series Ryzen either.realbabilu - Tuesday, March 17, 2020 - link
Will be apple interested too?Or apple will do things like licensing from amd zen2 platform and built own system?
Cliff34 - Tuesday, March 17, 2020 - link
My only wish is that Dell put this in their xps 15. I love how much mod and customization I can do with that model.yeeeeman - Tuesday, March 17, 2020 - link
Ian, can you make, on the review on one of these puppies a test like Andrei does on phones, that is compute the efficiency using SPEC? I want to see how Ice Lake stacks up.ballsystemlord - Tuesday, March 17, 2020 - link
Spelling and grammar errors:"...with within the right thermal envelope,..."
"but" not "with":
"...but within the right thermal envelope,..."
"If the CPU went too far down this stack, while it would be saving power, each hop down the rabbit hole meant a longer time to get back out of it, diminishing performance and latency but also requiring more power changes at the silicon level."
Badly written:
"If the CPU went too far down this stack, while it would be saving power, each hop down the rabbit hole would mean a longer time to get back out of it, diminishing performance and increasing latency but also requiring more power state changes at the silicon level."
dontlistentome - Tuesday, March 17, 2020 - link
Do you think Intel will put it in a NUC for us? They can go back to 15W and quieten them down again.Tams80 - Tuesday, March 17, 2020 - link
At last, some competition.If the world hasn't gone into a massive recession it'll still be this time next year at best, likely Winter 2021, that we'll see a Ryzen 4000 CPU paired with an rDNA 2 GPU though. Ah well.
mattkiss - Tuesday, March 17, 2020 - link
Under "What's New," second paragraph:"Compared to the desktop design, the mobile is listed as being ‘optimized for mobile’, primarily by the smaller L3 cache – only 4 MB per quad-core group, rather than the 32 MB per quad-core group we see on the desktop."
Should be 16 MB per quad-core group (CCX) on the desktop, not 32.
Gondalf - Wednesday, March 18, 2020 - link
Sooo the leonovo laptop is a confirmation Renoir is a 25 W SKU, and definitevely nope a 15W class cpu. To achieve the rated clock speeds the SOC needs to increase of 70 % the TDP.This the reason Intel care nothing of these AMD SKUs, at 15W a four core cpu is much more efficient for laptop responsiveness.
TheMighty - Wednesday, March 18, 2020 - link
Nope it's a 15w chip configurable up to 25w just like intel in most cases. Nice try though.Qasar - Wednesday, March 18, 2020 - link
oh shut up gondalfKAMiKAZOW - Saturday, March 21, 2020 - link
Why does the conclusion talk about a ASUS Zephyrus G14 with a Radeon GPU?The intro says GeForce, so does the Asus website. Where is that Radeon variant coming from?
arosso - Friday, April 10, 2020 - link
How moving to smaller technology decreases idle power?Or I misunderstood that sentence 😅
GreenReaper - Sunday, May 24, 2020 - link
Smaller silicon features may require less voltage to activate. For example, there are smaller gaps; think of how big a voltage (not current) you need for a spark across a significant air gap. There might also be unrelated technological advances incorporated.