Comments Locked

267 Comments

Back to Article

  • ingwe - Thursday, April 9, 2020 - link

    This is very exciting! Especially those battery life numbers.
  • shabby - Thursday, April 9, 2020 - link

    Wow ya i was hoping it would at least match intel not double intels battery life lol
  • BigMamaInHouse - Thursday, April 9, 2020 - link

    CB R20 Scores are wrong Again :-)
  • Cooe - Thursday, April 9, 2020 - link

    The scores are right, they are just labeled wrong lol.
  • SolarBear28 - Thursday, April 9, 2020 - link

    @IanCutress The scores should be reversed
  • Samus - Friday, April 10, 2020 - link

    It isn't just exciting. It's shocking. AMD finally has a viable notebook CPU.

    Now they need to come up with something worthy against the U series for ultraportables because the current crop Ryzen U parts (like the Pro 3500U) are a tough pill to swallow since they seem to run closer to 35w under any sort of load.
  • Gondalf - Friday, April 10, 2020 - link

    Here i can see only and AMD 8 cores SKU beating a 6 cores one. There is nothing to be shocked.
    Try a 8 cores Intel laptop instead, equipped with a new intel SKU and not with one shipped one year ago.
  • Qasar - Friday, April 10, 2020 - link

    and what if the 8 core intel laptop couldnt compete with this one, then what ? would you find a way to make up some lame BS reason as to why it still lost??? come one gondalf, get a life
  • Gondalf - Sunday, April 12, 2020 - link

    Pretty pointless to compare a 6 core cpu with a 8 core one.
    Moreover the article is rushed because Intel SKUs have a 15ms delay to go from idle to max turbo.
    Very likely the Intel Laptop suffer of bad settings or wrong/obsolete bios. Same applies to power measures, with a 80Wh battery an Intel laptop can last around 10 hours browsing web with GPU down. My 38Wh battery give me 5 hours of the same workload, with GPU down.
    My bet Intel laptop was with GPU up under the test.
    So better wait less rushed reviews to judge.
  • Qasar - Sunday, April 12, 2020 - link

    pretty pointless to try everything you can do to give intel excuses as to why they are losing gondalf, even if intel had more cores, they would still be losing, i believe zen 2 desktop vs intels desktop cpus show this, more cores for intel, and intel still looses on most tests, but yet if it was the other way around, it would be ok, and you would just bash amd for making an inferior product. either way, your just trying to come up with lame BS as to why intel lost, like most i bet, knew you would. very likely, intel just has the inferior product right now.
  • Deicidium369 - Sunday, April 12, 2020 - link

    What exactly are they losing? Intel is destroying them in every single metric other than superfluous cores and marketing dishonesty. What Gondolf said was why not compare latest to latest.

    Intel wins in market share
    Intel wins in Revenue & Profit (the ONLY metrics that matter in business)

    AMD wins in the rabid dedication of their misinformed Fan Boys.

    I can GUARANTEE you that I have bought far more AMD products than you have. Built a 1700x when they were new, built a 2700x when they were new, bought a 3950x a couple weeks ago. Bought the 5700XT when Gigabyte put out the first windforce, and pre ordered 2 of the Vega VII. Gave away both the 1700x and 2700x, still have the 3950x, the 5700XT and 1 of the VII. NONE of these are enough to dislodge my i9900K / Dual 2080TI system. All those extra cores are a cute marketing ploy - that's all they are - they are not useful except for benchmarks and 1% or less will make use of those cores.

    Again, Intel has not and is not losing. If Intel was in such bad shape, why hasn't the Magical Mythical Su been able to capitalize on that and steam roll market share? Next to ZERO uptake in the data center market (Cloud providers will install WHATEVER comes along - not like a single Intel was replaced by AMD in those services - AMD was ADDED). All AMD is excuses and marketing.

    IF Intel has the inferior product in this article - then why not compare Ice Lake to 4x00? I have a Dell 13 2-in-1 with the i7-1065G7 and it destroys my almost 2 year old Dell 13 2-in-1 - but yet that is basically the processor that is being used in this comparison.

    More cores more core more cores - would be GREAT if developers could come up with an easy way to make their code more parallel - simply spawning another thread that is doing the EXACT same thing as the other thread isn't getting anything done. Spawning a thread that works on a different part of the same problem is getting something done. Making code more parallel is a problem that has existed since the very 1st SMP system almost 4 decades ago. No amount of fanboy fervor will change that fact.
  • Qasar - Sunday, April 12, 2020 - link

    Deicidium369, then you must not be reading reviews, or like you put it your self Intel wins in the rabid dedication of their misinformed Fan Boys.

    Intel wins in market share, wish is starting to drop
    Intel wins in Revenue & Profit (the ONLY metrics that matter in business) cause they have been over charging for all of their products.
    " then why not compare Ice Lake to 4x00? " um maybe, because, they cant be found, NO stores here have any of the 10nm chips for notebooks in stock and are only quad core, which the intel rabid fan boys would claim is even more in fair then this review is.

    " More cores more core more cores " i guess you believe intels BS still about needing only 4 cores for the mainstream, the amount of power its chips use, or its lies about 10 nm being on track ? the ONLY reason why intel has any performance lead, is because of clock speed. but hey if you want to keep supporting intel, with the lies and BS it does and says, by all means
  • Deicidium369 - Monday, April 13, 2020 - link

    Look little boy, I don't root for any corporation, much less Intel or AMD.

    Where is the evidence of Intel's market share dropping?

    Go play, little boy - with all the lies and whatever else you look to corporations for...
  • Qasar - Monday, April 13, 2020 - link

    oooo resorting to insults now, any you call me the little boy ?? as most people will say, google it. but i know you wont and just say more BS so :
    https://www.tomshardware.com/news/amd-vs-intel-cpu...
    https://www.techradar.com/news/amd-now-has-40-of-p...
    https://www.reddit.com/r/AMD_Stock/comments/exvrkr...

    need more ??
  • jgood13 - Monday, February 22, 2021 - link

    Lmao. Do you see how comments like this can look a bit foolish in retrospect? They've definitely lost market share because of these laptops.
  • cpugod - Friday, April 17, 2020 - link

    "Intel wins in market share
    Intel wins in Revenue & Profit (the ONLY metrics that matter in business)"

    I'm sure some IBM fanboy made the same comment in the 80's and or 90's

    Intel is on a downward slope... which hopefully this competition will reverse.

    My oldest friend was a Sr. engineer on Intel's processors and left a few years ago because he got fed up with the politics and bozos that were killing their future... he told me then "watch you'll see us totally f up, but we won't fix it because our battles aren't in the marketplace, but rather with internal groups and upper management"... and sure enough
  • cpugod - Friday, April 17, 2020 - link

    actually would be more like IBM in the 70's and early 80's
  • Spunjji - Friday, April 24, 2020 - link

    Agreed with Gondalf is a great way to indicate to anyone with more than half a brain cell that you aren't worth paying attention to. xD
  • AwesomeBlackDude - Tuesday, May 5, 2020 - link

    Oh boy I wasn't even aware that Intel sales was diminishing so fast to elaborate... Intel profits is down into the toilet. So the new rumors might be real about Intel is back bribering the OEMs like HP and Dell.
    https://cdn.wccftech.com/wp-content/uploads/2019/1...
  • Curiousland - Sunday, April 12, 2020 - link

    To me, same as many users, it just like buying a car, we don't care what engine type or how many cylinders it has, only the bottom horse power and fuel efficient and car cost count. In the black box of "CPU" or GPU, why a user care? There is not such thing as "pointless to compare" as long as it is more powerful and efficient (energy and price pay).
  • schujj07 - Tuesday, April 14, 2020 - link

    @Gondalf you are complaining that this is a 6c/12t vs 8c/16t well here is a review of the same laptop with more competition. https://www.tomshardware.com/reviews/asus-rog-zeph... Included in that is an intel 8c/16t and that still loses. The reason for the 6c/12t laptop in this review is they were comparing laptops of similar price. An equivalent laptop with the Intel 8c/16t CPU runs $2650 or $1200 more than this Asus and that $2650 laptop still loses.

    @deicidium369 your rant about the gaming desktop you basically copied and pasted for a forum on tomshardware https://forums.tomshardware.com/threads/amd-big-na... Just because you post the same thing in two places doesn't give you more credibility. Also that laptop review I posted from tomshardware does include an Intel Ice-Lake laptop configured to 25W, guess what it still loses. Right now there aren't any reviews of laptops with the Ryzen 4000U series. Once those come out we will be able to see how they do against competing Ice-Lake laptops. My best guess is that the Intel will still lose and it won't matter the core count. Reason for that is across the board the Ryzens will have better base clock speeds regardless of core count. While there are certain tasks that are bursty on laptops, there are others that aren't and take longer to run. Anything that isn't able to burst and has to rely more on base clock will almost for sure be faster on the AMD. Even the 8c/16t 4800U @15W has a higher base clock than the 1065G7 (top of the stack Ice-Lake) @25W: 1.8GHz vs 1.5GHz, at 15W the Intel is only 1.3GHz. Looking at boost clocks the only Ryzen with a lower boost clock than the top of the line Intel is the Ryzen 3 4300U, the bottom stack chip, 3.9GHz Intel vs 3.7GHz Ryzen. All the other Ryzens boost to at least 4.0GHz.
  • Korguz - Tuesday, April 14, 2020 - link

    i looked at that post on toms, JarredWaltonGPU's post regarding him is awesome
  • schujj07 - Tuesday, April 14, 2020 - link

    It is nice to see Jarred Walton doing reviews again. I remember reading his reviews here on anandtech many years ago.
  • blkspade - Saturday, August 1, 2020 - link

    @Gondolf - Your arguing against such a comparison misses all of the important details. The 8 core outperforms a more expensive 6 core, while also being more efficient with those extra cores. Even if an equivalent Intel 8 core offering were between on par or better, performance wise, it would be both dramatically more expensive and less efficient. For the potential consumer, that makes it absolutely fair comparison, and one that matters.
  • Viilutaja - Saturday, April 11, 2020 - link

    Just check out the 8C vs 8C review's at Youtube! Be glad it was not compared againest the best of Intel mobile 8 cores, because most of them AMD won and even against 80W version of Intel 8core cpu... And there is even faster CPU by AMD 4900H which is 45W part not this 35W part in this review.
  • sharath.naik - Saturday, April 11, 2020 - link

    With this, there is no intel product you can buy over AMD. Not in Laptop, not in desktop and not in the server space. Intel is 2 generations behind in performance in all, but in laptops buying an Intel would be a very poor choice, given you have half the performance(I will not count the 5-sec turbo boost that intel gives as legitimate numbers) and poorer battery life.
  • Gondalf - Sunday, April 12, 2020 - link

    Too bad the output of these SKUs will be very low.
    At the end Intel care nothing of there cpus, they will not affect Intel botton line. Only Intel can supply the OEMs channels. This piece of silicon is an intersting but useless experiment.
    No volume no money
  • FreckledTrout - Sunday, April 12, 2020 - link

    I would have thought the world's largest fab, TSMC, could make as many chips as needed. Silly me.
  • Qasar - Sunday, April 12, 2020 - link

    more lame BS from pro intel gondalf
  • Deicidium369 - Sunday, April 12, 2020 - link

    More lame BS from pro AMD qasar.
  • Deicidium369 - Sunday, April 12, 2020 - link

    Thing is AMD is not their largest customer - they also build for Apple and Nvidia - so NO, TSMC could NOT deliver the same volume as Intel - not even close.
  • Qasar - Sunday, April 12, 2020 - link

    and intel cant deliver 10nm in volume, point is ?
  • Namisecond - Monday, April 13, 2020 - link

    Until we know actual numbers, Intel's "Can't deliver in volume" may still be more than the volume AMD can. To the point where they win the OEM contracts.
  • Qasar - Monday, April 13, 2020 - link

    and i STILL cant buy any 10nm based chips from intel, that kinda points to cant deliver in volume to me. some markets are getting them, but my local computer stores, best buy, or other stores that sell notebooks, dont have any.
  • Deicidium369 - Monday, April 13, 2020 - link

    I bought 2 Dell 2-in-1s back in October - both 10nm Ice Lake, both i7-1065G7s. The fact you can't seem to locate them at the Goodwill where you shop, doesn't change the fact they are around, and no problem to get.

    try http://www.dell.com
  • Qasar - Monday, April 13, 2020 - link

    and yet more insults.. that all you got now ?? grow up
  • Deicidium369 - Monday, April 13, 2020 - link

    I would bet you that Intel has sold more 10nm than AMD has sold period. The idea that Intel can't deliver silicon is a cute story. Intel 10nm is in like 20 different designs, and there are other 10nm products other than the laptops stuff. So they are delivering it in volume - for close to a year now - and they signaled that by making the Cooper Lake 4 and 8 socket only and only having 10nm Xeon in 1 and 2 socket.
  • Qasar - Monday, April 13, 2020 - link

    " I would bet you that Intel has sold more 10nm than AMD has sold period " yea right, prove it.. most some links instead of your intel biased BS
    again.. just be cause YOU were able to by intels 10nm stuff where YOU are, doesnt make it high volume, and to quote dell as being a source, good one, guess what dell as used intel for MOST of the time its been in business, and there for, more then likely gets first dibs.
  • Namisecond - Monday, April 13, 2020 - link

    If TSMC were exclusively working on AMD stuff, possibly, but TSMC has more steady and more lucrative customers like Nvidia, Qualcomm and Apple. To make matters worse, AMD is also tying up valuable fab time with their console SoCs.
  • Namisecond - Monday, April 13, 2020 - link

    World's largest fab or not, You can't just throw some money at TSMC and demand more wafers overnight. Last I checked, TSMC was at capacity and was not accepting new orders for anything less than 18 months into the future. This is how contract suppliers work. AMD also has the problem of game console SoCs tying up their available fab capacity at TSMC. Intel owns their own fabs. If they can get their shit sorted out, they won't have capacity problems like this. Just because Gondalf makes bad fanboi arguments doesn't mean you have to lower yourself to his level.
  • Curiousland - Sunday, April 12, 2020 - link

    US is adding more restrictions on TSMC to ship chips to Hwawei which is one of the biggest 7nm and 5nm customers of TSMC. So like it or not TSMC will have a lot more capacity and to rely on AMD's business. So, yeah, TSMC and AMD will work more closely ever before.

    https://www.cnbc.com/2020/03/27/us-prepares-crackd...
  • Namisecond - Tuesday, April 14, 2020 - link

    Like your article link says, that's more likely to hurt TSMC and it's western customers like AMD, Nvidia and Apple more than it will Huawei. At this point, I don't think it's going to happen. Even if it does, it won't affect contracts already in place and products already in production. Remember, 18 month lead time.
  • JayNor - Sunday, April 12, 2020 - link

    this is partially an Intel product, and they thank you.

    " a 1 TB Intel 660p NVMe SSD, and an Intel Wi-Fi 6 solution."
  • dguy6789 - Tuesday, April 14, 2020 - link

    What do you mean? Intel's 9900K is faster than anything AMD has in gaming including AMD's 4 grand CPU. And that is just 14nm vs 7nm.

    AMD won't have a better gaming CPU than the 9900K 2 years from now.
  • Qasar - Tuesday, April 14, 2020 - link

    and the 9900k only has the performance lead for one reason, clock speed, which is the only reason intel has any performance lead right now, while using more power to get that performance. clock the cpus at the same speed, and see what happens.
  • schujj07 - Tuesday, April 14, 2020 - link

    Outside of gaming, the 3700X performs as good or better than the 9900k more often than not, all while the 9900k draws a massive 60% more power. In gaming benchmarks the 9900k is ahead by about 5% at 1080p using a 2080Ti. At that point it is the difference between 300fps & 315fps. No way that you or I will ever be able to tell the difference.
  • schujj07 - Tuesday, April 14, 2020 - link

    Edit: The 9900k has a 3.6GHz base and 5GHz boost clock. 3700X has 3.6GHz base and 4.4GHz boost. Even with a 13.5% higher boost clock, see typically in single threaded applications, the 9900k is only barely able to beat out the 3700X in some single threaded applications. Difference is usually 3% on average. Normalized for clock speed Zen 2 has about an 8% IPC advantage over Sky Lake and its derivatives.
  • marrakech - Sunday, November 8, 2020 - link

    https://www.dell.com/en-nz/work/shop/workstations/...
    https://www.cpubenchmark.net/compare/Intel-Xeon-W-...
  • marrakech - Sunday, November 8, 2020 - link

    nice prediction ,
    just as an information i seen some dude soend 6000 $ for an intel xeon w workstation the best 8 core mobile intel and its still slower then my 4800H cpu
    total cost of laptop after ram upgrade 1180$
  • jgood13 - Monday, February 22, 2021 - link

    I don't think this is going to prove to be true...
  • Gondalf - Friday, April 10, 2020 - link

    I find crazy to compare an 8 core Laptop to a 6 core one.
    Intel is plenty of 45W 8 cores SKUs for laptops with turbos at over 5Ghz, and we are here to show nothing.
    So basically AMD can only compete with a 6 core Coffee lake???
  • Irata - Friday, April 10, 2020 - link

    There are reviews that show the eight core can‘t keep up either or just barely and that is in a huge and heavy chassis (i.e. the portable desktop type). And their battery life is even worse, as is their power consumption.

    Oh, and looking at this review, the six core is coupled with a faster RTX 2060 GPU...
  • Idontknowhatosay - Friday, April 10, 2020 - link

    Der8auer did a comparison against the 9980HK and the 4900HS still came on top in performance. It even manages to keep up with the 5 GHz, 90 watt core i9 inside the Helios 700, which is a beast of a laptop in terms of size and weight.
  • Omega215D - Friday, April 10, 2020 - link

    goddamn intel fanboys truly want to distort shit so there is no real competition left in this space.
  • Namisecond - Monday, April 13, 2020 - link

    There is no real consumer "competition" in this space. Intel's 10nm production is fubar'd for the time being, To the point where expected products are just now trickling into the market place. AMD could capitalize on this, but their production is constrained because they don't have full control of their own production. This is the mobile market, not the desktop CPU market where you can have retail sales. OEMs are key here.

    2020 will be a bad year for laptop options.
  • vozmem - Friday, April 10, 2020 - link

    Yes, you are right.
    It is crazy because the 6-core Razer laptop is also slightly more expensive than 8-core Asus laptop.
  • s.yu - Saturday, April 11, 2020 - link

    In all fairness, Razer is usually more expensive than Asus.
  • alufan - Friday, April 10, 2020 - link

    hmm ok so the Intel CPU was supposed to be at 45w but could not maintain that probably du e to thermals, its supposed to clock higher but ditto and the 2060 is a higher performance version, however other tests still show the Ryzen ahead on 8 core Intels as well, it also seems to be a common result that the AMD is a long long way ahead on its battery life, not sure about other countries but this week in the UK we have had every online seller doing ads on TV pushing Intel laptops to the max, no doubt paid for by Intels rebate scheme dirty tricks me thinks but the truth will out gen 10 indeed lol
  • Deicidium369 - Sunday, April 12, 2020 - link

    95% of the market vs 5% of the market - that's called playing to the largest potential audience. Sorry but Intel outsells AMD at every single price point - regardless of what the AMD fanboys say - Revenues do not lie.
  • Qasar - Sunday, April 12, 2020 - link

    well other sites say different about who out sells who, others here have even posted links that show this.
    "Revenues do not lie. " of course they dont, specially when you overcharge for your products
  • Deicidium369 - Monday, April 13, 2020 - link

    "Revenues do not lie. " of course they dont, specially when you overcharge for your products"

    The market says otherwise, they think the products are well priced, and Intel sells all they can make - so just because YOU can't afford them doesn't mean they are over priced - and if they were sooo overpriced, seems like AMD would be in MUCH better financial situation than they are.
  • Qasar - Monday, April 13, 2020 - link

    again prove it, look at the 3 links i posted farther up. yea..right well priced, over priced is more like it, Epyc Rome, more cores, it some cases HALF the price, and better performance.
  • schujj07 - Tuesday, April 14, 2020 - link

    Revenue actually does lie. Look at the mid 2000s when the Athlon 64 was king. Intel was still making money hand over fist because of shady business practices. When you are the 800lbs gorilla, you can throw your weight around and make sure that people only buy your product even if it is inferior.
  • alufan - Monday, April 13, 2020 - link

    lets revisit this comment in 12 months shall we as an example my company has a worldwide base of 60k plus they just moved all future buys to AMD the tide is turning and frankly its about time, intel will return and frankly i hope they do because competition is good for us the consumer but right now face it AMD simply has the better product in all ways maybe apart from one or two specilist benchs or workloads where intel has funded the software development and provided a chip to do the work
  • Namisecond - Monday, April 13, 2020 - link

    AMD probably does have the "better" product in just about all the fields. But can they step in and significantly eat into Intel's market share? I don't think so. AMD's production capability is currently limited and not in their control.
  • Qasar - Monday, April 13, 2020 - link

    " But can they step in and significantly eat into Intel's market share? " i think that is slowly starting to happen
  • Deicidium369 - Monday, April 13, 2020 - link

    Yup been happening for like 40 years - so far upto mid single digits.. AMD is a 2 trick pony and you almost can't build an AMD laptop/desktop without sending Intel some $$$
  • Qasar - Friday, April 10, 2020 - link

    ahh Gondalf, trying anything and everything to try to make your god of cpus look better, huh ? i find it crazy that you just cant except amd has the better product. give it up already, pathetic intel fanboy
  • Deicidium369 - Sunday, April 12, 2020 - link

    ahh Qasar, trying anything and everything to try to make your god of cpus look better, huh ? i find it crazy that you just cant except* Intel has the better product. give it up already, pathetic AMD fanboy

    *accept.
  • Qasar - Sunday, April 12, 2020 - link

    ahh Deicidium369 i find it crazy you are the one that cant except it. amd has the better product now, most reviews have shown that. give it up already, pathetic Intel fanboy
  • Deicidium369 - Monday, April 13, 2020 - link

    Except mean some sort of exclusion - I like you sister, except for here huge buck teeth.

    Accept mean to allow or to acquiesce...

    Your teachers have dropped the ball with you horribly. Maybe one of those Word a Day calendars.
  • Qasar - Monday, April 13, 2020 - link

    wow.. yet more insults, the must be all you have left.
  • cgeorgescu - Friday, April 10, 2020 - link

    The thing is that the very few 8-core Intel mobile CPUs cost about $600 each and, at 45W, they are slower than this CPU at 35W.
    Check on YouTube, there are plenty of comparisons of this AMD CPU with Intel's greatest at 45W and even a few pushed to 90W.
  • Zingam - Saturday, April 11, 2020 - link

    I don't care about battery life. I have a power cord but I care very much about performance, noise, heat and portability.
  • eva02langley - Sunday, April 12, 2020 - link

    I care about battery life, especially since I want a laptop I can use for office work that is not going to die on me after 3-4 hours.
  • Deicidium369 - Sunday, April 12, 2020 - link

    My now almost 6 month old Dell 13 2-in-1 with Ice Lake gets 11 to 12 hours of real use - previous 2 year old Dell 13 2-in-1s that the Ice Lakes replaced were 7-8 hours at most. Battery life was the number one consideration for upgrading - but it's also noticeably much faster.
  • redtail3 - Thursday, May 14, 2020 - link

    Oh stop it you dumb f&ck. You are clearly a paid intel shill with so much persistence.
    "Your" battery claims mean nothing. Post the screenshots or a video link.
  • philehidiot - Thursday, April 9, 2020 - link

    I was really sceptical about the lack of a webcam. I think for the cost it's a simple addition. Then I remembered that I've had several laptops with webcams and I've just covered them up and never used them. Not once. When I have wanted to do anything requiring my face I've used my phone. Anyone I know who uses a webcam for business wants something far better than the integrated ones and so buys a decent standalone one.

    I wonder if this is a decision driven by use data from Windows 10 telemetry?
  • wr3zzz - Thursday, April 9, 2020 - link

    I Skype video on Android most of the time but Windows Hello is very nice and there are times I need the webcam because I am using the phone for tethering. Not having a webcam nowadays is a pretty weird decision, especially in a gaming notebook.
  • RollingCamel - Thursday, April 9, 2020 - link

    You can use DroidCam to operate your Android phone or IP Cam as a webcam.
  • Tams80 - Thursday, April 9, 2020 - link

    It's still not as handy as having a webcam just there, at a decent height and angle with no stands/tripods, etc. to worry about. And of course no Windows Hello.
  • RollingCamel - Thursday, April 9, 2020 - link

    That's for sure.
  • Hardware Geek - Friday, April 10, 2020 - link

    I'm with you on the webcam. I've done the exact same thing and never used mine. The first think I do is put tape over it. No webcam is a plus for me personally.
  • Namisecond - Monday, April 13, 2020 - link

    Dogma meets reality.

    I had the same moment, when I bought my last laptop. I thought I needed massive computing power and the ability to game triple-A titles on the go. But then I remembered how I actually used my computer. A lot of note-taking, media consumption, web-browsing. For all the times I actually had to do any "heavy lifting" on the laptop, It was more likely I'd save it for when I got to a desktop or workstation with a big screen. What does this have to do with web cams? I've never had to use the ones that came with my laptops, but I see others who do. I'd like for my laptop to have the capability, but rationally speaking, I don't need it. It shouldn't be a deal-breaker for me.
  • R3MF - Thursday, April 9, 2020 - link

    Would love to see a cheaper model with:
    No dGPU
    R9 4900H
    16GB+512GB
    1080p 120hz screen
    same battery
  • joaolx - Thursday, April 9, 2020 - link

    Would also love for something similar although not necessarily this model. I have no need for one on a laptop but would love to have one of these new chips . My dream machine right now would be:

    No dGPU
    Any of the 8 Core/16 thread options - 4900H/HS or 4800H/HS or 4800U
    32GB + 1TB
    Any decent 1080p or higher display really, indifferent on higher refresh
    Similar battery life - if even better with the 4800U it'd be my choice
  • twotwotwo - Thursday, April 9, 2020 - link

    Speculation, but sounds likely you'll get exactly what you're asking for when the 4800U-based thin-and-lights come out.
  • R3MF - Thursday, April 9, 2020 - link

    If they're set to 25w with cooling to match then yes.
  • neblogai - Thursday, April 9, 2020 - link

    ~same for me: H(/HS), 16GB of LPDDR4, in a 13-14" ultraportable with bright screen.
  • mocseg - Friday, April 10, 2020 - link

    Lenovo Yoga slim.
  • neblogai - Friday, April 10, 2020 - link

    Yes, it is probably not too bad performance wise- U-series set to 25W + LPDDR4X. But, I understand, it is in the upper price range, so I'll wait to see the overclocking results of ~200g lighter and ~€200 cheaper Swift 3, which might fit my needs better.
  • twotwotwo - Thursday, April 9, 2020 - link

    Similar--I would also love 13-14" + all the CPU + enough battery. Minimal graphics is fine and I can't use a high refresh rate. Wouldn't mind a better-than-1080p screen, but that's icing. And I like the matte screen, user-upgradeability, and good keyboard here. (So, like, move a little towards the MacBook Pro kind of market but not too far.)

    Think I read they didn't expect anyone to build with a 4900H(S) and no dGPU. If that's how it is, it'd still be cool to see a small laptop that cheaps out on the dGPU/refresh rate but not on everything else, for those of us that aren't hardcore gamers. Maybe an AMD dGPU? Their stuff to shift power budget between CPU and GPU seems neat.
  • lightningz71 - Thursday, April 9, 2020 - link

    I like what you're putting down, but, I want the following:
    No dGPU
    R9-4900H with generous cooling
    two SODImm sockets
    1 X 2.5 inch SATA bay
    1 X NVME M.2 slot
    15inch form factor
    1440p screen with freesync (High res for productivity, 720p RIS upscaling from the iGPU for gaming)
    95watt battery

    That would be everything that I need in a laptop. I'm not looking for bleeding edge gaming, but, I do like having a lot of screen pixel area when I need to do something useful.
  • eva02langley - Sunday, April 12, 2020 - link

    Waiting for something similar, however in an slim ultrabook factor.
  • Zingam - Saturday, April 11, 2020 - link

    USB4, HDMI 2.1, more PCI lanes, PCI 4.0, AV1 4K encoding, decoding, etc. insignificant stuff...
  • Zingam - Saturday, April 11, 2020 - link

    RJ45
  • u600213 - Thursday, April 9, 2020 - link

    I almost ordered an ASUS Zephyrus G14 but no webcam so no go.
  • quantumshadow44 - Thursday, April 9, 2020 - link

    lack of RJ45 is also no go
  • Dahak - Thursday, April 9, 2020 - link

    Yep same, I could go without a webcam. But lack of RJ45 is a big no no for me. For home users or mobile pros, its probably fine but as an IT pro, I need ethernet.
  • philehidiot - Thursday, April 9, 2020 - link

    I'm no IT pro, I'm a garden variety nerd and I have to say I need at least one laptop in the house with an RJ45. I have two laptops in the house, one being a Macbook Air (2011, now obsolete) which has no RJ45 and it means I have to use the Missus's laptop for any network diagnostics where I need to connect directly to the router. I see RJ45s as kind of like an optical drive for most people. You can get away without one, but they're damned useful to have around. My Macbook will be moving to Linux shortly for the rest of its life and I recently popped an SSD into the Wife's ageing laptop which turned it from unusable to awesome. I expect they'll both need replacing at a similar time and when that time comes, part of the buying decision will be ensuring one of the machines has an RJ45 on it or we buy an adaptor for when it's required.
  • schujj07 - Thursday, April 9, 2020 - link

    There are USB to Ethernet adapters that can be used.
  • GreenReaper - Monday, April 20, 2020 - link

    I imagine you won't get the fastest performance, but USB 3.x is a lot better than 2.0 (or wireless), and you can also get 2.5Gbps which you wouldn't get built-in.
  • liquid_c - Thursday, April 9, 2020 - link

    “IT pros” know that you can get just as stable of an internet connection via wireless. Something tells me you’re either a troll or just plain noob. I’d wager both.
  • RSAUser - Thursday, April 9, 2020 - link

    Not sure if you're trolling or not, assuming you believe it so, but no, Ethernet is always going to be more stable as less interference which means no chance of signal drops. My laptop is about 40cm from the router, I still use an Ethernet cable as it's more consistent in case my laptop decides the other access point is nicer.

    At work, no chance I'd consistently work on the WiFi, having over 120 machines connected to that would just destroy the bandwidth (remember, WiFi splits it according to how many devices are connected). Of those 120, around 80 are wired in, get around 15Mbps on WiFi.
  • Makaveli - Thursday, April 9, 2020 - link

    lmao great troll post liquid_c
  • sonny73n - Saturday, April 11, 2020 - link

    “ “IT pros” know that you can get just as stable of an internet connection via wireless. Something tells me you’re either a troll or just plain noob. I’d wager both.”

    You’re the troll and also a noob here. If you know anything about networking, you wouldn’t be spouting nonsense.
  • shady28 - Saturday, April 11, 2020 - link

    I work in a multi-billion dollar company's IT department as a developer, 25 years now.
    It may not be strictly correct to say WiFi is equally as stable as wired, however I do not know a single developer that I work with (out of dozens) who uses their laptop wired for connection stability. 99.9% of the time it is not an issue at all - maybe once a year we see a wireless hiccup from a failing AP. This is in an office that houses thousands of people who almost universally use wireless.
  • schujj07 - Tuesday, April 14, 2020 - link

    Whenever I get a ticket at work from someone complaining about their VPN connection not working, my first question is if they are using WiFi to connect to the internet. When they say yes, I ask them to try connecting over a wired connection. Not once have I had them say it didn't work after that. WiFi might work well for most connections, but it is more prone to signal loss and random latency spikes, and that affects VPNs for sure.
  • hehatemeXX - Thursday, April 9, 2020 - link

    Umm.. I doubt you two were going to, as you can easily just buy a usb to ethernet adaptor for a few $$
  • Agent Smith - Friday, April 10, 2020 - link

    Asus put a free LAN dongle in the box
  • alufan - Friday, April 10, 2020 - link

    As an it pro you should know better

    https://www.amazon.co.uk/Anker-Ethernet-Including-...
  • Cooe - Thursday, April 9, 2020 - link

    Because sticking a cheap USB adapter on the end of the Ethernet cable you plug into is just too much work? That problem is really minor to fix tbh.
  • Icehawk - Sunday, April 12, 2020 - link

    Agreed, for a home user - which this is aimed at I think it’s NBD but for enterprise machines I much prefer an integrated NIC so I don’t need to rely on a customer having a dongle (they won’t) or remembering to bring one. Sadly they are hard to find these days in this size laptop.

    At least this machine has a DIMM slot instead of soldered only.
  • GreenReaper - Monday, April 20, 2020 - link

    Sure, but having to mod your memory because they didn't enable XMP profiles is not super-convenient. I'm sure Asus would like you to buy their RAM, but still. (Or perhaps it's an issue of Intel not giving the necessary data?)
  • 1_rick - Thursday, April 9, 2020 - link

    What do you need a webcam for? I've seen a bunch of people here and at other sites call the lack of a webcam a hard pass.

    I use teleconferencing software extensively at my day job, both for meetings among people in different offices (and at home) and for meetings with clients, and nobody uses a webcam, although we're all far more interested in screen sharing, either to show someone how to do something, or to show a document of some kind, or whatever.
  • schujj07 - Friday, April 10, 2020 - link

    ^This
    Totally agree with this. I do the same thing and screen sharing is far more important for me in the IT world than a webcam. If you need a webcam go out and get a good one from Logitech instead of the included garbage on most laptops.
  • 1_rick - Friday, April 10, 2020 - link

    Yeah, that's the other thing--why wouldn't you want a better camera than the potato 720p you'll get with a laptop, if you do need one?
  • haukionkannel - Saturday, April 11, 2020 - link

    Webcams Are useful for personal contacts. At work I keep webcam mostly closed.
    And there Are/will be Also models with it, so people can chose what They pay for. So no worry if one Gaming laptop does not have it :)
  • Deicidium369 - Sunday, April 12, 2020 - link

    People making imaginary purchases....
  • sonny73n - Saturday, April 11, 2020 - link

    That and privacy issue too. I have a piece of black electrical tape cover my laptop webcam. I would not know when it turns on by itself and snoops on me like those Samsung TVs a while back.

    The aholes would say I got something to hide. I would like to let them know that I’d rather break the law or break their faces than letting myself caught in my most embarrassing moments.
  • Deicidium369 - Sunday, April 12, 2020 - link

    Nah you and I know it's not about hiding something - I have those Samsung TVs and they get the same round piece of black tape that all of my Webcams get unless they have a shutter.
  • shady28 - Saturday, April 11, 2020 - link

    ^^
    That. Outside of playtime, nobody uses webcams.
    That's that moment that you realize 90% of the people here haven't worked much more than burger flipping. Where I work and none of our vendors use them when doing presentations or meetings. It is all screen sharing, file and folder sharing, team or skype chat, etc. In fact, most developers I know actually put a piece of tape over their webcam, just in case.
  • Icehawk - Sunday, April 12, 2020 - link

    With our push to WFH due to C-19 everyone and their mother is asking us to enable their cameras but I agree, in actual meetings it’s maybe 10% at best that use it. Hell 1/3rd of the people don’t even login properly so you can see who they are.

    If you are going to include a camera at least integrate a shutter.
  • shady28 - Sunday, April 12, 2020 - link

    I don't even see 10%. I see near 0%, when someone turns on a web cam they instantly get messages saying 'Hey, you know your webcam is on?' - because it serves no purpose in most settings except to distract, annoy, disrupt, and lag.
    The only place I see it having value are people who want to 'face chat' 'facetime' etc type of scenarios with friends / family. Those are entirely social, and I would say 90% of people using the laptop for personal use don't care about that either (I don't, nor do many others I know). That's what the 2nd cam on your phone / tablet is for - that's where I see it being used, for family and friends.
  • erple2 - Sunday, April 26, 2020 - link

    Unless there's more than about 15 people in a telecon, we _usually_ always turn on the webcam. In a WFH situation where there's really only 1-3 other people that you see on a daily basis, I find it helpful to continue to "see" my coworkers. I'm older than most of my coworkers (work in software), and I prefer in-person talking than slack or text-based communication, and I find that webcams help keep me more engaged in the particular meeting. Note - most of what we do with a "telecon" includes screensharing, too, but I find it much easier to gauge reactions and the the other non-verbal communications if you can also see each other. So I would agree that a webcam is important.

    That having been said, I find basically all of the webcam that exist on laptops to be pretty crummy, and thus I use a separate webcam than the one that comes on my work computer (Macbook Pro). Though that doesn't stop my coworkers from using the terribad one that comes on their laptops.
  • Irata - Friday, April 10, 2020 - link

    Even on laptops that have one built in, I used a USB webcam. Much better quality and I know when I can be filmed and when not.

    The downside is that you are losing a USB port.
  • Qasar - Friday, April 10, 2020 - link

    which could be gained back, and a few more by getting a small usb hub like i did from kensington, 4 ports, 1" x3 " and a short 6 " cable, nice little portable usb 3 hub, and works just fine :-)
  • yeeeeman - Thursday, April 9, 2020 - link

    How is the fan noise situation?
  • 1_rick - Thursday, April 9, 2020 - link

    According to a review elsewhere noticeable but not obnoxious (no coil whine etc., just air whooshing). YMMV of course.
  • rrinker - Thursday, April 9, 2020 - link

    Impressive machine - but I have to laugh. OK, they get the (S)pecial 10 watt lower CPU - and then shove about 10 watts of LEDs on the cover....
  • ses1984 - Thursday, April 9, 2020 - link

    All those leds are probably 1w or actually a fraction of a watt.
  • ingwe - Thursday, April 9, 2020 - link

    Definitely fractions of a watt. If they were 1 W, I doubt manufacturers would include them. Though I might be wrong on that one given how things seem to be going.
  • N8SLC - Thursday, April 9, 2020 - link

    The LEDs are an option if so concerned.
  • sonny73n - Saturday, April 11, 2020 - link

    Yea, LEDs is in trends nowadays, for dumb kids.
  • Deicidium369 - Sunday, April 12, 2020 - link

    The market for this is people who are really into the performance enhancing RGB LEDs - and once they can actually buy and drive a car they will have the performance enhancing stickers on their 6th hand Gold Honda
  • GreenReaper - Monday, April 20, 2020 - link

    Hopefully they only light up by default if plugged in. Sure, they would still decrease charge rate, but I imagine that would be an acceptable cost for the target audience.
  • BigMamaInHouse - Thursday, April 9, 2020 - link

    CB R20 results are wrong, Great Review like always :-)
  • Ian Cutress - Thursday, April 9, 2020 - link

    Good catch, I think I typed in the PCMark numbers by mistake there.
  • Retycint - Thursday, April 9, 2020 - link

    Probably mixed up the numbers for the Intel and AMD. From what I've seen the AMD should be getting 4000+ for Cinebench R20
  • anactoraaron - Thursday, April 9, 2020 - link

    That's exactly it. My I7-9750h gets around 2200 at 35w and near 3000 at 65w. These are flipped.
  • Slash3 - Thursday, April 9, 2020 - link

    Yeah, that graph immediately threw a red flag.
  • yeeeeman - Thursday, April 9, 2020 - link

    How about some gaming battery life? Given this is a gaming laptop this is the most interesting scenario.
  • Tams80 - Thursday, April 9, 2020 - link

    About what you'd expect from a gaming laptop, so under two hours. And that's not really surprising as it's the GPU that's the main culprit there.
  • haplo602 - Thursday, April 9, 2020 - link

    The Intel CPUs boost to 60W when not on battery until they exhaust the thermal headroom. So removing their power cable is the only way to force them into a comparable power envelope.

    Just check the Civ 6 AI test and compare power from the wall for both ... there'll be a huge difference (even if you count in the GPU power difference).
  • TheCrazyIvan - Thursday, April 9, 2020 - link

    Hanlon’s Razor - X-D
    Had to look that one up as I am no native speaker and did not know the phrase before - great reference!
  • The Von Matrices - Thursday, April 9, 2020 - link

    What is the point of having a single DIMM? It doesn't make the laptop any better than having non-upgradable memory because if you do upgrade the module then you have an unbalanced memory configuration. Either make both channels use DIMMs or have soldered down memory.
  • TheCrazyIvan - Thursday, April 9, 2020 - link

    Well, there will be a time that DIMMs will have appropriate specs and where e.g. 40GB of total RAM will be in need - no matter the last 24GB being only Single Channel.
  • GreenReaper - Monday, April 20, 2020 - link

    Not sure it will ever be the standard, I suspect people will move to DDR5 first.
  • Orkiton - Thursday, April 9, 2020 - link

    There's a typo at last section ;-) S/b "Ryzen mobile 4000: a divine win for amd"
  • ianmills - Thursday, April 9, 2020 - link

    LOL I was wondering about that as "divine wind" means kamikaze. Not quite what AMD is hoping for this line!
  • Rudde - Friday, April 10, 2020 - link

    Zephyrus is the god of the west wind in Greek mythology.
  • mode_13h - Monday, April 13, 2020 - link

    I also immediately thought of kamikaze, which literally translates to "divine wind".

    Not saying a play on the name wasn't warranted, just not that one.
  • Deicidium369 - Sunday, April 12, 2020 - link

    well, maybe subliminal - AMD has a history of Crash and Burn... so maybe intended, maybe not
  • morello159 - Thursday, April 9, 2020 - link

    I appreciate the in-depth review, particularly around the odd battery life inconsistency. Some other sites were reporting suspiciously low battery life as well - sounds like they may have had the same issue.
    I would buy this laptop at launch if it had a brighter screen. I can add a USB webcam, or use a USB dock for ethernet. But after using the XPS 13's glorious 500nit screen in a store, 260 nits is just not bright enough.
  • Reflex - Thursday, April 9, 2020 - link

    Honestly this is it for me, I need a XPS13 competitor or preferably variant. Perhaps a Surface Book 4 would work if they offer a 32GB version.
  • Agent Smith - Saturday, April 11, 2020 - link

    With all the great press this laptop has received i suspect Asus will address both the screen and keyboard brightness issues soon.

    Maybe even during the replenishment stages as sold out everywhere.
  • s.yu - Sunday, April 12, 2020 - link

    260nits is definitely not enough, because you have to account for backlight degradation a couple years into use. It'll become miserable anywhere but the most controlled ambience.
  • PeachNCream - Monday, April 13, 2020 - link

    Possibly for some users, yes 260 is too low, but not everyone is like that. Even with the windows open or in bright office lighting conditions, I usually run my laptop panel brightness at or close to as low as possible to reduce strain on my eyes. 200 nits feels way too bright so even a degraded panel that once pushed 260 is probably going to be bouncing around 10-30% brightness for a lot of us.
  • JayNor - Sunday, April 12, 2020 - link

    Anyone seen the MSI Creator 17 1000 nits display?
  • jaskij - Thursday, April 9, 2020 - link

    I'm surprised that the quick ramp up of turbo surprised you. Remember those crazy issues with desktop Zen 2 frequency switching too fast to actually measure? I wouldn't be surprised if AMD actually intentionally lowered those ramp ups times not to get bashed again.
  • XabanakFanatik - Thursday, April 9, 2020 - link

    There's something off about the Cinebench R20 scores. The 4900H is scoring around 50% higher than it should and the 9750 is around 35% higher than it should.
  • XabanakFanatik - Thursday, April 9, 2020 - link

    It looks like the scores from the PCMark 10 Creation graph ended up in the CB15 graph as well.
  • XabanakFanatik - Thursday, April 9, 2020 - link

    The new graph showing the 4900HS scoring 2000 points doesn't seem right, either. There's no way the 9750 scored over 4000.
  • T1beriu - Thursday, April 9, 2020 - link

    Ian, you might wanna correct the CB R20 graph.
  • ballsystemlord - Thursday, April 9, 2020 - link

    @Ian there's only one CB R20 chart. It doesn't say ST or MT.
  • ballsystemlord - Thursday, April 9, 2020 - link

    Its here! Its finally here! An AT AMD 4000 series laptop review!
  • Slash3 - Thursday, April 9, 2020 - link

    Great article.

    Any idea if there are 4900U (or 4900HS) models built with LPDDR4X-4266? Would be very interesting to see any gains to the integrated graphics vs the traditional DDR4 subsystem, especially for a 4900U without a separate discrete GPU.

    It would also be interesting to know whether an LPDDR4X-4266 platform would feature a 1:1 FCLK of 2133MHz, something well above the current capabilities of desktop Zen 2 chips (typically topping out at 1900MHz).

    Lenovo's forthcoming Thinkpad X13 might be another good one to look into, as it should feature both the 4000-series mobile Ryzen as well as Thunderbolt, a feature that seems to be missing on the majority of this current release wave.
  • neblogai - Thursday, April 9, 2020 - link

    TB was just reported on Notebookcheck to be actually absent in Lenovo Renoir laptops.
  • Slash3 - Thursday, April 9, 2020 - link

    Well that's pretty disappointing if true. The PDFs in their article link to older T14 variants but it should be pretty easy to get confirmation on the X13 when they end up in people's hands over the next few weeks.

    Bummer. :(
  • Fataliity - Friday, April 10, 2020 - link

    The LPDDR4X-4266 is actually worse than the DDR4 3200. It consumes way less power, but has way higher latency. The 3200 is best for the IGP.
  • neblogai - Friday, April 10, 2020 - link

    It would be best to get it tested. Usually iGPUs care more about the bandwidth, not the latency. I hope TechEpiphany gets the tests done.
  • Cooe - Thursday, April 9, 2020 - link

    The Renoir die doesn't just have "two DDR4-3200 memory channels". It has dual mode DDR4-3200/LPDDR4X-4266 controllers. Up to OEMs to decide what kind of memory best suits a given design.
  • TEAMSWITCHER - Thursday, April 9, 2020 - link

    There is something decidedly old school about those laptop designs. They look like laptops from the 2000's. The Dell XPS 13 and MacBook Air look like devices that are appropriate for the 2020's.
  • WaltC - Friday, April 10, 2020 - link

    Hmmmm...brings to mind the old adage about the inadvisability of judging a book by its cover...;)
  • Hul8 - Saturday, April 11, 2020 - link

    I say bring back the 2000's - with 4:3 AR laptop screens.
  • s.yu - Sunday, April 12, 2020 - link

    Because it's thick. sub-15" devices are no longer built like this.
  • Azix - Thursday, April 9, 2020 - link

    Did you also lock the razer to power save mode? Or did you test it in that mode and see that it didn't affect the battery life?
  • yeeeeman - Thursday, April 9, 2020 - link

    It was probably in a non optimized mode because comet lake u with 6 cores gets over 12 hours of battery life too in those scenarios.
  • Fataliity - Friday, April 10, 2020 - link

    This is a comet lake H, not a U.
  • The_Assimilator - Thursday, April 9, 2020 - link

    Very impressive, AMD. Very impressive. Well done.
  • Cooe - Thursday, April 9, 2020 - link

    Your Cinebench R20 scores are labeled backwards lol. It's the Asus/Ryzen that scored 4394.
  • maroon1 - Thursday, April 9, 2020 - link

    How the hell Q-Max RTX 2060 beats regular 2060 in 2 out of the 4 games ?? This does not sound like GPU-bond test. You should test borderland 3 and FFXV in higher setting.
  • yeeeeman - Thursday, April 9, 2020 - link

    Higher clock speeds, more cores, higher ipc.
  • Irata - Friday, April 10, 2020 - link

    The Razer Laptop is most likely power limited - too much for the form factor. The CPU not reaching its max advertized boost clock is another indicator for this.
  • amrc9 - Thursday, April 9, 2020 - link

    What's the deal with the Low Power results for the i7? Artificial restriction on framerate or is the performance really that bad?
  • R3MF - Thursday, April 9, 2020 - link

    Any news on how much system memory the iGPU is allowed to access?

    I think 2000/3000 were limited to 2GB...

    Still pining for a 35w 8cu ryzen 9 laptop with a compact metal chassis and decent battery.
  • SolarBear28 - Thursday, April 9, 2020 - link

    That battery life is seriously impressive. I will definitely be purchasing a Ryzen 4000 laptop, just not sure which one yet.
  • psyclist80 - Thursday, April 9, 2020 - link

    Bravo AMD, take a bow...Leadership performance AND Efficiency, Renoir will win OEM designs and marketshare Left, Right and Centre! Blown away by the battery life

    Looking forward to the apples to apples comparisons against 10th Gen, Intel has got the fight of its life on its hands now
  • corinthos - Thursday, April 9, 2020 - link

    Intel sinking.. AMD Ryzen!
  • PeachNCream - Thursday, April 9, 2020 - link

    Nice laptop. Kudos on omitting a creeper-view webcam that I end up covering with a blackout sticker anyhow. Thumbs down for BIOS-embedded junkware that offers to install on a fresh, clean OS and I'd have to cover the power button with tape or something to prevent it from finger scanning invasive junk when I want to turn my system on or off. So unfortunately at any price it's a privacy problem and a no thank you.
  • Qasar - Thursday, April 9, 2020 - link

    peachncream i actually have opened the notebooks up after i have had them for a while, and just unplugged the ribbon cable, and then removed the tape i put over it :-)
    if the Armoury Crate option is enabled in the BIOS it will ask to install it." problem solved, just disable it in the bios. did you even read this part ? :-) :-) i bet the finger scanner could be disabled as well.... but you would probable stick with the ancient notebook you have any, so no difference :-)
  • PeachNCream - Thursday, April 9, 2020 - link

    I did miss the part where the installer could be disabled. Thanks for catching that. As for disabling finger readers, that's a setting I don't really trust to work. A physical barrier is really the only sure way to keep yourself safe.

    In the end, you are right. I will likely use older hardware, however, as time moves forward that older hardware ends up being pretty useless so I get newer older hardware. Security holes like these tend to percolate down to the secondary market over time so I hope that integration of print scanners remains a niche, but I see falling costs and slow yet steady spread so it may one day be a problem for even information security professionals like us to avoid this sort of hole.
  • eastcoast_pete - Thursday, April 9, 2020 - link

    I actually share your dislike for a built-in webcam that doesn't have a slider integrated in it. Unfortunately, that seems to be the last thing on the mind of many laptop designers. I would like a webcam in my laptop, as I often have to videoconference with clients, even when we're not under a "shelter in place" order
  • Fataliity - Friday, April 10, 2020 - link

    Just use a phone or buy a decent webcam for 20-50 bucks. The quality on built in laptop cameras is horrendous. literally anything is better.
  • Kamen Rider Blade - Thursday, April 9, 2020 - link

    Dr. Ian Cutress, your Inter-Core Latency table might have a few mistakes on it!!!!

    How is it that a 3900X can have consistent Latency when it crosses CCX/CCD boundries:
    https://i.redd.it/mvo9nk2r94931.png

    Yet your 3950X has Zen+ like latency when it crosses CCX/CCD boundaries?

    Did you screw up your table when you copied & pasted?
  • mattkiss - Thursday, April 9, 2020 - link

    For Zen 2 desktop CPUs, the CCD/IOD link for memory operations is 32B/cycle while reading and 16B/cycle for writing. I am curious what the values are for the Renoir architecure. Also, I would be interested in seeing an AIDA64 memory benchmark run for the review system for both DDR4 3200 and 2666.
  • Khato - Thursday, April 9, 2020 - link

    The investigation regarding the low web browsing battery life result on the Zephyrus G14 is quite interesting. One question though, was the following statement confirmed? "With the Razer Blade, it was clear that the system was forced into a 60 Hz only mode, the discrete GPU gets shut down, and power is saved."

    Few reasons for that question. The numbers and analysis in this article piqued my curiosity due to how close the Razer Blade and 120Hz Asus Zephyrus numbers were. Deriving average power use from those run times plus battery capacity arrives at 16.3W for the 120Hz Asus Zephyrus, 14W for the Razer Blade, and 6.1W for the 60Hz Asus Zephyrus. So roughly a 10W delta for increased refresh rate plus discrete graphics. Performing the same exercise on the recent Surface Laptop 3 review yields 6.1W for the R7 3780U variant and 4.5W for the i7 1065G7 variant. Note that the R7 3780U variant shows same average power consumption as the 60Hz Asus Zephyrus, while the Razer Blade is 9.5W/3x higher than the i7 1065G7 variant. It makes no sense for Intel efficiency to be that much worse... unless the discrete graphics is still at play.

    The above conclusion matches up with the only laptop I have access to with discrete graphics, an HP zbook G6 with the Quadro RTX 3000. On battery with just normal web browser and office applications open the discrete graphics is still active with hwinfo reporting a constant 8W of power usage.
  • Fataliity - Friday, April 10, 2020 - link

    That's partly because for a Zen2 core to ramp up to turbo, it uses much less power. Intel can hit their 35W budget from one core going up to 4.5-4.8Ghz. Ryzen can hit their 4.4 at about 12W. And it turbos faster too. So it uses less power, finishes the job quicker, is more responsive, etc.

    For an example, look at the new Galaxy S20 review on here with the 120hz screen. When its turned on it shaves off over 50% of its battery life.
  • Khato - Friday, April 10, 2020 - link

    Those arguments could have some merit if the results were particular to the web browsing battery life tests. However, the exact same trend exists for both web browsing and video playback, and h.264 playback doesn't require a system to leave minimum CPU frequency. This is clear evidence that the difference in power consumption has nothing to do with compute efficiency of the CPU, but rather the platform.

    Regarding the comparison to the S20. Performing the same exercise of dividing battery Wh by hours of web browsing battery life run time for the S20 Ultra with Snapdragon 865 arrives at 1.37W at 60Hz and 1.7W at 120Hz. Even if you assumed multiplicative scaling that would only increase the 6.1W figure for the 60Hz Asus Zephyrus up to 7.6W... and it's not multiplicative scaling.

    As far as I can tell from my own limited testing, Optimus simply isn't working like it should. It's frequently activating the discrete GPU on trivial windows workloads which could easily be handled by the integrated graphics. My guess is that this is the normal state for Intel based windows laptops with discrete NVIDIA graphics. Wouldn't necessarily affect AMD as driver setup is different, which is definitely a selling point for AMD unless Intel/NVIDIA take notice and fix their driver to match.
  • eastcoast_pete - Thursday, April 9, 2020 - link

    Thanks Ian, glad I waited with my overdue laptop refresh! Yes, it'll be Intel outside this time, unless the i7 plus dGPU prices come down a lot; the Ryzen 4800/4900 are the price/performance champs in that segment for now.
    The one fly in the ointment is the omission of a webcam in the Zephyrus. I can (prefer) to do without the LED bling on the lid cover, but really need a webcam, especially right now with "shelter in place" due to Covid. However, I don't think ASUS designed the Zephyrus with someone like me in mind. Too bad, maybe another Ryzen 4800 laptop will fit the bill .
  • guachi - Thursday, April 9, 2020 - link

    I was looking at the very Razer you had in this review. Ended up preordering the Asus.

    So I thank you for the review and the comparison choice.
  • Mat3 - Thursday, April 9, 2020 - link

    I know that's not a real die shot, but even so, that's the worst "fake" die shot I've ever seen.
  • StevoLincolnite - Thursday, April 9, 2020 - link

    I have a Ryzen 2700u notebook which I will happily toss out the window for this.
  • ballsystemlord - Thursday, April 9, 2020 - link

    Spelling and grammar errors:

    "If Intel has a lower frequency, fewer cores, and a lower frequency, all for the same power envelope as AMD, then it looks like a slam dunk for AMD."
    Double "lower frequency":
    "If Intel has a fewer cores and a lower frequency, all for the same power envelope as AMD, then it looks like a slam dunk for AMD."

    "When the system does the battery life done right, it's crazy good."
    Badly worded:
    "When the system balances performance and battery life, it's crazy good."
  • mkozakewich - Thursday, April 9, 2020 - link

    On these kinds of systems, battery life can tank when something goes wrong. I wonder if there was also a reason that the Intel system was showing such poor battery life.

    When AnandTech reviewed the Surface Book, I remember them giving it a really low battery life score. It turns out there's some kind of problem with the GPU connection, and the device will get twice the life if you disconnect and reconnect the tablet portion multiple times. I actually get 12 hours on my 2017 Surface Book 2. The system can run on 4 W. So, finally! I've never even considered an AMD system, because they would run closer to 8 or 12 watts, and that meant they'd either have a massive and heavy battery, or they'd only last a few hours.
    But the caveat: As you see here, and with the Surface Book, that efficiency can go out the window if one thing goes wrong.
  • zodiacfml - Thursday, April 9, 2020 - link

    Thanks for the Anandtech quality review. I hope we see cheaper 8 core laptops without discrete graphics. Seeing the performance of the iGPU on two memory speeds, hope to see AMD with integrated memory in its future products like what they did in the PS5/Series X, great for mobile and compact desktop PCs
  • plonk420 - Friday, April 10, 2020 - link

    thanks for the latency numbers! huuuuge help for emulation fans (or at least PS3 emulation fans)!
  • deil - Friday, April 10, 2020 - link

    a bit high idle speed, but that response times is what we want most for "snappy laptop" which feels fast. I really want one now...
  • Haawser - Friday, April 10, 2020 - link

    Hey Ian, could you do some iGPU game tests with a 720p render target, but running full screen with RIS ? I'd be interested to see if you can get much higher frame rates but without a massive loss of subjective IQ ? Cheers.
  • Fulljack - Friday, April 10, 2020 - link

    I'm interested with AMD 25x20 initiative. Could you please make an update for it? Last time you did was two years ago back in 2018. It would be an interesting piece of article to show how much AMD has grown. Thanks!

    https://www.anandtech.com/show/13326/amd-updates-i...
  • twtech - Friday, April 10, 2020 - link

    That picture really illustrates just how gigantic the 64 core TR/Epyc really is.
  • Keyboard1701 - Friday, April 10, 2020 - link

    Going by the performance of the laptops on battery, I'm curious to see if something like a gtx 1650 might actually perform better than a rtx 2060 on battery.
  • Keyboard1701 - Friday, April 10, 2020 - link

    I apologise for the double post, but I've also observed that the iGPU actually performs just as well as the rtx 2060 when the laptop is running on battery. Would this mean that the discrete gpu is redundant for someone who mainly intends to use the laptop on battery power?
  • RollingCamel - Friday, April 10, 2020 - link

    Did you run any thermal analysis on the laptop?
  • DanNeely - Friday, April 10, 2020 - link

    IF you do any more tests on this, I'd be interested in seeing how gaming performance is affected by using a 65W USB-C charger instead of the 180W barrel one. The smaller charger would be nice while traveling; but the only time I'd be gaming on a laptop is when I'm away and don't have access to my desktop.
  • willgart - Friday, April 10, 2020 - link

    the performance gain by changing the RAM is incredible.
  • DiHydro - Friday, April 10, 2020 - link

    Thank you taking the time to look at and comment on the thickness, and how it allows a bigger battery in the same footprint. I have been saying for the past couple years how the race for the thinnest laptop is pretty futile when OEMs keep putting higher wattage parts in them.
  • velatra - Friday, April 10, 2020 - link

    There's a problematic sentence in the second paragraph of the "2016: A Historic Low for AMD in Notebooks" section of the first page. It reads "OEMs knew this crippled performance, but in enabled the headline processors...." Perhaps "in" should be "it."
  • Techie2 - Saturday, April 11, 2020 - link

    Nice to see AMD continuing to lead in performance. One thing that makes no sense to me is pricing on laptops. I just don't see the basis for >$1000. laptops. It's not like the hardware cost can justify the retail prices. It's more like collusion by laptop makers IMO. I guess when they finally saturate the laptop market prices will come down to reality.
  • watzupken - Saturday, April 11, 2020 - link

    AT has done a great job with this review. Where I think AT did a lot better than the rest of the reviewers (besides the usual technical analysis), is that you nailed the issue with the short battery life that left many perplexed since some review sites are getting 10+ hours of battery, while some are getting 4+. I think its a good testament of the knowledge of the reviewers here. Thank you AT.
  • Zingam - Saturday, April 11, 2020 - link

    How is the triple monitor 4K support? Is triple 4K monitor setup viable? Is it smooth? Does it lag? Does it overheat? Does it make the fans howl all the time even while idling? Does it have driver issues?
    Can I connect this laptop or any other modern laptops to two (or more if supported) K4 external monitors for a three monitor setup and type, edit text, compile code and never ever experience overheating, fan noise, lag and stuttering? Is this APU a good work driver? This type of tests I am interested in!
    And I also like to know how it compares to older CPUs not just the current. I like to know how it compares to Sandy Bridge, to Sky Lake, to Kaby Lake i7 -7700HQ, etc with and without discrete GPU (1050Ti) to know if an upgrade is worth it.

    I don't care about battery life very much but I care about performance, heat and fan noise and how portable that setup is. I don't work in coffee shops but I need to carry my laptop from my office to my home and back on my back - so I care about it being light with a small power brick too.

    It is very rarely that reviews provide that information - it is all about gaming and flashiness.
  • Zingam - Saturday, April 11, 2020 - link

    @Ian it would be great if you compare these new CPUs to older for real work professional use and even with other small form factor PCs like the NUCs and the Mac Minis.
  • Zingam - Saturday, April 11, 2020 - link

    Can it run a 2-3 hour compilation or static analysis without throttling, while watching YouTube, running an emulator and browsing the web, which is just as important as not throttling while gaming or running a game in the background while debugging it in the summer season. :)
  • Viilutaja - Saturday, April 11, 2020 - link

    I have not used any of my laptops webcams ever! And i have used work laptops (Lenovo Thinkpads) for some time now. Right now i have special cover on my laptop webcam and have not opened it since installation. I have weekly meetings with collaegues and many other meetings with clients, never ever was the webcam on. Overrated part in laptops. Who has a need to do video converences, that person buys separate 4k60fps external webcam for it anyway.
  • nils_ - Saturday, April 11, 2020 - link

    I would like this very much in a mobile workstation, but I do need Thunderbolt 3 at least for my Docking Station. I can do without the dGPU.
  • dk404 - Saturday, April 11, 2020 - link

    +1 AMD for their focus on perf per watt, more design per socket plus perf per $ (value). They definitely leading the innovation for laptop, desktop and also server markets,...
    Now time for blue to wake up even though it's too late,...
  • SeanFL - Saturday, April 11, 2020 - link

    Wondering how long before we see some ultra tiny desktops using the new AMD laptop APU's, similar to the NUC, Lenovo Thinkcentre, or the HP Elitedesk. The aforementioned systems are great for almost anything except video editing. The new 4000 series chips would be fantastic in a tiny desktop. Please AMD.
  • realbabilu - Saturday, April 11, 2020 - link

    Is it hackintoshable ?
    I wish apple also see amd as a switch too
  • Dodozoid - Saturday, April 11, 2020 - link

    Awesome review Dr. Cutress. There are two points that interest me. First - GPU Z shows the iGPU connected via PCIe 4.0 16x. Are there any power/performance implications to that or is it simply misinterpretation of infinity fabric?
    And another one is regarding the on-battery performance. There is an important piece of information missing (I am aware you show part of in in battery life section)- how long does it maintain that performance?
    (And maybe how much influence do various power/performance setting have on the the framerates/endurance tradeoffs?)
  • phoenix_rizzen - Sunday, April 12, 2020 - link

    Renoir doesn't have PCIe 4. That should say PCIe 3 x8 for the GPU.
  • Dodozoid - Monday, April 13, 2020 - link

    Yes, to the external one. Vega 8 gpu-z screen says pci-e 4.0 16x.
  • realbabilu - Saturday, April 11, 2020 - link

    I saw the absent of hdd 2.5 inch tray. Maybe it is why asus can have 14 inch form instead usual 15 inch form i7-9750H. The weight almost same with my cheap Msi 9RCX i7-9750H notebook around 1.6-1.7 kgs. Still 10H battery life and those raw power nailed it.
  • oleyska - Saturday, April 11, 2020 - link

    On the IGP page (first) you say L3 cache, it has bigger L2 cache and not L3 :)
    Also, what IF\Memory frequency does the 3950X run at in the latency test ?
    Could it be compared at 1600 vs 1600 ? what if IF and memory is at 1800 ?
    really good write-up!
  • Hrel - Saturday, April 11, 2020 - link

    I wonder how much of this battery life gain that's being attributed to AMD is actually thanks to Nvidia. RTX 2060 shouldn't be compared to RTX 2060 MaxQ, that's pretty ridiculous. I'm willing to bet that's most of that extra 5 hours. Still, good on AMD, but why misrepresent it?
  • SolarBear28 - Sunday, April 12, 2020 - link

    You think 12 hours of battery life is possible using the discrete GPU? Its using the integrated vega graphics for the video playback and web battery tests.
  • Santoval - Sunday, April 12, 2020 - link

    It is extremely disappointing that a laptop of the caliber of Zephyrus G14 has a QLC based SSD. Who cares about the 1 TB of storage when as a trade-off you get piss-poor performance and the piss-poorest of endurance? QLC was originally employed in largely read intensive servers and now it shows up in SSDs for ... gaming laptops. Seriously Asus?
  • Deicidium369 - Monday, April 13, 2020 - link

    I have one of Samsung's first QLC drives, it has been used continuously since new, and has zero issues. Also have one the early OCZ 120GB drives - bought the day it dropped - has been used continuously since new - pulled it and reformatted it and tested - still 0 bad cells. So the whole endurance thing is WAY overblown...
  • AntonErtl - Sunday, April 12, 2020 - link

    Thank you for the review. A very good showing for the Ryzen 4xxxH series, now awaiting U reviews:-).

    There is one thing that I find disappointing: If I understand your review correctly, both memory controllers run at the same clock, so putting in a slower DIMM also slows down the soldered-in RAM. With the memory controllers and the infinity fabric clocks locked together, I should have expected that, but has AMD not loosened the lock recently? In any case, what I would love to see from laptops with partially soldered-in memory in the future would be that they soldered in LPDDR4x memory for better bandwidth (not sure if it saves power), and an independently clocked DDR4 DIMM. Maybe we will see it in Ryzen 5xxx APUs.
  • Curiousland - Sunday, April 12, 2020 - link

    A general comment that most people missed the point -- Ryzen 4000 shows the potential of some optimization work on 7nm process as well as on the zen2 design and can improve when comparing the performance to the early version of zen2 on desktop ryzen 3000!
    This implies the further potential performance improve of zen3 ryzen 4000 can bring to the market even before they jump on TSMC 5nm for zen4.
  • Curiousland - Sunday, April 12, 2020 - link

    I guess this is why Dr. Su said they are focusing on chip design architecture now more than rushing to next 5nm node. They must have seen plenty of potential of improvement along on 7nm for now already.
  • Kishoreshack - Monday, April 13, 2020 - link

    @ian cuttres
    Why soo harsh on AMD?
    Its outdoing intel processor which draw 2 or 3 times more power
    It literally smokes any Intel processor in the same power envelope
    You should be giving praises & awards to AMD
    instead the tone of article doesn't do justice to AMD
  • Deicidium369 - Monday, April 13, 2020 - link

    Facts hurt, huh? It's impressive but not life altering. What sort of awards? "Award for not going bankrupt" or "Award for FINALLY putting something worth purchasing" or "Award because you hut AMD's fee fees"

    The tone is fair, Ian is a skilled writer and reviewer - I never felt he has leaned more to one side or the other.
  • destorofall - Tuesday, April 14, 2020 - link

    can I give intel an award for "masterfully delaying a node ramp-up for almost half a decade."
  • Korguz - Tuesday, April 14, 2020 - link

    they didnt delaying, they screwed up some how, some say intel was too aggressive on what it was trying to do with it, and it didnt work
  • Wineohe - Monday, April 13, 2020 - link

    This does potentially offer OEMs some breathing room with features, if the CPU and Chipset can lower their costs by a few hundred dollars. They can offer a Notebook with similar or better performance and battery life, with more features. Say a 1TB SSD versus a 512GB, 32GB vs 16GB, better display, or better dedicated GPU. This would easily sway me as a consumer toward the AMD option.
  • Deicidium369 - Monday, April 13, 2020 - link

    Nope. Most people would still buy Intel - price isn't most people's primary metric - some buy Intel because why buy something that is "like Intel" when you can, you know, "Buy Intel". OEMs build what people want - and this new CPU will be a major rarity - and will not sell in even large numbers by AMDs standards.
  • Qasar - Monday, April 13, 2020 - link

    " OEMs build what people want " or buy intel because intel bribes or threatens them, remember the athlon 64 days ? guess what, intel got taken to court over their shady dealings, and they lost, but i know you won't remember that, because your an intel fanboy, and intel cant do anything wrong.
  • watzupken - Monday, April 13, 2020 - link

    Its true that most consumers will still prefer Intel chips due to the reputation they have build for themselves over the years. However in this age where people can find information readily, that advantage may not be firm. With more enthusiasts leaning towards AMD, this may also filter down to those who are not tech savvy through positive word of mouth. For example, it is not uncommon for someone who is not tech savvy to get recommendation from a technology savvy person when buying a computer/ laptop. Moreover, AMD is very aggressive when it comes to the cost of the chips, further adding another carrot to consumers to switch camp. One other key problem was poorer battery life on AMD mobile chips, i.e. Ryzen 2xxx and 3xxx for mobile devices, which is no longer the case here unless the manufacturers deliberately gimp the battery capacity.
  • sleeperclass - Monday, April 13, 2020 - link

    No webcam is a big blow. In these times, where everyone is using online chat platforms for communicating such as Teams, Zoom ,etc, this is something that should have just been there.

    All said and done, good progress by AMD in the mobile cpu space.
  • Qasar - Monday, April 13, 2020 - link

    both of the notebooks i have, have webcams, both are unplugged and i picked up a logitech C920 i think it is, just didnt like the idea of having to tilt the screen so the person i was talking to could see me better, the separate webcam, allows that no matter how the screen is tilted. but to each his own :-)
  • Deicidium369 - Monday, April 13, 2020 - link

    The C920 is nice unit - next to impossible to find at the moment - bought mine a while back - a C920S - which has the privacy shutter.
  • Deicidium369 - Monday, April 13, 2020 - link

    It is most definitely progress, without a doubt.
  • lightsout565 - Tuesday, April 14, 2020 - link

    Battery life is incredible. I really wish Asus had an option to lock the display to 60Hz if the dGPU was off like you mention Razer does. Or even better lock the display to 60Hz if the dGPU is off *AND* on battery power. The latter of the two seems like the best solution. It's too bad we aren't seeing more OEMs put these in higher end laptops. This along with new Nvidia Max-Q tech seems to be the perfect recipe for thin and light 13-14" laptop designs that the 45w H chips could never quite deliver on.
  • x86koala - Tuesday, April 14, 2020 - link

    Excellent performance!
  • Felfazeebo - Wednesday, April 15, 2020 - link

    Should you expect a performance hit if you were to add another stick with 3200mhz instead? I was planning on adding a 16gb 3200mhz stick in the hopes of running 24gb all in 3200mhz. I'm curious to know if it will even run at the right speed, and whether it will be as fast, or very close to the stock 16gb at 3200.
  • Qasar - Wednesday, April 15, 2020 - link

    i think this was address in the article
  • GreenReaper - Monday, April 20, 2020 - link

    By the look of it, it could, but it maybe won't by default. You would might well have to re-program the module (if you can find a way to do that, it may involve a hardware hack) or else wait in hope for ASUS to offer an update enabling XMP.

    I think there should have been more pushing on the manufacturer of this point, as many are likely to install aftermarket memory sooner or later.
  • carcakes - Wednesday, April 22, 2020 - link

    If the infinite fabric version for Ryzen Mobile 4000 series is newer than the previous one how much will it impact performance?
  • carcakes - Wednesday, April 22, 2020 - link

    'The 2nd Gen Infinity Architecture will allow for 4-8 Way GPU connectivity in a singular node'
  • mi1400 - Saturday, April 25, 2020 - link

    I dont know why this and similar processors are not packaged in decent streamlined chassis like Lisa Su is holding to introduce this cpu. Dear ASUS, MSI stop making chassis(s) for drag-queens, most of us are descend corporate employees who dont want to flash/swing their "assets" out and about. Current day programming/cloud/vms/containers demands are making people like me desperate for such processors BUT in decent/sleek/non-intrusive chassis designs and you are holding us back. I understand heat dissipation and other factors may push u for this but it is still not the excuse. Lenovo etc are making mobile workstation which are slimmer and less flashy than asus one above. Take few pages from phone designs like Samsung Note, Xperia Z Ultra.. world has changed .. dont remain stuck in mid-late 2000s. Just phone Lisa Su where she got that chassis.
  • Silma - Wednesday, May 6, 2020 - link

    Correct me if I'm wrong, but unfortunately none of the AMD laptops are compatible with Thunderbolt 3.
    Very sad for musicians wanting to connect a TB3 audio interface.
  • mobutu - Thursday, May 7, 2020 - link

    Nice machine, some thoughts:
    -instead of that big chin they should have fitted an 16:10 1920x1200 screen
    -those exterior leds (AniMe Matrix) are atrocious, it's a very very good thing that they are optional
    -not having a (shitty anyway) webcam is a BIG PLUS
    -anyone buying it should buy it with maxxed out RAM (32GB DDR4 3200MHz)
    -the SSD inside can be replace with a faster/better one
    -I don't see it mentioned in the review, but you can buy it with a 14" Non-glare WQHD 2560x1440@60Hz IPS 100% sRGB Pantone Validated adaptive sync optional screen
  • SyCoREAPER - Monday, June 28, 2021 - link

    Jesus the comment section is cringe. Came to look for some additional insight and see a bunch of children defending multi-billion dollar companies who owe them nothing and act like they are their masia. Would be nice if it wasn't crapped up by the same 2 Intel shills. Now I have to skip over every other comment to read anything relevant.

    Its been a year, have your gotten over it yet?

Log in

Don't have an account? Sign up now