Comments Locked

46 Comments

Back to Article

  • shabby - Thursday, April 25, 2019 - link

    Is it really a 2080 when the base clock is cut in half?
  • Daeros - Thursday, April 25, 2019 - link

    No, but that's nvida's game - they can honestly say it's the same chip, even though performance is a few steps down in the hierarchy. Just like that 8750h's TDP is nowhere near 45w - probably closer to 120w under load.
  • Opencg - Thursday, April 25, 2019 - link

    you can see in the cpu benchmarks that draw real power for a significant portion of time that it loses a good deal of performance. all in all its about where it should be for a laptop this thin. i would be surprised if it is really designed to handle more than 45w. personally i would bet it can start to throttle on sustained 45w loads
  • philehidiot - Thursday, April 25, 2019 - link

    I saw the 2080 and then the screen resolution. In reality, you'd probably want 4K + gsync for a shortish lifespan machine or 1440P for one with a good few years on it. 1080P says the performance is compromised and they HAD to drop it down. You'd never ever run a desktop on a 2080 with 1080P. I bought a gaming laptop once when I had a real need for it, back in the P4 days. The thing had about 6 fans and chucked out 50C hot air. I required it at the time but I'd never buy one now unless I absolutely needed it. That had 1050 lines, so 1080 isn't really a step up, it's a marketing ploy ("FULL HD!")

    This GPU can not be considered alongside a real 2080 and whilst I appreciate the screen size means resolutions greater that 1440P would be silly (and arguably even that but you must remember you're usually closer to a laptop screen and even a 6" mobile can benefit from the upgrade from 1080 to 1440), to me a gaming laptop generally is 17" anyway. If you go down this path you're rarely looking for real portability but more because you (in my experience) live in two or three different places and want to take a full gaming PC with you with your suitcase and so on.
  • wintermute000 - Thursday, April 25, 2019 - link

    Exactly, a 2060 would have been perfect for 1080p 144hz and then maybe the cooling would have coped.
    Must be a marketing decision to shove the biggest number into the spec sheet....
  • PeachNCream - Friday, May 3, 2019 - link

    I would not mind the slightest pushing 1080p resolutions with a 2080 GPU, but not with this particular laptop given the network adapter selection. It just isn't worth messing with Killer NICs at all when there are other options out there.
  • wintermute000 - Thursday, April 25, 2019 - link

    a.) The TDP as defined by intel (i.e. base clock) IS 45W.
    b.) Power under boost is much higher for sure, but 120W is a total exaggeration. I can get it to run steady on 3.6Ghz (thermal throttling is a different question LOL) on around 60W with an undervolt.
    3.) It would take a mean cooler and power delivery / VRMs on a laptop chassis to let it boost anywhere near its paper specs for long durations. I haven't looked at the built-like-a-tank laptops in depth but none of the mainstream designs have managed it so far.
  • wintermute000 - Thursday, April 25, 2019 - link

    by it I mean the i7-8750H (in an XPS 9570 if that matters)
  • Retycint - Saturday, April 27, 2019 - link

    Intel's definition of TDP is very much meaningless because they can change the base clock to fit in the TDP envelope. The i7-8750H maintained the 45W TDP despite having 2 more cores than the 7700HQ, not because the former has had a huge leap in efficiency, but rather because Intel dropped the base clock from 2.8 to 2.2GHz.

    In other words, Intel can theoretically claim that the 9750H has a 10W TDP, when at the base clock of 0.8 GHz, for instance. Which is why TDP numbers are bull
  • jordanclock - Thursday, April 25, 2019 - link

    Welcome to Max Q! Where the models are made up and the clocks don't matter!
  • MrRuckus - Thursday, April 25, 2019 - link

    This... Do yourself a favor and compare. 2080 Max-Q to standard 2080 you're looking at an average 30% drop in performance. No thanks. To me its a niche market that doesn't make a lot of sense. You want portability, but these are not meant to game on battery. They're meant to be plugged in all the time. Thinner and Thinner while trying to keep the performance (or the naming scheme of performance oriented parts) seems like an ever increasing losing battle. I personally wouldn't pay the premium 2080 price for a 30% hit in performance.
  • DanNeely - Thursday, April 25, 2019 - link

    Thin and light enough to pass as a normal laptop, while still able to game significantly better than on an IGP is a valid market segment; and at the x60 level is probably what I'll be getting for my next laptop in a year or two. OTOH for people who want something that's a normal laptop first and a gaming system second, I suspect Optimus (for battery life) over GSync is probably the better choice until/unless NVidia can integrate the two.
  • Rookierookie - Thursday, April 25, 2019 - link

    Yeah, I wish they had a version that offered a FHD panel with Optimus. No 4K, no GSync, and lasts at least 6 hours on battery surfing the web. The Gigabyte Aero is still more or less the only viable option for thin, light, powerful, and decent battery.

    The ConceptD 7 that comes out later this year looks really attractive, but it has a 4K panel so again I'm not optimistic about battery time.
  • Opencg - Thursday, April 25, 2019 - link

    optimus is great with the exception of when they wire the igpu to the external ports. imo you should be able to plug into a vr headset or external gsync monitor. you might give up the ability to run a presentation on a projector in low power optimus mode but imo when you are pluging into external displays you are probably plugging into the wall anyway.
  • Brett Howse - Thursday, April 25, 2019 - link

    Apparently this laptop has a MUX to allow you to choose between G-SYNC and Optimus. I've updated the article and am re-running the battery life tests right now and will add them in when done. So really this is the best of both worlds.
  • jordanclock - Thursday, April 25, 2019 - link

    The biggest problem is that Max-Q can mean everything from specs that match the desktop down to half the TDP rating. So a 2080 Max-Q can be below a 2070 Max-Q depending on how each OEM configures the TDPs and clocks.
  • Tams80 - Saturday, April 27, 2019 - link

    You're clearly not the market for it.
    There is a significant market for one machine that is as best as possible for gaming when plugged in, but decent as an everyday machine when not. I'd say that market also value the machine being light for transporting between places.

    You may poo poo such a use case, but it can be very tiresome maintaining more than one machine. People are willing to pay a premium for lower specs for that. Then there are those who just want the 'best' numbers in everything, but they'll buy anything.
  • Fallen Kell - Monday, April 29, 2019 - link

    You're clearly not the market for it.
    There is a significant market for one machine that is as best as possible for gaming when plugged in, but decent as an everyday machine when not. I'd say that market also value the machine being light for transporting between places.


    Except that he is the market. The complaint is that these companies are putting in features that they are marketing the device on which the device has no chance of being able to accomplish. This device isn't the best as possible for gaming when plugged in. It can't cool itself even when plugged in to have even a chance of using the 2080 GPU or even the CPU for more than a few seconds before thermal throttling. These shenanigans should ALWAYS be called out. If they want to make a thin laptop, then they should put in hardware that it can actually use at full capability without thermal or power throttling when plugged in. In this case, it would probably mean stepping down the CPU and GPU.

    The marketing guys know that will affect sales because they can't say they have the top of line components, but the whole max-q that Nvidia released along with relying on Intel's thermal throttling is letting companies say they have all this hardware in a thin laptop and people are buying them thinking they have that hardware, not knowing they are buying something that is possibly running 30-40% slower than it should be and could have saved hundreds of dollars and had the exact same performance if they simply used the "slower" parts that worked in the thermal and power loads of the laptop in the first place.
  • plewis00 - Saturday, April 27, 2019 - link

    Totally agree. If you want performance then expect a big machine to cart around. Making things thinner doesn't work all the time, Apple's MacBook range is a great example of how to ruin machines by making them obsessively thin.
  • not_anton - Friday, April 26, 2019 - link

    Apple did the same with Macbok pro 15, my Radeon 460 works at mere 960MHz with decent performance and ridiculously low power for a 1000-core GPU chip.
    The downside is the value for money that you'll get - about the same as Apple's :D
  • Ethos Evoss - Saturday, April 27, 2019 - link

    iT IS ABOUT FEATURES NIT JUST CLOCK....MAN
    SUPPORT NEW FEATURES 4k or 8k in 60hz
    3 4k support .. plenty memory new memory
  • DanNeely - Thursday, April 25, 2019 - link

    With all the power related limitations with mobile GPUs these days I'd love to see a nominal equivalent desktop system added to the tables as a reference baseline.
  • Oyeve - Thursday, April 25, 2019 - link

    3k laptop. I'll stick with my 17" Lenovo laptop from 3 years ago with a 980m that I got for 900. Plays everything I throw at it very well.
  • MrRuckus - Thursday, April 25, 2019 - link

    You can find diamonds in the rough. I bought a Alienware 17 R5 with a "1070 OC" they call it, and can get it within 10-15% of my friends Asus G703 with an overclocked 1080. That laptop was $3500, mine was $1500 base + 1TB NVMe Evo, and 1TB SSD Evo I added to it coming out around $1800. While I dont care too much for Alienware in general, I knew this chassis could take an i9/1080 combo,so I knew it would handle an i7/1070 easily, which it does with no throttling. With laptops, its all about finding a chassis that can handle the hardware without throttling.
  • Jedi2155 - Thursday, April 25, 2019 - link

    Did you do an iUnlock and liquid metal the R5? I had to do it on my 17R4 but still working great after 2 years of ownership in my backpack.
  • WagonWheelsRX8 - Thursday, April 25, 2019 - link

    Pretty impressive amount of power for a portable device.
    Would love to see a review or 2 sprinkled in of the more middle range laptop hardware, too.
  • Gunbuster - Thursday, April 25, 2019 - link

    Good looking laptop until we get to the last page and there is that big old I'm a grown man in the basement messaging a 13 year old predator logo. :p Marketing Acer, Marketing. Research it.
  • MrRuckus - Thursday, April 25, 2019 - link

    This! I dont know why its so hard to find a aesthetically pleasing high end laptop! haha. The MSI GS75 Stealth is an amazing looking laptop, if only it wasn't a Max-Q design! I only wish the screen lid came on the MSI Raider with the full fledged 2080. But no, they want to stamp that with "Dragon scale" LED strips What?? Teenage looks with seriously adult prices.
  • patel21 - Thursday, April 25, 2019 - link

    Gigabyte Aero
  • MrRuckus - Thursday, April 25, 2019 - link

    Ahh, its nice, but would like a full 2080 non Max-Q and a 17inch display. 15inch is just too small for my liking. I have owned a lot of Asus ROG laptops (get a new one every 2 years through my work as my treat to myself). 15 inch laptops even some of the better ones, have issues with throttling. That extra 2 inches of space helps with cooling in the small form factor of a laptop. The GS75 Stealth is a nice looking laptop, but I just cant get over the 30% hit from Max-Q. I would look at another G703 from Asus, but that laptop is now pushing $4k with a 2080. I had the G703 w/1080 before I sold it to a friend due to needing some money due to some unfortunate events, It was a really nice solid laptop, but was really pushing it at $3500. I just cant justify spending over 3500, that was even a stretch.
  • Spunjji - Friday, April 26, 2019 - link

    You're looking at something similar to me; only I'm not interested in the 2080 specifically as it's comically overpriced. I'd prefer a 2060 and enough thermal headroom to get it running at something close to actual desktop 2060 performance.

    nVidia really dropped a bollock this generation. After having rough performance parity between desktop and notebook with Pascal (Max-Q snake oil excluded) they quietly dropped it for Turing but kept the same naming convention. The performance disparity is egregious now, while prices have been out of control since Maxwell.
  • vicbee - Thursday, April 25, 2019 - link

    Guess there are enough 16 to 25 year olds with $3k+ to spend on gaming laptops who love the bling. Beyond my understanding.
  • Junz - Thursday, April 25, 2019 - link

    I have the triton 500 and it doesn't have Optimus but there is and option to turn on Mshybrid in the BIOS and in the predator sense software settings gear wheel there is an option for dgpu only which if turned off I believe does the same thing.

    Also would never have bought the laptop at full price but managed to get the $2500 model for $2100 tax free from Best Buy.
  • Brett Howse - Thursday, April 25, 2019 - link

    Hi Junz. Thanks for the tip. I see there is an option for Optimus so I've enabled it (disabling G-SYNC) and updated the article text. Re-running the battery life tests as well.
  • Junz - Friday, April 26, 2019 - link

    No problem. I feel like I get around 6-7 hours from a full charge while running something like dev-c++ and music/YouTube playing in the background. I can't wait to see your results though.
  • PeachNCream - Friday, April 26, 2019 - link

    Wish Acer would just use Intel branded network adapters in these systems. It feels like a frisking rather than a premium experience to buy at the highest end, but get saddled with Killer NICs.
  • Brett Howse - Friday, April 26, 2019 - link

    Killer uses Intel as their base adapter now and this laptop uses the 1550 Killer which is based on the 9260 Intel
  • PeachNCream - Friday, April 26, 2019 - link

    Yes, I'd heard that was the case. The trouble is that as with any rebranding effort, a company that purchases and resells has to perform some sort of markup in order to turn over a profit. That's where Rivet Networks (RN) sits, as a middleman business between Intel and the OEMs. Normally these in-between companies offer the prospect of added value, but RN's offerings of additional software don't generally improve on vanilla Intel adapters by offering useful features. A lot of us with networking backgrounds and people that have picked up the basics of how packets find their way to the end destination and back remain unconvinced that software prioritization at the NIC makes a measurable difference and there is a dearth of supporting numbers to prove otherwise. Meanwhile features like ethernet adapter teaming (market speak - DoubleShot) are not new features and have little reason to be implemented at an endpoint node that mainly performs consumer computing. Rivet has worked at stabilizing their software so at least that problem is not as pronounced as it was in the past and the switch to buying Intel was probably a good move from a driver standpoint, yet Killer NICs selling points appear to prey on a lack of knowledge and have that snake oil flavor. I'd hope Rivet finds a different, more meaningful way to add value so they can earn the premium level the company is hoping to achieve. Before that can happen, something fundamental needs to change about what they're offering and how they're offering it....or someone needs to post some numbers that put the proof in the pudding about the claims they're making.
  • Hrel - Saturday, April 27, 2019 - link

    Acer has known reliability issues, what I'd really like to see is stress testing, since you are apparently gonna keep advertising their products. I've never had an Acer anything last more than 2 years. With that said it has been a while exactly for that reason. So, I say, abuse the keykoard, open the screen 1000 times, slide the thing off couches onto tile and carpet. Throw it in a backpack and act like you're a train commuter, pick it up, shuffle it around, toss it back down 1000 times.

    Until this kind of testing is done on Acer I'll never give them another cent. I just don't trust anything they make. Regardless of the components inside, assembly and quality build matter.
  • Junz - Saturday, April 27, 2019 - link

    I've had mine for 2 weeks and what your describing is pretty much how mine is treated and so far it seems pretty sturdy. Even dropped my back pack on the floor once and freaked out when I heard the loud metallic thunk but it held up pretty well. I don't know how it'll be in 2 years but I haven't had a laptop last me 2 years yet(I'm pretty rough with my electronics), only time will tell how this one holds up.
  • timecop1818 - Friday, May 3, 2019 - link

    Garbage 1080p screen and not one but TWO killer network shits? Hard pass.
    What's the fucking point of 2080 if the screen isn't even 4K on this thing?
  • Rookierookie - Saturday, May 4, 2019 - link

    Optimus battery time goes from 5 hours to 3, for starters.
  • Loic - Tuesday, May 7, 2019 - link

    Are you sure it's a vm ? It looks like a User Mode Linux to me. There are some bugs that have been fixed in wsl to run UML less than a year ago so it would make sense to work with that. You'll get docker and fuse out of the box without having to manage a vm.
  • Loic - Tuesday, May 7, 2019 - link

    Wrong article sorry!
  • ballsystemlord - Thursday, May 9, 2019 - link

    Spelling and grammar corrections:
    "The most recent installment of the Lara Croft series really bumps of the graphical fidelity,"
    "up" no "of":
    "The most recent installment of the Lara Croft series really bumps up the graphical fidelity,"

    Luckily GeForce Experience makes this process pretty easy,
    Missing comma:
    Luckily, GeForce Experience makes this process pretty easy,
  • somisingh - Wednesday, January 22, 2020 - link

    Here we provide the all latest exams results notifications and job alerts in our website. Its is a best platform for students and those who are searching for sarkari job notification directly visit our site to get job notification,admit card,syllabus,answer key, results at latest exam result site .

    https://latestexamresult.com/

Log in

Don't have an account? Sign up now