I run a 750Ti in my Haswell i5 gaming PC. Pretty impressive for a card that runs completely off the PCIe bus power. It runs BF4 at 1900x1200 always above 60FPS at a mix of Medium and High.
The problem with Maxwell and desktop in general is that Maxwell is to GPUs what Haswell was to CPUs, an almost entirely mobile/battery life concentrated update. Which means if you're plugging something in just buy whatever because Maxwell isn't going to do you much better.
Except for the mobile variants like the 860M, which is the entire point of the article you're commenting on. Just like with Haswell, for the same or better performance devices can be thinner, lighter, and last longer.
And I realize you're not necessarily referring to mobile, but the analogy still applies to desktop. I think everyone would prefer a smaller, quieter, and less power-hungry box in their room.
Well actually, the new mobile maxwells have far better performance, because they were able to put more cores into the gpus and clock them slightly higher due to the much lower TDP.
Who's to say the same won't happen with desktop GPUs?
You must not have noticed that heat and power limitations are seriously limiting GPU options for SFF PCs (most miniITX builds that are significantly smaller than a microATX build). Better-performing single-slot, half-height, or 17cm-length cards are what we are after. The Haswell difference is exactly what we need here to push better GPUs into this market space.
I'm a bit surprised to see there aren't any thermal numbers. Isn't it important to know the temperature of the GPU, CPU and chassis for a gaming notebook?
This right here. I'm in the market for a Maxwell laptop and I'd like to know which is the least likely to self-immolate on my desk while I'm in the middle of a game.
Why do they insist on cramming numpads on 16" laptops? Who really uses them? I'd much rather have my keyboard and trackpad centred so I don't constantly have my wrists bent to the side or the screen off-centred. Bah.
I totally agree with you. I'd rather have a better experience typing than extra number buttons I'm not going to use all that often. If this was a business PC for accounting, perhaps the number pad would make sense.
For everything work related except spreadsheeting, I'd rather have the arrows and navigation keys in the standard 104 key positions instead of smashed into the edge of the main area and scattered at random as fn-combos.
I use a 10-key enough that I'm happier with a full 104-key arrangement rather than 2-inch gaps on the right and left where keys could have been but aren't in the interest of centering the keyboard. YMMV.
Have you looked at the W230ss? 13.3" clevo with 860m and 3200x1800 screen option.
While it has a smaller screen, it's fatter than the MSI since clevo packed in the giant 12.7cfm fan that they use for GPU cooling on their bigger laptops.
I have the GE70 version... popped in 2 840 EVOs in raid (1TB drive as mass storage only) and it is quite nice. It can get warm, but being a bigger chassis and using a laptop cooler have totally removed any issues with that for me. With the SSD performance difference is dramatic.
I realize you need to use certain configurations so you have points of comparison with other machines you've reviewed, but framerates at the display's native resolution are the only ones that really matter.
It's weird, the lead-in for the page with gaming results talks about 1920x1080 (the panel's native res) but then I don't see any charts for that resolution. It's like they were left out by accident.
On the other hand, if you dive into the control panel for your video card and enable "aspect ratio scaling on GPU" instead of the default scaling on screen, the jaggies tend to be MUCH less horrible when operating below native resolution.
The "Mainstream" results are high enough that bumping to 1080p isn't a problem at the settings we use, but then we wouldn't have anything to compare performance against as it's a non-standard setting. If we run one "non-standard" setting, it opens the door to all sorts of other possibilities. Maybe we should use the GFE recommended settings (or AMD's recommended settings) as another item to include?
In fact, I'll go ahead and run those and update the Gaming page in a bit with results (as well as details on the precise settings used by GFE). If nothing else, it will be an interesting experiment. :-)
Hopefully they update the applications for 120Hz or 144Hz users... or maybe have it target your monitor's refresh rate by default instead? Speaking of which, where are the variable refresh 4K 120Hz monitors? ;-)
First we'd need gpus to implement displayport 1.3 to have the outbound bandwidth. Then until we get another generation of 2x as fast decoder/ldc panel controllers we'll be back to the looks like 2 monitors over MST setup we enjoyed with the first generation of 4k60 panels.
If I may make a suggestion, please add thermals to laptop reviews. Not just how hot the components get, but how hot each area gets. My primary concern with these thin gaming laptops is that not only would the run really hot inside, but they'll create pockets of hot spots where I don't want them.
For example, I had a Dell XPS 15z. Not quite a gaming laptop mind you, but if I fired up a game, the left side of the keyboard would get uncomfortably warm to the point where I had to get a 84-key keyboard so I could play something comfortably.
My digital thermometer stopped working properly a while back so I haven't been able to provide numbers. I can order a new one but considering we haven't included surface temperatures for years it didn't seem necessary.
Thank you Jarred. I think surface temps are very important since it is a laptop that many people use in their laps. I appreciate you including them.
xenol - a good site is notebookcheck dot net, they do a lot of laptop reviews and always include surface temps from 18 areas of the laptop. Their temps for this MSI look to be 3-4 degrees higher than Jarred saw.
I have the laptop. It has solid CPU, GPU and display. I actually thought the display was some exceptionally good TN. I bought an SSD together with it to replace the HDD... only to find out that opening the laptop voids your warranty!!! Coming from an Acer laptop which had no such c(r)ap, this was a huge letdown.
I've heard (but can't personally verify) that such stickers can't actually be enforced by law, but it's definitely annoying when they try to prevent end users from upgrading things like the RAM and storage. You could always email/call MSI and ask them for confirmation that you can upgrade the RAM/storage first and see what they say -- get it in writing, though! :-)
in writing is not even always necessary, depending on local laws :) in Washington state, a verbal contract is a VALID contract. No need to write it and sign. There does need to be a third party to verify it though
For Americans, the Magnuson-Moss Warranty Act applies. Just modifying/servicing your product doesn't void a warranty guarantee, unless the producer can show the modification caused the defect. This came about because of car manufacturers were pulling crap like, "Oh look you added an aftermarket exhaust, you voided your warranty and we will not cover the windshield wipers failing."
With static discharges, however, it will be hard to refute user error, but still, it's always worth making the argument.
No DisplayPort and no G-Sync. This is exactly the class of machine that would benefit the most from it. Even if the claim is that it's too expensive to integrate into the laptop panel (though Nvidia themselves talked about how it's easier on laptops), DisplayPort would have at least allowed for the possibility of using an external G-Sync monitor.
No, G-Sync is actually a pain in the butt on laptops, for one reason: Optimus. To do G-Sync, you need to have the GPU and display communicate with each other, so the only way NVIDIA can do it is if they get rid of Optimus. But doing that means you just killed battery life as well. There are potentially ways around that I'm sure, but it's the reason there haven't been any G-Sync notebooks yet. I actually asked NVIDIA about it at CES and they basically said as much: "We're looking at ways to implement it, but right now we don't have anything we can talk about."
Wow. Interesting. Nvidia's response to AMD's "free sync" demo implied that laptops were easier to implement.
"However, Petersen quickly pointed out an important detail about AMD's "free sync" demo: it was conducted on laptop systems. Laptops, he explained, have a different display architecture than desktops, with a more direct interface between the GPU and the LCD panel, generally based on standards like LVDS or eDP (embedded DisplayPort). Desktop monitors use other interfaces, like HDMI and DisplayPort, and typically have a scaler chip situated in the path between the GPU and the panel. As a result, a feature like variable refresh is nearly impossible to implement on a desktop monitor as things now stand."
All of this just raises more questions.
In AMD's demo, did they have to disable DSG for free sync to work? If not, how does an AMD GPU communicate with the display? If so, then what AMD showed was even more impressive if Nvidia is still "looking at ways to implement it".
Are there laptops with free sync (officially) coming?
I am not trying to take things off topic. I just want to reiterate that the reviewed laptop is exactly the class of machine that would benefit the most from G-Sync/free sync. If anything, it is more important for a laptop because you do not ever have the option of replacing the GPU once your framerates start dropping.
One final (two-part) question. Could a laptop not have the DP connected directly to the Nvidia GPU? Is it not safe to assume that a person connected to an external DP monitor would have access to external power (and therefore not need Optimus)?
So basically on a laptop if you plug directly into the dGPU, yes, it's easier -- but I'm not sure how much easier we're really talking about. Obviously it can be done with desktop displays with enough effort, the main benefit of laptops being you have multiple inputs into desktop displays with scalers and such. An interesting corollary is that AMD might have an advantage with laptops using AMD APUs -- both the APU and dGPU would be AMD software, so there's no "Intel iGPU" in the way.
As to the question of when laptops with G/Free Sync are coming, I don't believe any have been announced yet, so your guess is as good as mine. I told NVIDIA that laptops would be great for this as getting >60 FPS on a laptop is rather difficult but ~40 FPS with G-SYNC would still be achievable. We might see this in the next year, or perhaps even earlier with non-Optimus solutions (e.g. ASUS has the G750 without Optimus, so it might be a target for G-SYNC in an update). Of course, the number of displays with G-Sync support is still very small (one or two ASUS are officially available, another ASUS display can be modded by the user; Acer and BenQ have displays coming but they're not out yet.)
As for the second question: sure, an external G-Sync display could easily be driven by a laptop. But that sort of defeats the purpose of a laptop in large measure. :-)
At first it might appear that it defeats the point of a laptop, but there is (what I assume a growing) group of aging gamers that require a laptop for business/work, but still want to game at home. I have a Lenovo Y580 with 16GB of RAM, a 240GB mSATA SSD, and the Nvidia GTX 660m. It's great for work, and OK for games, but would be better with G-Sync or free sync.
SLI class laptops already do not support optimus because nivida does not allow it with SLI, so these could do G-sync just fine.
As for strong single card laptops then yes the iGPU is in the way. Alienware laptops allow you to run in optimus/enduro mode or in dGPU only mode which connects the dGPU directly to the display, but ASUS, Clevo, and MSI do not. Alienware actually has every display output directly drivable by the dGPU, so I feel it would not be very difficult for other makers to at least route the internal display with a BIOS option for iGPU/dGPU mode, or dGPU only mode.
This is a good machine. I am looking for an upgrade and this looks fast and with a spacious HD, which is a plus. That said, I wish start seeing laptops (especially in this size and price range), using an SSD paired with an HDD (less than 1TB means that you need to find some creative way to keep music, photos and videos with you, and 1TB SSD is still way too expensive). Maybe some of the Kaveri parts can compromise on the CPU while provide similar GPU power, all combined for a much lower cost. Let's see: so far Kaveri has been MIA.
I picked up the GE60 i7 and GTX765M GPU for $800 off neweggflash a few months back. Couldn't be happier with it. Really amazing laptop, for that money I wouldn't expect a backlit keyboard, yet it has one. a 720p camera, stereo mic, Optimus, fantastic keyboard feel, though the layout could use some work. Some things are in odd places and there's some completely unused keys that could be replaced with nice keys, like a Windows key to the right of the spacebar. I'd also like the spacebar moved left some as it sits directly under my right palm. The Del key is also oddly far to the right, so that should be put back where it belongs.
Nit picky stuff though, it's an amazing laptop. Seems completely wasteful to get anything more expensive considering what is offered in the GE series. I guess if you need something .5" thinner, for some unkown reason, the GS series makes sense. But I cannot fathom a function for the GT series besides just wasting money.
The Physx issue is because Metro Last Light installs an older version. If you install metro, then install the drivers, it runs beautifully. I had the same issue with my SLI'd 780's. The fps would plummet to single digits. After the fix, I had perfect smooth gameplay with zero slowdowns.
On storage, on the other hand, this is the only laptop at this price with 2 msata ports. My GE60 is equipped with 2 x 1TB Sams SSD and 2 x 1TB HDD (one in a caddy in the optical drive). So if you are willing to get your hands dirty and self upgrade, this baby is for you.
I dare you to find a laptop at this price that can house 4TB of storage.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
63 Comments
Back to Article
pierrot - Thursday, July 17, 2014 - link
Wheres that desktop 860? Preferably something that will fit in an ITX case!koekkoe - Thursday, July 17, 2014 - link
It's called 750 Ti.Anonymous Blowhard - Thursday, July 17, 2014 - link
It also comes in low-profile for those really tight spaces.odell_wills - Thursday, October 9, 2014 - link
I love it! /Odell from http://www.consumertop.com/best-laptop-guide/Samus - Thursday, July 17, 2014 - link
I run a 750Ti in my Haswell i5 gaming PC. Pretty impressive for a card that runs completely off the PCIe bus power. It runs BF4 at 1900x1200 always above 60FPS at a mix of Medium and High.I'd say its somewhere between a 560Ti and 660.
Frenetic Pony - Thursday, July 17, 2014 - link
The problem with Maxwell and desktop in general is that Maxwell is to GPUs what Haswell was to CPUs, an almost entirely mobile/battery life concentrated update. Which means if you're plugging something in just buy whatever because Maxwell isn't going to do you much better.smorebuds - Thursday, July 17, 2014 - link
Except for the mobile variants like the 860M, which is the entire point of the article you're commenting on. Just like with Haswell, for the same or better performance devices can be thinner, lighter, and last longer.smorebuds - Thursday, July 17, 2014 - link
And I realize you're not necessarily referring to mobile, but the analogy still applies to desktop. I think everyone would prefer a smaller, quieter, and less power-hungry box in their room.DanNeely - Monday, July 21, 2014 - link
With gaming desktops I suspect more people would turn around and spend all the increased performance/watt on more performance instead of fewer watts.Antronman - Thursday, July 17, 2014 - link
Well actually, the new mobile maxwells have far better performance, because they were able to put more cores into the gpus and clock them slightly higher due to the much lower TDP.Who's to say the same won't happen with desktop GPUs?
CZroe - Saturday, July 26, 2014 - link
You must not have noticed that heat and power limitations are seriously limiting GPU options for SFF PCs (most miniITX builds that are significantly smaller than a microATX build). Better-performing single-slot, half-height, or 17cm-length cards are what we are after. The Haswell difference is exactly what we need here to push better GPUs into this market space.damianrobertjones - Thursday, July 17, 2014 - link
Wouldn't be awesome to have a picture of the internals after removing the bottom cover!BPM - Friday, July 18, 2014 - link
Yeah they normally include one in their reviews.romba - Thursday, July 24, 2014 - link
Google MSI GE60 msata. I found the images in a forum somewhere although I can't pinpoint it at the moment. It has 2 spare msata slots.EzioAs - Thursday, July 17, 2014 - link
I'm a bit surprised to see there aren't any thermal numbers. Isn't it important to know the temperature of the GPU, CPU and chassis for a gaming notebook?Anonymous Blowhard - Thursday, July 17, 2014 - link
This right here. I'm in the market for a Maxwell laptop and I'd like to know which is the least likely to self-immolate on my desk while I'm in the middle of a game.JarredWalton - Thursday, July 17, 2014 - link
They are all listed on the General Performance page at the bottom, including a gallery showing stress test results.JarredWalton - Thursday, July 17, 2014 - link
Update: I moved the temperature discussion to a separate page now, page 5.ramj70 - Thursday, July 17, 2014 - link
I have this laptop and it can get fairly warm when playing FPS games. I bought a laptop cooler and that helped out quite a bit.evilspoons - Thursday, July 17, 2014 - link
Why do they insist on cramming numpads on 16" laptops? Who really uses them? I'd much rather have my keyboard and trackpad centred so I don't constantly have my wrists bent to the side or the screen off-centred. Bah.Flunk - Thursday, July 17, 2014 - link
I totally agree with you. I'd rather have a better experience typing than extra number buttons I'm not going to use all that often. If this was a business PC for accounting, perhaps the number pad would make sense.DanNeely - Thursday, July 17, 2014 - link
For everything work related except spreadsheeting, I'd rather have the arrows and navigation keys in the standard 104 key positions instead of smashed into the edge of the main area and scattered at random as fn-combos.JarredWalton - Thursday, July 17, 2014 - link
I use a 10-key enough that I'm happier with a full 104-key arrangement rather than 2-inch gaps on the right and left where keys could have been but aren't in the interest of centering the keyboard. YMMV.nathanddrews - Friday, July 18, 2014 - link
Indeed. I use the 10-key a lot for work and some games.Antronman - Thursday, July 17, 2014 - link
It's a gaming laptop, and some very popular games have mods or themselves take advantage of the numpad.Nagorak - Wednesday, July 23, 2014 - link
I use my numpad all the time. If the GE60 did not have a numpad I definitely would have had a hard time justifying my purchase of it.Khenglish - Thursday, July 17, 2014 - link
Have you looked at the W230ss? 13.3" clevo with 860m and 3200x1800 screen option.While it has a smaller screen, it's fatter than the MSI since clevo packed in the giant 12.7cfm fan that they use for GPU cooling on their bigger laptops.
emarston - Thursday, July 17, 2014 - link
I have the GE70 version... popped in 2 840 EVOs in raid (1TB drive as mass storage only) and it is quite nice. It can get warm, but being a bigger chassis and using a laptop cooler have totally removed any issues with that for me. With the SSD performance difference is dramatic.MooseMuffin - Thursday, July 17, 2014 - link
I realize you need to use certain configurations so you have points of comparison with other machines you've reviewed, but framerates at the display's native resolution are the only ones that really matter.evilspoons - Thursday, July 17, 2014 - link
It's weird, the lead-in for the page with gaming results talks about 1920x1080 (the panel's native res) but then I don't see any charts for that resolution. It's like they were left out by accident.On the other hand, if you dive into the control panel for your video card and enable "aspect ratio scaling on GPU" instead of the default scaling on screen, the jaggies tend to be MUCH less horrible when operating below native resolution.
JarredWalton - Thursday, July 17, 2014 - link
The "Mainstream" results are high enough that bumping to 1080p isn't a problem at the settings we use, but then we wouldn't have anything to compare performance against as it's a non-standard setting. If we run one "non-standard" setting, it opens the door to all sorts of other possibilities. Maybe we should use the GFE recommended settings (or AMD's recommended settings) as another item to include?In fact, I'll go ahead and run those and update the Gaming page in a bit with results (as well as details on the precise settings used by GFE). If nothing else, it will be an interesting experiment. :-)
JarredWalton - Thursday, July 17, 2014 - link
Page three is updated with 1080p GFE results, if you're interested.nathanddrews - Friday, July 18, 2014 - link
Thanks for the update, it's very enlightening. A review of GeForce Experience vs Gaming Evolved would be very cool. I know that up until v2.0 or 1.8, GFE automatically targeted 40-60fps with no option to prefer quality (30fps) or performance (60fps) like it does now.https://forums.geforce.com/default/topic/525176/is...
http://www.geforce.com/geforce-experience/faq
Hopefully they update the applications for 120Hz or 144Hz users... or maybe have it target your monitor's refresh rate by default instead? Speaking of which, where are the variable refresh 4K 120Hz monitors? ;-)
DanNeely - Friday, July 18, 2014 - link
First we'd need gpus to implement displayport 1.3 to have the outbound bandwidth. Then until we get another generation of 2x as fast decoder/ldc panel controllers we'll be back to the looks like 2 monitors over MST setup we enjoyed with the first generation of 4k60 panels.xenol - Thursday, July 17, 2014 - link
If I may make a suggestion, please add thermals to laptop reviews. Not just how hot the components get, but how hot each area gets. My primary concern with these thin gaming laptops is that not only would the run really hot inside, but they'll create pockets of hot spots where I don't want them.For example, I had a Dell XPS 15z. Not quite a gaming laptop mind you, but if I fired up a game, the left side of the keyboard would get uncomfortably warm to the point where I had to get a 84-key keyboard so I could play something comfortably.
JarredWalton - Thursday, July 17, 2014 - link
My digital thermometer stopped working properly a while back so I haven't been able to provide numbers. I can order a new one but considering we haven't included surface temperatures for years it didn't seem necessary.JarredWalton - Thursday, July 17, 2014 - link
I was able to use a kitchen thermometer from my wife to do some testing. :-) Page 5 has surface temps now if you're interested.LeapingGnome - Thursday, July 17, 2014 - link
Thank you Jarred. I think surface temps are very important since it is a laptop that many people use in their laps. I appreciate you including them.xenol - a good site is notebookcheck dot net, they do a lot of laptop reviews and always include surface temps from 18 areas of the laptop. Their temps for this MSI look to be 3-4 degrees higher than Jarred saw.
shtldr - Thursday, July 17, 2014 - link
I have the laptop. It has solid CPU, GPU and display. I actually thought the display was some exceptionally good TN.I bought an SSD together with it to replace the HDD... only to find out that opening the laptop voids your warranty!!!
Coming from an Acer laptop which had no such c(r)ap, this was a huge letdown.
JarredWalton - Thursday, July 17, 2014 - link
I've heard (but can't personally verify) that such stickers can't actually be enforced by law, but it's definitely annoying when they try to prevent end users from upgrading things like the RAM and storage. You could always email/call MSI and ask them for confirmation that you can upgrade the RAM/storage first and see what they say -- get it in writing, though! :-)thesavvymage - Thursday, July 17, 2014 - link
in writing is not even always necessary, depending on local laws :) in Washington state, a verbal contract is a VALID contract. No need to write it and sign. There does need to be a third party to verify it thoughReedTFM - Friday, July 18, 2014 - link
For Americans, the Magnuson-Moss Warranty Act applies. Just modifying/servicing your product doesn't void a warranty guarantee, unless the producer can show the modification caused the defect. This came about because of car manufacturers were pulling crap like, "Oh look you added an aftermarket exhaust, you voided your warranty and we will not cover the windshield wipers failing."With static discharges, however, it will be hard to refute user error, but still, it's always worth making the argument.
ramj70 - Thursday, July 17, 2014 - link
I contacted MSI and they said that opening the laptop will to upgrade will not void the warranty. I also bought an SSD and replaced the HDDruthan - Thursday, July 17, 2014 - link
Without inbuild 3g modem.. so not for real life.Novaguy - Saturday, July 19, 2014 - link
There are phone plans that come with data tethering or wifi hotspot options, and I find those work well.Tanclearas - Thursday, July 17, 2014 - link
No DisplayPort and no G-Sync. This is exactly the class of machine that would benefit the most from it. Even if the claim is that it's too expensive to integrate into the laptop panel (though Nvidia themselves talked about how it's easier on laptops), DisplayPort would have at least allowed for the possibility of using an external G-Sync monitor.JarredWalton - Friday, July 18, 2014 - link
No, G-Sync is actually a pain in the butt on laptops, for one reason: Optimus. To do G-Sync, you need to have the GPU and display communicate with each other, so the only way NVIDIA can do it is if they get rid of Optimus. But doing that means you just killed battery life as well. There are potentially ways around that I'm sure, but it's the reason there haven't been any G-Sync notebooks yet. I actually asked NVIDIA about it at CES and they basically said as much: "We're looking at ways to implement it, but right now we don't have anything we can talk about."Tanclearas - Friday, July 18, 2014 - link
Wow. Interesting. Nvidia's response to AMD's "free sync" demo implied that laptops were easier to implement."However, Petersen quickly pointed out an important detail about AMD's "free sync" demo: it was conducted on laptop systems. Laptops, he explained, have a different display architecture than desktops, with a more direct interface between the GPU and the LCD panel, generally based on standards like LVDS or eDP (embedded DisplayPort). Desktop monitors use other interfaces, like HDMI and DisplayPort, and typically have a scaler chip situated in the path between the GPU and the panel. As a result, a feature like variable refresh is nearly impossible to implement on a desktop monitor as things now stand."
All of this just raises more questions.
In AMD's demo, did they have to disable DSG for free sync to work? If not, how does an AMD GPU communicate with the display? If so, then what AMD showed was even more impressive if Nvidia is still "looking at ways to implement it".
Are there laptops with free sync (officially) coming?
I am not trying to take things off topic. I just want to reiterate that the reviewed laptop is exactly the class of machine that would benefit the most from G-Sync/free sync. If anything, it is more important for a laptop because you do not ever have the option of replacing the GPU once your framerates start dropping.
One final (two-part) question. Could a laptop not have the DP connected directly to the Nvidia GPU? Is it not safe to assume that a person connected to an external DP monitor would have access to external power (and therefore not need Optimus)?
JarredWalton - Friday, July 18, 2014 - link
So basically on a laptop if you plug directly into the dGPU, yes, it's easier -- but I'm not sure how much easier we're really talking about. Obviously it can be done with desktop displays with enough effort, the main benefit of laptops being you have multiple inputs into desktop displays with scalers and such. An interesting corollary is that AMD might have an advantage with laptops using AMD APUs -- both the APU and dGPU would be AMD software, so there's no "Intel iGPU" in the way.As to the question of when laptops with G/Free Sync are coming, I don't believe any have been announced yet, so your guess is as good as mine. I told NVIDIA that laptops would be great for this as getting >60 FPS on a laptop is rather difficult but ~40 FPS with G-SYNC would still be achievable. We might see this in the next year, or perhaps even earlier with non-Optimus solutions (e.g. ASUS has the G750 without Optimus, so it might be a target for G-SYNC in an update). Of course, the number of displays with G-Sync support is still very small (one or two ASUS are officially available, another ASUS display can be modded by the user; Acer and BenQ have displays coming but they're not out yet.)
As for the second question: sure, an external G-Sync display could easily be driven by a laptop. But that sort of defeats the purpose of a laptop in large measure. :-)
Tanclearas - Saturday, July 19, 2014 - link
At first it might appear that it defeats the point of a laptop, but there is (what I assume a growing) group of aging gamers that require a laptop for business/work, but still want to game at home. I have a Lenovo Y580 with 16GB of RAM, a 240GB mSATA SSD, and the Nvidia GTX 660m. It's great for work, and OK for games, but would be better with G-Sync or free sync.Khenglish - Friday, July 18, 2014 - link
SLI class laptops already do not support optimus because nivida does not allow it with SLI, so these could do G-sync just fine.As for strong single card laptops then yes the iGPU is in the way. Alienware laptops allow you to run in optimus/enduro mode or in dGPU only mode which connects the dGPU directly to the display, but ASUS, Clevo, and MSI do not. Alienware actually has every display output directly drivable by the dGPU, so I feel it would not be very difficult for other makers to at least route the internal display with a BIOS option for iGPU/dGPU mode, or dGPU only mode.
yankeeDDL - Friday, July 18, 2014 - link
This is a good machine. I am looking for an upgrade and this looks fast and with a spacious HD, which is a plus.That said, I wish start seeing laptops (especially in this size and price range), using an SSD paired with an HDD (less than 1TB means that you need to find some creative way to keep music, photos and videos with you, and 1TB SSD is still way too expensive).
Maybe some of the Kaveri parts can compromise on the CPU while provide similar GPU power, all combined for a much lower cost. Let's see: so far Kaveri has been MIA.
Hrel - Friday, July 18, 2014 - link
I picked up the GE60 i7 and GTX765M GPU for $800 off neweggflash a few months back. Couldn't be happier with it. Really amazing laptop, for that money I wouldn't expect a backlit keyboard, yet it has one. a 720p camera, stereo mic, Optimus, fantastic keyboard feel, though the layout could use some work. Some things are in odd places and there's some completely unused keys that could be replaced with nice keys, like a Windows key to the right of the spacebar. I'd also like the spacebar moved left some as it sits directly under my right palm. The Del key is also oddly far to the right, so that should be put back where it belongs.Nit picky stuff though, it's an amazing laptop. Seems completely wasteful to get anything more expensive considering what is offered in the GE series. I guess if you need something .5" thinner, for some unkown reason, the GS series makes sense. But I cannot fathom a function for the GT series besides just wasting money.
Taurus229 - Friday, July 18, 2014 - link
Mainstream at $1200.00. Who's kidding who ?grayson360 - Saturday, July 19, 2014 - link
The Physx issue is because Metro Last Light installs an older version. If you install metro, then install the drivers, it runs beautifully. I had the same issue with my SLI'd 780's. The fps would plummet to single digits. After the fix, I had perfect smooth gameplay with zero slowdowns.GreenMeters - Sunday, July 20, 2014 - link
Is it possible to remove the tacky badge from the top cover?DaveLikesHardware - Sunday, July 20, 2014 - link
I prefer using a touchscreen to navigate Metro UI.Are GTX-class mobile GPUs and touch screens mutually exclusive?
I was looking forward to the Y50 - thinking it'd be touch screen.
How about the GS60 Ghost or GS60 Stealth? Neither are, are they?
I'll keeping waiting...
Dave
CommandoCATS - Monday, July 21, 2014 - link
I think there's a Lenovo Y50 Touch with a GTX 860M and touchscreen. Was that what you meant?DaveLikesHardware - Monday, July 21, 2014 - link
Yes, Commando, it is. Thanks - I missed it and only saw the Y50 non-Touch.That is the fastest GPU paired with Touch screen that I've seen so far.
Dave
romba - Thursday, July 24, 2014 - link
On storage, on the other hand, this is the only laptop at this price with 2 msata ports. My GE60 is equipped with 2 x 1TB Sams SSD and 2 x 1TB HDD (one in a caddy in the optical drive). So if you are willing to get your hands dirty and self upgrade, this baby is for you.I dare you to find a laptop at this price that can house 4TB of storage.
<a href="http://imgur.com/AeBQRR3"><img src="http://i.imgur.com/AeBQRR3m.jpg" title="Hosted by imgur.com"/></a>
romba - Thursday, July 24, 2014 - link
I messed up the imgur link. Here it ishttp://imgur.com/AeBQRR3
romba - Thursday, July 24, 2014 - link
And for battery I purchased a 9 cell battery and it keeps it running for 5-6 hours on office duty.petwho - Monday, August 11, 2014 - link
Not very lightweight though disk storage look promissing.http://www.tuicoding.com