Assuming it's TDP is close to that of the 750m, yes, it could end up in the new rMBP's.
However, what apple did with the 750m in the rMBP is that they over clocked it, and used its lower TDB compared to the 765m to keep heat down. In that sense, we may very well see an OC'd 850m in the new rMBP.
Only time will tell. Razer did manage to put the 765m in their Blade 14" laptop, so if Apple was willing to take the slight hit on battery life, they may decide to give the 860m a shot :)
Apple likely buy the cheapest pne 850M and firmware clocked it 10% higher and that's it. They are real cheap-skate in terms of getting good parts due to their monopoly in the Mac market.
But we'd still have to pay 700 extra for it, presumably :/ The Iris Pro is good enough and all, but for some work (even non gaming, and not quite so high end as for a workstation card to be worth it) a Geforce or Radeon is just required so you have to pony up that huge extra fee.
Ha. I just started a thread on the AT forums about the MSI GS60 saying that the only other laptop i'd look forward to is a refreshed Razer Blade 14 and here were are. Very opportune timing. If that Blade has a significantly better screen than the last generation, I may finally buy a laptop again.
Jarred. In regards to the ultra high-res displays, have you already or do you plan on putting together an article regarding gaming and upscaling on 3K and 4K screens? More and more laptops are coming out with these "retina" style displays where the graphics card won't be able to run games at native res. I'd like to see how the games would look/perform with upscaling. For example, running the game at 16x9 or 1080p with various settings on a 3k screen.
I think it'd be an important article because it would affect my buying choice when offered multiple screen options with the same laptop.
I commented on this in the Dell XPS 15 review. Basically, upscaling (or running at non-native) is much less noticeable in many respects as the pixels are so small. The other side of the coin is that there are way too many Windows apps that I use that don't work well with high DPI. In a few years, I think it will all get sorted out, but right now high DPI is "bleeding edge" stuff.
Possibly, but I thought he was primarily concerned with making the chassis and waxing eloquent on the joys of the material choices and such. Speaking of which, what are you working on these days? Hahaha....
Wallet is never prepared. I just club it on the head and drag it out. This will be one hell of an expensive purchase if I end up buying it. I'm really curious about the heat and noise generated by this beast. Looks like it's shipping in April? I'm hoping AT gets a review sample.
This is great news! I look forward to seeing laptops become thinner and more powerful. I know they will never fully reach desktop performance, but it makes me smile seeing them gain. I recently purchased a sager np9150 with a 7970m and I am honestly using my desktop less and less while I enjoy the benefits of mobility. Its an awesome time for technology!
I'm still rocking a big Alienware 17x R3 with 580M so I have been eyeing the upgrade paths. But, I can't help but be very disappointed the "880M" is not Maxwell technology but just re-branded old stuff. I don't care if its 20% faster. I don't mind a carrying around a large laptop. I'm not a frail girl.
Not only in time for back-to-school, but in time for Broadwell. The power improvements of Broadwell + Maxwell at the very high end will be an incredibly refreshing change in gaming-laptop battery life.
Couple of outlets are running pieces on the "New Razer Blade 14", it has a 3200 x 1800 touch screen and indeed does have a 870m while the Blade Pro indeed has a 860m
So now we will get a many nice and light and thin 17" laptops?
Currently the only 1 i can by in The Netherlands is the MSI Stealth. I don't know any other. So i guess i just have to wait until the refresh of the Stealth (that i guess will have a 8xxM in side it)
But it would be nice to have way more options.. (for example a 17" with a resolution between 2k and 4k)
Hm, I think the only disappointing factors are the use of Kepler and Maxwell parts (although not surprising) and the lack of distinguishing between the two architectures at the 860M level. However, I am looking forward to coverage of 860M (Maxwell) parts as I think that will finally be a good sweet spot for me. I have an aging gaming laptop (Dell M1530 with an M8600 GT), and I've grown very used to my ASUS Zenbook... especially its weight (or the lack thereof). So, I'm hoping for a (relatively!) light-weight laptop that has some decent gaming power under the hood.
Kepler? How about that Fermi SKU? They have to be blowing the dust off of some old chips to still be selling those. Really the most disappointing thing about the 800 series so far is the fact that there still is no word on desktop parts. As most people are saying they're likely waiting for TSMC 20nm. That is a long gap. The entire desktop gaming GPU cycle has been shifted half a year because of it. I can't imagine nvidia and amd expect to get "back on track" by releasing the 900 series parts a few months after the 800 series.
The 750Ti (and thus 860m) still doesn't match the 7870/660 class GPUs in current-gen consoles and thus isn't a really viable enthusiast gaming platform. The
870m is a closer fit, and the 880m definitely does it, but those will be much more expensive. We're so dang close to xbone/PS4-class performance in mainstream laptops, it hurts.
You can't just look at the CUDA core counts -- the consoles are using AMD chips for one, and AMD shader cores have almost always been "slower" individually than NVIDIA shader cores. I suspect the GTX 750 Ti will give the PS4 and XBOne a run for the money in GPU performance. Couple it with a decent CPU and the only thing really holding us back on PCs is the OS and the need to support "infinity" different hardware configurations.
I'm not looking at core counts alone, all the various game tests have shown the 750Ti to be slower than the 7870 and the 660, which are roughly equivalent to current-gen consoles.
it plays titanfall on medium settings @1080p 60FPS.. The Xbox One is using similar settings but only rendering at 760p then upscaled. So yes, it's faster if you dismiss variables like CPU and Platform performance.
No real way to compare unless someone pops in a Steamroller cpu with 750 Ti
While that would be nice to see, I don't think that's the point. The OP said the 750 Ti can't make for a viable gaming platform compared to current gen consoles, because it is "slower."
Clearly it can. It's outperforming both consoles with some CPU/platform setups. The 860Ms in these laptops would likely be paired with i5s and i7s, which will yield much better CPU performance than the Steamroller APUs in the consoles.
Exactly. Especially since the "860M (Kepler)" is crippled by its memory bus. What's the point of putting such a large, powerful and expensive chip in there when many shaders are deactivated and not even the remaining ones can use their full performance potential. You also can't make a huge chip as energy efficient as a smaller one by disabling parts of it.
GK106 with full shaders, 192 bit bus and moderate clocks would have been more economic and at least as powerful, probably a bit faster depending on game and settings.
Yet regarding power efficiency "860M (Maxwell)" destroys both of these configurations, which makes them rather redundant. Especially since it should be cheaper for nVidia to produce the GM107. Do they have so many GK104 left to throw away?
simple. TSMC's 20nm process STILL isnt ready for mass use. until they are finished, both AMD and NVIDIA are stuck at 28nm. it's too bad, as laptop GPUs are screaming for 20nm gpus.
GeForce Experience is pretty awesome. It used to be that whenever I got a new game, I'd have to spend a lot of time trying to figure out the right settings for it; the default settings would usually either run way too slow or way too fast. But with GeForce experience, they've already tested the game with the same CPU and GPU that I've got, and its defaults are generally a good balance of quality and performance. So what used to be an involved process of play-tweak-play-tweak-play is now just a "mash button and go".
That said, it could still use some polish. I don't know if it's still in beta, but it feels like it. It's not uncommon for it to start reporting "game cannot be optimized" for games that it DOES support (and that you have previously optimized), which usually requires a reboot. And a few months ago nVidia did a self-update that caused it to go completely nuts, locking up the machine (a trip to their forums indicated it happened to everybody who got the update before they fixed it).
"Where things get a little interesting is when we get to the GTX 860M. As we’ve seen in the past, NVIDIA will have two different models of the 860M available, and they’re really not very similar (though performance will probably be pretty close)."
no... performance won't be close. This is a laptop, where power efficiency is part of performance, and you me, and the author all know power consumption of the "Maxwell 860M" is going to be less than the "Kepler 860M" the article should be absolutely SLAMMING nVidia for calling two very different parts the same thing. The video card numbering schemes are confusing enough to laymen (I get asked to try to explain it regularly, since I'm the go to hardware guy in my circle of friends, relatives, co-workers, casual acquaintances and all their friends...) It's going to be impossible to tell them that it depends on which GTX860 they get and they probably can't tell which they'll get until they get the computer...
We've complained many times about overlapping names in the past. There were for instance two completely different versions of the GT 555M (which later became the GT 635M I believe). And performance and battery life are not "the same" -- particularly since the GPU is usually off when you're on battery power. If you want to play games while unplugged, well, there it could be a different story. Anyway, we pointed it out, said it was a dumb overlap more or less (the "interesting" was meant as a sarcastic interesting, not a "wow, this is really interesting"; perhaps that wasn't properly conveyed though I'd suggest the rest of the text supports that), and moved on. If the Kepler variant is widely used, we'll certainly complain about it.
"and while I can’t verify the numbers they claim to provide better performance with a 840M than Iris Pro 5100 while using less than half as much power."
i think you mean 5200
---
the iris pro in the macbook retina 15" is actually quite amazing for the casual gamer: doat2: 1920x1200 maximum FXAA 67fps csgo: 1920x1200 high FXAA TRI 82fps (107fps inside office in corridor) sc2: 1920x1200 texture = ultra| graphics = med = 78fps gw2: 1920x1200 - Best Appearance 21fps | Autodetect 50fps (60fps on land) | Best Performance 91fps diablo3: 1920x1200 high = 56fps
With Cherry Trail, they will be able to put the HD 4400 level of performance in Atom chips.
Both Nvidia and Intel have secret sauce to tremendously improve performance/watt in the next few years or so to push HPC.
Broadwell should be the first result for Intel in that space, while Nvidia starts with Maxwell. The eventual goal for both companies are 10TFlop DP at about 200W in 2018-19 timeframe. Obviously the efficiency gains gets pushed down in graphics.
No, I just don't 2MB is going to effectively hide the fact that you're using 2GB of textures and trying to deal with most of those using a rather tiny amount of memory bandwidth. Does the Xbox One's eDRAM effectively make up for the lack of raw memory bandwidth compared to the PS4? In general, no, and that's with far more than a 2MB cache.
It can help, sure, but you're comparing a chip with a faster GPU and the same RAM to a chip with 640 Maxwell shaders at 1189MHz to a chip with 768 Kepler shaders at 1032MHz (plus Boost in both cases). Just on paper, the GTX 750 Ti has 4% more shader processing power. If bandwidth isn't the bottleneck in a game -- and in many cases it won't be with 86.4GB/s of bandwidth -- then the two GPUs are basically equal, and if a game needs a bit more bandwidth, the 650 Ti will win out.
Contrast that with what I'm talking about: a chip with less than 20% of the bandwidth of the 750 Ti. It's one thing to be close when you're at 80+ GB/s, and quite another to be anywhere near acceptable performance at 16GB/s.
"Speaking of which, I also want to note that anyone that thinks “gaming laptops” are a joke either needs to temper their requirements or else give some of the latest offerings a shot." You realize that you are speaking to the "PC gaming master race", right? :P
why do I live in a world where thunderbolt eGPU for laptops are still not a thing and astronomically expensive, mediocre-performing gaming laptops are still a thing?
Because outward facing bandwidth is scarce and expensive. Even with thunderbolt 2.0 (which has seen a very underwhelming adoption from OEMs) the GPU will be spending a great deal of time waiting around to be fed.
the few videos on youtube show that it is possible to run graphics cards over TB1 4x pice, and be able to achieve 90%+ performance out of the card when connected to an external monitor and 80%+ performance when feeding back through the TB to the internal monitor
so basically eGPU could really be a thing right now, but no-one is making a gaming targeted pcie tb connector. (sonnet's one needs an external psu if you are trying to put a gpu in it)
Because GPU Makers can sell mobile chips for a huge increase over desktop chips. A 780M, which is roughly comparable to a desktop 660 nets Nvidia a hell of a lot more cash. A 660 can be picked up for arround £120 these days, whereas on Clevo reseller site in my country a 770-780M upgrade costs £158.
Nvidia knows that a large percentage of very high end gaming laptops (780M) just sit on desks and are carried around very infrequently, which could easily be undercut piecewise with a eGPU.
My 13inch laptop, 750Ti ExpressCard eGPU, and PSU for the eGPU (XBOX 360) can still easily fit into a backpack for taking round to a friends, and costed much less than any similar performing laptop avaible at the time, and when I don't need that GPU power then I have an ultraportable 13 inch at my disposal.
Because intel does not give permissions for making thunderbolt eGPU. I was waiting for a thunderbolt based Vidock, however it will never happen because of this.
So I take it nvidia hasn't hinted at the possibility of g-sync chips being included in laptop panels? I think they'd make the biggest impact in laptops where sub 60 fps is practically given on newer titles.
I asked about this at CES. It's something NVIDIA is working on, but there's a problem in that the display is being driven by the Intel iGPU, with Optimus working in the background and rendering the frames. So NVIDIA would have to figure out how to make Intel iGPU drive the LCD properly -- and not give away their tech I suppose. I think we'll see a solution some time in the next year or two, but G-Sync is still in its early stages on desktops, so it will take time.
I'm surprised you guys didn't say anything about the 850M not supporting SLI. I was expecting a paragraph deriding that decision by Nvidia. I'm really upset. I would have loved to see how energy efficient SLI could get. Lenovo has had that laptop with 750M in SLI for about a year now and I've thought that was kinda stupid.
But considering how power efficient Maxwell is maybe that could actually be a good idea now.
Maybe they'll still do it with underclocked GTX 860M's.
Hm, I bet that's what Nvidia wanted. To discourage OEM's from buying 2 cheaper GPU's/laptop instead of ONE hugely expensive one. Prevent them from SLI'ing the best GPU in their lineup.
I've never been much of a fan of SLI in laptops. Scaling is never perfect, often there is more difficulty with drivers and game compatibility, and battery life can take a hit as well. I mentioned it didn't support SLI, but I can't say it bothers me much. 860M Maxwell supports it (apparently) and will use the same chip, so really it's only going to be a small bump in price to go from two 850M to two 860M -- assuming an OEM wants to do SLI 860M that is.
I think the thesis of this article is right on: it's a great time for gaming notebooks. The idea just four years ago that I could play the newest games at highest settings at 2560x1600 at more than respectable FPS was unthinkable. My Sager with dual 780ms does that with legs to spare, and I should be able to do this with a SINGLE GPU once high-end Maxwell mobile chips come later this year -- simply amazing.
"we’re talking about a feeble 16GB/s of memory bandwidth – that’s lower than what most desktops and laptops now have for system memory, as DDR3-1600 with a 128-bit interface will do 25.6GB/s."
Actually the shared aspect in iGPU systems effectively makes the bandwidth equal to about half that. That is, 12.8GB/s.
Only if you're using the iGPU to do something, but since I'm discussing system bandwidth vs. GPU bandwidth I didn't get into that. I suppose something like Kaveri will end up with about the same 16GB/s of bandwidth from the system RAM (with the remaining bandwidth going to the CPU), but really Kaveri will still only be a "moderate" GPU performance level.
Does anybody see why Maxwell part is the starting point and Kepler holding the higher end of the range ?. They are squeezing blood out of Kepler before a complete switch to Maxwell. It is a very clever trick to pull and buys NV time to carefully craft the performance of higher end Maxwell parts to suit the performance/price model they wanted. This release alone seems enough to maintain their discrete gpu market on laptops while AMD struggles with their mobile market. It keeps Intel IGP at bay except for non gamers who do not care about discrete graphics.
I am one of those people that think laptop gaming "is a joke" still. 1hr gaming is still limited but more so is most gamers are going to want to use a mouse not trackpad and therefore use a table. At the point you're gaming at a table why not build a solid desktop and buy an ultraportable for less? Are people unable to spend a 1k to 2k budget better in this regard?
I am greatful to see real performance hitting laptops, lower 850m = 580m in the illustration for example. Coming from a purely cost sense I am unconvinced an upgradable desktop and cheap slim low power laptop is not better for the vast majority while also being cheaper. Especially in the long run as you can upgrade a desktop while a $200-$300 laptop won't depreciate like a 1k-2k one will. For example, who is going to buy that 580m laptop for even half its list now? A cheap laptop is almost always worth $100.
most play plugged into the power and with an external mouse
yet the reason to get a gaming laptop (not gaming desktop + laptop) is that it can be cheaper (though performance will be worse), and that it is very portable to take to a friends house
How would a 860m compare to a 580m? I have a 580m currently and find it fast enough. So if i upgrade again, i would only go for a 860m or so and make savings on the battery life.
Jarred: Having a gaming laptop with Optimus, I'm not convinced of any benefit of Optimus. Since I leave my machine plugged in all the time, it really doesn't give me any benefit. On the contrary, I have the following main issue with Optimus: sometimes software doesn't use the dGPU or the dGPU fails to kick in. I've experienced this with at least 2 titles: AutoCAD 2013 and Zen Pinball 2. In the first case, AC displays an error dialog saying it can't find a real GPU and exits to the desktop. I've actually resorted to running AC2013 on my old ThinkPad that has a discrete GPU. In the second, Zen Pinball 2 (via Steam) apparently finds and runs on the HD4000 iGPU, but is horribly laggy. (Other Steam-based titles seem to run fine.) Optimus for big, power-hungry laptops is probably a half-baked idea given how well the recent major GPUs drop to idle anyway, and I will be shopping for a discrete GPU only in the future.
And you can't get either to work by creating a custom profile in NVIDIA's control panel? I have Zen Pinball 2 so I can at least try that, but I haven't ever used AutoCAD so I'm not help there.
Anybody know when the benchmarks come out? I'll love to see the maxwell 860m benchmarked against the 770m and the 750m/755m sli (i.e., the lenovo y510p).
Hey Jarred, I'm confused a little with this quote: "One nice benefit of moving to the GTX class is that the 850M will require the use of GDDR5" It doesn't seem to be true since geforce.com states both DDR3 and GDDR5 are possible for 850m.
The RAM clocks for the gddr5 cards are specified as 2.5 GHz. Is that a mistake? Shouldn't it be in the neighbourhood of 5 GHz with 80ish GB/s memory bandwidth?
Good grief OEMs aren't listing which 860m is in which laptop SKU which leaves me believing thatbtheyre actually mixing maxwell and kepler bins. That pisses me off greatly because one chip has a higher value than the other and they're leaving it literally up to luck. If I get a laptop this summer I will return it if I get a Kepler chip without hesitation and try a different site to buy from.
Also the quoted max of 2GB for the 860m is interesting because gigabytes "p34g v2" product page shows the 860m with 4GB DDR5.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
91 Comments
Back to Article
nutgirdle - Wednesday, March 12, 2014 - link
Maxwell 860M seems to be a good fit for 15" rMBP.WinterCharm - Wednesday, March 12, 2014 - link
Assuming it's TDP is close to that of the 750m, yes, it could end up in the new rMBP's.However, what apple did with the 750m in the rMBP is that they over clocked it, and used its lower TDB compared to the 765m to keep heat down. In that sense, we may very well see an OC'd 850m in the new rMBP.
Only time will tell. Razer did manage to put the 765m in their Blade 14" laptop, so if Apple was willing to take the slight hit on battery life, they may decide to give the 860m a shot :)
tipoo - Wednesday, March 12, 2014 - link
Well, not so much "overclocked" as allowed the maximum allowable boost clock through firmware and cooling.fteoath64 - Thursday, March 13, 2014 - link
Apple likely buy the cheapest pne 850M and firmware clocked it 10% higher and that's it. They are real cheap-skate in terms of getting good parts due to their monopoly in the Mac market.Zoolookuk - Saturday, March 15, 2014 - link
Yeah, Apple are known for being 'cheap'...Antronman - Saturday, March 29, 2014 - link
As far as what they want to pay, yeah.As far as what they ask you to pay, never in a million years.
Antronman - Saturday, March 29, 2014 - link
And it would be a waste because only hipsters who don't even play games that would need a GPU for high-quality settings, good fps use macs.tipoo - Wednesday, March 12, 2014 - link
But we'd still have to pay 700 extra for it, presumably :/The Iris Pro is good enough and all, but for some work (even non gaming, and not quite so high end as for a workstation card to be worth it) a Geforce or Radeon is just required so you have to pony up that huge extra fee.
Connoisseur - Wednesday, March 12, 2014 - link
Ha. I just started a thread on the AT forums about the MSI GS60 saying that the only other laptop i'd look forward to is a refreshed Razer Blade 14 and here were are. Very opportune timing. If that Blade has a significantly better screen than the last generation, I may finally buy a laptop again.JarredWalton - Wednesday, March 12, 2014 - link
3K I believe, so yes -- much better. :-)Connoisseur - Wednesday, March 12, 2014 - link
Jarred. In regards to the ultra high-res displays, have you already or do you plan on putting together an article regarding gaming and upscaling on 3K and 4K screens? More and more laptops are coming out with these "retina" style displays where the graphics card won't be able to run games at native res. I'd like to see how the games would look/perform with upscaling. For example, running the game at 16x9 or 1080p with various settings on a 3k screen.I think it'd be an important article because it would affect my buying choice when offered multiple screen options with the same laptop.
JarredWalton - Wednesday, March 12, 2014 - link
I commented on this in the Dell XPS 15 review. Basically, upscaling (or running at non-native) is much less noticeable in many respects as the pixels are so small. The other side of the coin is that there are way too many Windows apps that I use that don't work well with high DPI. In a few years, I think it will all get sorted out, but right now high DPI is "bleeding edge" stuff.rxzlmn - Thursday, March 13, 2014 - link
4K is much better than 3K though, due to the fact that you can natively scale to both 720p and 1080p.Dustin Sklavos - Wednesday, March 12, 2014 - link
I wonder if Vivek had a hand in that. ;)JarredWalton - Wednesday, March 12, 2014 - link
Possibly, but I thought he was primarily concerned with making the chassis and waxing eloquent on the joys of the material choices and such. Speaking of which, what are you working on these days? Hahaha....WinterCharm - Wednesday, March 12, 2014 - link
Just checked out Razer's website...That blade has an IGZO (Indium Gallium Zinc Oxide) IPS touchscreen, at a resolution of 3200x1800.
Looks like you're in luck! :) just... uh... prepare your wallet.
Connoisseur - Thursday, March 13, 2014 - link
Wallet is never prepared. I just club it on the head and drag it out. This will be one hell of an expensive purchase if I end up buying it. I'm really curious about the heat and noise generated by this beast. Looks like it's shipping in April? I'm hoping AT gets a review sample.talos113 - Wednesday, March 12, 2014 - link
This is great news! I look forward to seeing laptops become thinner and more powerful. I know they will never fully reach desktop performance, but it makes me smile seeing them gain. I recently purchased a sager np9150 with a 7970m and I am honestly using my desktop less and less while I enjoy the benefits of mobility. Its an awesome time for technology!warezme - Wednesday, March 12, 2014 - link
I'm still rocking a big Alienware 17x R3 with 580M so I have been eyeing the upgrade paths. But, I can't help but be very disappointed the "880M" is not Maxwell technology but just re-branded old stuff. I don't care if its 20% faster. I don't mind a carrying around a large laptop. I'm not a frail girl.JarredWalton - Wednesday, March 12, 2014 - link
My guess is we'll see Maxwell GTX 890M or 885M or something during the summer months -- just in time for the back-to-school shopping.Freakie - Wednesday, March 12, 2014 - link
Not only in time for back-to-school, but in time for Broadwell. The power improvements of Broadwell + Maxwell at the very high end will be an incredibly refreshing change in gaming-laptop battery life.willis936 - Wednesday, March 12, 2014 - link
What's the timeframe for laptops showing up with these? A P34G with a GTX 860M (and hopefully a 16GB option) would be great before summer.bakedpatato - Wednesday, March 12, 2014 - link
Couple of outlets are running pieces on the "New Razer Blade 14", it has a 3200 x 1800 touch screen and indeed does have a 870m while the Blade Pro indeed has a 860mjasonelmore - Wednesday, March 12, 2014 - link
wtf the pro has a less powerful GPU in a much bigger chassis?jcompagner - Wednesday, March 12, 2014 - link
So now we will get a many nice and light and thin 17" laptops?Currently the only 1 i can by in The Netherlands is the MSI Stealth. I don't know any other. So i guess i just have to wait until the refresh of the Stealth (that i guess will have a 8xxM in side it)
But it would be nice to have way more options.. (for example a 17" with a resolution between 2k and 4k)
Aikouka - Wednesday, March 12, 2014 - link
Hm, I think the only disappointing factors are the use of Kepler and Maxwell parts (although not surprising) and the lack of distinguishing between the two architectures at the 860M level. However, I am looking forward to coverage of 860M (Maxwell) parts as I think that will finally be a good sweet spot for me. I have an aging gaming laptop (Dell M1530 with an M8600 GT), and I've grown very used to my ASUS Zenbook... especially its weight (or the lack thereof). So, I'm hoping for a (relatively!) light-weight laptop that has some decent gaming power under the hood.willis936 - Wednesday, March 12, 2014 - link
Kepler? How about that Fermi SKU? They have to be blowing the dust off of some old chips to still be selling those. Really the most disappointing thing about the 800 series so far is the fact that there still is no word on desktop parts. As most people are saying they're likely waiting for TSMC 20nm. That is a long gap. The entire desktop gaming GPU cycle has been shifted half a year because of it. I can't imagine nvidia and amd expect to get "back on track" by releasing the 900 series parts a few months after the 800 series.schizoide - Wednesday, March 12, 2014 - link
The 750Ti (and thus 860m) still doesn't match the 7870/660 class GPUs in current-gen consoles and thus isn't a really viable enthusiast gaming platform. The870m is a closer fit, and the 880m definitely does it, but those will be much more expensive. We're so dang close to xbone/PS4-class performance in mainstream laptops, it hurts.
JarredWalton - Wednesday, March 12, 2014 - link
You can't just look at the CUDA core counts -- the consoles are using AMD chips for one, and AMD shader cores have almost always been "slower" individually than NVIDIA shader cores. I suspect the GTX 750 Ti will give the PS4 and XBOne a run for the money in GPU performance. Couple it with a decent CPU and the only thing really holding us back on PCs is the OS and the need to support "infinity" different hardware configurations.schizoide - Wednesday, March 12, 2014 - link
I'm not looking at core counts alone, all the various game tests have shown the 750Ti to be slower than the 7870 and the 660, which are roughly equivalent to current-gen consoles.MrSpadge - Wednesday, March 12, 2014 - link
Only if you equate the next gen consoles with the PS4. The XBone (~HD7790) is about matched by GTX750Ti.rish95 - Wednesday, March 12, 2014 - link
Are you sure? As far as I know, the 750 Ti outperforms the PS4 in Battlefield 4 and the XBone in Titanfall.jasonelmore - Thursday, March 13, 2014 - link
it plays titanfall on medium settings @1080p 60FPS.. The Xbox One is using similar settings but only rendering at 760p then upscaled. So yes, it's faster if you dismiss variables like CPU and Platform performance.No real way to compare unless someone pops in a Steamroller cpu with 750 Ti
rish95 - Thursday, March 13, 2014 - link
While that would be nice to see, I don't think that's the point. The OP said the 750 Ti can't make for a viable gaming platform compared to current gen consoles, because it is "slower."Clearly it can. It's outperforming both consoles with some CPU/platform setups. The 860Ms in these laptops would likely be paired with i5s and i7s, which will yield much better CPU performance than the Steamroller APUs in the consoles.
sheh - Wednesday, March 12, 2014 - link
860M and 860M. That's nice. A new low in obfuscation of actual specs?schizoide - Wednesday, March 12, 2014 - link
Agreed, I also found that disgusting.MrSpadge - Wednesday, March 12, 2014 - link
Exactly. Especially since the "860M (Kepler)" is crippled by its memory bus. What's the point of putting such a large, powerful and expensive chip in there when many shaders are deactivated and not even the remaining ones can use their full performance potential. You also can't make a huge chip as energy efficient as a smaller one by disabling parts of it.GK106 with full shaders, 192 bit bus and moderate clocks would have been more economic and at least as powerful, probably a bit faster depending on game and settings.
Yet regarding power efficiency "860M (Maxwell)" destroys both of these configurations, which makes them rather redundant. Especially since it should be cheaper for nVidia to produce the GM107. Do they have so many GK104 left to throw away?
En1gma - Wednesday, March 12, 2014 - link
> 800M will be a mix of both Kepler and Maxwell partsand even Fermi: 820m is GF108-based
En1gma - Wednesday, March 12, 2014 - link
or GF117..r3loaded - Wednesday, March 12, 2014 - link
Why does Nvidia persist with the GT/GTX prefixes? They're largely meaningless as people just go off the 3-digit model number.JarredWalton - Wednesday, March 12, 2014 - link
They did drop the "GT" on the 840M/830M/820M.JarredWalton - Wednesday, March 12, 2014 - link
And adding to that, keeping GTX allows them the ability to say, "GTX is required for Battery Boost, GameStream, and ShadowPlay."jeffbui - Wednesday, March 12, 2014 - link
Is there a reason why the GPUs have been stuck at the 28nm node for so long? IIRC the 600 series was also a 28nm part.TheinsanegamerN - Wednesday, March 12, 2014 - link
simple. TSMC's 20nm process STILL isnt ready for mass use. until they are finished, both AMD and NVIDIA are stuck at 28nm. it's too bad, as laptop GPUs are screaming for 20nm gpus.Guspaz - Wednesday, March 12, 2014 - link
GeForce Experience is pretty awesome. It used to be that whenever I got a new game, I'd have to spend a lot of time trying to figure out the right settings for it; the default settings would usually either run way too slow or way too fast. But with GeForce experience, they've already tested the game with the same CPU and GPU that I've got, and its defaults are generally a good balance of quality and performance. So what used to be an involved process of play-tweak-play-tweak-play is now just a "mash button and go".That said, it could still use some polish. I don't know if it's still in beta, but it feels like it. It's not uncommon for it to start reporting "game cannot be optimized" for games that it DOES support (and that you have previously optimized), which usually requires a reboot. And a few months ago nVidia did a self-update that caused it to go completely nuts, locking up the machine (a trip to their forums indicated it happened to everybody who got the update before they fixed it).
Concillian - Wednesday, March 12, 2014 - link
"Where things get a little interesting is when we get to the GTX 860M. As we’ve seen in the past, NVIDIA will have two different models of the 860M available, and they’re really not very similar (though performance will probably be pretty close)."no... performance won't be close. This is a laptop, where power efficiency is part of performance, and you me, and the author all know power consumption of the "Maxwell 860M" is going to be less than the "Kepler 860M" the article should be absolutely SLAMMING nVidia for calling two very different parts the same thing. The video card numbering schemes are confusing enough to laymen (I get asked to try to explain it regularly, since I'm the go to hardware guy in my circle of friends, relatives, co-workers, casual acquaintances and all their friends...) It's going to be impossible to tell them that it depends on which GTX860 they get and they probably can't tell which they'll get until they get the computer...
JarredWalton - Wednesday, March 12, 2014 - link
We've complained many times about overlapping names in the past. There were for instance two completely different versions of the GT 555M (which later became the GT 635M I believe). And performance and battery life are not "the same" -- particularly since the GPU is usually off when you're on battery power. If you want to play games while unplugged, well, there it could be a different story. Anyway, we pointed it out, said it was a dumb overlap more or less (the "interesting" was meant as a sarcastic interesting, not a "wow, this is really interesting"; perhaps that wasn't properly conveyed though I'd suggest the rest of the text supports that), and moved on. If the Kepler variant is widely used, we'll certainly complain about it.Runamok81 - Wednesday, March 12, 2014 - link
Best news here? nVidia finally dropped the silly GTX suffix. Bump that stock price. Sanity is prevailing.Death666Angel - Wednesday, March 12, 2014 - link
Nope.Anders CT - Wednesday, March 12, 2014 - link
GTX 860M and GTX 860M are two different GPU's using different architectures?That is a pretty lame naming scheme.
MrSpadge - Wednesday, March 12, 2014 - link
Yeah, 4 letters and 3 numbers are just not enough to get this point across.lordmocha - Wednesday, March 12, 2014 - link
"and while I can’t verify the numbers they claim to provide better performance with a 840M than Iris Pro 5100 while using less than half as much power."i think you mean 5200
---
the iris pro in the macbook retina 15" is actually quite amazing for the casual gamer:
doat2: 1920x1200 maximum FXAA 67fps
csgo: 1920x1200 high FXAA TRI 82fps (107fps inside office in corridor)
sc2: 1920x1200 texture = ultra| graphics = med = 78fps
gw2: 1920x1200 - Best Appearance 21fps | Autodetect 50fps (60fps on land) | Best Performance 91fps
diablo3: 1920x1200 high = 56fps
blzd - Wednesday, March 12, 2014 - link
While using 2x as much power as a low end dedicated GPU. Intel just threw power efficiency out the window with Iris Pro.IntelUser2000 - Thursday, March 13, 2014 - link
With Cherry Trail, they will be able to put the HD 4400 level of performance in Atom chips.Both Nvidia and Intel have secret sauce to tremendously improve performance/watt in the next few years or so to push HPC.
Broadwell should be the first result for Intel in that space, while Nvidia starts with Maxwell. The eventual goal for both companies are 10TFlop DP at about 200W in 2018-19 timeframe. Obviously the efficiency gains gets pushed down in graphics.
lordmocha - Sunday, March 16, 2014 - link
yes that is true, but with any gaming laptop you'd only get 2 or 3 hours battery while gaming,aka most laptop gamers play plugged in, thus it's not a massive issue, but will affect the few who game not near a power point.
HighTech4US - Wednesday, March 12, 2014 - link
Jarred: Actually, scratch that; I’m almost certain a GT 740M GDDR5 solution will be faster than the 840M DDR3, though perhaps not as energy efficient.Someone seems to have forgotten the 2 MB of on-chip cache.
JarredWalton - Wednesday, March 12, 2014 - link
No, I just don't 2MB is going to effectively hide the fact that you're using 2GB of textures and trying to deal with most of those using a rather tiny amount of memory bandwidth. Does the Xbox One's eDRAM effectively make up for the lack of raw memory bandwidth compared to the PS4? In general, no, and that's with far more than a 2MB cache.HighTech4US - Thursday, March 13, 2014 - link
So then please explain how the GTX 750 Ti with it's 128 bit bus comes very close to the GTX 650 Ti with a 192 bit bus?JarredWalton - Friday, March 14, 2014 - link
It can help, sure, but you're comparing a chip with a faster GPU and the same RAM to a chip with 640 Maxwell shaders at 1189MHz to a chip with 768 Kepler shaders at 1032MHz (plus Boost in both cases). Just on paper, the GTX 750 Ti has 4% more shader processing power. If bandwidth isn't the bottleneck in a game -- and in many cases it won't be with 86.4GB/s of bandwidth -- then the two GPUs are basically equal, and if a game needs a bit more bandwidth, the 650 Ti will win out.Contrast that with what I'm talking about: a chip with less than 20% of the bandwidth of the 750 Ti. It's one thing to be close when you're at 80+ GB/s, and quite another to be anywhere near acceptable performance at 16GB/s.
Death666Angel - Wednesday, March 12, 2014 - link
"Speaking of which, I also want to note that anyone that thinks “gaming laptops” are a joke either needs to temper their requirements or else give some of the latest offerings a shot."You realize that you are speaking to the "PC gaming master race", right? :P
ThreeDee912 - Wednesday, March 12, 2014 - link
Very minor typo on the first page."The second GTX 860M will be a completely new Maxell part"
I'm assuming "Maxell" should be "Maxwell".
/nitpick
gw74 - Wednesday, March 12, 2014 - link
why do I live in a world where thunderbolt eGPU for laptops are still not a thing and astronomically expensive, mediocre-performing gaming laptops are still a thing?willis936 - Wednesday, March 12, 2014 - link
Because outward facing bandwidth is scarce and expensive. Even with thunderbolt 2.0 (which has seen a very underwhelming adoption from OEMs) the GPU will be spending a great deal of time waiting around to be fed.lordmocha - Sunday, March 16, 2014 - link
the few videos on youtube show that it is possible to run graphics cards over TB1 4x pice, and be able to achieve 90%+ performance out of the card when connected to an external monitor and 80%+ performance when feeding back through the TB to the internal monitorso basically eGPU could really be a thing right now, but no-one is making a gaming targeted pcie tb connector. (sonnet's one needs an external psu if you are trying to put a gpu in it)
rhx123 - Wednesday, March 12, 2014 - link
Because GPU Makers can sell mobile chips for a huge increase over desktop chips.A 780M, which is roughly comparable to a desktop 660 nets Nvidia a hell of a lot more cash.
A 660 can be picked up for arround £120 these days, whereas on Clevo reseller site in my country a 770-780M upgrade costs £158.
Nvidia knows that a large percentage of very high end gaming laptops (780M) just sit on desks and are carried around very infrequently, which could easily be undercut piecewise with a eGPU.
My 13inch laptop, 750Ti ExpressCard eGPU, and PSU for the eGPU (XBOX 360) can still easily fit into a backpack for taking round to a friends, and costed much less than any similar performing laptop avaible at the time, and when I don't need that GPU power then I have an ultraportable 13 inch at my disposal.
CosmosAtlas - Wednesday, March 26, 2014 - link
Because intel does not give permissions for making thunderbolt eGPU. I was waiting for a thunderbolt based Vidock, however it will never happen because of this.willis936 - Wednesday, March 12, 2014 - link
So I take it nvidia hasn't hinted at the possibility of g-sync chips being included in laptop panels? I think they'd make the biggest impact in laptops where sub 60 fps is practically given on newer titles.JarredWalton - Thursday, March 13, 2014 - link
I asked about this at CES. It's something NVIDIA is working on, but there's a problem in that the display is being driven by the Intel iGPU, with Optimus working in the background and rendering the frames. So NVIDIA would have to figure out how to make Intel iGPU drive the LCD properly -- and not give away their tech I suppose. I think we'll see a solution some time in the next year or two, but G-Sync is still in its early stages on desktops, so it will take time.Hrel - Wednesday, March 12, 2014 - link
I'm surprised you guys didn't say anything about the 850M not supporting SLI. I was expecting a paragraph deriding that decision by Nvidia. I'm really upset. I would have loved to see how energy efficient SLI could get. Lenovo has had that laptop with 750M in SLI for about a year now and I've thought that was kinda stupid.But considering how power efficient Maxwell is maybe that could actually be a good idea now.
Maybe they'll still do it with underclocked GTX 860M's.
Hm, I bet that's what Nvidia wanted. To discourage OEM's from buying 2 cheaper GPU's/laptop instead of ONE hugely expensive one. Prevent them from SLI'ing the best GPU in their lineup.
Yep, pissed. I'm pissed.
Hrel - Wednesday, March 12, 2014 - link
best GPU for SLI* in their lineup.JarredWalton - Thursday, March 13, 2014 - link
I've never been much of a fan of SLI in laptops. Scaling is never perfect, often there is more difficulty with drivers and game compatibility, and battery life can take a hit as well. I mentioned it didn't support SLI, but I can't say it bothers me much. 860M Maxwell supports it (apparently) and will use the same chip, so really it's only going to be a small bump in price to go from two 850M to two 860M -- assuming an OEM wants to do SLI 860M that is.Harmattan - Wednesday, March 12, 2014 - link
I think the thesis of this article is right on: it's a great time for gaming notebooks. The idea just four years ago that I could play the newest games at highest settings at 2560x1600 at more than respectable FPS was unthinkable. My Sager with dual 780ms does that with legs to spare, and I should be able to do this with a SINGLE GPU once high-end Maxwell mobile chips come later this year -- simply amazing.IntelUser2000 - Thursday, March 13, 2014 - link
"we’re talking about a feeble 16GB/s of memory bandwidth – that’s lower than what most desktops and laptops now have for system memory, as DDR3-1600 with a 128-bit interface will do 25.6GB/s."Actually the shared aspect in iGPU systems effectively makes the bandwidth equal to about half that. That is, 12.8GB/s.
JarredWalton - Thursday, March 13, 2014 - link
Only if you're using the iGPU to do something, but since I'm discussing system bandwidth vs. GPU bandwidth I didn't get into that. I suppose something like Kaveri will end up with about the same 16GB/s of bandwidth from the system RAM (with the remaining bandwidth going to the CPU), but really Kaveri will still only be a "moderate" GPU performance level.lmcd - Thursday, March 13, 2014 - link
Since when has Fermi been die shrunk? The 820 needs fact-checked.Ryan Smith - Thursday, March 13, 2014 - link
Since 2012. They released a 28nm version of GF108: GF117.http://www.anandtech.com/show/5697/nvidias-geforce...
jasonelmore - Thursday, March 13, 2014 - link
First Maxwell SLI Enabled GPU!!! and its in a LAPTOP WTFfteoath64 - Thursday, March 13, 2014 - link
Does anybody see why Maxwell part is the starting point and Kepler holding the higher end of the range ?. They are squeezing blood out of Kepler before a complete switch to Maxwell. It is a very clever trick to pull and buys NV time to carefully craft the performance of higher end Maxwell parts to suit the performance/price model they wanted. This release alone seems enough to maintain their discrete gpu market on laptops while AMD struggles with their mobile market. It keeps Intel IGP at bay except for non gamers who do not care about discrete graphics.hero4hire - Thursday, March 13, 2014 - link
I am one of those people that think laptop gaming "is a joke" still. 1hr gaming is still limited but more so is most gamers are going to want to use a mouse not trackpad and therefore use a table. At the point you're gaming at a table why not build a solid desktop and buy an ultraportable for less? Are people unable to spend a 1k to 2k budget better in this regard?I am greatful to see real performance hitting laptops, lower 850m = 580m in the illustration for example. Coming from a purely cost sense I am unconvinced an upgradable desktop and cheap slim low power laptop is not better for the vast majority while also being cheaper. Especially in the long run as you can upgrade a desktop while a $200-$300 laptop won't depreciate like a 1k-2k one will. For example, who is going to buy that 580m laptop for even half its list now? A cheap laptop is almost always worth $100.
lordmocha - Sunday, March 16, 2014 - link
most play plugged into the power and with an external mouseyet the reason to get a gaming laptop (not gaming desktop + laptop) is that it can be cheaper (though performance will be worse), and that it is very portable to take to a friends house
adityarjun - Thursday, March 13, 2014 - link
How would a 860m compare to a 580m?I have a 580m currently and find it fast enough. So if i upgrade again, i would only go for a 860m or so and make savings on the battery life.
JarredWalton - Friday, March 14, 2014 - link
NVIDIA says the 850M is 30% faster than the 580M, so the 860M would be another 15% over that.adityarjun - Friday, March 14, 2014 - link
Wow!! To think that my m17x r3 is already 'slow'.I find the 580m to be quite awesome. The 880m must really be something.
I know there are a lot of other variables but how long do you think a good FHD system with 860m and a processor like i7 4800mq could last on battery?
jtd871 - Thursday, March 13, 2014 - link
Jarred: Having a gaming laptop with Optimus, I'm not convinced of any benefit of Optimus. Since I leave my machine plugged in all the time, it really doesn't give me any benefit. On the contrary, I have the following main issue with Optimus: sometimes software doesn't use the dGPU or the dGPU fails to kick in. I've experienced this with at least 2 titles: AutoCAD 2013 and Zen Pinball 2. In the first case, AC displays an error dialog saying it can't find a real GPU and exits to the desktop. I've actually resorted to running AC2013 on my old ThinkPad that has a discrete GPU. In the second, Zen Pinball 2 (via Steam) apparently finds and runs on the HD4000 iGPU, but is horribly laggy. (Other Steam-based titles seem to run fine.) Optimus for big, power-hungry laptops is probably a half-baked idea given how well the recent major GPUs drop to idle anyway, and I will be shopping for a discrete GPU only in the future.JarredWalton - Friday, March 14, 2014 - link
And you can't get either to work by creating a custom profile in NVIDIA's control panel? I have Zen Pinball 2 so I can at least try that, but I haven't ever used AutoCAD so I'm not help there.iwod - Thursday, March 13, 2014 - link
No Wonder why Apple are designing their own GPU, with the mess GPU maker seems to have with naming things.Novaguy - Friday, March 14, 2014 - link
Anybody know when the benchmarks come out? I'll love to see the maxwell 860m benchmarked against the 770m and the 750m/755m sli (i.e., the lenovo y510p).kaellar - Friday, March 14, 2014 - link
Hey Jarred, I'm confused a little with this quote:"One nice benefit of moving to the GTX class is that the 850M will require the use of GDDR5"
It doesn't seem to be true since geforce.com states both DDR3 and GDDR5 are possible for 850m.
ilkhan - Sunday, March 16, 2014 - link
So nothing all that interesting to replace my GTX770M card with.Anders CT - Sunday, March 16, 2014 - link
The RAM clocks for the gddr5 cards are specified as 2.5 GHz. Is that a mistake? Shouldn't it be in the neighbourhood of 5 GHz with 80ish GB/s memory bandwidth?willis936 - Sunday, March 16, 2014 - link
Good grief OEMs aren't listing which 860m is in which laptop SKU which leaves me believing thatbtheyre actually mixing maxwell and kepler bins. That pisses me off greatly because one chip has a higher value than the other and they're leaving it literally up to luck. If I get a laptop this summer I will return it if I get a Kepler chip without hesitation and try a different site to buy from.Also the quoted max of 2GB for the 860m is interesting because gigabytes "p34g v2" product page shows the 860m with 4GB DDR5.