Comments Locked

46 Comments

Back to Article

  • A5 - Tuesday, August 16, 2016 - link

    Huh. The 1060 seems to be the most interesting part for people who actually use their laptops as portable devices - double the performance (I assume) of the old 960M is nothing to sneeze at.

    Also explains where all the 1080 and 1070 supply went. /s...maybe?
  • Xajel - Tuesday, August 16, 2016 - link

    Indeed, I'm still waiting to see if 1060 is the real successor to 960M, NV never released the TDP of 960M but most says its between 60 & 80, and 1060 seems to have more power than that... I don't know how they will manage the power and thermal limits if it was really higher as 120W as the desktop part.. and the gap between 60~80W to 120W is very big to think that an mobile optimised version of 1060 will manage..
  • dsumanik - Tuesday, August 16, 2016 - link

    Only 299.99 for platinum founders edition.

    You heard it here first next gen nvidia gpus will have not one but two tiers of founders edition and the sheep will eat it up
  • Michael Bay - Tuesday, August 16, 2016 - link

    What exactly is wrong with paying more for getting the thing you want faster?
  • HollyDOL - Tuesday, August 16, 2016 - link

    Indeed, that's same like complaining post office charges you more for express mail. Nobody forces you to use it either.
  • D. Lister - Tuesday, August 16, 2016 - link

    C‘mon fellas, be fair. What else is left there for AMD fans to complain about besides pricing and Gameworks?
  • Morawka - Tuesday, August 16, 2016 - link

    well your mail example is only a difference of a few days, with the shortage in 16nm GPU's, most people will be waiting 4-6 months for the prices to hit MSRP
  • TheinsanegamerN - Monday, August 22, 2016 - link

    So no different then last gen?
  • emn13 - Tuesday, August 16, 2016 - link

    What's certainly a little misleading is to then to also focus on the base price of the product, even though that price is to this day not achievable where I live (the netherlands). Part of the problem is the press, for repeating those fictious prices in reviews, and using them in perf/$ comparisons, but I certainly would have appreciated a little less sleaziness from the uncontested market leader.
  • medi03 - Tuesday, August 16, 2016 - link

    Or why shouldn't manufacturer price gouge. No, really, why?
  • D. Lister - Tuesday, August 16, 2016 - link

    Why should a business be allowed to charge what they want for their product? That sounds an awful lot like *gasp* capitalism!
  • cknobman - Tuesday, August 16, 2016 - link

    Nvidia can and should charge whatever they want for their product.
    As long as there are suckers out there who will pay for it then I say rape them for every penny you can get.

    I'm perfectly fine with no being one of those suckers and waiting patiently for reasonable prices.
  • BrokenCrayons - Tuesday, August 16, 2016 - link

    Agreed...companies should seek out the price point that gives them the best compromise between per unit profit per sale and total sales volume. Its simple economics.

    This does look like another mobile GPU generation I'm not going to purchase though. I haven't been able to justify a laptop upgrade to myself for a long time and these new GPUs don't really do much of anything to change that. Maybe the TDP is lower or something, but since NV is unwilling to publish numbers on TDP, I'm guessing there's nothing impressive in the power savings department or thermal performance over the 900 series. Overall the 1000-series launch has been sort of uninteresting and routine despite the smaller transistors.
  • beck2050 - Wednesday, August 17, 2016 - link

    80% performance jump in a smaller power envelope is not impressive. Yeah right.
  • BrokenCrayons - Wednesday, August 17, 2016 - link

    "80% performance jump in a smaller power envelope is not impressive. Yeah right"

    When you're looking for a 0% performance increase and 50% less power, then yes, the 1000-series has been a very, very lackluster launch so far. NV opted for pushing higher clock speeds and increasing core count instead of reaping the potentially massive power savings and heat output reductions a new process could have provided. It strikes me as a waste of perfectly good new technology.
  • LukaP - Saturday, August 20, 2016 - link

    I on the other hand think with the advent of VR, being able to squeeze as much perf from the chips as they can is the preferrable solution... Two different points of view here. but what is good is that they offer both of these things, or at least will in time, when the 1050 and below hit the market. Pretty sure those will offer limited performance, almost no power consumption, and be even cheaper for the performance than the last generation.

    They did something similar with the 750(Ti) and those cards were amazing and sold well. Not that much faster than the 650 Ti Boost, but smaller, cheaper, more efficient and damn near silent. Be sure they are not going to miss out on those sales.
  • TheinsanegamerN - Monday, August 22, 2016 - link

    this is a move from 28 to 16nm. The performance per watt of pascal is a big jump from maxwell.
  • Murloc - Tuesday, August 16, 2016 - link

    this is like complaining about hard cover books
  • MrSpadge - Tuesday, August 16, 2016 - link

    120 W for GTX1060 is essentially running the card at full throttle (~1.05 V). It should easily reach 80 - 90 W at slightly reduced clock speeds and significantly reduced voltage. GTX 1080 tuning show them able to reach 1.5 - 1.6 GHz at ~0.8 V, which provides massive efficiency improvements over high voltage operation.
  • emn13 - Tuesday, August 16, 2016 - link

    Interesting - do you have a link examining that in more detail?
  • RaichuPls - Wednesday, August 17, 2016 - link

    If you check out the Tomshardware 1060 review, they had a power vs clockspeed graph. At 1.5 GHz, it only consumed around 60W.
  • Xajel - Tuesday, August 16, 2016 - link

    Edit, just came from pcperspective.com they listed 1060m as having a 75W TDP ( just assumption as usual ), so this means that 1060m is the successor to 970M and not 960M, as 970M has a 75W TDP also, and 960M have 60W TDP
  • jabbadap - Tuesday, August 16, 2016 - link

    If you look chip tier, it would be successor of gtx965M(gm206 version). Although they were soldered to motherboard like gtx960m was it used different chip(gm107). If you are waiting real gtx960m successor then you have to wait gp107 mobile variants. Was there any information given about mobile gtx 1060: is it mxm card or soldered? Did not catch my eye by skimming the article.

    Notebookcheck review of gtx965m:
    http://www.notebookcheck.net/Nvidia-GeForce-GTX-96...
  • Morawka - Tuesday, August 16, 2016 - link

    yeah everyone wants to see the 1060m performance numbers and we'd also love to know the TDP.

    The 1070m and 1080m are interesting, but it seems nvidia is worried about performance more than TDP. They could have given us GTX 980m performance at 50% of the power required. the normal 980m was a 100w part, so it could be a 50w part with the node shrink.

    they really need to release TDP numbers to the public. thats kind of silly, the OEM's have to know.
  • MrSpadge - Tuesday, August 16, 2016 - link

    > They could have given us GTX 980m performance at 50% of the power required.

    That's approximately GTX 1060 in its mobile trim, or up to a future slightly cut-down GTX 1050.
  • SunLord - Tuesday, August 16, 2016 - link

    "To follow: GTX 1060 Review (hopefully Friday), RX 480 Architecture Writeup/Review, and at some point RX 470 and RX 460 are still due."

    So given it took 14 days beyond the "hopefully" Friday for the GTX 1060 review to show up can we expect the 480 write up some time in September?
  • WorldWithoutMadness - Tuesday, August 16, 2016 - link

    This makes me wonder, what's the trigger for notching up the game?
    Pressure from TB externalGPU to mobile GPU?
  • nightbringer57 - Tuesday, August 16, 2016 - link

    The very low peak power consumption of this generation, coupled with the fact that the "consumer" high end tends to not be the absolute highest-end of the range anymore (see the "titan" trend). It's mostly just because they can, I guess.
  • MrSpadge - Tuesday, August 16, 2016 - link

    That's because modern laptops can handle higher power levels and can be sold according to their performance.
  • Jon Tseng - Tuesday, August 16, 2016 - link

    Laptops also the majority of PC units sold, so in the long term this is the growth market (despite lack of upgradability - I guess Thunderbolt helps!). Makes sense to want to give them a bit more love.
  • MrSpadge - Tuesday, August 16, 2016 - link

    I'd like to have that "Batteryboost" available on the desktop as well. Set the display refresh rate as frame rate target for action games, or half of it for slower ones, and have the rendering "perfectly" sync'ed to the display output without GSync. With the added benefit of automatically running in the most energy-efficient configuration and without worrying about frame rate dips.

    That's not for "hardcore" gamers, but for everyone else I'd consider this a sensible setting.
  • kn00tcn - Wednesday, August 17, 2016 - link

    what? adaptive vsync has an option for half refresh, otherwise it's locked to your refresh, now why would you still want battery boost? it's going to alter your clockspeeds, reducing performance
  • TheinsanegamerN - Monday, August 22, 2016 - link

    It more limits your FPS to 30 then reducing performance. My guess is to keep heat/power consumption down. There is no real easy way of doing that on desktop.
  • psychobriggsy - Tuesday, August 16, 2016 - link

    1060 Laptop Spec - 80W TDP
    1070 Laptop Spec - 120W TDP
    1080 Laptop Spec - 150W TDP

    I found one review of a 1060 equipped laptop - $1800 - and it managed 85 minutes on battery in Witcher 3.
  • D. Lister - Tuesday, August 16, 2016 - link

    "NVIDIA is also tying quality settings into Battery Boost, allowing the technology to dial down game settings to further save power."

    I'm guessing that means dual 3D settings profiles, one for direct power and the other for battery. Otherwise it would be pretty daft of Nvidia if the quality got reduced without the end-user having any choice in the matter.
  • kn00tcn - Wednesday, August 17, 2016 - link

    that depends on if you use GFE & set your game to 'nvidia optimized'
  • Stuka87 - Tuesday, August 16, 2016 - link

    How on earth are they going to keep a GPU that consumes that much power cool in a laptop, when the reference desktop parts even had issues cooling them? Yes these will be lower power, as their clock is slightly lower, and I am sured they are the best chips of the crop. But even if they got a 1080 down to a real world consumption of 145W, thats a ton of heat to dissipate out of a laptop.
  • Spoogie - Tuesday, August 16, 2016 - link

    So can these newfangled notebooks drive an external gsync enabled monitor to 120hz?
  • Ryan Smith - Tuesday, August 16, 2016 - link

    If the dGPU is directly wired to a DisplayPort, yes.
  • rxzlmn - Tuesday, August 16, 2016 - link

    What I'm most curious about is whether these cards will still support Optimus, and whether this Optimus support (in case it is there) will still prevent them from powering proper VR. I don't remember the specifics, but I remember that the Optimus re-routing through the integrated graphics was quoted as a technical hurdle for laptops to run VR.
  • rev3rsor - Tuesday, August 16, 2016 - link

    Looking forward to this new generation, especially the better efficiency. To be honest I'm more interested in what someone said earlier, of some smaller performance bumps but with considerable drop in TDP... Aimed at something like the XPS 15 or (less likely) the MacBook Pro 15, the former with a 960M that has a TDP (apparently) much lower.
  • beck2050 - Wednesday, August 17, 2016 - link

    Game changer for notebook gaming.
    I have a Sager with a 780 and can't wait to upgrade to a 1070.
  • damianrobertjones - Wednesday, August 17, 2016 - link

    Why buy any model with the 6700hq when we KNOW that the new version is nearly here. The people that purchased the 5700 just before the 6700 know what I'm talking about.
  • TheinsanegamerN - Monday, August 22, 2016 - link

    because the performance difference is like 2%?
  • Notmyusualid - Saturday, August 20, 2016 - link

    So no numbers / testing then?

    I'm more surprised by this site as days pass.

    For the rest of you:

    http://www.notebookcheck.net/Nvidia-Pascal-for-Not...
  • Gunbuster - Monday, August 22, 2016 - link

    So who has my dream laptop?
    1070, Thin-ish (not 3 inches), 15" 4K, no ugly ass dragon, dominator, or penetrator in huge light up font on the lid, accessible M.2, and Thunderbolt 3 so it can use a graphics dock.

Log in

Don't have an account? Sign up now