Comments Locked

27 Comments

Back to Article

  • Ro_Ja - Tuesday, November 28, 2017 - link

    TL;DR
    MX110 is a rebranded 920MX and the MX130 is a rebranded 940MX
  • peevee - Wednesday, November 29, 2017 - link

    Exactly. And mister Nate Oh decided to put this information only at the very end of the article, wasting everybody's time. Not cool.
  • Nate Oh - Thursday, November 30, 2017 - link

    Neither of them were confirmed as exact rebrands of the 920MX and 940MX, and we are not in the business of publishing rumors and stating unconfirmed single-sourced reports as facts. The GPUs being confirmed as "Maxwell-based" and specifications in the Best Guess table should tell you all you need to know, and for a ~280 word article, I think it's rather unlikely I wasted more than 30 seconds of anyone's time :)
  • Phylyp - Tuesday, November 28, 2017 - link

    Why do they tarnish the good name of the MX150 by releasing similarly named GPUs build on older Maxwell technology. Sigh. I'd be happier to see an option of something above the MX150 instead.
  • tipoo - Tuesday, November 28, 2017 - link

    Yeah, that's particularly silly, the MX150 was racking up a lot of good joojoo.
  • ImSpartacus - Tuesday, November 28, 2017 - link

    These cheap OEM-friendly parts will have their name tarnished eventually, so Nvidia probably dgaf.

    The OEMs have too much control. They will inevitably gimp it to abuse the goodwill that "discrete" graphics have earned.
  • tipoo - Tuesday, November 28, 2017 - link

    >Maxwell
    >nearly 2018

    butwhy.gif. They're low performance parts, but what disadvantage would there be for them to be Pascal?
  • bug77 - Tuesday, November 28, 2017 - link

    Not to mention they're 28nm parts that will suck your battery dry compared to newer parts.
  • PeachNCream - Tuesday, November 28, 2017 - link

    There's nothing that says technological advancement like releasing a rebranded 28nm GPU in 2017! </sarcasm>
  • DanNeely - Tuesday, November 28, 2017 - link

    Cost. The bottom of the market is extremely price sensitive; if GP108 yields are good enough that there aren't enough partially flawed parts available to fit the next step down in performance something made on a legacy process will be cheaper (cost/transistor stopped falling a process or two ago).

    If you look at previous generations the bottom of the market GPUs have often been 1 or even 2 generation old designs for this reason. It's nothing new for either Nvidia or AMD.
  • Phylyp - Tuesday, November 28, 2017 - link

    Fair enough - the MX150/GP108 is small enough that the yields are probably good. Is there any reason they're still fabbing the MX130/MX110 on the 28nm process - can't they migrate at least the fab process to 14 nm?
  • Ariknowsbest - Tuesday, November 28, 2017 - link

    It is expensive to migrate an existing design to 14nm process, and 14nm is more expensive than 28nm. 28nm is probably the most cost-effective process at the moment.
  • Phylyp - Tuesday, November 28, 2017 - link

    Thanks, Ariknowsbest
  • jabbadap - Tuesday, November 28, 2017 - link

    Yeah gm108 is lacking nvenc so no encoding and nvdec has no hevc nor vp9 decoding capabilities. Would much rather seen them being gp108, although performance would be still quite too close to intel's igpus.
  • Ro_Ja - Tuesday, November 28, 2017 - link

    Where did you get that information?

    Starting fron Kepler to the current architecture has NVENC.

    Remember when GT 730 users bypassed shadowplay? It's because it had NVENC.
  • ET - Wednesday, November 29, 2017 - link

    See https://developer.nvidia.com/nvidia-video-codec-sd...
  • trane - Tuesday, November 28, 2017 - link

    That's very disappointing, was expecting MX 130 to be a lower clocked GP108.

    Vega 8 in Ryzen 5 2500U is faster than 940 MX across the board. Source: Notebookcheck and Tech Report - benchmarks line up as expected. So Vega 8 is going to beat MX 130 handily. Pretty insane for a 15W TDP CPU+GPU combo! Vega 10 in Ryzen 7 2700U might just come close to MX150 + Core i5, which is overall a 30W-35W combo.

    To everyone lampooning Vega for its inefficiencies - at low clocks it's quite stunning perf/W. Well ahead of Pascal.
  • lefty2 - Tuesday, November 28, 2017 - link

    That's because 940MX comes with DDR3 memory. Theoretically OEMS can use GDDR5, but they never do.
  • trane - Tuesday, November 28, 2017 - link

    Vega 8 uses shared DDR3 too. Either way, the point is Ryzen 5 2500U laptops will undercut Core i5 85xxU + MX130 laptops on price, while consuming way less power; for better gaming performance. Heck, it undercuts Core i5 85U laptops without a GPU!
  • Ro_Ja - Tuesday, November 28, 2017 - link

    I've seen more 940MXs with GDDR5 than DDR3.

    I wonder why the 945 is so rare?
  • Flunk - Tuesday, November 28, 2017 - link

    Not much of a point in these GPUs, not worth the power draw.
  • Pork@III - Tuesday, November 28, 2017 - link

    https://images.nvidia.com/geforce-com/internationa...

    "the UHD 620, while the MX130 is cited as 2.5x faster"

    This is nor true MX150 is cited to 2.5x than oldest HD 520 based on graph in own website of Nvidia...MX130 is slower than MX150.

    Maybe you have to make your calculations again so you do not write lies.
  • extide - Tuesday, November 28, 2017 - link

    No, they are using the correct information. The slide you posted is specifically talking about accelerating photo editing.

    Nvidia claims the stated numbers in the article on their own page:
    https://www.geforce.com/hardware/notebook-gpus/nvi...
    https://www.geforce.com/hardware/notebook-gpus/nvi...
  • Pork@III - Wednesday, November 29, 2017 - link

    La, la, la! Nvidia say: "up to" but below the line: "relative" :D
    Einstain HELP me to undertand the true!
  • vasubandu - Wednesday, November 29, 2017 - link

    To everyone who has commented just want to say thanks. Very enlightening.
  • CapricornOne - Wednesday, November 29, 2017 - link

    Maybe it's just because it's early in the morning, but it's nice to see a set of critical comments with some good points without being caustic.
  • HStewart - Wednesday, November 29, 2017 - link

    One thing that would be cool, is for Intel to work with NVidia give people to use NVidia GPU's in similar packaging that they did with Radeon CPU's. With Intel's packaging, there is no reason that this could not happen.

Log in

Don't have an account? Sign up now