Having spoken with nVidia technical engineers as part of my job, nVidia does not handle drivers for OSX. They "advise", but don't do any of the actual driver writing. Apple does that in-house. Boot Camp Windows, however, follows the same driver update path as everyone else using Windows.
Yeah right, nVidia is giving specs of their GPU's to Apple developers so they can write GeForce drivers for OSX. nVidia is not crazy to share their knowledge to competition, because for writing drivers you need to know how GPU internally work.
To my understanding it used to be Apple who wrote the drivers but Nvidia has possibly taken the reins back to themselves. There have been some Nvidia driver update releases that are newer than what is found in Apple's updates.
I'll never, ever, buy another laptop with a discrete GPU. The extra heat and power drain, together with the inflated prices and dishonest marketing just aren't worth the modest performance increase on a machine that will never really provide the same level of gaming performance that even a dirt cheap desktop machine will.
If a pair of 680M cards in SLI performs worse than a single 660TI, then it's just plain dishonest for NVidia to keep branding them thusly. I don't see onboard graphics overtaking desktop video boards any time soon, but for laptops the time is near and it can't come soon enough.
There are a number of laptops that let you switch between discrete and integrated graphics on demand so you can save power when you're on-the-go and still have that extra power when you're plugged in.
As for value versus desktops, yes there's a premium for mobility and the value of that mobility depends greatly on your lifestyle and job conditions.
You have a point when it comes to high-end "gaming" laptops that weight 20+ pounds, cost a fortune and perform poorly. But there is a place for mid-range discrete GPUs in smaller systems that allow you to play games at moderate settings if you're on the go.
I think the best option would be a small laptop that connects to an external GPU but it appears that the industry disagrees with me.
I totally agree with you.... all laptops in general and also win8 tablets should connect to an external GPU....that would be the solution to many problems.... you want to play heavy duty games just plug in the external GPU and If you want or need portability then use the ultrabook alone....I have also read that with the current technology this is not possible
Sony shipped a laptop that supported a (low end) external dGPU. Another company showed a generic enclosure that could be used to connect a GPU to a computer via Thunderbolt (I'm not sure if it ever actually shipped, though). It certainly is possible, even if there's currently no link that could provide enough bandwidth to let a top-of-the-line GPU run full tilt.
I would think nVidia and/or Intel would want to push that market more, but it doesn't seem like anyone really cares, unfortunately. It would be nice to be able to 'upgrade' a laptop's GPU without having to replace the entire thing.
What possible reason would intel have for pushing a product like that? In fact if some sources are correct they are trying to do the exact opposite by bottlenecking even internal dGPU by limiting available PCIe lanes in Broadwell.
afaik this is not entirely correct since even thunderbolt is too slow to properly utilize a modern graphics card. this is not surprising, since thunderbolt is based on 4x pci-e 2.0 (2GB/s) and current desktop class graphics are using 16x pci-e 3.0 (~16GB/s) which is about eight times as fast.
so i wouldn't say the problem is completely solved throughput-wise, but thunderbold sure was an important step in the right direction.
Ugh. You're either ignorant or reaaaaally generous with the hyperbole. "20+ lbs notebooks"? Really?
In real life, mid-range notebooks/GPUs do fine for gaming, and high end notebooks/GPUs do...REALLY fine. When you can max out today's games at 1080p, that isn't "performing poorly", and is orders of magnitude better than Intel's video.
If YOU guys don't want high end notebooks, fine, but I don't see how they're hurting you.
I had a laptop with discrete graphics that lasted for over 9 hours on battery, while surfing the web. It was a laptop with an early form of Optimus (you had to manually switch), but still, you can have graphical performance xor battery life if you don't need the performance. But asking for both? Now that's silly.
As for your issue with marketing the 680M as it is when it can't outperform a midrange desktop card... You do realize that this is a different market segment? Also you should tell "shame on you" to all the display companies who mislead customers into think they're buying a panel that can do 16 million colors (which last I checked, 18-bits is not 16 million) or have a 1000000:1 contrast ratio (which you need to be in a pitch black room and being shown a black/white checkerboard pattern to see).
"Modest performance increase"? I wouldn't call my GTX 680m a "modest performance increase" over Intel video lol
Are you KIDDING?!? Notebook hardware is ALWAYS worse than desktop. This applies obviously to CPUs too, which you're inexplicably not complaining about. You always pay more to get the same performance. That doesn't mean it's "dishonest" or the like.
And quite obviously integrated video can never catch up with a discreet part so long as they make high end discreet parts, so the time is "never", not "near".
**** Regarding the article...Optimus...eh, Nvidia's driver team is impressive as always, but literally the first program I ran that I wanted to run on the GPU wouldn't run on the GPU...thankfully my notebook lets you turn off Optimus.
And this is why I don't buy into such claims; I had some issues when Optimus first launched, but I haven't had any I can specifically pinpoint in the last year or more. If anyone has something under Windows that doesn't play right with Optimus, please post it here so I can verify the issue. Otherwise, with no concrete evidence, it's just FUD.
VLC for starters. I seriously have no idea how you can be unaware of issues with it-any big notebook forum will have threads about it all the time, people trying to disable it.
Speaking of dishonest, you are not sticking to the facts. You obviously don't own a nice laptop with a 680M inside. I do, and it performs amazingly well. I output games from my laptop to my TV via HDMI and they look spectacular.
You also obviously don't understand that desktop PCs (and their components) cannot be directly compared laptops. I also highly doubt that a "dirt cheap" PC can run The Witcher 2 at almost Ultra settings at a playable framerate.
You're missing out then if you like to game. I've got a 560m that still performs admirably, running many of today's games with max settings (no AA) at 1600x900 60fps. I'm spoiled by fast frame rates and decent graphics settings, I can't imagine using even the upcoming haswell to play games like Bioshock Infinite on.
I will never ever buy a laptop without a discrete card. Video cards 7770M/650M or above can play any game on 1920x1080 on high if there is a good enough CPU as well. Mobile graphics are starting to become powerful enough.
Look at Intel CPU's. My i7-2600K at home is slightly slower than my i7-3720QM clock for clock.
Sure, maybe most mainstream users who facebook and email all day on laptops don't need anything more than integrated graphics, but guys like me (engineering students) who actually do stuff on computers will still rely on discrete GPUs. I can't properly run any of my CAD software without a discrete GPU.
It's there now... our gallery is set to post things by day only, and since the NDA was at 9AM Eastern I didn't want the gallery going live nine hours early. So I set it for 4/2 and then just changed it to 4/1.
I was hoping to get an 15 inch normal size laptop with a gk106 fully enabled this summer, but this just crushed my hopes. they will just keep using those highly clocked gk107 and call it mid-high end...
A fully enabled GK106 is almost certainly in the works, but it will be something like a GTX 770M. GTX 780M is likely to be GK104, but it will be interesting to see if NVIDIA can do a full enabled GK104 while keeping it within the typical 100W power envelope that gaming notebooks target. GTX 680MX is a full GK104, but I understand TDP is ~125W or so and that's why the only product using it is the iMac 27. We'll likely see the high-end 700M parts launch in June or July.
Jared, a question for you regarding the 700m line out this year- are these going to be backwards compatible with our current HM laptops from 2012, or are they only for the new Haswell design? Due to the issues that we have had with AMD, there's quite a few of us that want to get rid of AMD, and upgrade to the newest Nvidia card, and this 780m is quite a bit faster than the 680m. I would like to swap out my 7970m GPU for the upcoming 780m card, and was wondering if this is do-able or not.
Keep in mind that I'm guessing at 780M -- I could be off. As for backwards compatibility, given that we're essentially looking at minor tweaks and revisions to Kepler, they should all work fine in existing laptops. Getting a replacement GPU might be a bit difficult/expensive, though. Quick question: did the latest 13.3 beta drivers help you at all with 7970M? I Haven't had a chance to test then yet.
The new Catalyst 13.3 beta drivers are fine, and have no problems yet with them. My concern, though, is that these last few driver releases from AMD have not/are not dealing with increasing the FPS rates on current games, while Nvidia's has. Even though the new games that are out like Bioshock are AMD-coded, Nvidia is winning the battle on better gaming performance. While Nvidia is releasing their standard (and better) drivers for newer games at a faster rate than AMD is (as is usual), AMD is falling further and futher behind in other areas as well in terms of hardware specs- in this case, their 8000 series gaming GPU for this summer which has been very disappointing in it's specs, to say the least. This is why I want to upgrade to the new 780m GPU. From what was said on another forum, the preliminary specs for the 780m is the equivalent of the 680MX for the Imac released late last year. It is said to offer a 20%+ improvement over the 680m, and if it's backwards compatible to the HM series motherboards that we currently have, it's worth thinking about upgrading to this year for our existing Sager laptops. You're right about the cost, though. If the price is prohibitive, then it would make sense to wait for the Maxwell release next year with whatever Nvidia has out then. It really depends on cost at this point.
A fully enabled gk106 may be asking to much maybe with 1 smx and a 64-bit memory controler disabled leaving it with 3 smx a fast 128-bit memory controler and call it gt755m or gtx760m it would be the gt555m of this generation and a sweet spot for a standart perfomance 15 inch laptop.
I'm about 99% sure we'll see a full GK106 on mobile this year; the only question is what they'll call it. Well, that and how much power will it use and what will its clocks be. For clocks, 2500MHz (5000MHz effective GDDR5) seems likely, and clocks will probably be in the 600-700MHz range with Boost taking it up to a max of around 800MHz. That's my guess anyway. TDP will be ~70W, though, so this will be a part for larger gaming notebooks only.
Based on the products announced today, it looks like GK106 based mobile GPU's will start with the gtx760m, and will surely be available in a 15" form factor.
What I'd like to see is an ExpressCard version of the low end parts. I've been working with numerous business class laptops with this expansion slot and I've run into the situation where I could use an additional display. I've used USB adapters but they've been less than ideal. I fathom a low clock GK208 chip and a 64 bit wide memory bus could be squeezed into an ExpressCard form factor. I'd expect it to perform around the level of Intel HD4000 but that'd still be far superior to USB solutions.
While some ExpressCard slots give access to the PCI-E bus, the problem is that the laptop's BIOS/UEFI has to support the device in its whitelist. In almost every situation where people have modded their laptops and attached them to external GPUs, they had to flash a custom ROM to remove compatibility restrictions put in place to limit the amount of compatibility testing the vendor had to conduct.
A surprisingly low amount of laptops needed modification to remove the whitelist on the express card slot, and it is possible to do it with software pre-windows if there is whitelisiting. I did not have to whitelist on my Lenovo X220T.
Cooling would require the majority of the GPU to exist outside of the slot if you go this route. I don't think you could properly route heat-pipes through the relatively thin slot opening with a radiator/fan on the outside. Once you go external, the number of people really interested in the product drops quite a bit, and you'd still need to power the device so on most laptops without a dGPU I expect the external ExpressCard option would also require external power. At that point, the only real value is that you could have an external GPU hooked up to a display and connect your laptop to it for a semi-portable workstation.
It would be crazy to put any of these chips into an ExpressCard form factor without reducing power consumption. I was thinking of dropping the clock down to 400 Mhz and cutting power consumption further with a corresponding drop in voltages. It wouldn't have to break any performance records, just provide full acceleration and drive an external display.
In hindsight, the GK208 may be too power hungry. The 28 nm Fermi parts (GF117?) should be able to hit the power and thermal allocations for ExpressCard without resorting to an external chassis.
I like the IDEA of a connection to an external dock that allows ANY video card to be used (heck, why not go for SLI?) but notebooks would have to be able to support it-sounds like lots don't, plus tons of notebooks don't have ExpressCard slots anymore (plus not sure if the bandwidth would start being a bottleneck or not). (Or obviously Thunderbolt could theoretically pull this off too...IF you could just boot with any GPU installed and have the external GPU active by the time Windows boots at least).
Disappointing, this is a really small bump. Mostly a re-labelling of existing parts. Although I suppose it is to be expected seeing as almost all Geforce GT 640m LE-650ms can be clocked up to 1100Ghz with a little bit of bios hacking.
Besides the fact that nothing runs at 1100GHz (or Ghz, whatever those are), I dare say you've exaggerated quite a bit. Many laptops with even moderate dGPUs run quite warm, and that's with the dGPUs hitting a max clock of around 900MHz (GT 650M with DDR3 and a higher clocked core as opposed to GDDR5 with a lower clocked core). If you manage to hack the VBIOS for a laptop to run what is supposed to be a 500MHz part at 1GHz or more, you're going to overload the cooling system on virtually every laptop I've encountered.
In fact, I'll go a step further and say that with very few exceptions, overclocking of laptops in general is just asking for trouble, even when the CPU supports it. I tested old Dell XPS laptops with Core 2 Extreme CPUs that could be overclocked, and the fans would almost always be at 100% under any sort of load as soon as you started overclocking. Long-term, that sort of thing is going to cause component failures far more quickly, and on laptops that cost well over $2000 I think most would be quite angry if it failed after a couple years.
If you understand the risks and don't really care about ruining a laptop, by all means have at it. But the number of laptops I've seen running stock that have heat dissipation issues urges extreme caution.
Not really. Overclocking is fine if you know what you're doing. Years ago I had a Pentium M 1.6ghz notebook with a Mobility Radeon 9700 Pro. Overclocked that processor to 2.0ghz+ and the Graphics card core clock was almost doubled. Ran fine for years, eventually the screen on it died due to sheer age, but I'm still using it as file server hooked up to an old monitor still to this day, with about a half dozen external drives hanging off it.
Hence the "with very few exceptions". You had a top-end configuration and overclocked it, but that was years ago. Today with Turbo Boost the CPUs are already pushing the limits most of the time in laptops (and even in desktops unless you have extreme cooling). GPUs are doing the same now with GPU Boost 2.0 (and AMD has something similar, more or less). But if you have a high-end Clevo, you can probably squeeze an extra 10-20% from overclocking (YMMV).
But if we look at midrange offerings with GT 640M LE...well, does anyone really think an Acer M5 Ultrabook is going to handle the thermal load or power load of a GPU that's running twice as fast as spec over the long haul? Or what about a Sony VAIO S 13.3" and 15.5" -- we're talking about Sony, who is usually so worried about form that they underclock GPUs to keep their laptops from overheating. Hint: any laptop that's really thin isn't going to do well with GPU or CPU overclocking! I know there was a Win7 variant of the Sony VAIO S that people overclocked (typically 950MHz was the maximum anyone got stable), but that was also with the fans set to "Performance".
Considering the number of laptops I've seen where dust buildup creates serious issues after six months, you're taking a real risk. The guys who are pushing 950MHz overclocks on 640M LE are also the same people that go and buy ultra-high-end desktops and do extreme overclocking, and when they kill a chip it's just business as usual. Again, I reiterate that I have seen enough issues with consumer laptops running hot, especially when they're over a year old, that I suggest restraint with laptop overclocking. You can do it, but don't cry to NVIDIA or the laptop makers when your laptop dies!
Totally agreed. I had a Clevo/Sager Laptop with the 9800m GTX in it, and after only two years, it died, due to the Nvidia GPU getting fried to a crisp. The heat build-up from internal dust accumulation was what destroyed my $2700 dollar laptop after only 2 years of use. Ironically, I was thinking about overclocking it prior to it dying on me. In looking back, good thing I didn't do it. Overclocking is risky, and the payoffs are just not worth it, unless you are ready to take the expensive financial risks involved.
I've got a Clevo x7200 and I just cleaned out a wall of dust after discovering it was thermal throttling hard core. I've got to hand it to the internals and cooling of this thing though, it was still running like a champ.
This thing's massive cooling is really nice.
I can stably overclock the 485m GPU from 575 Mhz to 700Mhz without playing with voltages. No signifigant difference in temps, especially compared to when it was throttling. Runs at 61C.
It depends really. As long as you don't touch voltage the temperature does not rise much. I have a 660m and it reaches 1085/2500 without any problems (ANIC rating of 69%). Overclocked vs non overclocked is basically a 2 degree difference (72 vs 74 degrees). Better than a stock 650 desktop.
Also considering virtually every 660m I have seen boost up to 950/2500 from 835/2000 I don't think the 750m is going to be any upgrade. Many 650m have a boost of 835 core so there really is no upgrade there either (maybe 5-10%). GK107 is fine with 64 GB/sec bandwidth.
Funny thing is that in reading comments on some of the modded VBIOS stuff for the Sony VAIO S, the modder say, "The Boost clock doesn't appear to be working properly so I just set it to the same value..." Um, think please Mr. Modder. The Boost clock is what the GPU is able to hit when certain temperature and power thresholds are not exceeded; if you overclock, you've likely inherently gone beyond what Boost is designed to do.
Anyway, a 2C difference for a 660M isn't a big deal, but you're also looking at a card with a default 900MHz clock, so you went up in clocks by 20% and had a 3% temperature increase (and no word on fan speed). Going from 500MHz to 950MHz is likely going to be more strenuous on the system and components.
So if the "core hardware" is the same from Boost 1 and 2, then nVidia should go on and make Boost 2.0 be something we all can enable in the driver.
Or... are they trying to get me to upgrade to new hardware to activate a feature my card is already fully capable of supporting? Haha, nVidia, you so crazy.
There may be some minor difference in the core hardware (some extra temperature or power sensors?), but I'd be shocked if NVIDIA offered an upgrade to Boost 1.0 users via drivers -- after all, it looks like half of the performance increase from 700M is going to come from Boost 2.0!
Yeah. I kinda figured. Still, if it's the same, then it'd be foolish not to ask.
I knew when I heard about Boost 2.0 in Titan that all that time spent discussing it was going to mean it would show up in Kepler refresh products relatively soon afterward. I wouldn't be surprised to see nVidia refresh even the desktop high end with something like that. Minor changes, including slightly higher clocks and a newer Boost.
Even a "minor change" would probably be enough to ruin AMD's next six months.
I've been using GeForce Experience, and have some comments. It's definitely a timesaver, and it's nice to be able to "just click a button" and not have to worry about tweaking the detailed settings (although it's nice to still be able to if I want to override something). I find that the settings it picks generally do run at a good framerate on my rig. It also makes driver updates easier, since it presents you with notification of new drivers (even betas), gives you a nice list of changes in each version, and makes install a one-click affair (it downloads/installs inside the app).
Downsides? First, it doesn't support very many games. This is understandable since supporting a game means they need to have setting profiles for every one of their cards, but also a whole lot of other configurations such as different CPUs and different monitor resolutions. Unless there is some sort of dynamic algorithm involved, that would be an enormous number of potential configs per game. Still, the limited game support is unfortunate. Second, the app will continually notify you that new optimized settings are available, even when the new settings it downloaded are not for any game you have installed. So it keeps telling me there are new settings, but when I go into the app to check, there are no updates for any of my games.
I hadn't heard of this program, and have to say it's kind of a cool idea. Heck, *I* don't always like messing around with sometimes vaguely settings in games, I think for the average user this could be really cool, and does indeed help make it more console like.
I like that they went in and started supporting prioritizing resolution. So instead of just abstractly telling me to change my 2560x1600 to 1920x1200/1080, they leave it at 2560x1600 now. That's good.
Plus, their latest release notes also said they were adding SLI support, which is great.
The thing that I think this program lacks is the option to set certain settings that you want to be true regardless and then have the program adjust to recommend specs around certain "givens" that you won't ever change.
For example, I'm not a big fan of AA unless there is ABSOLUTELY no performance setting that can't be turned all the way up. I can imagine some people might want AA at all costs because jaggies just bug them.
I think we should both have the option to prioritize for the setting we want. I'd also love it if we had a program like Geforce Experience that let us alter the settings for a game before we entered it and also served as a launcher (much as Geforce Experience does), but I think instead of just doing the defaults, we should have the option to select the settings, choose the "Optimal" as determined by nVidia, and also the option to do the tweaking from right inside the Geforce Experience interface.
And if I'm adding in wish list items, I'd love it if nVidia would integrate SMAA and FXAA into the program. Hell, I think I'd really prefer it if Geforce Experience would serve a similar function to SweetFX except in an official setting kinda way. So we could tweak the game from Geforce Experience in addition to just it serving as a simple optimizer.
It could come with an "Advanced" mode. I think a quick launch and/or settings for the control panel might be nice to help us move between different UI's, from adjusting the game to adjusting the profiles to adjusting the settings of the video card. Maybe it should be the same interface with different tabs to interact with each element.
mmm, I don't like it seems to push the pretty too much hurt performance no end. Now it might be that its looking at GPU only, in which case...duh pointless.
Nice idea, but needs to be a bit more balanced in its options!
Jared, can you confirm or not if these parts have the same (or very similar) power envelope as their like-named 600 series parts that are being replaced?
My understanding is that they do have the same approximate power envelopes. However, keep in mind that NVIDIA doesn't disclose notebook power TDPs -- they simply say that they work with each OEM to provide what the OEM desires. Thus, two laptops with GT 750M could potentially have TDPs as much as 5-10W apart (or not -- I don't know how much of a difference we're likely to see).
I hate this up to crap for specs. This leaves way too much wiggle room for OEMs to under clock the chips to fit a certain cooling profile. This messes with performance way too much. There should be clearly defined specifications for each GPU model. The typical consumer doesn't understand that the bigger number doesn't mean faster. It doesn't make sense that you pay more for a higher end part only to have it nerfed down to the OEM's cooling solution, etc...
NVIDIA's policy is that a GPU has to score within 10% of the "stock" GPU in order to have the same model name, so a GT 650M with DDR3 can't be more than 10% off the performance of a GT 650M with GDDR5. Of course, there's a catch: the 10% margin is measured with 3DMark Vantage "Performance" defaults, which aren't nearly as meaningful as using a suite of games for the testing. So basically, I'm with you: it sucks.
Haven't you heard? Windows 8's desktop is apparently unusable, you are forced into the Start screen for everything. And, it's really, really hard to click the tiles with a mouse.
Obviously I'm referring more the the OEM feelings than what people really need; I probably should have put requires in quotes. :-) But, I will say that I find the Start Screen to be much more useful with a touchscreen vs. a mouse. I will also say that I'm still running Windows 7 on all my personal use laptops and desktops, and on most Windows 8 laptops I install Classic Shell.
Almost certainly thanks to pricing, but don't forget to note that the 8670M is only in the AMD A10 based model -- oddly, at least right now, the Intel model doesn't support a dGPU?
does it mean that gt 730m which appeared earlier this year in laptops such as dell inspiron 14r still uses GK107(128 bit memory bandwith) rather than just announced 64 bit GK208.
I think so... but you'd have to see if it's a 64-bit or 128-bit interface. To my knowledge, only GK208 supports 64-bit interface configurations. (Well, along with Fermi of course.)
Nobody bothers any more to complain about Fermis still being used in the "700 series", let alone the same Kepler chips as in the "600 series". Maybe they should start to simply attach years to the model numbers, to make it clear that they say nothing anymore about the technological generation or capabilities of the chips.
I used to shake my head when I read comments about supposed 20+ lbs laptops, noise, heat, cost, etc.
They are not as noisy, nor as heavy as you think. Sure, if you run wPrime all day, you are going to hear it...if you OpenCL compute, you'll hear the GPUs too.
But nobody is forcing you to buy it, and there is clearly a market out there.
Now I smile at the ignorant comments. Like another posted said, making these big laptops is not hurting you, so why hate?
My M18x R2 gets 5+hrs on discrete graphics (nice for typing reports in airports, when you've no lounge access, and unsure of the batteries current capacity).
On it's dual graphic cards I push 3DMark06 33,500, 3DMark11 11,250+, under Linux, Pyrit crunches 130,000PMK/s - all with NO GPU OVERCLOCKING.
Can YOUR desktop do that? I know of many that can't.
What are valid opinions here, is the confusing GPU-naming-game, that both AMD & Nvidia play.
Forcing many uninformed / incorrect purchases, the world over.
If the average consumer knew what they'd get with GDDR3, and 128bit bus width, they might run a mile. Let alone what architecture might reside beneath... I'd welcome a more consistent naming approach, like BMW, for example. (You can be sure your 550i is gonna smoke a 316i). And I'm not saying that is a perfect system either.
Anyway, like they say on Youtube, "Haters are gonna hate".
Hi Jarred. I have a confirmation that acer laptop v3 571g has 730m with 128 bit memory bus bandwidth interface. What do you think about other OEMs like Dell, will they implement the same 128 bit interface or 64 bit interface as specifications tend to differ among OEMs also Notebookcheck.net indicates interface as 128/64 bit.
THe only thing I am pissed off about from Nvidia is that their technical specifications about their mobile dGPUs. I have a hard time finding out the difference b/w some of their cards by looking at the spec. sheet. They all apper to be almost same (unless you consider the clock variations.)!!
Overclocking is safe , but only as long as you don't mess with the voltages and step up the clock speed slowly. And its not at all a good idea to push the dGPU around the upper limits. I don't know much about the newer 700m series , but I used to have two models of Kepler600m series. I overclocked the GT640m to touch the values around a DDR3 GT650m , using a modded vBIOS. The temperature for GPU never exceeded 76C with the help of a powerful cooler at 99% load for several minutes. Performance was on par with GT650 DDR3.!!! :)
However , for me the major issues was the blazing hot cores of 3610QM.Under 60-75% load for 30mins , it reaches 90C , just like that.!!!! Its probably either a poor thermal paste.
And to those who think the laptops with dGPU are poor performing and are overpriced , " are you new to this world , baby!!!! ?? ". Cause here on Earth , I have never heard of a Notebook which performs better than a Desktop at same price point. You have to pay the price for mobility.!!!
you see for many consumers, like myself, we appreciate your efforts to interpret what is going on with the the specs of tgese chips. however even with this information i am weary of trying decipher what and when I should actually purchase a machine that is not just a rebrand or fail chip. I got burned on a Toshiba satellite with sli of their 8600M and it was a piece, one that failed in just over two years. anyway, that was 2007 and I said I'd never buy another "gaming laptop" but it's time to try again. would anyone of you recommend a laptop that is already available or will soon be. I am really looking at a 680m as it shold be relevant for a couple of years. however, I really don't have more than 1.5k to burn especially on another fail laptop. I also looked at the macbook but they are running only 650m and I'm not sure when their next gen is forthcoming.
I realize this likely won't be read or replied to, but it would be GREATLY appreciated to have a few laptops reviewed with the 7 hundred series nVidia GPU's prior to the Haswell launch.
In my case it would be much easier to compare a Ivy Bridge/660M to IB/750M and finally the to a Haswell/750M in a midrange gaming system rather than skipping the IB/750M step and wondering how much of the change is CPUvsGPU based.
Lenovo had/has some IB/750M gaming laptops for sale (replacing an IB/660M offering) and will likely have a Haswell/750M available shortly after the Haswell launch. MSI also has lines with IB/660M and is likely to be at the haswell mobile launch party.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
91 Comments
Back to Article
Torrijos - Monday, April 1, 2013 - link
Hope they'll carry on giving mac user drivers quickly.Jorgisven - Monday, April 1, 2013 - link
Having spoken with nVidia technical engineers as part of my job, nVidia does not handle drivers for OSX. They "advise", but don't do any of the actual driver writing. Apple does that in-house. Boot Camp Windows, however, follows the same driver update path as everyone else using Windows.Boland - Tuesday, April 2, 2013 - link
nVidia's job descriptions page says otherwise. They're actually looking at expanding their mac driver team.http://www.nvidia.com/page/job_descriptions.html
cpupro - Tuesday, April 2, 2013 - link
Yeah right, nVidia is giving specs of their GPU's to Apple developers so they can write GeForce drivers for OSX. nVidia is not crazy to share their knowledge to competition, because for writing drivers you need to know how GPU internally work.kasakka - Thursday, April 4, 2013 - link
To my understanding it used to be Apple who wrote the drivers but Nvidia has possibly taken the reins back to themselves. There have been some Nvidia driver update releases that are newer than what is found in Apple's updates.TerdFerguson - Monday, April 1, 2013 - link
I'll never, ever, buy another laptop with a discrete GPU. The extra heat and power drain, together with the inflated prices and dishonest marketing just aren't worth the modest performance increase on a machine that will never really provide the same level of gaming performance that even a dirt cheap desktop machine will.If a pair of 680M cards in SLI performs worse than a single 660TI, then it's just plain dishonest for NVidia to keep branding them thusly. I don't see onboard graphics overtaking desktop video boards any time soon, but for laptops the time is near and it can't come soon enough.
geniekid - Monday, April 1, 2013 - link
There are a number of laptops that let you switch between discrete and integrated graphics on demand so you can save power when you're on-the-go and still have that extra power when you're plugged in.As for value versus desktops, yes there's a premium for mobility and the value of that mobility depends greatly on your lifestyle and job conditions.
Flunk - Monday, April 1, 2013 - link
You have a point when it comes to high-end "gaming" laptops that weight 20+ pounds, cost a fortune and perform poorly. But there is a place for mid-range discrete GPUs in smaller systems that allow you to play games at moderate settings if you're on the go.I think the best option would be a small laptop that connects to an external GPU but it appears that the industry disagrees with me.
nehs89 - Monday, April 1, 2013 - link
I totally agree with you.... all laptops in general and also win8 tablets should connect to an external GPU....that would be the solution to many problems.... you want to play heavy duty games just plug in the external GPU and If you want or need portability then use the ultrabook alone....I have also read that with the current technology this is not possibleKitsuneKnight - Monday, April 1, 2013 - link
Sony shipped a laptop that supported a (low end) external dGPU. Another company showed a generic enclosure that could be used to connect a GPU to a computer via Thunderbolt (I'm not sure if it ever actually shipped, though). It certainly is possible, even if there's currently no link that could provide enough bandwidth to let a top-of-the-line GPU run full tilt.I would think nVidia and/or Intel would want to push that market more, but it doesn't seem like anyone really cares, unfortunately. It would be nice to be able to 'upgrade' a laptop's GPU without having to replace the entire thing.
crypticsaga - Monday, April 1, 2013 - link
What possible reason would intel have for pushing a product like that? In fact if some sources are correct they are trying to do the exact opposite by bottlenecking even internal dGPU by limiting available PCIe lanes in Broadwell.shompa - Monday, April 1, 2013 - link
That problem have been solved for over a year with Thunderbolt. Use a Thunderbolt PCIE with graphic card.fokka - Monday, April 1, 2013 - link
afaik this is not entirely correct since even thunderbolt is too slow to properly utilize a modern graphics card.this is not surprising, since thunderbolt is based on 4x pci-e 2.0 (2GB/s) and current desktop class graphics are using 16x pci-e 3.0 (~16GB/s) which is about eight times as fast.
so i wouldn't say the problem is completely solved throughput-wise, but thunderbold sure was an important step in the right direction.
MojaMonkey - Monday, April 1, 2013 - link
No, shompa is correct, it has been solved with Thunderbolt and I'm personally using a GTX 680 connected externally. Works great.Wolfpup - Monday, April 1, 2013 - link
Ugh. You're either ignorant or reaaaaally generous with the hyperbole. "20+ lbs notebooks"? Really?In real life, mid-range notebooks/GPUs do fine for gaming, and high end notebooks/GPUs do...REALLY fine. When you can max out today's games at 1080p, that isn't "performing poorly", and is orders of magnitude better than Intel's video.
If YOU guys don't want high end notebooks, fine, but I don't see how they're hurting you.
lmcd - Tuesday, April 2, 2013 - link
My cheap A8m (Trinity) can play Rage at high texture res at panel res (1366x768), just for starters. And that's $400 level I think right now.superjim - Wednesday, April 10, 2013 - link
I can confirm this. An A8-4500M does really well for $400 or below on 1366x768. Now if the A10 would come down to $400 we'll really be in good shape.xenol - Monday, April 1, 2013 - link
I had a laptop with discrete graphics that lasted for over 9 hours on battery, while surfing the web. It was a laptop with an early form of Optimus (you had to manually switch), but still, you can have graphical performance xor battery life if you don't need the performance. But asking for both? Now that's silly.As for your issue with marketing the 680M as it is when it can't outperform a midrange desktop card... You do realize that this is a different market segment? Also you should tell "shame on you" to all the display companies who mislead customers into think they're buying a panel that can do 16 million colors (which last I checked, 18-bits is not 16 million) or have a 1000000:1 contrast ratio (which you need to be in a pitch black room and being shown a black/white checkerboard pattern to see).
Wolfpup - Monday, April 1, 2013 - link
"Modest performance increase"? I wouldn't call my GTX 680m a "modest performance increase" over Intel video lolAre you KIDDING?!? Notebook hardware is ALWAYS worse than desktop. This applies obviously to CPUs too, which you're inexplicably not complaining about. You always pay more to get the same performance. That doesn't mean it's "dishonest" or the like.
And quite obviously integrated video can never catch up with a discreet part so long as they make high end discreet parts, so the time is "never", not "near".
****
Regarding the article...Optimus...eh, Nvidia's driver team is impressive as always, but literally the first program I ran that I wanted to run on the GPU wouldn't run on the GPU...thankfully my notebook lets you turn off Optimus.
JarredWalton - Monday, April 1, 2013 - link
Which program did you run that you wanted on the GPU? Please be very specific.JarredWalton - Monday, April 1, 2013 - link
*Crickets* (again)And this is why I don't buy into such claims; I had some issues when Optimus first launched, but I haven't had any I can specifically pinpoint in the last year or more. If anyone has something under Windows that doesn't play right with Optimus, please post it here so I can verify the issue. Otherwise, with no concrete evidence, it's just FUD.
Wolfpup - Tuesday, April 2, 2013 - link
VLC for starters. I seriously have no idea how you can be unaware of issues with it-any big notebook forum will have threads about it all the time, people trying to disable it.mrhumble1 - Monday, April 1, 2013 - link
Speaking of dishonest, you are not sticking to the facts. You obviously don't own a nice laptop with a 680M inside. I do, and it performs amazingly well. I output games from my laptop to my TV via HDMI and they look spectacular.You also obviously don't understand that desktop PCs (and their components) cannot be directly compared laptops. I also highly doubt that a "dirt cheap" PC can run The Witcher 2 at almost Ultra settings at a playable framerate.
tviceman - Monday, April 1, 2013 - link
You're missing out then if you like to game. I've got a 560m that still performs admirably, running many of today's games with max settings (no AA) at 1600x900 60fps. I'm spoiled by fast frame rates and decent graphics settings, I can't imagine using even the upcoming haswell to play games like Bioshock Infinite on.xTRICKYxx - Monday, April 1, 2013 - link
I will never ever buy a laptop without a discrete card. Video cards 7770M/650M or above can play any game on 1920x1080 on high if there is a good enough CPU as well. Mobile graphics are starting to become powerful enough.Look at Intel CPU's. My i7-2600K at home is slightly slower than my i7-3720QM clock for clock.
jsilverstreak - Monday, April 1, 2013 - link
680m sli is about 10% faster than a 680and the 780m's leaked benchmark was on par with a 660ti
jsilverstreak - Monday, April 1, 2013 - link
oops it's actually the 680 the 780m is on par withalthough its leaked
nerd1 - Saturday, April 13, 2013 - link
You obviously dont game, do you?Mr. Bub - Wednesday, April 17, 2013 - link
Sure, maybe most mainstream users who facebook and email all day on laptops don't need anything more than integrated graphics, but guys like me (engineering students) who actually do stuff on computers will still rely on discrete GPUs. I can't properly run any of my CAD software without a discrete GPU.nerdstaz - Monday, April 22, 2013 - link
http://www.notebookcheck.net/NVIDIA-GeForce-GTX-68...Your comparison is a bit off base <3
moep - Monday, April 1, 2013 - link
The slide deck does not work (404).Technical difficulty or did you have to pull it?
JarredWalton - Monday, April 1, 2013 - link
It's there now... our gallery is set to post things by day only, and since the NDA was at 9AM Eastern I didn't want the gallery going live nine hours early. So I set it for 4/2 and then just changed it to 4/1.nunomoreira10 - Monday, April 1, 2013 - link
I was hoping to get an 15 inch normal size laptop with a gk106 fully enabled this summer, but this just crushed my hopes.they will just keep using those highly clocked gk107 and call it mid-high end...
JarredWalton - Monday, April 1, 2013 - link
A fully enabled GK106 is almost certainly in the works, but it will be something like a GTX 770M. GTX 780M is likely to be GK104, but it will be interesting to see if NVIDIA can do a full enabled GK104 while keeping it within the typical 100W power envelope that gaming notebooks target. GTX 680MX is a full GK104, but I understand TDP is ~125W or so and that's why the only product using it is the iMac 27. We'll likely see the high-end 700M parts launch in June or July.transphasic - Monday, April 1, 2013 - link
Jared, a question for you regarding the 700m line out this year- are these going to be backwards compatible with our current HM laptops from 2012, or are they only for the new Haswell design?Due to the issues that we have had with AMD, there's quite a few of us that want to get rid of AMD, and upgrade to the newest Nvidia card, and this 780m is quite a bit faster than the 680m.
I would like to swap out my 7970m GPU for the upcoming 780m card, and was wondering if this is do-able or not.
JarredWalton - Monday, April 1, 2013 - link
Keep in mind that I'm guessing at 780M -- I could be off. As for backwards compatibility, given that we're essentially looking at minor tweaks and revisions to Kepler, they should all work fine in existing laptops. Getting a replacement GPU might be a bit difficult/expensive, though. Quick question: did the latest 13.3 beta drivers help you at all with 7970M? I Haven't had a chance to test then yet.transphasic - Monday, April 1, 2013 - link
The new Catalyst 13.3 beta drivers are fine, and have no problems yet with them. My concern, though, is that these last few driver releases from AMD have not/are not dealing with increasing the FPS rates on current games, while Nvidia's has.Even though the new games that are out like Bioshock are AMD-coded, Nvidia is winning the battle on better gaming performance. While Nvidia is releasing their standard (and better) drivers for newer games at a faster rate than AMD is (as is usual), AMD is falling further and futher behind in other areas as well in terms of hardware specs- in this case, their 8000 series gaming GPU for this summer which has been very disappointing in it's specs, to say the least.
This is why I want to upgrade to the new 780m GPU. From what was said on another forum, the preliminary specs for the 780m is the equivalent of the 680MX for the Imac released late last year.
It is said to offer a 20%+ improvement over the 680m, and if it's backwards compatible to the HM series motherboards that we currently have, it's worth thinking about upgrading to this year for our existing Sager laptops.
You're right about the cost, though. If the price is prohibitive, then it would make sense to wait for the Maxwell release next year with whatever Nvidia has out then.
It really depends on cost at this point.
nunomoreira10 - Monday, April 1, 2013 - link
A fully enabled gk106 may be asking to muchmaybe with 1 smx and a 64-bit memory controler disabled
leaving it with 3 smx a fast 128-bit memory controler and call it gt755m or gtx760m
it would be the gt555m of this generation and a sweet spot for a standart perfomance 15 inch laptop.
JarredWalton - Monday, April 1, 2013 - link
I'm about 99% sure we'll see a full GK106 on mobile this year; the only question is what they'll call it. Well, that and how much power will it use and what will its clocks be. For clocks, 2500MHz (5000MHz effective GDDR5) seems likely, and clocks will probably be in the 600-700MHz range with Boost taking it up to a max of around 800MHz. That's my guess anyway. TDP will be ~70W, though, so this will be a part for larger gaming notebooks only./speculation
tviceman - Monday, April 1, 2013 - link
Based on the products announced today, it looks like GK106 based mobile GPU's will start with the gtx760m, and will surely be available in a 15" form factor.Kevin G - Monday, April 1, 2013 - link
What I'd like to see is an ExpressCard version of the low end parts. I've been working with numerous business class laptops with this expansion slot and I've run into the situation where I could use an additional display. I've used USB adapters but they've been less than ideal. I fathom a low clock GK208 chip and a 64 bit wide memory bus could be squeezed into an ExpressCard form factor. I'd expect it to perform around the level of Intel HD4000 but that'd still be far superior to USB solutions.arthur449 - Monday, April 1, 2013 - link
While some ExpressCard slots give access to the PCI-E bus, the problem is that the laptop's BIOS/UEFI has to support the device in its whitelist. In almost every situation where people have modded their laptops and attached them to external GPUs, they had to flash a custom ROM to remove compatibility restrictions put in place to limit the amount of compatibility testing the vendor had to conduct.rhx123 - Monday, April 1, 2013 - link
A surprisingly low amount of laptops needed modification to remove the whitelist on the express card slot, and it is possible to do it with software pre-windows if there is whitelisiting.I did not have to whitelist on my Lenovo X220T.
JarredWalton - Monday, April 1, 2013 - link
Cooling would require the majority of the GPU to exist outside of the slot if you go this route. I don't think you could properly route heat-pipes through the relatively thin slot opening with a radiator/fan on the outside. Once you go external, the number of people really interested in the product drops quite a bit, and you'd still need to power the device so on most laptops without a dGPU I expect the external ExpressCard option would also require external power. At that point, the only real value is that you could have an external GPU hooked up to a display and connect your laptop to it for a semi-portable workstation.Kevin G - Monday, April 1, 2013 - link
It would be crazy to put any of these chips into an ExpressCard form factor without reducing power consumption. I was thinking of dropping the clock down to 400 Mhz and cutting power consumption further with a corresponding drop in voltages. It wouldn't have to break any performance records, just provide full acceleration and drive an external display.In hindsight, the GK208 may be too power hungry. The 28 nm Fermi parts (GF117?) should be able to hit the power and thermal allocations for ExpressCard without resorting to an external chassis.
Wolfpup - Tuesday, April 2, 2013 - link
I like the IDEA of a connection to an external dock that allows ANY video card to be used (heck, why not go for SLI?) but notebooks would have to be able to support it-sounds like lots don't, plus tons of notebooks don't have ExpressCard slots anymore (plus not sure if the bandwidth would start being a bottleneck or not). (Or obviously Thunderbolt could theoretically pull this off too...IF you could just boot with any GPU installed and have the external GPU active by the time Windows boots at least).rhx123 - Monday, April 1, 2013 - link
You can make an external graphics card if you want, I have a 650Ti desktop card attached through ExpressCard.It's powered by an XBox PSU.
http://imgur.com/239skMP
rhx123 - Monday, April 1, 2013 - link
It can drive the internal laptop display through Optimus.Flunk - Monday, April 1, 2013 - link
Disappointing, this is a really small bump. Mostly a re-labelling of existing parts. Although I suppose it is to be expected seeing as almost all Geforce GT 640m LE-650ms can be clocked up to 1100Ghz with a little bit of bios hacking.JarredWalton - Monday, April 1, 2013 - link
Besides the fact that nothing runs at 1100GHz (or Ghz, whatever those are), I dare say you've exaggerated quite a bit. Many laptops with even moderate dGPUs run quite warm, and that's with the dGPUs hitting a max clock of around 900MHz (GT 650M with DDR3 and a higher clocked core as opposed to GDDR5 with a lower clocked core). If you manage to hack the VBIOS for a laptop to run what is supposed to be a 500MHz part at 1GHz or more, you're going to overload the cooling system on virtually every laptop I've encountered.In fact, I'll go a step further and say that with very few exceptions, overclocking of laptops in general is just asking for trouble, even when the CPU supports it. I tested old Dell XPS laptops with Core 2 Extreme CPUs that could be overclocked, and the fans would almost always be at 100% under any sort of load as soon as you started overclocking. Long-term, that sort of thing is going to cause component failures far more quickly, and on laptops that cost well over $2000 I think most would be quite angry if it failed after a couple years.
If you understand the risks and don't really care about ruining a laptop, by all means have at it. But the number of laptops I've seen running stock that have heat dissipation issues urges extreme caution.
StevoLincolnite - Monday, April 1, 2013 - link
Not really. Overclocking is fine if you know what you're doing.Years ago I had a Pentium M 1.6ghz notebook with a Mobility Radeon 9700 Pro.
Overclocked that processor to 2.0ghz+ and the Graphics card core clock was almost doubled.
Ran fine for years, eventually the screen on it died due to sheer age, but I'm still using it as file server hooked up to an old monitor still to this day, with about a half dozen external drives hanging off it.
JarredWalton - Monday, April 1, 2013 - link
Hence the "with very few exceptions". You had a top-end configuration and overclocked it, but that was years ago. Today with Turbo Boost the CPUs are already pushing the limits most of the time in laptops (and even in desktops unless you have extreme cooling). GPUs are doing the same now with GPU Boost 2.0 (and AMD has something similar, more or less). But if you have a high-end Clevo, you can probably squeeze an extra 10-20% from overclocking (YMMV).But if we look at midrange offerings with GT 640M LE...well, does anyone really think an Acer M5 Ultrabook is going to handle the thermal load or power load of a GPU that's running twice as fast as spec over the long haul? Or what about a Sony VAIO S 13.3" and 15.5" -- we're talking about Sony, who is usually so worried about form that they underclock GPUs to keep their laptops from overheating. Hint: any laptop that's really thin isn't going to do well with GPU or CPU overclocking! I know there was a Win7 variant of the Sony VAIO S that people overclocked (typically 950MHz was the maximum anyone got stable), but that was also with the fans set to "Performance".
Considering the number of laptops I've seen where dust buildup creates serious issues after six months, you're taking a real risk. The guys who are pushing 950MHz overclocks on 640M LE are also the same people that go and buy ultra-high-end desktops and do extreme overclocking, and when they kill a chip it's just business as usual. Again, I reiterate that I have seen enough issues with consumer laptops running hot, especially when they're over a year old, that I suggest restraint with laptop overclocking. You can do it, but don't cry to NVIDIA or the laptop makers when your laptop dies!
transphasic - Monday, April 1, 2013 - link
Totally agreed. I had a Clevo/Sager Laptop with the 9800m GTX in it, and after only two years, it died, due to the Nvidia GPU getting fried to a crisp. The heat build-up from internal dust accumulation was what destroyed my $2700 dollar laptop after only 2 years of use.Ironically, I was thinking about overclocking it prior to it dying on me. In looking back, good thing I didn't do it. Overclocking is risky, and the payoffs are just not worth it, unless you are ready to take the expensive financial risks involved.
Drasca - Tuesday, April 2, 2013 - link
I've got a Clevo x7200 and I just cleaned out a wall of dust after discovering it was thermal throttling hard core. I've got to hand it to the internals and cooling of this thing though, it was still running like a champ.This thing's massive cooling is really nice.
I can stably overclock the 485m GPU from 575 Mhz to 700Mhz without playing with voltages. No signifigant difference in temps, especially compared to when it was throttling. Runs at 61C.
I love the cooling solution on this thing.
whyso - Monday, April 1, 2013 - link
It depends really. As long as you don't touch voltage the temperature does not rise much. I have a 660m and it reaches 1085/2500 without any problems (ANIC rating of 69%). Overclocked vs non overclocked is basically a 2 degree difference (72 vs 74 degrees). Better than a stock 650 desktop.Also considering virtually every 660m I have seen boost up to 950/2500 from 835/2000 I don't think the 750m is going to be any upgrade. Many 650m have a boost of 835 core so there really is no upgrade there either (maybe 5-10%). GK107 is fine with 64 GB/sec bandwidth.
whyso - Monday, April 1, 2013 - link
Whoops sorry didn't see the 987 clocks, nice jump there.JarredWalton - Monday, April 1, 2013 - link
Funny thing is that in reading comments on some of the modded VBIOS stuff for the Sony VAIO S, the modder say, "The Boost clock doesn't appear to be working properly so I just set it to the same value..." Um, think please Mr. Modder. The Boost clock is what the GPU is able to hit when certain temperature and power thresholds are not exceeded; if you overclock, you've likely inherently gone beyond what Boost is designed to do.Anyway, a 2C difference for a 660M isn't a big deal, but you're also looking at a card with a default 900MHz clock, so you went up in clocks by 20% and had a 3% temperature increase (and no word on fan speed). Going from 500MHz to 950MHz is likely going to be more strenuous on the system and components.
damianrobertjones - Monday, April 1, 2013 - link
"and their primary competition in the iGPU market is going to be HD 4000 running on a ULV chip!"Wouldn't that be the HD 4600? Also it's a shame that no-one really states the HD4000 with something like Vengeance ram which improves performance
HisDivineOrder - Monday, April 1, 2013 - link
So if the "core hardware" is the same from Boost 1 and 2, then nVidia should go on and make Boost 2.0 be something we all can enable in the driver.Or... are they trying to get me to upgrade to new hardware to activate a feature my card is already fully capable of supporting? Haha, nVidia, you so crazy.
JarredWalton - Monday, April 1, 2013 - link
There may be some minor difference in the core hardware (some extra temperature or power sensors?), but I'd be shocked if NVIDIA offered an upgrade to Boost 1.0 users via drivers -- after all, it looks like half of the performance increase from 700M is going to come from Boost 2.0!HisDivineOrder - Monday, April 1, 2013 - link
Yeah. I kinda figured. Still, if it's the same, then it'd be foolish not to ask.I knew when I heard about Boost 2.0 in Titan that all that time spent discussing it was going to mean it would show up in Kepler refresh products relatively soon afterward. I wouldn't be surprised to see nVidia refresh even the desktop high end with something like that. Minor changes, including slightly higher clocks and a newer Boost.
Even a "minor change" would probably be enough to ruin AMD's next six months.
Guspaz - Monday, April 1, 2013 - link
I've been using GeForce Experience, and have some comments. It's definitely a timesaver, and it's nice to be able to "just click a button" and not have to worry about tweaking the detailed settings (although it's nice to still be able to if I want to override something). I find that the settings it picks generally do run at a good framerate on my rig. It also makes driver updates easier, since it presents you with notification of new drivers (even betas), gives you a nice list of changes in each version, and makes install a one-click affair (it downloads/installs inside the app).Downsides? First, it doesn't support very many games. This is understandable since supporting a game means they need to have setting profiles for every one of their cards, but also a whole lot of other configurations such as different CPUs and different monitor resolutions. Unless there is some sort of dynamic algorithm involved, that would be an enormous number of potential configs per game. Still, the limited game support is unfortunate. Second, the app will continually notify you that new optimized settings are available, even when the new settings it downloaded are not for any game you have installed. So it keeps telling me there are new settings, but when I go into the app to check, there are no updates for any of my games.
Wolfpup - Monday, April 1, 2013 - link
I hadn't heard of this program, and have to say it's kind of a cool idea. Heck, *I* don't always like messing around with sometimes vaguely settings in games, I think for the average user this could be really cool, and does indeed help make it more console like.HisDivineOrder - Monday, April 1, 2013 - link
I like that they went in and started supporting prioritizing resolution. So instead of just abstractly telling me to change my 2560x1600 to 1920x1200/1080, they leave it at 2560x1600 now. That's good.Plus, their latest release notes also said they were adding SLI support, which is great.
The thing that I think this program lacks is the option to set certain settings that you want to be true regardless and then have the program adjust to recommend specs around certain "givens" that you won't ever change.
For example, I'm not a big fan of AA unless there is ABSOLUTELY no performance setting that can't be turned all the way up. I can imagine some people might want AA at all costs because jaggies just bug them.
I think we should both have the option to prioritize for the setting we want. I'd also love it if we had a program like Geforce Experience that let us alter the settings for a game before we entered it and also served as a launcher (much as Geforce Experience does), but I think instead of just doing the defaults, we should have the option to select the settings, choose the "Optimal" as determined by nVidia, and also the option to do the tweaking from right inside the Geforce Experience interface.
And if I'm adding in wish list items, I'd love it if nVidia would integrate SMAA and FXAA into the program. Hell, I think I'd really prefer it if Geforce Experience would serve a similar function to SweetFX except in an official setting kinda way. So we could tweak the game from Geforce Experience in addition to just it serving as a simple optimizer.
It could come with an "Advanced" mode. I think a quick launch and/or settings for the control panel might be nice to help us move between different UI's, from adjusting the game to adjusting the profiles to adjusting the settings of the video card. Maybe it should be the same interface with different tabs to interact with each element.
And... nah. That's it.
cjb110 - Tuesday, April 2, 2013 - link
mmm, I don't like it seems to push the pretty too much hurt performance no end.Now it might be that its looking at GPU only, in which case...duh pointless.
Nice idea, but needs to be a bit more balanced in its options!
tviceman - Monday, April 1, 2013 - link
Jared, can you confirm or not if these parts have the same (or very similar) power envelope as their like-named 600 series parts that are being replaced?JarredWalton - Monday, April 1, 2013 - link
My understanding is that they do have the same approximate power envelopes. However, keep in mind that NVIDIA doesn't disclose notebook power TDPs -- they simply say that they work with each OEM to provide what the OEM desires. Thus, two laptops with GT 750M could potentially have TDPs as much as 5-10W apart (or not -- I don't know how much of a difference we're likely to see).Einy0 - Monday, April 1, 2013 - link
I hate this up to crap for specs. This leaves way too much wiggle room for OEMs to under clock the chips to fit a certain cooling profile. This messes with performance way too much. There should be clearly defined specifications for each GPU model. The typical consumer doesn't understand that the bigger number doesn't mean faster. It doesn't make sense that you pay more for a higher end part only to have it nerfed down to the OEM's cooling solution, etc...JarredWalton - Monday, April 1, 2013 - link
NVIDIA's policy is that a GPU has to score within 10% of the "stock" GPU in order to have the same model name, so a GT 650M with DDR3 can't be more than 10% off the performance of a GT 650M with GDDR5. Of course, there's a catch: the 10% margin is measured with 3DMark Vantage "Performance" defaults, which aren't nearly as meaningful as using a suite of games for the testing. So basically, I'm with you: it sucks.random2 - Monday, April 1, 2013 - link
... but again considering Windows 8 almost requires a touchscreen to really be useful that’s expected ...Really? Once this OS is set to boot to the desktop, it's a great little OS for those of us who don't run tablets or touch panels.
kyuu - Monday, April 1, 2013 - link
Haven't you heard? Windows 8's desktop is apparently unusable, you are forced into the Start screen for everything. And, it's really, really hard to click the tiles with a mouse.JarredWalton - Monday, April 1, 2013 - link
Obviously I'm referring more the the OEM feelings than what people really need; I probably should have put requires in quotes. :-) But, I will say that I find the Start Screen to be much more useful with a touchscreen vs. a mouse. I will also say that I'm still running Windows 7 on all my personal use laptops and desktops, and on most Windows 8 laptops I install Classic Shell.lightsout565 - Monday, April 1, 2013 - link
I wonder why Vizio chose to go with AMD for their dGPU's, putting the 8670M in their new 15.6" thin and lightJarredWalton - Monday, April 1, 2013 - link
Almost certainly thanks to pricing, but don't forget to note that the 8670M is only in the AMD A10 based model -- oddly, at least right now, the Intel model doesn't support a dGPU?lightsout565 - Monday, April 1, 2013 - link
Exactly. Samsung's new Series 7 Chronos offers a quad core i7 (identical to the Vizio's) with an 8870m. Look's nice.willy54 - Monday, April 1, 2013 - link
My 670mx in my g75vx is a gk104 will that make any difference in overclocking or does the memory width hold it back?karbom - Tuesday, April 2, 2013 - link
does it mean that gt 730m which appeared earlier this year in laptops such as dell inspiron 14r still uses GK107(128 bit memory bandwith) rather than just announced 64 bit GK208.JarredWalton - Wednesday, April 3, 2013 - link
I think so... but you'd have to see if it's a 64-bit or 128-bit interface. To my knowledge, only GK208 supports 64-bit interface configurations. (Well, along with Fermi of course.)karbom - Saturday, April 13, 2013 - link
Thnx i guess i have no other option than to 1st buy inspiron 14 and then check it out myself for 128/64 bit interface LOL........!MrSpadge - Tuesday, April 2, 2013 - link
Well played, nVidia!Nobody bothers any more to complain about Fermis still being used in the "700 series", let alone the same Kepler chips as in the "600 series". Maybe they should start to simply attach years to the model numbers, to make it clear that they say nothing anymore about the technological generation or capabilities of the chips.
JarredWalton - Wednesday, April 3, 2013 - link
They're so low end that I hardly worry about them. Anyone buying a 710M or GT 720M ought to know what they're getting, and at least they're 28nm.Notmyusualid - Wednesday, April 3, 2013 - link
I used to shake my head when I read comments about supposed 20+ lbs laptops, noise, heat, cost, etc.They are not as noisy, nor as heavy as you think. Sure, if you run wPrime all day, you are going to hear it...if you OpenCL compute, you'll hear the GPUs too.
But nobody is forcing you to buy it, and there is clearly a market out there.
Now I smile at the ignorant comments. Like another posted said, making these big laptops is not hurting you, so why hate?
My M18x R2 gets 5+hrs on discrete graphics (nice for typing reports in airports, when you've no lounge access, and unsure of the batteries current capacity).
On it's dual graphic cards I push 3DMark06 33,500, 3DMark11 11,250+, under Linux, Pyrit crunches 130,000PMK/s - all with NO GPU OVERCLOCKING.
Can YOUR desktop do that? I know of many that can't.
What are valid opinions here, is the confusing GPU-naming-game, that both AMD & Nvidia play.
Forcing many uninformed / incorrect purchases, the world over.
If the average consumer knew what they'd get with GDDR3, and 128bit bus width, they might run a mile. Let alone what architecture might reside beneath... I'd welcome a more consistent naming approach, like BMW, for example. (You can be sure your 550i is gonna smoke a 316i). And I'm not saying that is a perfect system either.
Anyway, like they say on Youtube, "Haters are gonna hate".
Rishi. - Monday, April 22, 2013 - link
ummmmm........yes , its the confusing naming schemes they follow , coupled by somewhat more confusing spec. sheet.!nerd1 - Saturday, April 13, 2013 - link
Samsung released chronos 7 with 8870m, which lasts 10+ hrs, 20mm thick and more powerful than 670M and weighs slightly more than rmbp 15.Mobile gaming really gets awesome.
Rishi. - Monday, April 22, 2013 - link
Yeah , it certainally does.!!!I just wished I had a portable beast with GTX675 ,or around that.!!!
Desktop users gonna hate , though.!!!
karbom - Sunday, April 14, 2013 - link
Hi Jarred. I have a confirmation that acer laptop v3 571g has 730m with 128 bit memory bus bandwidth interface. What do you think about other OEMs like Dell, will they implement the same 128 bit interface or 64 bit interface as specifications tend to differ among OEMs also Notebookcheck.net indicates interface as 128/64 bit.Mr. Bub - Wednesday, April 17, 2013 - link
And here begins the obsolescence of my 6 month old laptop with a GT640m.Rishi. - Monday, April 22, 2013 - link
THe only thing I am pissed off about from Nvidia is that their technical specifications about their mobile dGPUs. I have a hard time finding out the difference b/w some of their cards by looking at the spec. sheet. They all apper to be almost same (unless you consider the clock variations.)!!Overclocking is safe , but only as long as you don't mess with the voltages and step up the clock speed slowly. And its not at all a good idea to push the dGPU around the upper limits.
I don't know much about the newer 700m series , but I used to have two models of Kepler600m series. I overclocked the GT640m to touch the values around a DDR3 GT650m , using a modded vBIOS. The temperature for GPU never exceeded 76C with the help of a powerful cooler at 99% load for several minutes.
Performance was on par with GT650 DDR3.!!! :)
However , for me the major issues was the blazing hot cores of 3610QM.Under 60-75% load for 30mins , it reaches 90C , just like that.!!!! Its probably either a poor thermal paste.
And to those who think the laptops with dGPU are poor performing and are overpriced , " are you new to this world , baby!!!! ?? ". Cause here on Earth , I have never heard of a Notebook which performs better than a Desktop at same price point. You have to pay the price for mobility.!!!
"Haters gonna Hate.!"
sdubyas - Wednesday, April 24, 2013 - link
you see for many consumers, like myself, we appreciate your efforts to interpret what is going on with the the specs of tgese chips. however even with this information i am weary of trying decipher what and when I should actually purchase a machine that is not just a rebrand or fail chip. I got burned on a Toshiba satellite with sli of their 8600M and it was a piece, one that failed in just over two years. anyway, that was 2007 and I said I'd never buy another "gaming laptop" but it's time to try again. would anyone of you recommend a laptop that is already available or will soon be. I am really looking at a 680m as it shold be relevant for a couple of years. however, I really don't have more than 1.5k to burn especially on another fail laptop. I also looked at the macbook but they are running only 650m and I'm not sure when their next gen is forthcoming.here is something I am also considering: http://www.villageinstruments.com/tiki-index.php?p... has anyone looked into the vidock?
Menetlaus - Monday, May 6, 2013 - link
I realize this likely won't be read or replied to, but it would be GREATLY appreciated to have a few laptops reviewed with the 7 hundred series nVidia GPU's prior to the Haswell launch.In my case it would be much easier to compare a Ivy Bridge/660M to IB/750M and finally the to a Haswell/750M in a midrange gaming system rather than skipping the IB/750M step and wondering how much of the change is CPUvsGPU based.
Lenovo had/has some IB/750M gaming laptops for sale (replacing an IB/660M offering) and will likely have a Haswell/750M available shortly after the Haswell launch. MSI also has lines with IB/660M and is likely to be at the haswell mobile launch party.
Trollinglikepong - Friday, May 17, 2013 - link
If only Nvidia would ditch ddr3 for GPU ram, rebadge 500M Kepler with higher clock speed, what Nvidia has always done, nothing to see here folks.