Cant wait for a 2630QM and a GT 555M in a 15.6 inch 1080p notebook. Now lets see who is first to market with that gem, and see what price we are looking at.
"Of course, while notebook manufacturers are doing the above, please quit with the lousy LCDs. Tablets are now shipping with IPS displays; can the laptops and notebooks get some luvin' as well? Also, stop with the glossy plastics, give us decent keyboards, and stop using 48Wh batteries in 15.6" and larger laptops!"
Please do this so I can stop paying an extra $1000 of Apple tax to get decent screens and batteries in a laptop?!
Remember that with the so called "Apple Tax" you also get a wonderful modern "Ultimate" version OS and a slew of really excellent useful applications in iLife minus the pile of crippled cpu-hogging crapware installed.
If you want to do the real math ( including the seemingly infinite headaches and hours of productivity lost by all my Windows clients who are constantly paying me to fix their machines ), Apple Macbooks are a steal!
There's a lot more to a laptop than the upfront cost.
Ironically, my macBook Pro is the best Windows laptop (using BootCamp) that I've ever owned and believe me I've owned many!
And I'll say again again what has already been said many times before: 16x9? Only good for for video! And video on almost all sub $1000 notebooks... Can you say 0% viewing angle? Because effectively it's exactly that on my 1215n.
jarred I like very much your take on reviewing notebooks, you provide an angle that most of us who are interested in buying the laptop are wanting.
However Im still left aloof when it comes to optimus. I cant see why all the hype about it, the switchable gpus is a reality even on the AMD field. granted that you need a board to coordinate the stuff, but it doesnt suffer the same driver issues that the optimus offer, the switch is not as seamless as it looks, and the drivers are plagued with bugs.
Not only that but its visible that optimus was designed to provide high performance when needed, thus improving the battery life, but is there any laptop out there that use optimus that also have a mid upper range GPU? Im not even going into the territory of the high end stuff.
I also like the idea, but the way optimus is now, I dont consider it a good thing, unless you are seeing the AMD side of things.
The big benefit of Optimus is that driver updates are available. If you get something with switchable graphics, you end up only getting new drivers when the laptop manufacturer puts together a package that includes your GPU and IGP drivers. In practice, that usually means you're stuck with whatever the laptop initially shipped with.
Mid-range GPUs like the 335M have done Optimus before, which I'd call midrange for the 300M series. I don't think anyone did higher than 435M Optimus up until now, but with Sandy Bridge you can now get quad-core as well as high-end Optimus. That's what I want to see, but we'll have to wait for someone to actually make it.
Optimus does have a few glitches on occasion with compatibility, but if you're stuck with drivers that are months old and you're trying to run a new game, it can be even worse. So the combination of driver updates and better battery life is a win for me.
While I am excited about the possibility of a GTX 560M using an uncut GF106 die, the fact that the GF108 only has 4 ROPs basically makes it worthless at gaming. A 96 shader card could have made a decent low-end gaming option, but the ROP count limits performance in ways that are simply insurmountable. It's true that we're probably looking at laptops with <1080p displays where the ROP count matters less, but still, I can't see the card being competitive enough to justify the cost. On the other hand, nVidia did make the right choice with the GTX 485M, that's the card the original GTX 480M should have been (much like the GTX 580 vs the GTX 480).
On page 2 you have listed the GT520m as a cut down GF108. The part is up on the nvidia site and it really does not look like GF108, more like a new chip the GF119. See here: http://www.nvidia.com/object/product-geforce-gt-52... The chip is physically much smaller and a different shape than the GF108 from the pictures.
It looks like the 520M has four fewer pins, but the top of the chips is quite different. NVIDIA didn't disclose any chip names to me when I asked, unfortunately, I have intermittent Internet access, so I can't really do much other than respond to posts right now, but I'll try to look into it later.
One small thing though - the pin outs on the GF108 and GF119 i think are identical. The website for some reason though shows the GF119 pads rotated by 90degrees, notice the triangle is in a different position to the equivalent shot for the GF108. The pin outs are very similar and same size to the older GF215 and GF216 chips, with 4 additional pins for the 2 new parts.
With regard to the GF119 performance, be a bit careful estimating it. As the codename indicates, it is rumored to have quite some advances over the other Fermi chips. For instance the increase from 420m to 520m performance level apparently only consumes 2W more according to the nvidia figures.
Thanks for taking the time to go through the new chips and explain the differences. Even then nVidia has made it confusing again. I really wish they would just stick with increasing the model number with increased performance.
At this rate Fermi will be in the 600s, maybe the 700Ms; and their next chip design will take them well beyond the 900s.
G80 and derivatives were the 8000s, 9000s, 100Ms, 200Ms, 300Ms.
Nvidia's whole point of new model numbers is to HIDE that there is no/minimal increase in performance.
I won't even begin to consider shedding a fraction of a tear that all the "1Gb Dedicated Graphics" cards will have no reason for existence. They are only ever targeted and marketed to the uninformed and it's clear a company has reached the pinnacle of un-ethics when it deliberately seeks to confuse the marketplace in order to market its products.
Amen to the closing statement on laptop LCD quality. 1366x768 is NOT an acceptable resolution on a 14/15/17" notebook. My 10" netbook as that resolution! The currently lack of ability to deliver quality displays is troubling. Dell has "supply issues" with their 1080p panel for the XPS 15, while HP pulled their 1600x panel for the Envy 14. Only the Sony Vaio Z is delivering the goods, with 1080p on a 13" panel.
"Clock speeds are also up, in this case it’s a 14% increase for the 550M vs. 435M, 20% for 540M vs. 425M, and 20% for the 525M over the 520M—not too shabby" I think you meant "525M over the 420M (not 520M).
The GT425M at 560core/1120processor/800RAM clocks. That's only 7%, 7% and 12.5% increases. I'm glad for the extra memory bandwidth and would love to see if giving this GPU GDDR5 would let it handle 1080p at least on the lowest settings in games like Crysis or Stalker. I just hope the "new" GT525M costs less than the GT425M cause realistically it won't be much faster if at all so the price should drop. I just really want a 15.6" laptop with a decent quality (Compal/Clevo) quality 1080p screen. i5 2520M, GT525M, 4GB DDR3 1333 for a thousand bucks or less. I REALLY don't care if it includes a blue ray drive, or even a dvd drive. And I REALLY REALLY want it to come with Seagate's Momentus XT Hybrid drive 500GB.
Finally got enough internet speed to be able to update a few bits in the text. The 525M replaces the 420M, which had 500/1000 clocks. The 535M replaces 425M, which is the 560/1120 clocks you mention. But you're right if you compare 525M to 425M. The overlapping names is more than a little confusing!
I also updated the information in regards to 520M/410M, which use the GF119 core.
If we can currently buy a 1GB GTX460 for 150$ any chance well get the GTX485 at a reasonable price? Sure its fully unlocked (why dont we have an unlocked desktop GTX460?), so make it 200$. I want to see laptops with that chip for under $1,000.
If Nvidia wants $500 for a card like that, whats stopping companies like Asus from just making their own mobile varients? Couldnt they just take the mobile PCI card blank and drop in an actual GTX460 chip and downclock it? Ive been saying all this time, if they can do GTX480 SLI in a laptop with an i7-980x, why the heck cant they just use a GTX460 and go with a much cheaper and more reasonable mobile CPU to make a low cost but effective gaming laptop?
Thats what I want, a midrange sandy bridge CPU and a switching GTX460 level GPU. Call it an all in one PC if you have to and stick it to a decent 17" LCD. Keep it at 6lbs or less. Id buy that for around a grand. Oh and let us use those larger mobile HDDs...I do need 1TB storage in a laptop and dont want to use 2 drive bays to do it. Keep the second for SSD.
NVIDIA didn't disclose any specific TDP, but they said they're competing to get into the same designs as AMD's high-end parts, and so it appears they're looking at around 75W to 80W.
nVidia and AMD are using different TDPs. The TDP of a Mobility Radeon is only the power consumed by the GPU itself while the TDP of a Geforce M is the total MXM Module power consumption.
Regardless, the point is that the notebook manufacturers have to be the ones to handle the cooling, so the premise is that NVIDIA and AMD mobile GPUs are targeting roughly the same power/thermal requirements.
Finally, competence will force nVidia to bring the goods.
But what nVidia needs is HydraLogic technology.
That way discrete graphics will add to integrated graphics instead of compete with it. The speed up would be 3X instead of 2X, and a good integrated would help nVidia to sell his hardware instead of hinder it.
The 384 Shaders are expected, but the 575MHz frequency is just unbelievable! Keep in mind that the GTX480M with 352 Shaders clocked at only 425MHz (which makes it even overrun by the high-freq-lower-shader-count GTX470M), it is easy to figure out that the 485M is theoretically around 1.5x performance of 480M. Keeping the TDP with 50% performance boost is really a great job done by nVidia.
There is one error in the article: the RAM frequency of 485M is 750MHz(3GHz effective), given that it uses 256bit GDDR5, the memory bandwidth should be 96GB/s instead of 76.8GB/s (3000*256/8=96)
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
29 Comments
Back to Article
Kaboose - Wednesday, January 5, 2011 - link
Cant wait for a 2630QM and a GT 555M in a 15.6 inch 1080p notebook. Now lets see who is first to market with that gem, and see what price we are looking at.Willhouse - Thursday, January 6, 2011 - link
I too would like to see this laptop. Been waiting for months. 14" with similar hardware would be nice too.Pneumothorax - Wednesday, January 5, 2011 - link
"Of course, while notebook manufacturers are doing the above, please quit with the lousy LCDs. Tablets are now shipping with IPS displays; can the laptops and notebooks get some luvin' as well? Also, stop with the glossy plastics, give us decent keyboards, and stop using 48Wh batteries in 15.6" and larger laptops!"Please do this so I can stop paying an extra $1000 of Apple tax to get decent screens and batteries in a laptop?!
Dug - Wednesday, January 5, 2011 - link
I 2nd that!!!Wizzdo - Thursday, January 6, 2011 - link
Remember that with the so called "Apple Tax" you also get a wonderful modern "Ultimate" version OS and a slew of really excellent useful applications in iLife minus the pile of crippled cpu-hogging crapware installed.If you want to do the real math ( including the seemingly infinite headaches and hours of productivity lost by all my Windows clients who are constantly paying me to fix their machines ), Apple Macbooks are a steal!
There's a lot more to a laptop than the upfront cost.
Ironically, my macBook Pro is the best Windows laptop (using BootCamp) that I've ever owned and believe me I've owned many!
inaphasia - Friday, January 7, 2011 - link
3rd! (Except for the Apple part)And I'll say again again what has already been said many times before:
16x9? Only good for for video! And video on almost all sub $1000 notebooks... Can you say 0% viewing angle? Because effectively it's exactly that on my 1215n.
Karammazov - Wednesday, January 5, 2011 - link
jarred I like very much your take on reviewing notebooks, you provide an angle that most of us who are interested in buying the laptop are wanting.However Im still left aloof when it comes to optimus. I cant see why all the hype about it, the switchable gpus is a reality even on the AMD field. granted that you need a board to coordinate the stuff, but it doesnt suffer the same driver issues that the optimus offer, the switch is not as seamless as it looks, and the drivers are plagued with bugs.
Not only that but its visible that optimus was designed to provide high performance when needed, thus improving the battery life, but is there any laptop out there that use optimus that also have a mid upper range GPU? Im not even going into the territory of the high end stuff.
I also like the idea, but the way optimus is now, I dont consider it a good thing, unless you are seeing the AMD side of things.
JarredWalton - Wednesday, January 5, 2011 - link
The big benefit of Optimus is that driver updates are available. If you get something with switchable graphics, you end up only getting new drivers when the laptop manufacturer puts together a package that includes your GPU and IGP drivers. In practice, that usually means you're stuck with whatever the laptop initially shipped with.Mid-range GPUs like the 335M have done Optimus before, which I'd call midrange for the 300M series. I don't think anyone did higher than 435M Optimus up until now, but with Sandy Bridge you can now get quad-core as well as high-end Optimus. That's what I want to see, but we'll have to wait for someone to actually make it.
Optimus does have a few glitches on occasion with compatibility, but if you're stuck with drivers that are months old and you're trying to run a new game, it can be even worse. So the combination of driver updates and better battery life is a win for me.
LtGoonRush - Wednesday, January 5, 2011 - link
While I am excited about the possibility of a GTX 560M using an uncut GF106 die, the fact that the GF108 only has 4 ROPs basically makes it worthless at gaming. A 96 shader card could have made a decent low-end gaming option, but the ROP count limits performance in ways that are simply insurmountable. It's true that we're probably looking at laptops with <1080p displays where the ROP count matters less, but still, I can't see the card being competitive enough to justify the cost. On the other hand, nVidia did make the right choice with the GTX 485M, that's the card the original GTX 480M should have been (much like the GTX 580 vs the GTX 480).rjc - Wednesday, January 5, 2011 - link
On page 2 you have listed the GT520m as a cut down GF108. The part is up on the nvidia site and it really does not look like GF108, more like a new chip the GF119.See here:
http://www.nvidia.com/object/product-geforce-gt-52...
The chip is physically much smaller and a different shape than the GF108 from the pictures.
JarredWalton - Wednesday, January 5, 2011 - link
It looks like the 520M has four fewer pins, but the top of the chips is quite different. NVIDIA didn't disclose any chip names to me when I asked, unfortunately, I have intermittent Internet access, so I can't really do much other than respond to posts right now, but I'll try to look into it later.rjc - Thursday, January 6, 2011 - link
Thanks Jarred for updating the article.One small thing though - the pin outs on the GF108 and GF119 i think are identical. The website for some reason though shows the GF119 pads rotated by 90degrees, notice the triangle is in a different position to the equivalent shot for the GF108. The pin outs are very similar and same size to the older GF215 and GF216 chips, with 4 additional pins for the 2 new parts.
With regard to the GF119 performance, be a bit careful estimating it. As the codename indicates, it is rumored to have quite some advances over the other Fermi chips. For instance the increase from 420m to 520m performance level apparently only consumes 2W more according to the nvidia figures.
Dug - Wednesday, January 5, 2011 - link
Thanks for taking the time to go through the new chips and explain the differences. Even then nVidia has made it confusing again. I really wish they would just stick with increasing the model number with increased performance.bennyg - Wednesday, January 5, 2011 - link
At this rate Fermi will be in the 600s, maybe the 700Ms; and their next chip design will take them well beyond the 900s.G80 and derivatives were the 8000s, 9000s, 100Ms, 200Ms, 300Ms.
Nvidia's whole point of new model numbers is to HIDE that there is no/minimal increase in performance.
I won't even begin to consider shedding a fraction of a tear that all the "1Gb Dedicated Graphics" cards will have no reason for existence. They are only ever targeted and marketed to the uninformed and it's clear a company has reached the pinnacle of un-ethics when it deliberately seeks to confuse the marketplace in order to market its products.
MrSpadge - Thursday, January 6, 2011 - link
I find it really funny how the same chip with lower clocks suddenly gets a higher subnumer once it "improved" one generation :pMrS
RyanVM - Wednesday, January 5, 2011 - link
Firefox 4 doesn't support hardware acceleration with Optimus configurations. It was causing lots of instability.https://bugzilla.mozilla.org/show_bug.cgi?id=59732...
Can't speak for Chrome or IE9.
Ed051042 - Wednesday, January 5, 2011 - link
Amen to the closing statement on laptop LCD quality. 1366x768 is NOT an acceptable resolution on a 14/15/17" notebook. My 10" netbook as that resolution! The currently lack of ability to deliver quality displays is troubling. Dell has "supply issues" with their 1080p panel for the XPS 15, while HP pulled their 1600x panel for the Envy 14. Only the Sony Vaio Z is delivering the goods, with 1080p on a 13" panel.Hrel - Wednesday, January 5, 2011 - link
"Clock speeds are also up, in this case it’s a 14% increase for the 550M vs. 435M, 20% for 540M vs. 425M, and 20% for the 525M over the 520M—not too shabby" I think you meant "525M over the 420M (not 520M).The GT425M at 560core/1120processor/800RAM clocks. That's only 7%, 7% and 12.5% increases. I'm glad for the extra memory bandwidth and would love to see if giving this GPU GDDR5 would let it handle 1080p at least on the lowest settings in games like Crysis or Stalker. I just hope the "new" GT525M costs less than the GT425M cause realistically it won't be much faster if at all so the price should drop. I just really want a 15.6" laptop with a decent quality (Compal/Clevo) quality 1080p screen. i5 2520M, GT525M, 4GB DDR3 1333 for a thousand bucks or less. I REALLY don't care if it includes a blue ray drive, or even a dvd drive. And I REALLY REALLY want it to come with Seagate's Momentus XT Hybrid drive 500GB.
JarredWalton - Thursday, January 6, 2011 - link
Finally got enough internet speed to be able to update a few bits in the text. The 525M replaces the 420M, which had 500/1000 clocks. The 535M replaces 425M, which is the 560/1120 clocks you mention. But you're right if you compare 525M to 425M. The overlapping names is more than a little confusing!I also updated the information in regards to 520M/410M, which use the GF119 core.
EliteRetard - Wednesday, January 5, 2011 - link
If we can currently buy a 1GB GTX460 for 150$ any chance well get the GTX485 at a reasonable price? Sure its fully unlocked (why dont we have an unlocked desktop GTX460?), so make it 200$. I want to see laptops with that chip for under $1,000.If Nvidia wants $500 for a card like that, whats stopping companies like Asus from just making their own mobile varients? Couldnt they just take the mobile PCI card blank and drop in an actual GTX460 chip and downclock it? Ive been saying all this time, if they can do GTX480 SLI in a laptop with an i7-980x, why the heck cant they just use a GTX460 and go with a much cheaper and more reasonable mobile CPU to make a low cost but effective gaming laptop?
Thats what I want, a midrange sandy bridge CPU and a switching GTX460 level GPU. Call it an all in one PC if you have to and stick it to a decent 17" LCD. Keep it at 6lbs or less. Id buy that for around a grand. Oh and let us use those larger mobile HDDs...I do need 1TB storage in a laptop and dont want to use 2 drive bays to do it. Keep the second for SSD.
5150Joker - Thursday, January 6, 2011 - link
Jarred, has the TDP of the 485M gone down vs the 480M and is it still 100W+?JarredWalton - Thursday, January 6, 2011 - link
NVIDIA didn't disclose any specific TDP, but they said they're competing to get into the same designs as AMD's high-end parts, and so it appears they're looking at around 75W to 80W.nitrousoxide - Thursday, January 6, 2011 - link
nVidia and AMD are using different TDPs. The TDP of a Mobility Radeon is only the power consumed by the GPU itself while the TDP of a Geforce M is the total MXM Module power consumption.JarredWalton - Thursday, January 6, 2011 - link
Regardless, the point is that the notebook manufacturers have to be the ones to handle the cooling, so the premise is that NVIDIA and AMD mobile GPUs are targeting roughly the same power/thermal requirements.marraco - Thursday, January 6, 2011 - link
Finally, competence will force nVidia to bring the goods.But what nVidia needs is HydraLogic technology.
That way discrete graphics will add to integrated graphics instead of compete with it. The speed up would be 3X instead of 2X, and a good integrated would help nVidia to sell his hardware instead of hinder it.
Ed051042 - Thursday, January 6, 2011 - link
As of 2:19PM EST the Dell XPS 15 has the 1080p panel available through online ordering!nitrousoxide - Thursday, January 6, 2011 - link
The 384 Shaders are expected, but the 575MHz frequency is just unbelievable! Keep in mind that the GTX480M with 352 Shaders clocked at only 425MHz (which makes it even overrun by the high-freq-lower-shader-count GTX470M), it is easy to figure out that the 485M is theoretically around 1.5x performance of 480M. Keeping the TDP with 50% performance boost is really a great job done by nVidia.There is one error in the article: the RAM frequency of 485M is 750MHz(3GHz effective), given that it uses 256bit GDDR5, the memory bandwidth should be 96GB/s instead of 76.8GB/s (3000*256/8=96)
JarredWalton - Thursday, January 6, 2011 - link
Whoops... I think I copied/pasted the old data and missed updating that one cell.halcyon - Thursday, January 6, 2011 - link
This is what matters in the mobile space.What are the parts capable of? Are they still 35W + parts?