Bah I could care less about a card that gets 5-10fps more blah blah... my main issue with both of these GPU manufacturers is the constant barrage of meager technological updates, which end up costing a ton, but on paper and in reality you barely get what you've paid in full for... WTF would anyone spend another $150-300 to upgrade from their current card, when this newest model gets maybe 10-20fps more on avg? WHO spends that kind of money for 10-20fps? Fanboys and or their parents do, that's who. The GPU industry has long figured out that if they just continually meter out cards with near imperceptible FPS improvements (aside from review site nit-picking on paper, which sends the zealous and OCD types into a posting fury about who's "the best?!") and let the review sites do the advertising for them by showing these neat, colored little charts and stretching the lines of "superiority," the fanboys will clear the shelves of meager-ware products.
That's the whole problem with newer cards and people that incessantly upgrade every year or sometimes even sooner. A lot of you are flag waving and whining about GDDR3,4,5. Look at the grand scheme of things: nit-picking about a few FPS and a few dollars here and there, when you should complaining about the lack of serious upgrades to play the newest games. Some titles are near REQUIRING the user spend upwards of $300+ to even play the damn game?! Shitting coding + technological-metering are the winning combo for people that cannot withhold from impulse shopping, b/c for some illogical reason, they just HAVE to have this new card and or HAAAAVE to own at Crysis... at the cost to all of us b/c the more you guys buy up this metered out garbage, the more the industry adapts its price/metering model to you. If people stopped buying up this crap so often, they would take notice and would HAVE to pander to YOU instead of vice versa "Umm PLEASE release us another, new and meager upgrade so I can claim 13 fps ownage over those ATI idiots! I'll pay ya another $200 I swear!" ... exercise some shopping restraint. There’s PLENTY of gaming titles out there that aren't brand spanking new, that you can play at near to max settings with whatever modern card you have. But they are overlooked b/c the review sites have to keep up with the times and apparently so do most “gamers,” Vs looked for less recent titles.
Ever since the 9800pro era faded and the custom card-to-game 6800 series was introduced... games have long been ahead of cards. And people have been pandering to the manufacturers and not the other way around. What a shame... waste of time and money: A testament to SLI returning—SLI(?) that old sh!t “technology” has been dated since I remember it 1st surfacing with Voodoo 8-12Ms for Quake 1… Double the price, double the heat, double the noise, double the power requirements, double the dust... Geesh the sheep own the market or does the market own the sheep… negatives away!
you do realize that monitors are getting larger right? the larger the monitor the more power your card needs to be. Its not about achieving faster frames here. Its about not having to have the frames drop when the battle gets intense on a large monitor or hi def tv with the same exact settings.
Come on... Yes your issue is legit, BUT do you really think when most of these people go out to get a card they are concerned about "monitor size?" No, they are still yammering on about FPS: ATI Vs Nvidia. Nobody once has said to me on IRC or any forum: "Oh man I NEED SLI/ 4870-X2 b/c I'm getting a 24" monitor." They are getting those cards for hopes of maxing out at 1600x. No way in hell anyone is anywhere close to maxing anything with a $300 card on a 24" monitor. Maybe a 19-21" sure, but larger, forget it... Esp with the newest games. I hate to say it, but most shoppers are sheep and make accompanying decisions like one. Keep the manufacturer's pockets fat, and your wallets lean... Ciao!
so the gpu market is milking us dry? what about CPUS?! the rv770 core has a stagering 1,000,000,000 transistors, how about the e8600 cpu, which costs about the same as an ENTIRE card? a meager 410,000,000! sure its clock speed is way higher, but come on, its also 45nm AND it doesn't include ram, pcb, OR fan price. the e8600 uses 65w of power, the 4870 uses twice that. think about the fact that gpus are WAAAAY more powerful then cpus and I say boycott intel. BTW i am currently using an nvidia fx 5200, i cannot even play halo maxed out. best buy still sells the 5200 btw, boycott them too. more then doubling the amount of power whilst reducing the number of transistors required for such performance is NOT a minor upgrade, please open your eyes.
Actually, resolution plays a huge role in how people by cards. Notice how most benchmarks have different data for different resolutions. It's not simply the FPS game regardless of context: The FPS are in context of the resolution, like AA and other settings.
In fact, it's not uncommon for people ask if the GPU they want to buy is "overkill" for their displays.
I've never considered buying an ATI/AMD product because of the poor drivers (something that should have perhaps been mentioned in the OP). The specs and benchmarks can say whatever they want, but unless they have magically started writing drivers at the same level as nvidia, I'm not interested.
ATI has a bad rep for producing bad drivers that has existed since the Radeon 8500 and has been entirely undeserved since the beginning of the Catalyst era. I hate to do it, but let me just dredge up that "40% of early Vista crashes were attributed to NVIDIA hardware". NOW lets talk about horrible drivers.
honestly nvidia's drivers are slipping somewhat, specialy with new cards and some older games.
and I've been using ATI's since the 9700pro (and just got my 4th ATI card, a hd4870) and I've never had any mayor issues with any ATI drivers.
basically your argument went out the window when ATI released the 9700pro and the accompanying drivers. there is no difference in overall driver quality.
and if you'd been talking about linux driver i'd have agread with you... untill the latest drivers and ATI commitment to making completely open source drivers possible.
Let's talk about ATI scaling options: does it support everything Nvidia supports? If not, which ones?
-Stretch to fit the screen.
-Pillar box (maximize with fixed aspect ratio)
-Let monitor scale
-1:1
What's this about exiting developments? Who is leaving? Or are we having exciting developments? Err, sorry. I don't know a polite way to point out spelling issues.
You spoke of the rumor that AMD was going to spin off its manufacturing business and become 'fabless'. This is a rumor that has been circulating for quite a time and the fact is it is false.
AMD cannot go 'fabless' because their x86 license requires them to have at least 51% share of such a business, and that is where the rumor comes from.
Because of some agreement supposedly AMD cannot produce other peoples silicon in their fabs and currently they want to sell some fab space off in order to bring in additional revenue. The idea is to spin off this new company which AMD would own 51% of and it could then produce chips from outside sources.
Even at best AMD could do this with 1 fab, but certainly not all of them and the whole rumor that AMD is selling its fabs off is just not really a reality. At best it is rumor based on the situation above that really only affects (effects?) the company's accounting pages. AMD will continue to make processors and chipsets in house, with the exception of some ATI or IGP work that may be produced by TMSC.
"We do have to give credit where it is due to NVIDIA though: after seeing the RV770 hit, NVIDIA very quickly and adequately reduced the pricing on their GT200 hardware"
look no one gives credit to nvidia apart from you for dropping their prices as it was now obvious they had to to compete with ATi`s GPU Cards for performance and price (which they still cant manage). Nvidia prices have always been high as they possibly can be until ATi comes to the scene with new cards. The nvidia motto is to milk them while you can (or until ATi is down) for all they are worth with the same card from last year (they will prolly change the name again next year as it is been suggested now for these current cards to the names that correspond to the new cards they bring out in 2009).
Without AMD (The Under Dawg) we are truly fcuk*d from nVidia and Intel.
Isn't that the description of competition? Don't you think ATI/AMD would have done the same thing if the roles would be reversed?
Every company wants to profit as much as they can. But when you enter with a new product, you have to make sure that somebody buys it, so you offer it for a smaller price/performance. After a short time, the competition is forced to lower the prices of existing HW so that people still buy it. They may also enter with another product offering even smaller price/performance. This is the case for 90% of businesses. It's just that the GPU war is much more intense than others (CPU, memory, HDD, etc). Or maybe they are just monitored more. Actually I think it's because they are independent of the rest of the system (the only link to the rest is PCI, AGP, PCIEx which each had a very long time to be replaced).
PS: everything in the world is gray. The good guy/bad guy is just a matter of perception (ok!ok! some extremes are obvious).
Right now the pendelum is swinging -not between pity thing like whos card is the better atm.. -i'm talking about the big pendelum that decides between gaming on pc or concole for the current generation of young people. They will probably stay on that course forever.
From what I hear the consoles pretty much has won already.
The greedy manufacturers pretty much killed their own market!
Now they can stick to making a chip for some console -once every 5 or 10 years...good luck :-)
Think about that for a second. If this was true, we would see the same progress between console updates as between 2 generational gpu updates.
To allow the progress and huge leaps in functionality and graphics that the consoles have enjoyed between each generation, continual development of GPUs (and selling of those GPUs, to fund more development) is needed.
Without a PC gaming market, the console gaming market would be a lot more boring.
#1 Memory Technology
-- ATI/AMD has now implemented and shipped numerous boards with GDDR4 and GDDR5 memory. NVIDIA has been stuck at GDDR3 memory during the time ATI/AMD has led the way with GDDR4 and GDDR5. This is a huge disadvantage for NVIDIA considering the have to route a 512-bit memory bus in order to provide bandwith which ATI/AMD can provide with a 256-bit memory bus
#2 Process Technology
-- ATI/AMD now has two generations of products at 55nm during which time NVIDIA has been stuck at 65nm. This is a huge disadvantage for NVIDIA considering the enourmous transistor count on their chips.
#1 Memory Technology
-- ATI/AMD how now implemented and shipped numerous boards with GDDR4 and GDDR5 memory. NVIDIA has been stuck at GDDR3 memory during the time ATI/AMD has led the way with GDDR4 and GDDR5. This is a huge disadvantage for NVIDIA considering the have to route a 512-bit memory bus in order to provide bandwith which ATI/AMD can provide with a 256-bit memory bus
#2 Process Technology
-- ATI/AMD now has two generations of products at 55nm during which time NVIDIA has been stuck at 65nm. This is a huge disadvantage for NVIDIA considering the enourmous transistor count on their chips.
"And saying using GDDR3 is a 'huge' disadvantage is a retarded statement."
512-bit bus due to using GDDR3. Big expensive chip on relatively older process technology (ATI has used 55nm for two generations) with a huge bus to route is a huge disadvantage vs. smaller process technology, more efficient design, with a bus size that requires 1/2 the routing.
I can see passing over GDDR4 due to limited benefit, but passing over GDDR5 just amazed me for their gtx200 chips. Brand new billion transistor chip on a generation old process technology with two generation old memory?
512-bit bus is just a bus size. It has nothing to do with memory technologies that are used. Just about any memory technology can be used. Just put it in perspective a 512-bit bus uses eight (8) memory chips that has 64-bit data bus. You can say it is an eight channel memory bus. The use of GDDR5 compared to previous GDDR generations is just providing an illusion of performance to the customer. Does GDDR5 perform better than previous generations? I do not think so since each generation introduces new latency specs. Graphics cards needs the lowest latency memory technology to provide the best performance.
nVidia did not get behind. They just picked the wrong stuff to go further with their products. AMD/ATI picked the right stuff to go further with their graphic cards and at the same time they picked the right stuff to make their high-end models cheaper than we as consumers have thought.
"512-bit bus is just a bus size. It has nothing to do with memory technologies that are used. Just about any memory technology can be used. Just put it in perspective a 512-bit bus uses eight (8) memory chips that has 64-bit data bus."
My point is exactly what you have stated. NVIDIA needed to route 8 memory chips with a 64-bit data bus. If GDDR5 was used eight 32-bit data buses would provide approximately the same bandwith.
ATI Radeon HD4870
GDDR5 256-bit bus 900MHz 115.2GB/s
NVIDIA GTX 260
GDDR3 448-bit bus 999MHz 111.9GB/s
ATI is able to achieve higher bandwith, with a smaller bus width, and a lower memory clock speed compared to NVIDIA.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
27 Comments
Back to Article
v12v12 - Monday, October 6, 2008 - link
Bah I could care less about a card that gets 5-10fps more blah blah... my main issue with both of these GPU manufacturers is the constant barrage of meager technological updates, which end up costing a ton, but on paper and in reality you barely get what you've paid in full for... WTF would anyone spend another $150-300 to upgrade from their current card, when this newest model gets maybe 10-20fps more on avg? WHO spends that kind of money for 10-20fps? Fanboys and or their parents do, that's who. The GPU industry has long figured out that if they just continually meter out cards with near imperceptible FPS improvements (aside from review site nit-picking on paper, which sends the zealous and OCD types into a posting fury about who's "the best?!") and let the review sites do the advertising for them by showing these neat, colored little charts and stretching the lines of "superiority," the fanboys will clear the shelves of meager-ware products.That's the whole problem with newer cards and people that incessantly upgrade every year or sometimes even sooner. A lot of you are flag waving and whining about GDDR3,4,5. Look at the grand scheme of things: nit-picking about a few FPS and a few dollars here and there, when you should complaining about the lack of serious upgrades to play the newest games. Some titles are near REQUIRING the user spend upwards of $300+ to even play the damn game?! Shitting coding + technological-metering are the winning combo for people that cannot withhold from impulse shopping, b/c for some illogical reason, they just HAVE to have this new card and or HAAAAVE to own at Crysis... at the cost to all of us b/c the more you guys buy up this metered out garbage, the more the industry adapts its price/metering model to you. If people stopped buying up this crap so often, they would take notice and would HAVE to pander to YOU instead of vice versa "Umm PLEASE release us another, new and meager upgrade so I can claim 13 fps ownage over those ATI idiots! I'll pay ya another $200 I swear!" ... exercise some shopping restraint. There’s PLENTY of gaming titles out there that aren't brand spanking new, that you can play at near to max settings with whatever modern card you have. But they are overlooked b/c the review sites have to keep up with the times and apparently so do most “gamers,” Vs looked for less recent titles.
Ever since the 9800pro era faded and the custom card-to-game 6800 series was introduced... games have long been ahead of cards. And people have been pandering to the manufacturers and not the other way around. What a shame... waste of time and money: A testament to SLI returning—SLI(?) that old sh!t “technology” has been dated since I remember it 1st surfacing with Voodoo 8-12Ms for Quake 1… Double the price, double the heat, double the noise, double the power requirements, double the dust... Geesh the sheep own the market or does the market own the sheep… negatives away!
MarioJP - Tuesday, October 7, 2008 - link
you do realize that monitors are getting larger right? the larger the monitor the more power your card needs to be. Its not about achieving faster frames here. Its about not having to have the frames drop when the battle gets intense on a large monitor or hi def tv with the same exact settings.v12v12 - Wednesday, October 8, 2008 - link
Come on... Yes your issue is legit, BUT do you really think when most of these people go out to get a card they are concerned about "monitor size?" No, they are still yammering on about FPS: ATI Vs Nvidia. Nobody once has said to me on IRC or any forum: "Oh man I NEED SLI/ 4870-X2 b/c I'm getting a 24" monitor." They are getting those cards for hopes of maxing out at 1600x. No way in hell anyone is anywhere close to maxing anything with a $300 card on a 24" monitor. Maybe a 19-21" sure, but larger, forget it... Esp with the newest games. I hate to say it, but most shoppers are sheep and make accompanying decisions like one. Keep the manufacturer's pockets fat, and your wallets lean... Ciao!qwertymac93 - Friday, October 24, 2008 - link
so the gpu market is milking us dry? what about CPUS?! the rv770 core has a stagering 1,000,000,000 transistors, how about the e8600 cpu, which costs about the same as an ENTIRE card? a meager 410,000,000! sure its clock speed is way higher, but come on, its also 45nm AND it doesn't include ram, pcb, OR fan price. the e8600 uses 65w of power, the 4870 uses twice that. think about the fact that gpus are WAAAAY more powerful then cpus and I say boycott intel. BTW i am currently using an nvidia fx 5200, i cannot even play halo maxed out. best buy still sells the 5200 btw, boycott them too. more then doubling the amount of power whilst reducing the number of transistors required for such performance is NOT a minor upgrade, please open your eyes.cokelight - Thursday, October 9, 2008 - link
Actually, resolution plays a huge role in how people by cards. Notice how most benchmarks have different data for different resolutions. It's not simply the FPS game regardless of context: The FPS are in context of the resolution, like AA and other settings.In fact, it's not uncommon for people ask if the GPU they want to buy is "overkill" for their displays.
TheMakron - Saturday, October 4, 2008 - link
I've never considered buying an ATI/AMD product because of the poor drivers (something that should have perhaps been mentioned in the OP). The specs and benchmarks can say whatever they want, but unless they have magically started writing drivers at the same level as nvidia, I'm not interested.Goty - Sunday, October 5, 2008 - link
ATI has a bad rep for producing bad drivers that has existed since the Radeon 8500 and has been entirely undeserved since the beginning of the Catalyst era. I hate to do it, but let me just dredge up that "40% of early Vista crashes were attributed to NVIDIA hardware". NOW lets talk about horrible drivers.TheMakron - Monday, October 6, 2008 - link
Well, if what you say is true then I will give their cards a shot next time I buy one.Thanks.
TheCountess - Sunday, October 5, 2008 - link
honestly nvidia's drivers are slipping somewhat, specialy with new cards and some older games.and I've been using ATI's since the 9700pro (and just got my 4th ATI card, a hd4870) and I've never had any mayor issues with any ATI drivers.
basically your argument went out the window when ATI released the 9700pro and the accompanying drivers. there is no difference in overall driver quality.
and if you'd been talking about linux driver i'd have agread with you... untill the latest drivers and ATI commitment to making completely open source drivers possible.
AnnonymousCoward - Monday, October 6, 2008 - link
Let's talk about ATI scaling options: does it support everything Nvidia supports? If not, which ones?-Stretch to fit the screen.
-Pillar box (maximize with fixed aspect ratio)
-Let monitor scale
-1:1
TheCountess - Monday, October 6, 2008 - link
check, check, check, and check. tried and tested. they got it all.AnnonymousCoward - Tuesday, October 7, 2008 - link
Thanks for the info, I was wondering.Jorgisven - Friday, October 3, 2008 - link
What's this about exiting developments? Who is leaving? Or are we having exciting developments? Err, sorry. I don't know a polite way to point out spelling issues.NullSubroutine - Thursday, October 2, 2008 - link
You spoke of the rumor that AMD was going to spin off its manufacturing business and become 'fabless'. This is a rumor that has been circulating for quite a time and the fact is it is false.AMD cannot go 'fabless' because their x86 license requires them to have at least 51% share of such a business, and that is where the rumor comes from.
Because of some agreement supposedly AMD cannot produce other peoples silicon in their fabs and currently they want to sell some fab space off in order to bring in additional revenue. The idea is to spin off this new company which AMD would own 51% of and it could then produce chips from outside sources.
Even at best AMD could do this with 1 fab, but certainly not all of them and the whole rumor that AMD is selling its fabs off is just not really a reality. At best it is rumor based on the situation above that really only affects (effects?) the company's accounting pages. AMD will continue to make processors and chipsets in house, with the exception of some ATI or IGP work that may be produced by TMSC.
X1REME - Wednesday, October 1, 2008 - link
"We do have to give credit where it is due to NVIDIA though: after seeing the RV770 hit, NVIDIA very quickly and adequately reduced the pricing on their GT200 hardware"look no one gives credit to nvidia apart from you for dropping their prices as it was now obvious they had to to compete with ATi`s GPU Cards for performance and price (which they still cant manage). Nvidia prices have always been high as they possibly can be until ATi comes to the scene with new cards. The nvidia motto is to milk them while you can (or until ATi is down) for all they are worth with the same card from last year (they will prolly change the name again next year as it is been suggested now for these current cards to the names that correspond to the new cards they bring out in 2009).
Without AMD (The Under Dawg) we are truly fcuk*d from nVidia and Intel.
mathew7 - Thursday, October 2, 2008 - link
Isn't that the description of competition? Don't you think ATI/AMD would have done the same thing if the roles would be reversed?Every company wants to profit as much as they can. But when you enter with a new product, you have to make sure that somebody buys it, so you offer it for a smaller price/performance. After a short time, the competition is forced to lower the prices of existing HW so that people still buy it. They may also enter with another product offering even smaller price/performance. This is the case for 90% of businesses. It's just that the GPU war is much more intense than others (CPU, memory, HDD, etc). Or maybe they are just monitored more. Actually I think it's because they are independent of the rest of the system (the only link to the rest is PCI, AGP, PCIEx which each had a very long time to be replaced).
PS: everything in the world is gray. The good guy/bad guy is just a matter of perception (ok!ok! some extremes are obvious).
Spivonious - Wednesday, October 1, 2008 - link
nVidia needs help from the government so their stock prices can remain artificially high.perzy - Wednesday, October 1, 2008 - link
Right now the pendelum is swinging -not between pity thing like whos card is the better atm.. -i'm talking about the big pendelum that decides between gaming on pc or concole for the current generation of young people. They will probably stay on that course forever.From what I hear the consoles pretty much has won already.
The greedy manufacturers pretty much killed their own market!
Now they can stick to making a chip for some console -once every 5 or 10 years...good luck :-)
Spoelie - Wednesday, October 1, 2008 - link
Think about that for a second. If this was true, we would see the same progress between console updates as between 2 generational gpu updates.To allow the progress and huge leaps in functionality and graphics that the consoles have enjoyed between each generation, continual development of GPUs (and selling of those GPUs, to fund more development) is needed.
Without a PC gaming market, the console gaming market would be a lot more boring.
jeffrey - Tuesday, September 30, 2008 - link
Specifically in two areas:#1 Memory Technology
-- ATI/AMD has now implemented and shipped numerous boards with GDDR4 and GDDR5 memory. NVIDIA has been stuck at GDDR3 memory during the time ATI/AMD has led the way with GDDR4 and GDDR5. This is a huge disadvantage for NVIDIA considering the have to route a 512-bit memory bus in order to provide bandwith which ATI/AMD can provide with a 256-bit memory bus
#2 Process Technology
-- ATI/AMD now has two generations of products at 55nm during which time NVIDIA has been stuck at 65nm. This is a huge disadvantage for NVIDIA considering the enourmous transistor count on their chips.
jeffrey - Tuesday, September 30, 2008 - link
Specifically in two areas:#1 Memory Technology
-- ATI/AMD how now implemented and shipped numerous boards with GDDR4 and GDDR5 memory. NVIDIA has been stuck at GDDR3 memory during the time ATI/AMD has led the way with GDDR4 and GDDR5. This is a huge disadvantage for NVIDIA considering the have to route a 512-bit memory bus in order to provide bandwith which ATI/AMD can provide with a 256-bit memory bus
#2 Process Technology
-- ATI/AMD now has two generations of products at 55nm during which time NVIDIA has been stuck at 65nm. This is a huge disadvantage for NVIDIA considering the enourmous transistor count on their chips.
anonymous x - Saturday, October 4, 2008 - link
about your #2My GTX+ is 55nm...
anyways, its faster than its main competitor, the 4850, so I don't really care if its a old architecture or not
Goty - Sunday, October 5, 2008 - link
It would be a good idea to qualify your statement by saying "fast in certain scenarios."hemmy - Wednesday, October 1, 2008 - link
Because they were complacent with their dominance since the release of G80 at the end of 2006.And saying using GDDR3 is a 'huge' disadvantage is a retarded statement.
You could go a few years back and ask why ATI was behind in multi-gpu technology, lack of PS3.0, etc.
It is just a cycle, each having their ups and downs.
jeffrey - Wednesday, October 1, 2008 - link
"And saying using GDDR3 is a 'huge' disadvantage is a retarded statement."512-bit bus due to using GDDR3. Big expensive chip on relatively older process technology (ATI has used 55nm for two generations) with a huge bus to route is a huge disadvantage vs. smaller process technology, more efficient design, with a bus size that requires 1/2 the routing.
I can see passing over GDDR4 due to limited benefit, but passing over GDDR5 just amazed me for their gtx200 chips. Brand new billion transistor chip on a generation old process technology with two generation old memory?
jmurbank - Wednesday, October 1, 2008 - link
512-bit bus is just a bus size. It has nothing to do with memory technologies that are used. Just about any memory technology can be used. Just put it in perspective a 512-bit bus uses eight (8) memory chips that has 64-bit data bus. You can say it is an eight channel memory bus. The use of GDDR5 compared to previous GDDR generations is just providing an illusion of performance to the customer. Does GDDR5 perform better than previous generations? I do not think so since each generation introduces new latency specs. Graphics cards needs the lowest latency memory technology to provide the best performance.nVidia did not get behind. They just picked the wrong stuff to go further with their products. AMD/ATI picked the right stuff to go further with their graphic cards and at the same time they picked the right stuff to make their high-end models cheaper than we as consumers have thought.
jeffrey - Wednesday, October 1, 2008 - link
"512-bit bus is just a bus size. It has nothing to do with memory technologies that are used. Just about any memory technology can be used. Just put it in perspective a 512-bit bus uses eight (8) memory chips that has 64-bit data bus."My point is exactly what you have stated. NVIDIA needed to route 8 memory chips with a 64-bit data bus. If GDDR5 was used eight 32-bit data buses would provide approximately the same bandwith.
ATI Radeon HD4870
GDDR5 256-bit bus 900MHz 115.2GB/s
NVIDIA GTX 260
GDDR3 448-bit bus 999MHz 111.9GB/s
ATI is able to achieve higher bandwith, with a smaller bus width, and a lower memory clock speed compared to NVIDIA.