Actually I'm writing this comment from a notebook, that kind of already runs in hybrid mode (Uniwill 259 EN3). It hat its IGP and a fully fleshed Go 6600 256MB built in. Granted, this one works a little different than the hybrid crossfire mentioned in the article. But even though I only can switch between the IGP and the 6600 when my laptop is turned of (there is a little switch at the front), the increased battery lifetime with only the IGP running is fantastic! Scince I'm not gaming that often I run the IGP most of the time which translates to lower overall temperatures so I sometimes even forget that this thing has a cooling fan - the machine simply doesn't have to throw it on that often. Sometimes it's not running for hours when it would run quite regularly with the 6600 turned on.
With this very concept working that well I've been wondering for quite some time, why this hasn't become mainstream already. Doesn't seem that hard to come up with a solution that features a way to turn of the dedicated GPU in Desktop mode to run with the IGP, doesn't it? Esp. when IGP and dedicated GPU are from the same manufacturer and running the same driver set.
My laptop has to be turned of to switch the GPUs, because the IGP is Intel and the GPU nVidia and thus running two fairly different drivers which cannot run simultaniously. But the benefits of this basic concept should even be enough to convince the big names in the game to produce interoperable drivers...
And if not... do some more notebooks with the little switch mine has. Usually I know if I intend to do a little gaming, when I turn my computer on. Gaming time - switch on the full GPU. Non gaming time - switch to IGP before booting up. Until the more seamless hybrid mode from the article works, I can perfectly live with that solution :-)
This technology (on paper) looks pretty good. I think many of us miss the point of this technology. Most retail computers are shipped with IGP. Even $800 Dell or HP PCs are often equipped with IGP only. With this hybrid Xfire, your average Joe can buy a $70-80 GPU and improve his frame rate on his $800 retail computer.
Can't they just as easily buy a $110 GPU and improve their performance by a much greater amount? CrossFire scaling usually gives 80% better performance in a best-case scenario, and in many cases it will be less than 50%. Two HD 2400 Pro cards very likely will not match the performance of a single HD 2600 Pro, unless AMD somehow gets much better scaling out of this than they have in the past.
Energy is too expensive to waste it. Having integrated and discrete graphics is great for laptops and for HTPCs too.
I would like to build something like an Phenom 2,6GHz with a not too complex and robust SIS Chipset and an optionally running Ati HD3870 in a beautiful aluminium DTX HTPC case.
Then I have my powersaving fileserver, MythTV back- and frontend and a nice gaming machine.
This sounds like an excellent idea for laptops, once you can turn off the discrete card. Put both an integrated and discrete card in the laptop so you can play your games, but turn off the discrete card to save battery life for less demanding applications.
It seems like this might be beneficial for laptops when they implement the aggressive power saving features...
Two gimpy GPUs for 3d gaming, but extended battery life for desktop work?
Other than that I'm not sure how many people with integrated graphics actually buy new video cards these days. You pretty much either game or you don't it seems... and a lower-midrange card doesn't do a lot for gaming these days.
Still, it's a good thought. Over time I think it could develop into something useful. If they got it to tag along with slightly faster cards (either make the IGU faster or raise up the maximum cards supported)... if you could get a 15-20% performance increase (basically for free) on a midrange card that would actually be somewhat helpful, and would be a good selling point for the AMD line of products.
"Making Two Slow GPUs, Not So Slow" would mean "making two slow GPUs (is not) so slow". You probably meant "Making two slow GPUs not so slow" (i.e., how to make two slow GPUs run faster).
It's like the difference between "making a car, faster" and "making a car faster".
I bet you all remember the shitty powerpoint slides that AMD kept releasing before the Phenom (ooh is that like PHENOMINALLY good?) turned out to be a turd? Same with the ATI 2900XTXXXTXTXTX cards when the R600 was coming out.
Also notice how the curve is flattening out from exponential and that white arrow in the background goes up higher than the foreground graph.
I doubt AMD or Nvidia are worried about any Intel onboard graphic solution or Intel's supposed commitment to improve. AMD and Nvidia will compete with each other while Intel drags along way behind.
11 years after 3dfx released the voodoo 1 we're going back to a 2 card concept...granted one of the cards is integrated on the motherboard but nevertheless.
If the video output model utilized the DisplayPort's chainable device features, you could use a single bus to feed all of your monitors. This would provide easy switching for a scenario like the aggressive powersaving features listed in the article.
Furthermore, this would allow different devices to power different workspaces (In Linux), effectively allowing you to utilize two different video cards with one monitor.
I guess, it just comes down to having an integrated Video-Switch, which is already done on laptops...or at least, the discrete accelerator is routed through the already-present port.
"The RS780 will ship with an integrated RV610 graphics core, the heart of the ATI Radeon HD 2400 Pro" - so this is a DX 10 compliant card, right?
"Unfortunately, the first incarnation of Hybrid Crossfire with the RS780 will only really work with the upcoming Radeon HD 3400 series GPUs. If you stick a Radeon HD 3400 card (the 3450 and 3470 will be announced early next year)..." - so the 3450 and 3470 are DX 10.1 compliant cards. Or at least they should be, that is why they belong to the 3XXX family, mainly.
And now, my question...
What DX version will it support in Hybrid mode?
In my opinion it will be just DX 10. But this way you lose a big + of the 3XXX family. Anyway I am eager to see how it really works.
This whole release is even less impressive when you consider what performance is being offered. AMD is saying four times the 3DMark06 performance of the 690G. Great! Except, does anyone realize that with the latest drivers, the Intel GMA X3100 is actually faster than the 690G in many games, and particularly in 3DMark06? The reason is that 690G lacks SM3.0 support, and that's a large part of the 3D06 score.
Anandtech's numbers http://www.anandtech.com/mobile/showdoc.aspx?i=311...">back this up, with even an older driver version of the GMA X3100 scoring twice as high as the 690G. What AMD is announcing, then, is that they will have DX10 support in an IGP and twice the performance of the fastest current Intel IGP. It remains to be seen what the GMA X3500 can do, as that's supposed to support DX10 among other things. If drivers are better this time around, Intel might be within striking distance of AMD's IGP. Double the performance of X3100 would come close to the HD 2300, and I'm guessing the IGP version of the HD 2400 Pro will be pretty similar to the mobile HD 2300.
I'd much rather see an IGP that's close to the HD 2600. Oh, and get the hybrid power stuff to work! That's FAR more important than slow, flaky Crossfire! Which, incidentally, is what nVidia appears to be working on. They've at least mentioned work towards powering down discrete GPUs when in non-3D apps. Maybe they can get it right with the first release, as it appears AMD won't.
I believe AMD/ATI chip sets started the High definition on board audio, though I may be mistaken. And their 690g, with the built in ATI express 1200 featured HD decoding didn't it?
I am pretty sure in the long run, this will be a good thing for systems. Instead of your on board video going completely useless when you put a dedicated graphics card, the dedicated will simply utilize the on board for a bit more processing power.
Most MB I buy have Intergrated graphics (IG). The reason being when i upgrade I some time keep my grpahics card adnt his way i can still pass on the MB and CPU. Even with em always having integrated graphics I don't see this as being a great thing. Even a 100% increase (and we will not get that) will not make intergrate graphics good enough for the game people want. In my experience a mid range card is 4-5 time faster then intergrated graphics.
Personal I dont play FPS and those game are the most graphic intensive. The RTS is play dont push the midrange cards too hard but still even 100% increase over IG would not do it. Not sure what this small increase would do for anyone that wanted to game.
Most MB I buy have Intergrated graphics (IG). The reason being when i upgrade I some time keep my grpahics card adnt his way i can still pass on the MB and CPU. Even with em always having integrated graphics I don't see this as being a great thing. Even a 100% increase (and we will not get that) will not make intergrate graphics good enough for the game people want. In my experience a mid range card is 4-5 time faster then intergrated graphics.
Personal I dont play FPS and those game are the most graphic intensive. The RTS is play dont push the midrange cards too hard but still even 100% increase over IG would not do it. Not sure what this small increase would do for anyone that wanted to game.
However, it will offer the added value of paying $50 for the equivalent of a $90 Graphics Accelerator. While you certainly don't get an enthusiast-class graphics experience, you go from PoS to lower-mid-range.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
28 Comments
Back to Article
Dfere - Monday, December 17, 2007 - link
For re-routing video back through the PCI- express bus to a shared video output?ThoroSOE - Sunday, December 16, 2007 - link
Actually I'm writing this comment from a notebook, that kind of already runs in hybrid mode (Uniwill 259 EN3). It hat its IGP and a fully fleshed Go 6600 256MB built in. Granted, this one works a little different than the hybrid crossfire mentioned in the article. But even though I only can switch between the IGP and the 6600 when my laptop is turned of (there is a little switch at the front), the increased battery lifetime with only the IGP running is fantastic! Scince I'm not gaming that often I run the IGP most of the time which translates to lower overall temperatures so I sometimes even forget that this thing has a cooling fan - the machine simply doesn't have to throw it on that often. Sometimes it's not running for hours when it would run quite regularly with the 6600 turned on.With this very concept working that well I've been wondering for quite some time, why this hasn't become mainstream already. Doesn't seem that hard to come up with a solution that features a way to turn of the dedicated GPU in Desktop mode to run with the IGP, doesn't it? Esp. when IGP and dedicated GPU are from the same manufacturer and running the same driver set.
My laptop has to be turned of to switch the GPUs, because the IGP is Intel and the GPU nVidia and thus running two fairly different drivers which cannot run simultaniously. But the benefits of this basic concept should even be enough to convince the big names in the game to produce interoperable drivers...
And if not... do some more notebooks with the little switch mine has. Usually I know if I intend to do a little gaming, when I turn my computer on. Gaming time - switch on the full GPU. Non gaming time - switch to IGP before booting up. Until the more seamless hybrid mode from the article works, I can perfectly live with that solution :-)
razor2025 - Friday, December 14, 2007 - link
This technology (on paper) looks pretty good. I think many of us miss the point of this technology. Most retail computers are shipped with IGP. Even $800 Dell or HP PCs are often equipped with IGP only. With this hybrid Xfire, your average Joe can buy a $70-80 GPU and improve his frame rate on his $800 retail computer.JarredWalton - Friday, December 14, 2007 - link
Can't they just as easily buy a $110 GPU and improve their performance by a much greater amount? CrossFire scaling usually gives 80% better performance in a best-case scenario, and in many cases it will be less than 50%. Two HD 2400 Pro cards very likely will not match the performance of a single HD 2600 Pro, unless AMD somehow gets much better scaling out of this than they have in the past.Schugy - Thursday, December 13, 2007 - link
Energy is too expensive to waste it. Having integrated and discrete graphics is great for laptops and for HTPCs too.I would like to build something like an Phenom 2,6GHz with a not too complex and robust SIS Chipset and an optionally running Ati HD3870 in a beautiful aluminium DTX HTPC case.
Then I have my powersaving fileserver, MythTV back- and frontend and a nice gaming machine.
Virusx86 - Friday, December 14, 2007 - link
Not to mention that an IGP+SlowGPU would be a hell of a lot quieter than the equivalent... NotSoSlowGPU...proflogic - Thursday, December 13, 2007 - link
This sounds like an excellent idea for laptops, once you can turn off the discrete card. Put both an integrated and discrete card in the laptop so you can play your games, but turn off the discrete card to save battery life for less demanding applications.Virusx86 - Thursday, December 13, 2007 - link
It seems like this might be beneficial for laptops when they implement the aggressive power saving features...Two gimpy GPUs for 3d gaming, but extended battery life for desktop work?
Other than that I'm not sure how many people with integrated graphics actually buy new video cards these days. You pretty much either game or you don't it seems... and a lower-midrange card doesn't do a lot for gaming these days.
Still, it's a good thought. Over time I think it could develop into something useful. If they got it to tag along with slightly faster cards (either make the IGU faster or raise up the maximum cards supported)... if you could get a 15-20% performance increase (basically for free) on a midrange card that would actually be somewhat helpful, and would be a good selling point for the AMD line of products.
Justin Case - Thursday, December 13, 2007 - link
"Making Two Slow GPUs, Not So Slow" would mean "making two slow GPUs (is not) so slow". You probably meant "Making two slow GPUs not so slow" (i.e., how to make two slow GPUs run faster).It's like the difference between "making a car, faster" and "making a car faster".
Justin Case - Saturday, December 15, 2007 - link
I this has been changed. GJ.superflex - Thursday, December 13, 2007 - link
Moar AMD bashing from the Intel Fanbois at Anandtech. Thanks for the responsible reporting Anand.tayhimself - Thursday, December 13, 2007 - link
AMD is a powerpoint company??I bet you all remember the shitty powerpoint slides that AMD kept releasing before the Phenom (ooh is that like PHENOMINALLY good?) turned out to be a turd? Same with the ATI 2900XTXXXTXTXTX cards when the R600 was coming out.
Also notice how the curve is flattening out from exponential and that white arrow in the background goes up higher than the foreground graph.
DO NOT WANT!!
Griswold - Sunday, December 16, 2007 - link
Dumbass.Justin Case - Thursday, December 13, 2007 - link
You do realise that's in Intel slide, right? It's always hilarious when the fanbois fail to see the logo.Olaf van der Spek - Thursday, December 13, 2007 - link
Isn't that at IDF?Looks more like Intel to me.
eetnoyer - Thursday, December 13, 2007 - link
MOAR???Didn't happen to notice that that was an Intel slide from IDF?
ninjit - Thursday, December 13, 2007 - link
I think that's an intel Slide, not an AMD one.drebo - Thursday, December 13, 2007 - link
Uhm, that's a shot from the Intel Developer Forum...Egg, meet face.
Maroon - Thursday, December 13, 2007 - link
I doubt AMD or Nvidia are worried about any Intel onboard graphic solution or Intel's supposed commitment to improve. AMD and Nvidia will compete with each other while Intel drags along way behind.shing - Thursday, December 13, 2007 - link
11 years after 3dfx released the voodoo 1 we're going back to a 2 card concept...granted one of the cards is integrated on the motherboard but nevertheless.murphyslabrat - Thursday, December 13, 2007 - link
If the video output model utilized the DisplayPort's chainable device features, you could use a single bus to feed all of your monitors. This would provide easy switching for a scenario like the aggressive powersaving features listed in the article.Furthermore, this would allow different devices to power different workspaces (In Linux), effectively allowing you to utilize two different video cards with one monitor.
I guess, it just comes down to having an integrated Video-Switch, which is already done on laptops...or at least, the discrete accelerator is routed through the already-present port.
JAKra - Thursday, December 13, 2007 - link
"The RS780 will ship with an integrated RV610 graphics core, the heart of the ATI Radeon HD 2400 Pro" - so this is a DX 10 compliant card, right?"Unfortunately, the first incarnation of Hybrid Crossfire with the RS780 will only really work with the upcoming Radeon HD 3400 series GPUs. If you stick a Radeon HD 3400 card (the 3450 and 3470 will be announced early next year)..." - so the 3450 and 3470 are DX 10.1 compliant cards. Or at least they should be, that is why they belong to the 3XXX family, mainly.
And now, my question...
What DX version will it support in Hybrid mode?
In my opinion it will be just DX 10. But this way you lose a big + of the 3XXX family. Anyway I am eager to see how it really works.
Cheers!
JAKra - Thursday, December 13, 2007 - link
It's me again :P .I know the answer! M$ will push out Direct X 10.05 :D
Frumious1 - Thursday, December 13, 2007 - link
This whole release is even less impressive when you consider what performance is being offered. AMD is saying four times the 3DMark06 performance of the 690G. Great! Except, does anyone realize that with the latest drivers, the Intel GMA X3100 is actually faster than the 690G in many games, and particularly in 3DMark06? The reason is that 690G lacks SM3.0 support, and that's a large part of the 3D06 score.Anandtech's numbers http://www.anandtech.com/mobile/showdoc.aspx?i=311...">back this up, with even an older driver version of the GMA X3100 scoring twice as high as the 690G. What AMD is announcing, then, is that they will have DX10 support in an IGP and twice the performance of the fastest current Intel IGP. It remains to be seen what the GMA X3500 can do, as that's supposed to support DX10 among other things. If drivers are better this time around, Intel might be within striking distance of AMD's IGP. Double the performance of X3100 would come close to the HD 2300, and I'm guessing the IGP version of the HD 2400 Pro will be pretty similar to the mobile HD 2300.
I'd much rather see an IGP that's close to the HD 2600. Oh, and get the hybrid power stuff to work! That's FAR more important than slow, flaky Crossfire! Which, incidentally, is what nVidia appears to be working on. They've at least mentioned work towards powering down discrete GPUs when in non-3D apps. Maybe they can get it right with the first release, as it appears AMD won't.
SilthDraeth - Thursday, December 13, 2007 - link
I believe AMD/ATI chip sets started the High definition on board audio, though I may be mistaken. And their 690g, with the built in ATI express 1200 featured HD decoding didn't it?I am pretty sure in the long run, this will be a good thing for systems. Instead of your on board video going completely useless when you put a dedicated graphics card, the dedicated will simply utilize the on board for a bit more processing power.
OrSin - Thursday, December 13, 2007 - link
Most MB I buy have Intergrated graphics (IG). The reason being when i upgrade I some time keep my grpahics card adnt his way i can still pass on the MB and CPU. Even with em always having integrated graphics I don't see this as being a great thing. Even a 100% increase (and we will not get that) will not make intergrate graphics good enough for the game people want. In my experience a mid range card is 4-5 time faster then intergrated graphics.Personal I dont play FPS and those game are the most graphic intensive. The RTS is play dont push the midrange cards too hard but still even 100% increase over IG would not do it. Not sure what this small increase would do for anyone that wanted to game.
OrSin - Thursday, December 13, 2007 - link
Most MB I buy have Intergrated graphics (IG). The reason being when i upgrade I some time keep my grpahics card adnt his way i can still pass on the MB and CPU. Even with em always having integrated graphics I don't see this as being a great thing. Even a 100% increase (and we will not get that) will not make intergrate graphics good enough for the game people want. In my experience a mid range card is 4-5 time faster then intergrated graphics.Personal I dont play FPS and those game are the most graphic intensive. The RTS is play dont push the midrange cards too hard but still even 100% increase over IG would not do it. Not sure what this small increase would do for anyone that wanted to game.
murphyslabrat - Thursday, December 13, 2007 - link
However, it will offer the added value of paying $50 for the equivalent of a $90 Graphics Accelerator. While you certainly don't get an enthusiast-class graphics experience, you go from PoS to lower-mid-range.