Do you know what is madness? This is madness! R9-290X doesn't have Xfire connector anymore. It is designed for PCI 3.0 if you DO Xfire it. The Shift doesn't even use a mobo with PCI 3.0 support and it went mad with dual GPU configuration. The only PCI 3.0 mobo (CMIIW) is Sabertooth 990FX/GEN3
If R9-290X doesn't have a CF connector, I'll be quite surprised. As for CF, it works perfectly well on PCIe 2.0 -- it might be a bit slower than on PCIe 3.0. But then, CF 290X will also be slower in AMD platforms than Intel I'd wager.
It's pretty much confirmed that 290/290X uses PCI-e only and doesn't have a bridge/connector. Look at the slides available from AMD. Plus AMD platforms is really too weak any way. You won't really be able to drive ultra high res games on CF with AMD cpus.
Ryan's the guy that follows the GPU side more, and he's read a bunch of NDA stuff that I'm not privvy to and thus I can discuss news like this. As for what R9-290X is and how it does CF, we can't say until the NDA lift. If it is indeed PCIe only for CF data, I suspect PCIe 3.0 will indeed be a substantial benefit. Of course, I don't think AMD or NVIDIA ever publicly stated how much data was transferred over their SLI/CF connections, so maybe there's not really a lot of overhead. Even at 2560x1600 60FPS, we're talking about 500MBps for screen content (because only half of the frames need to be rendered/transmitted). With PCIe 2.0, an x16 slot has 8GB/s available (for each direction), of which 6% is being used for transmitting frames.
Tri-CF might be a bit more of a problem, but I don't know many real gamers that use more than two GPUs -- my experience is that 3- and 4-way GPU setups are more for benchmarking or to show off money than because it's needed.
With AMD I think the CPU would be the limiting factor any way. So it doesn't matter much, and most CF systems will be 2x PCIe 3.0 x8 any way, on Intel.
I do not know why people keep saying the AMD CPUs are the limiting factor. You have no idea if that is true. All new games are likely highly parallel, and most likely will start using the GPU more than the CPU. The old programming was actually rendering graphics with the CPU, because GPUs didnt always used to be as strong. And AMD's 8 core processors also do very very well in multi-threaded loads. They scale a lot better than Intel, meaning Intel single core performance is great, and AMDs is not, and yet they are very similar when both using 4 to 8 threads.
Oh, and the Steamroller update early next year will also drive AMD's single thread performance up, and increase their scaling, so expect to see some superior AMD CPUs over Intel in 2014, when it comes to multi-threaded applications. I do not see any reason that AMD couldn't keep them compatable with older motherboards as well, even though some decent new ones would be nice, as MSI seems to have issues with the FX cpus. I would go with Asus, and hopefully they will offer a microATX mobo with X16 xfire capability, like my older MSI mobo.
That assumes that only frame buffer information is transmitted over the CrossFire connector. I would suspect a few other things are transmitted like sync information.
Also effects like motion blur would require each frame to be transferred over the CrossFire connector has the previous frame would need to be blended in with the current frame buffer. It is even worse when it is the secondary Radeon card as it has to receive the previous frame, blend it with the new frame buffer, and then send the final frame buffer to the primary Radeon card (the one with the displays connected) for output.
I know you're under NDA so you're trying to talk down speculation as to not break contract. But AMD is targeting 4K, not 2560x1600; they mentioned 4K so much in their presentation that there is no question on this. If AMD was not having troubles with it's current crossfire interconnect then why do the latest frame pacing drivers not support single screen resolutions over 2560x1600? Obviously a different method of transmitting data is needed when you have more than 4MP of display, like with a single 4K screen or Eyefinity with 3 1920x1080 screens. It's clear there's an interconnect issue.
The current crossfire interconnect bridge is equivalent to PCIe 1.0 x1 (i.e. 250MB/s). It was designed when 2560x1600 was the largest desktop resolution and Eyefinity wasn't in their product plans. AMD is thinking of the future and will probably design the new standard for future proofing with 4K Eyefinity.
However, the bandwidth for driving 4K Eyefinity just doesn't seem to work over the current PCIe bus. At 12MP with two cards suddenly you're at 3GB/s. With four way crossfire that increases to 4.5GB/s over the PCIe bus. This is a significant portion of the PCIe 3.0 bandwidth. I doubt any card they release this year can drive 4K Eyefinity at a reasonable framerate, but the bandwidth constraints are worth considering when PCIe 4 is not going to be released for a few years.
I agree that it looks unlikely that the current crossfire bridge configuration will remain for the 290X. The question is what will replace when the PCIe bus doesn't have unlimited bandwidth.
Might be sheer coincidence, but Apple is supposed to be doing their next event on October 22, which is interesting in light of the Oct 23 ship date here. Apparently they're staying pure AMD for their new Trashcan Pro, wonder if some R9-290X variant could feature there along with the Fire stuff that's been touted.
The specs from the Mac Pro announcement point toward Tahiti based Fire Pro cards. Though Apple could easily slip in a higher grade card as a build to order option.
I, like most of us, am excited to soon see some real world benchmark numbers for the 290X but I feel that using an AMD CPU - even if it is the 9590 - in a system designed to be a "limited edition Battlefield 4" rig is just unfortunate. After playing the BF4 beta and noticing just how CPU intensive it is in multiplayer (I needed to turn CPU intensive settings off/down with an i5-2500k @ 4.2 Ghz to make it smoother) it brings a tear to my eye thinking someone would decide to mate an AMD CPU to an unreleased, high-end graphics system like that. What a damn shame.
AMD cpu's. Interesting. The only connection I can think of why this might work is that "Perfect Parallel Rendering-perfectly utilize all 8 cores" cpu thing in the Frostbite/Mantle API presentation, but I'll have to see benchmarks first. If Mantle works better with 8 real AMD cores, it might squeeze something extra out of the FX 8 core architecture that would be lacking in intel's faster 4 cores.
Even then, you'll still be stuck with weaker performance in the directX stuff.
As you can see, Battlefield 4, along side most future games, are/will be optimized for 8 cores, so the FX chips will be perfectly fine for future gaming.
Interesting to see CyberPowerPC is also bundling these cards with only AMD CPUs? Sounds like AMD is forcing this upon their OEMs, if you want their latest GPU you need to sell it with their CPU? Not a bad strategy, they probably should've done this long ago.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
20 Comments
Back to Article
World_without_madness - Wednesday, October 9, 2013 - link
Do you know what is madness? This is madness!R9-290X doesn't have Xfire connector anymore. It is designed for PCI 3.0 if you DO Xfire it.
The Shift doesn't even use a mobo with PCI 3.0 support and it went mad with dual GPU configuration.
The only PCI 3.0 mobo (CMIIW) is Sabertooth 990FX/GEN3
JarredWalton - Wednesday, October 9, 2013 - link
If R9-290X doesn't have a CF connector, I'll be quite surprised. As for CF, it works perfectly well on PCIe 2.0 -- it might be a bit slower than on PCIe 3.0. But then, CF 290X will also be slower in AMD platforms than Intel I'd wager.Stuka87 - Wednesday, October 9, 2013 - link
AMD stated during the web cast that they will be using the PCI-E bus for CF for the 290X. And their photos did not show a CF connector on it.Penti - Wednesday, October 9, 2013 - link
It's pretty much confirmed that 290/290X uses PCI-e only and doesn't have a bridge/connector. Look at the slides available from AMD. Plus AMD platforms is really too weak any way. You won't really be able to drive ultra high res games on CF with AMD cpus.JarredWalton - Wednesday, October 9, 2013 - link
Ryan's the guy that follows the GPU side more, and he's read a bunch of NDA stuff that I'm not privvy to and thus I can discuss news like this. As for what R9-290X is and how it does CF, we can't say until the NDA lift. If it is indeed PCIe only for CF data, I suspect PCIe 3.0 will indeed be a substantial benefit. Of course, I don't think AMD or NVIDIA ever publicly stated how much data was transferred over their SLI/CF connections, so maybe there's not really a lot of overhead. Even at 2560x1600 60FPS, we're talking about 500MBps for screen content (because only half of the frames need to be rendered/transmitted). With PCIe 2.0, an x16 slot has 8GB/s available (for each direction), of which 6% is being used for transmitting frames.Tri-CF might be a bit more of a problem, but I don't know many real gamers that use more than two GPUs -- my experience is that 3- and 4-way GPU setups are more for benchmarking or to show off money than because it's needed.
Penti - Wednesday, October 9, 2013 - link
http://www.sweclockers.com/image/red/2013/10/09/Am... which is probably from the wccftech article though, but looks like it is real enough as it corresponds to prior information.With AMD I think the CPU would be the limiting factor any way. So it doesn't matter much, and most CF systems will be 2x PCIe 3.0 x8 any way, on Intel.
Principle - Saturday, October 12, 2013 - link
I do not know why people keep saying the AMD CPUs are the limiting factor. You have no idea if that is true. All new games are likely highly parallel, and most likely will start using the GPU more than the CPU. The old programming was actually rendering graphics with the CPU, because GPUs didnt always used to be as strong. And AMD's 8 core processors also do very very well in multi-threaded loads. They scale a lot better than Intel, meaning Intel single core performance is great, and AMDs is not, and yet they are very similar when both using 4 to 8 threads.Principle - Saturday, October 12, 2013 - link
Oh, and the Steamroller update early next year will also drive AMD's single thread performance up, and increase their scaling, so expect to see some superior AMD CPUs over Intel in 2014, when it comes to multi-threaded applications. I do not see any reason that AMD couldn't keep them compatable with older motherboards as well, even though some decent new ones would be nice, as MSI seems to have issues with the FX cpus. I would go with Asus, and hopefully they will offer a microATX mobo with X16 xfire capability, like my older MSI mobo.Kevin G - Wednesday, October 9, 2013 - link
That assumes that only frame buffer information is transmitted over the CrossFire connector. I would suspect a few other things are transmitted like sync information.Also effects like motion blur would require each frame to be transferred over the CrossFire connector has the previous frame would need to be blended in with the current frame buffer. It is even worse when it is the secondary Radeon card as it has to receive the previous frame, blend it with the new frame buffer, and then send the final frame buffer to the primary Radeon card (the one with the displays connected) for output.
The Von Matrices - Wednesday, October 9, 2013 - link
I know you're under NDA so you're trying to talk down speculation as to not break contract. But AMD is targeting 4K, not 2560x1600; they mentioned 4K so much in their presentation that there is no question on this. If AMD was not having troubles with it's current crossfire interconnect then why do the latest frame pacing drivers not support single screen resolutions over 2560x1600? Obviously a different method of transmitting data is needed when you have more than 4MP of display, like with a single 4K screen or Eyefinity with 3 1920x1080 screens. It's clear there's an interconnect issue.The current crossfire interconnect bridge is equivalent to PCIe 1.0 x1 (i.e. 250MB/s). It was designed when 2560x1600 was the largest desktop resolution and Eyefinity wasn't in their product plans. AMD is thinking of the future and will probably design the new standard for future proofing with 4K Eyefinity.
However, the bandwidth for driving 4K Eyefinity just doesn't seem to work over the current PCIe bus. At 12MP with two cards suddenly you're at 3GB/s. With four way crossfire that increases to 4.5GB/s over the PCIe bus. This is a significant portion of the PCIe 3.0 bandwidth. I doubt any card they release this year can drive 4K Eyefinity at a reasonable framerate, but the bandwidth constraints are worth considering when PCIe 4 is not going to be released for a few years.
I agree that it looks unlikely that the current crossfire bridge configuration will remain for the 290X. The question is what will replace when the PCIe bus doesn't have unlimited bandwidth.
zanon - Wednesday, October 9, 2013 - link
Might be sheer coincidence, but Apple is supposed to be doing their next event on October 22, which is interesting in light of the Oct 23 ship date here. Apparently they're staying pure AMD for their new Trashcan Pro, wonder if some R9-290X variant could feature there along with the Fire stuff that's been touted.Kevin G - Wednesday, October 9, 2013 - link
The specs from the Mac Pro announcement point toward Tahiti based Fire Pro cards. Though Apple could easily slip in a higher grade card as a build to order option.sylar365 - Wednesday, October 9, 2013 - link
I, like most of us, am excited to soon see some real world benchmark numbers for the 290X but I feel that using an AMD CPU - even if it is the 9590 - in a system designed to be a "limited edition Battlefield 4" rig is just unfortunate. After playing the BF4 beta and noticing just how CPU intensive it is in multiplayer (I needed to turn CPU intensive settings off/down with an i5-2500k @ 4.2 Ghz to make it smoother) it brings a tear to my eye thinking someone would decide to mate an AMD CPU to an unreleased, high-end graphics system like that. What a damn shame.Novaguy - Thursday, October 10, 2013 - link
AMD cpu's. Interesting. The only connection I can think of why this might work is that "Perfect Parallel Rendering-perfectly utilize all 8 cores" cpu thing in the Frostbite/Mantle API presentation, but I'll have to see benchmarks first. If Mantle works better with 8 real AMD cores, it might squeeze something extra out of the FX 8 core architecture that would be lacking in intel's faster 4 cores.Even then, you'll still be stuck with weaker performance in the directX stuff.
willis936 - Thursday, October 10, 2013 - link
I don't think there's a hardware/software configuration on earth that would have an fx chip outperform an 115x.parkerm35 - Thursday, October 10, 2013 - link
As you can see, Battlefield 4, along side most future games, are/will be optimized for 8 cores, so the FX chips will be perfectly fine for future gaming.http://gamegpu.ru/images/remote/http--www.gamegpu....
http://gamegpu.ru/images/remote/http--www.gamegpu....
[-Stash-] - Thursday, October 10, 2013 - link
Intel 3770K and 4770K results conspicuously absent from these results…piroroadkill - Thursday, October 10, 2013 - link
What an absolute waste of what is no doubt an incredible card (referring of course to the fact it has an AMD CPU).chizow - Thursday, October 10, 2013 - link
Interesting to see CyberPowerPC is also bundling these cards with only AMD CPUs? Sounds like AMD is forcing this upon their OEMs, if you want their latest GPU you need to sell it with their CPU? Not a bad strategy, they probably should've done this long ago.Nikhilanand - Saturday, October 12, 2013 - link
I think AMD's more number of cores benefit from this generation of games (as there are 8 jaguar cores in both Xbox one and PS4)