"Lastly, we were sent updated spec numbers on the Xbox's numbers, and we spoke with Microsoft's Vice President of hardware, Todd Holmdahl, about the Xbox 360's final transistor count.
Another bit of information sent our way is the final transistor count for Xbox 360's graphics subset. The GPU totals 332 million transistors, which is spit between the two separate dies that make up the part. The parent die is the "main" piece of the GPU, handling the large bulk of the graphics rendering, and is comprised of 232 million transistors. The daughter die contains the system's 10MB of embedded DRAM and its logic chip, which is capable of some additional 3D math. The daughter die totals an even 100 million transistors, bringing the total transistor count for the GPU to 232 million."
OK, I did a bit more hunting around on the transistor count in the Xbox 360's GPU, and the only thing I've found comes from Microsoft's Major Nelson.(Xbox Live Director of Programming)
According to his blog, the Xbox 360's GPU has 330 million transistors.
Just remember that video chipsets developers have multiple teams working on different generations - they have one chip in early development and one in late development/near production. This means two teams at least (to keep up with the 6 month product launches that were the norm starting from some 3 years ago)
With the memory controller functions ALSO on the 360 GPU, 150 million is pretty darn out of reach, im thinking. The 360 GPU almost sounds like an integrated single chip north bridge. It will be interesting to crack an Xbox360 open and get a peak at the insides when it ships. Should be interesting.
I couldnt help but wonder if perhaps this is a tile based arch... any info on that? If so, it would answer a few of my questions about the system.
Jarred is correct, the IGN figures have to be wrong. Remember that R420, a 16 pipe design, was already around 160M transistors. The Xbox 360 GPU has 48 pipes, although they are unified shader pipes. Add in the additional logic to deal with handling both vertex and pixel instruction streams and you are already dealing with a GPU that is larger than the R420.
Not to mention the 10MB of embedded DRAM, which will not be tiny.
Jarred, I thought ATI made the XBox 2 GPU specifically for the console, and wasn't incorporating any of its features into the R520? I'm not sure I agree that spending most of your R&D on a "dead-end" GPU is the best tactic; nVidia's approach of optimizing an existing desktop GPU architecture seems to be the more efficient way to spend R&D capital.
It also allows nVidia to take any lessons learned from the PS3 GPU and add/modify them when they finally release the G70 (hopefully with fully functional PureVideo, not just "sort of functional" PureVideo--I'm paying for the transistor real estate in price and heat, I better be able to use it this time!)...
Low Roller - I wouldn't put too much stock in that figure for the X360 GPU. The way the chip is designed (split in two pieces), I wouldn't be surprised to find that one piece is 150 million and the other is maybe 150 to 200 million.
My DRAM knowledge is a bit fuzzy, and the "Embedded DRAM" is something we don't have specifics on, but 10MB of RAM represents 83,886,080 bits, and best case scenario you're using 1 transistor per bit. SRAM uses 6, and perhaps DRAM is 2? 2 transistors per bit would already put just the embedded RAM at 167,772,160 transistors. Heh. 150 million is WAY too small, no matter what IGN says.
As a separate thought, I wouldn't be surprised to see the Xbox 360 GPU end up the more powerful of the two graphics chips. The reason is based on inference: R4xx is very similar to R3xx, meaning ATI didn't spend as much resources creating R4xx as NVIDIA spent on NV4x. If their R&D teams are about equal in size, where did ATI's extra efforts end up being spent? That's right: the Xbox 360 GPU. This is simply a deductive guess, and it could be wrong, but it's something to consider. NVIDIA spent a lot of effort recovering from the NV3x (FX) fiasco.
If anything, it seems like the PS3 GPU is more of a PC design with less "future technology". In other words, everything said by MS and Sony is complete hype and should be taken with a massive helping of salt. :)
Which graphics processor will be more powerful?
The XBOX 360 or the PS3? The Nintendos future gaming console also uses ATI's GPU codenamed "Hollywood".
nVidia's apparently pulling out the 16" battle cannons with RSX/G70--136 shader ops per clock is damn impressive, regardless of whether the GPU is console or desktop...
And if nVidia says the desktop G70's going to be even more powerful than RSX, I'm willing to bet that there will be at least 10 shader units pumping to 24+ pipes in full FP32 quality. Nice. :)
Very very interesting article. ATi and NVIDIA seem to have diverging paths. All things considered, this tells me that ATi has the more advanced GPU. Does that mean a faster GPU though.
"This year's E3 has been, overall, a pretty big letdown"
Major new hardware announcements and details from the big three console manufacturers. This is the most exciting E3 that I can remember (I just read about it; never actually been to the show).
Plenty of upcoming titles have been announced and discussed as well. Were the previous E3's way more exciting or is the conference in general just not much more exciting than all the info you can read on the Internet?
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
22 Comments
Back to Article
finbarqs - Saturday, May 21, 2005 - link
perhaps that's why they were very *hush* *hush* regarding their PS3 GPU. For all we know, the Sony demo's is very possible on the XGPUIllissius - Saturday, May 21, 2005 - link
It'd have been nice to have some specs for current desktop GPUs in there for comparison purposes... how many shader ops/sec can they do? 16 + 6 = 22?ksherman - Saturday, May 21, 2005 - link
DAmn they cant count! they just explained what the transisters are doing, and last i checked 232million+100million != 232million!Low Roller - Saturday, May 21, 2005 - link
IGN posted an update to their Xbox 360 specs:"Lastly, we were sent updated spec numbers on the Xbox's numbers, and we spoke with Microsoft's Vice President of hardware, Todd Holmdahl, about the Xbox 360's final transistor count.
Another bit of information sent our way is the final transistor count for Xbox 360's graphics subset. The GPU totals 332 million transistors, which is spit between the two separate dies that make up the part. The parent die is the "main" piece of the GPU, handling the large bulk of the graphics rendering, and is comprised of 232 million transistors. The daughter die contains the system's 10MB of embedded DRAM and its logic chip, which is capable of some additional 3D math. The daughter die totals an even 100 million transistors, bringing the total transistor count for the GPU to 232 million."
http://xbox360.ign.com/articles/617/617951p3.html
Low Roller - Friday, May 20, 2005 - link
OK, I did a bit more hunting around on the transistor count in the Xbox 360's GPU, and the only thing I've found comes from Microsoft's Major Nelson.(Xbox Live Director of Programming)According to his blog, the Xbox 360's GPU has 330 million transistors.
http://www.majornelson.com/2005/05/20/xbox-360-vs-...
I'm not sure how credible either IGN's or Major Nelson's figures are on this, as their not even close to each other.
ksherman - Friday, May 20, 2005 - link
WOAAA, so Sony is manufacturing nVidia's GPU? really weird.. hope sony got a discount on the price of the core then...Calin - Friday, May 20, 2005 - link
Just remember that video chipsets developers have multiple teams working on different generations - they have one chip in early development and one in late development/near production. This means two teams at least (to keep up with the 6 month product launches that were the norm starting from some 3 years ago)Cygni - Friday, May 20, 2005 - link
With the memory controller functions ALSO on the 360 GPU, 150 million is pretty darn out of reach, im thinking. The 360 GPU almost sounds like an integrated single chip north bridge. It will be interesting to crack an Xbox360 open and get a peak at the insides when it ships. Should be interesting.I couldnt help but wonder if perhaps this is a tile based arch... any info on that? If so, it would answer a few of my questions about the system.
Anand Lal Shimpi - Friday, May 20, 2005 - link
Low RollerJarred is correct, the IGN figures have to be wrong. Remember that R420, a 16 pipe design, was already around 160M transistors. The Xbox 360 GPU has 48 pipes, although they are unified shader pipes. Add in the additional logic to deal with handling both vertex and pixel instruction streams and you are already dealing with a GPU that is larger than the R420.
Not to mention the 10MB of embedded DRAM, which will not be tiny.
Take care,
Anand
IceWindius - Friday, May 20, 2005 - link
I think im gonna give up on the upgrade race for PC's, im so tired of it. Think i'll go buy Xbox360, PS3 and come out money ahead.Shinei - Thursday, May 19, 2005 - link
Jarred, I thought ATI made the XBox 2 GPU specifically for the console, and wasn't incorporating any of its features into the R520? I'm not sure I agree that spending most of your R&D on a "dead-end" GPU is the best tactic; nVidia's approach of optimizing an existing desktop GPU architecture seems to be the more efficient way to spend R&D capital.It also allows nVidia to take any lessons learned from the PS3 GPU and add/modify them when they finally release the G70 (hopefully with fully functional PureVideo, not just "sort of functional" PureVideo--I'm paying for the transistor real estate in price and heat, I better be able to use it this time!)...
JarredWalton - Thursday, May 19, 2005 - link
Low Roller - I wouldn't put too much stock in that figure for the X360 GPU. The way the chip is designed (split in two pieces), I wouldn't be surprised to find that one piece is 150 million and the other is maybe 150 to 200 million.My DRAM knowledge is a bit fuzzy, and the "Embedded DRAM" is something we don't have specifics on, but 10MB of RAM represents 83,886,080 bits, and best case scenario you're using 1 transistor per bit. SRAM uses 6, and perhaps DRAM is 2? 2 transistors per bit would already put just the embedded RAM at 167,772,160 transistors. Heh. 150 million is WAY too small, no matter what IGN says.
As a separate thought, I wouldn't be surprised to see the Xbox 360 GPU end up the more powerful of the two graphics chips. The reason is based on inference: R4xx is very similar to R3xx, meaning ATI didn't spend as much resources creating R4xx as NVIDIA spent on NV4x. If their R&D teams are about equal in size, where did ATI's extra efforts end up being spent? That's right: the Xbox 360 GPU. This is simply a deductive guess, and it could be wrong, but it's something to consider. NVIDIA spent a lot of effort recovering from the NV3x (FX) fiasco.
What makes this all really entertaining to me is the following:
http://www.anandtech.com/news/shownews.aspx?i=2427...
If anything, it seems like the PS3 GPU is more of a PC design with less "future technology". In other words, everything said by MS and Sony is complete hype and should be taken with a massive helping of salt. :)
Iftekharalam - Thursday, May 19, 2005 - link
Which graphics processor will be more powerful?The XBOX 360 or the PS3? The Nintendos future gaming console also uses ATI's GPU codenamed "Hollywood".
Low Roller - Thursday, May 19, 2005 - link
AnandTech's article says they were not able to get a transistor count out of ATI for the Xbox 360.According to IGN, the Xbox 360's GPU only has 150 million transistors, compared to the G70's 300 million.
http://xbox360.ign.com/articles/612/612995p1.html?...
araczynski - Thursday, May 19, 2005 - link
nice info.too bad i could care less which gpu is used in which console, i'm more interested in which console will have some original quality games...
R3MF - Thursday, May 19, 2005 - link
sounds good, shame about the non unified shader model tho.maybe nvidia are right, but i like advanced tech. :p
Shinei - Thursday, May 19, 2005 - link
nVidia's apparently pulling out the 16" battle cannons with RSX/G70--136 shader ops per clock is damn impressive, regardless of whether the GPU is console or desktop...And if nVidia says the desktop G70's going to be even more powerful than RSX, I'm willing to bet that there will be at least 10 shader units pumping to 24+ pipes in full FP32 quality. Nice. :)
AnandThenMan - Thursday, May 19, 2005 - link
Very very interesting article. ATi and NVIDIA seem to have diverging paths. All things considered, this tells me that ATi has the more advanced GPU. Does that mean a faster GPU though.EODetroit - Thursday, May 19, 2005 - link
How about anything about the new physics processor?Garyclaus16 - Thursday, May 19, 2005 - link
...sooo much for my bfg6800U...seems like I'm already way behind again.LanceVance - Thursday, May 19, 2005 - link
"This year's E3 has been, overall, a pretty big letdown"Major new hardware announcements and details from the big three console manufacturers. This is the most exciting E3 that I can remember (I just read about it; never actually been to the show).
Plenty of upcoming titles have been announced and discussed as well. Were the previous E3's way more exciting or is the conference in general just not much more exciting than all the info you can read on the Internet?
knitecrow - Thursday, May 19, 2005 - link
the X360 and PS3 may be even closer in terms of power than anyone thought