Interesting dialog about the audio pipelines and how much power is really needed. It adds up REAL FAST if you're going to actually source everything in a scene. Image standing in a bustling city square, dozens of npc's footsteps, conversations, sound sources, all being fed in a 3d positional system. 300-400 almost seems like quite a small number when you factor in the thousands that should be present and accounted for in a real environment. At least we're in the right direction!
Sorry Ian, I tried to get an answer about the Xbox Series X's GPU TDP because if the TDP is the same as the Xbox One's GPU (120W at worse) then AMD has a monster of a GPU on their hands.
The problem when asking for GPU TDP is that it's monolithic. I guess if you'd said 'SoC' level 'average' power you might have got more of an answer. I think his answer about 'TDP is different based on workload' was a massive cop-out and someone disingenuous given what the definition of TDP actually is, and that engineers in the room understand nuance around TDP numbers. Moreover, it doesn't have to be a number, it could be a range.
Perhaps the question should have been - what thermal profile in watts will the cooling for the chip have to manage? or How big will the power brick be? Sometimes asking questions is an art, so the presenter doesn't block themselves into a corner with their own interpretation of what the question is getting at.
wtf I can smell the hostility here that you hold towards MS/Xbox. But now I recall you tried to do some similar "gotcha" BS with the Xbox One back then though I dont remember about what.
If he said "we're not talking about TDP" what's the big deal? at this stage they're not talking about a lot of things, like ya know, the price.
I dont know why you tried to frame it as a hostile "gotcha" Ian, wanna bet the less powerful PS5 will likely have a higher TDP? Considering they've already been show to have a much larger volume case (PS5 is utterly massive), use some crazy liquid metal cooling system, and all that because they to overclock to over 2.2 ghz to try to catch up to Xbox even a little on the spec sheet and get over double digit TF. Now that's bad engineering.
to be honest this is just speculations. having more capable cooling solution can go into both ways either dissipating more heat or lowering noise levels. I think after PS4 pro fan noise issues Sony decided to have more capable cooling solution to avoid such scenario. we don't know which console consume more power and we don't know if PS5 has the same GPU architecture as Xbox series x, so keep the criticism after we know all the details. and the criticism on Ian was not justified in my opinion, because he has a point the explanation why he won't comment is disingenuous. TDP is well defined so you don't need to take specific environment or other elements. if he would have just commented that he can't give more information right now, no one would have complained.
The base ps4 and the pro are louder than my PC, xbox one and xbox one x combined lol. I don't think the ps5 will be much better. They can't use a very large slower spinning fan in that case. I just don't see how it could fit (besides sideways) and I don't see how that would be beneficial (sideways) because the system has to work vertical and horizontal.
There are few possible orientation and sizes. probably the main benefit of having larger volume is having place for larger heat sink which by itself helps enormously with dissipation see for example the 3 slots GPUs they do not usually have better or bigger fans but they do have much larger heat sink which helps with cooling enabling the fans to run significantly lower RPM for the same performance (compared to 2 slots designes) lowering the overall noise produced.
Ian's comment: "I think his answer [...] was a massive cop-out and someone disingenuous"
Ian's answer when Intel was showing a *clearly* overclocked CPU cooled by a hidden 1HP chiller under the table was to "applaud" and ask no questions. Later on when the gig was up he published a weak retraction/admonishment saying that "this was not communicated as well as it should have been on stage" or "we did tell Intel that we had hoped that the presenter would have spent more time on stage talking about the system in play [...] Our commentary was taken on board by the Intel team we spoke to".
I'd also say that Ian's comment now sounds particularly aggressive, especially given his past reactions to what was clear and deliberate lie/misdirection from Intel. Judge the difference in the aggressiveness of the response yourself. It's the sort of bias that should be highlighted with every opportunity.
What would you call evading such a simple answer? I'd call it a massive cop-out. As Ian said, they didn't even have to give an exact answer, just a range like " package TDP around 100-120W" would suffice. Yet they chose not to disclose at all.
@dotjiz: "What would you call evading such a simple answer?"
Ian could have said "this was not communicated as well as it should have been on stage". He obviously found it an appropriate answer when it came to Intel's orchestrated deception.
Calling this " a massive cop-out" and more importantly "disingenuous" when it comes to AMD avoiding a straight answer really shines a spotlight on the journalist's bias. And their journalistic integrity.
I'm going to repeat my recommendation. There's no reasonable way to read aggression into what he said - "massive cop-out" is a fair summary of the answer he got and tbh it's fairly neutral language - he didn't say the guy was an ass or a liar, just that his answer dodged the question. Comparing it to the chiller demo is an extended reach.
In the comment section here, I'm not sure I'd call his statement exactly neutral. He did imply a degree of dishonesty. Here's the part you avoided quoting: "someone disingenuous" I assume he meant somewhat, for the record.
With that being said, this isn't as big of a deal as Close is saying IMO. The way I see it, he doesn't hate AMD or MS... he's just not afraid of them. Most journalists are far more worried about pissing off Intel, so they treat them with kid gloves unless *everyone* is going after them on a particular issue. Safety in numbers and all that.
I'm sure that Ian doesn't hate AMD and I am not suggesting he is a shill or on Intel's payroll. But he *is* biased and can't seem to work around it or make an effort to seem more balanced in his texts. The hallmark of a good journalist is keeping their tone equally neutral whether they're talking for or against the things they personally like more or less.
And he's obviously treating Intel with gloves in a way I haven't seen any other popular outlet do (like ArsTechnica or GamersNexus). I could give you examples in the dozens of Ian treating Intel with kid gloves from not burning Intel to the stake for making a fool of him (and other journalists) for not spotting obvious signs that other people spotted even if they weren't in the room, to singing some massive praise in reviews to some CPUs that barely brought 5% over the previous gen so the comparison to ones 3-4 generations old was hammered again and again.
@Spunkjji: "he didn't say the guy was an ass or a liar"
We must have different definitions of calling someone's statements "disingenuous". It literally means insincere. And I understand a random on the comment section may not have a solid grasp of... words. But a journalist has no such excuse. Again, he called an orchestrated deception as "not communicated as well as it should have been" and someone not wanting to give a detailed answer after a slide deck as "a cop-out" and "disingenuous".
So then I can just say Ian's way of expressing his unhappiness towards AMD compared to Intel is *extremely* disingenuous. I guess you don't develop integrity no matter how many titles you slap in front of your name.
Crazy liquid metal? Liquid metal is just an alternative to thermal paste. It's more efficient... But it also has to be replaced or should be replaced yearly because it dries out.
If sony really is using liquid metal as their TIM then it's a massive mistake that will lead to throttling at best and dead systems at worst.
so are current and next Gen Nvidia GPUs. Look at the gigantic framerate hit you get when trying to do full RT, it's not worth it. You have to apply it smartly
I dunno... Control uses ray tracing for quite a bit and it shows in both visual quality and the huge performance hit... Buuut chances are these consoles have more rt hardware than 1st gen RTX and when you combine that with VRS and Microsoft's own version of DLSS2... It should perform just fine at 4k even with ray tracing.
On a 2080ti I have to render Control at 1080p to get a locked (output) 4k60 at the games max settings with DLSS2 in quality mode. It can get close at a rendering resolution of 1440p but not close enough, plus the frame rate is all over the place.
That being said DLSS2 is such an amazing improvement over v1 that in Control is essentially looks native and in Death Stranding it looks better than native but the render resolution is also higher.
Point is... It can be done with no sacrifice to fidelity.
What they mean is that the scene is still a raster render with some ray tracing to create shadows or real time reflections on certain objects. No hardware right now can render an modern AAA game fully ray traced, and they won't be able to for a long time. I doubt it will even be pursued because the compute resources required are better used for other things.
That's the same as literally ever other GPU out there. Full real-time ray-tracing at current resolutions and expected detail levels is still impossible with hardware available today.
Whether it's better than Turing and Ampere is unknown, but you seem to have made up your mind...
You should probably read up on the subject. RDNA2 architecture doesn't really have much to do with RT. *Just like Nvidia*, it's a separate hardware block, a RT accelerator. Both Nvidia and AMD are using hybrid rendering. You blend traditional rasterization and RT, balancing the workload as suited to your game/engine. Having a hybrid chip also means your 2080 Ti or Series X can run existing titles better than predecessors, since they have gobs of rasterization-capable hardware.
6x2GB+4x1GB. This is also why devs are getting 10GB of full speed + ~3 GB of reduced speed (the rest of reduced speed is for OS). But I have no idea why MS wouldn't use 10x2GB for 20GB, all at full speed. It would drive home "our console is the most powerful" even better.
RAM cost, as they pretty much made clear in the presentation, but I wouldn't be surprised at all if 2023-2024 mid-generation update ups it 20 GB if DRAM prices drop enough.
cost, they are paying good penny for the SoC and SSD already, faster and bigger memory means additional $ and in console where you very slim profit margins if any at all, saving those bucks means a lot.
“CUs have 25% better perf/clock compared to last gen”. So, what does this mean?What is so called "last gen"?Last generation platform(Polaris)?Or RDNA1?
Microsoft is not AMD so when they compare generation they compare their hardware meaning last generation console. but they should be clear what is performance metric, FP? fps in games ? or something else ?
Yeah, 42.8 MTr/mm^2 is a bit low given that Renoir can hit 60MTr/mm^2 already. Is 7nm Enhanced actually N7+ EUV however? It might just be N7P. I know this doesn't fit in with what people expected, but with that density...
Maybe it is using higher-performance larger transistors in a lot of areas.
There's quite a bit more non-core logic in this chip than Renoir - and a much larger proportion of the die is the GPU - so it's not necessarily very helpful to compare the two.
I laughed when I saw "GPU 12 FLOPs". (Slides are correct.) Then I started to think about what 12 FLOPs feels like and realized 12 FLOPs is still way faster than what I can do. Damn computers.
I think the server class it's meant to be more stable/reliable in the long term, cpu and gpu nowdays implement to many oc techniques out the box that's why oc these days sucks a lot, all we have done this decade is increment power and frequency, look how intel ended it up, if you look amd zen2 cpus the 3700x has the same core count but it can boost to 4.4ghz in paper (in reality using a good mb and ram combination numbers are more likely 4.1~4.2) so the epyc or xeon side of cpus are clocked way lower and that's why they aren't targeted for gaming, that's why my guess that info from amd shows server class because it's not boosted like let's say ps5, and that's ps5 requires a larger cooler solution, it's just doesn't make any sense, I wish people stop consuming gaming stuff, I rather like to have a line straight than a zigzag, but that's me I like efficiency instead or performance bursts
It does ECC with the GDDR6 memory (a custom thing). They are going to use this same SoC for xCloud Servers in Azure, and they designed it with use in the data center in mind - using a virtualized display controller for instance. They’re going to run 4 XBox One instances from a single chip.
Remember, it's a comparison to their previous products which used tablet-class CPU cores. I think darkz3r0 is right about them emphasising the stable clocks, too.
Microsoft say they were targeting a 4-6x improvement of the Gpu in the Xbox One at the 1x power. Microsoft have an impressive console. What remains now so that we can finally call it is price. Kudos
yeah looking at the memory speeds (which is more important than ssd speeds) for graphics this console will deliver pc class graphics, I like xbox over ps4 as a device, even the og xbox one had better quality cables over ps4, since xbox one s/x the quality was above, my xbox one x beats my ps4 pro, I had to purchase a new ps4 pro with newer power supply because it was simply loud that even wearing headset I was bothered, never had noise issues with xbox one x, the quality of xbox 1x is worthy the extra $100 over ps4 pro, it has better quality audio, better outputs (disney+ can do HDR where ps4 pro can't) in games xbox 1x is not only performing faster fps but the quality of it, I had hard time getting the colors in my ps4 pro right later they added a hdr calibration setting, still some games looks washed out, if you are using a ips panel or bad quality tv probably it wont matter since the gamma is very low, but when using a fald tv with a good panel (oled or qled quality) xbox one x without need of tweak looked more richer so movies are more stunning on it, that's why I adore my xbox and I have no doubt they will make series x a good device it's just needs games
I agree Microsoft learnt their lesson with Xbox one x waited with its release but made much better device than PS4 pro. I bought the PS4 pro because of the games - I got a good PC to enjoy games I could play on it but I wanted the exclusives so I got the PS4 pro. for the question what is more important for performance of GPU traditionally only the memory speed matters because assets and textures are preloaded but with the new consoles and the ability to load on the fly things might get more complicated. anyway for me PS 5 is almost a guarantee buy over Xbox mostly because of games and ps4 games backwards compatible.
Wow! so much fan boyisem! First off the Ps4 cable has good quality which wonders me how fan boyish some one can be or might never owned a PS4 pro! Second i have Ps4 pro and never had as bad as sound issue you are describing, i am not denying it was a bit load but nothing like you said.( might needed a little bit clean up which obviously you have never done) You must never owned a PS4 pro because it clearly can do HDR and if you had PS4 pro and got washed out looking game is just because your TV is not certified for HDR, so buy one that has HDR certificate and voila every thing looks amazing! So i suggest try one game with actual HDR certified TV on PS4pro to see what a real HDR is instead of running around and telling every one lies!
This would be a very attractive APU for a gaming PC. It's a shame that the integrated design of consoles with one big and fast pool of memory isn't copied PC designs with open application installation, and enthusiasts are still left with only legacy architectures with DIMMs and a GPU sitting across at PCIe bus with its own separate pool of memory. Yes, you get upgrade options and the opportunity to swap in snapshots of the state of the art GPU once in a while, but the integration really does improve efficiency, and the snapshots will matter less as process improvements slow. Furthermore, the cost of the entire console is way less than a high end GPU as well.
Microsoft can afford a custom design like this because it gets subsidized by licensing in the end; it would be great if it licensed up the APU for implementations by other OEMs like something meant to work with Valve's Steam for instance.
GDDR has significantly more latency than DDR, which usually isn't a good trade-off for a general purpose computer. For gaming and streaming, it's a different story.
That's a fair point, and I wouldn't use this to run something like a DB server. However, I wouldn't provision a GPU like this for a server chip either and something architected like this would still run productivity apps just fine.
I think it makes perfect sense to redefine the "unit of upgrade" abstraction for an entertainment oriented PC along similar same lines as a console, bundling a CPU and RAM upgrade along with a GPU upgrade which would give you more performance for the buck now that high end GPUs retail for > $1k. You can keep separate your storage, power supply, and PCIe cards just like before and keep apace with any innovations from the console space as well.
That architecture would require support in Windows/Linux drivers, there was Chinese PC-Console called Subor Z+ with custom APU based on quad core zen and ~1500 SP iGPU coupled with GDDR6 memory and it had compatibility issues. People also ran Linux on PS4 and PS4 Pro, and there 8 small CPU cores were the problem.
After re-reading the parts about the audio processing, here is how I translated it for myself: " we wanted to get Dolby Atmos-like audio, but really wanted to avoid paying license fees for console and games. So, we came up with this".
Also, found it interesting that they emphasized the AVX256 prowess of their CPU cores. I guess upcoming games will make use of that; the numbers are impressive. Alternatively, is this about using the chip in Azure xCloud?
Now if they only gave up on trying to suck people into the Microsoft store...
I want to be able to buy a game on Steam and then play it on any of my various PCs, in the cloud when I am on the road (e.g. Nvidia Geforce Now) or on an Xbox (even a PSx for that matter).
The hassle of being locked into distinct eco-systems for every device is just blocking sales.
Can it run WIndows? It has a decent PC-class hardware - if Microsoft really cares about reducing electrowaste and carbon footprint it should allow tro run Windows-on_Xbox so that people would not be forced to buy extra desktop containing essentially the same basic parts.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
58 Comments
Back to Article
eastcoast_pete - Monday, August 17, 2020 - link
Question: how does the audio processing compare to, let's say, that of a decent audio codec like a 1220?dersteffeneilers - Thursday, August 20, 2020 - link
I think Discord uses Opus, so like that probablyCrazyeyeskillah - Monday, August 17, 2020 - link
Interesting dialog about the audio pipelines and how much power is really needed. It adds up REAL FAST if you're going to actually source everything in a scene. Image standing in a bustling city square, dozens of npc's footsteps, conversations, sound sources, all being fed in a 3d positional system. 300-400 almost seems like quite a small number when you factor in the thousands that should be present and accounted for in a real environment. At least we're in the right direction!Cheesecake16 - Monday, August 17, 2020 - link
Sorry Ian, I tried to get an answer about the Xbox Series X's GPU TDP because if the TDP is the same as the Xbox One's GPU (120W at worse) then AMD has a monster of a GPU on their hands.Ian Cutress - Monday, August 17, 2020 - link
The problem when asking for GPU TDP is that it's monolithic. I guess if you'd said 'SoC' level 'average' power you might have got more of an answer. I think his answer about 'TDP is different based on workload' was a massive cop-out and someone disingenuous given what the definition of TDP actually is, and that engineers in the room understand nuance around TDP numbers. Moreover, it doesn't have to be a number, it could be a range.Perhaps the question should have been - what thermal profile in watts will the cooling for the chip have to manage? or How big will the power brick be? Sometimes asking questions is an art, so the presenter doesn't block themselves into a corner with their own interpretation of what the question is getting at.
bill54 - Monday, August 17, 2020 - link
wtf I can smell the hostility here that you hold towards MS/Xbox. But now I recall you tried to do some similar "gotcha" BS with the Xbox One back then though I dont remember about what.If he said "we're not talking about TDP" what's the big deal? at this stage they're not talking about a lot of things, like ya know, the price.
I dont know why you tried to frame it as a hostile "gotcha" Ian, wanna bet the less powerful PS5 will likely have a higher TDP? Considering they've already been show to have a much larger volume case (PS5 is utterly massive), use some crazy liquid metal cooling system, and all that because they to overclock to over 2.2 ghz to try to catch up to Xbox even a little on the spec sheet and get over double digit TF. Now that's bad engineering.
kulareddy - Tuesday, August 18, 2020 - link
no way, Sony (and also Microsoft) will not use Liquid metal TIM.Eliadbu - Tuesday, August 18, 2020 - link
to be honest this is just speculations. having more capable cooling solution can go into both ways either dissipating more heat or lowering noise levels. I think after PS4 pro fan noise issues Sony decided to have more capable cooling solution to avoid such scenario. we don't know which console consume more power and we don't know if PS5 has the same GPU architecture as Xbox series x, so keep the criticism after we know all the details.and the criticism on Ian was not justified in my opinion, because he has a point the explanation why he won't comment is disingenuous. TDP is well defined so you don't need to take specific environment or other elements. if he would have just commented that he can't give more information right now, no one would have complained.
d0x360 - Wednesday, August 19, 2020 - link
The base ps4 and the pro are louder than my PC, xbox one and xbox one x combined lol. I don't think the ps5 will be much better. They can't use a very large slower spinning fan in that case. I just don't see how it could fit (besides sideways) and I don't see how that would be beneficial (sideways) because the system has to work vertical and horizontal.Eliadbu - Wednesday, August 19, 2020 - link
There are few possible orientation and sizes. probably the main benefit of having larger volume is having place for larger heat sink which by itself helps enormously with dissipation see for example the 3 slots GPUs they do not usually have better or bigger fans but they do have much larger heat sink which helps with cooling enabling the fans to run significantly lower RPM for the same performance (compared to 2 slots designes) lowering the overall noise produced.Spunjji - Wednesday, August 19, 2020 - link
What a bizarrely aggressive response. Ian didn't frame anything as a "gotcha", let alone exhibit hostility. Calm down and slow your roll.close - Thursday, August 20, 2020 - link
Ian's comment: "I think his answer [...] was a massive cop-out and someone disingenuous"Ian's answer when Intel was showing a *clearly* overclocked CPU cooled by a hidden 1HP chiller under the table was to "applaud" and ask no questions. Later on when the gig was up he published a weak retraction/admonishment saying that "this was not communicated as well as it should have been on stage" or "we did tell Intel that we had hoped that the presenter would have spent more time on stage talking about the system in play [...] Our commentary was taken on board by the Intel team we spoke to".
I'd also say that Ian's comment now sounds particularly aggressive, especially given his past reactions to what was clear and deliberate lie/misdirection from Intel. Judge the difference in the aggressiveness of the response yourself. It's the sort of bias that should be highlighted with every opportunity.
https://www.anandtech.com/show/12932/intel-confirm...
https://www.anandtech.com/show/12893/intels-28core...
Meteor2 - Thursday, August 20, 2020 - link
Chill outclose - Friday, August 21, 2020 - link
I'm always chill when I'm 100% right. ;)dotjaz - Sunday, August 23, 2020 - link
What would you call evading such a simple answer? I'd call it a massive cop-out. As Ian said, they didn't even have to give an exact answer, just a range like " package TDP around 100-120W" would suffice. Yet they chose not to disclose at all.close - Sunday, August 30, 2020 - link
@dotjiz: "What would you call evading such a simple answer?"Ian could have said "this was not communicated as well as it should have been on stage". He obviously found it an appropriate answer when it came to Intel's orchestrated deception.
Calling this " a massive cop-out" and more importantly "disingenuous" when it comes to AMD avoiding a straight answer really shines a spotlight on the journalist's bias. And their journalistic integrity.
Spunjji - Friday, August 21, 2020 - link
I'm going to repeat my recommendation. There's no reasonable way to read aggression into what he said - "massive cop-out" is a fair summary of the answer he got and tbh it's fairly neutral language - he didn't say the guy was an ass or a liar, just that his answer dodged the question. Comparing it to the chiller demo is an extended reach.Alexvrb - Sunday, August 23, 2020 - link
In the comment section here, I'm not sure I'd call his statement exactly neutral. He did imply a degree of dishonesty. Here's the part you avoided quoting: "someone disingenuous" I assume he meant somewhat, for the record.With that being said, this isn't as big of a deal as Close is saying IMO. The way I see it, he doesn't hate AMD or MS... he's just not afraid of them. Most journalists are far more worried about pissing off Intel, so they treat them with kid gloves unless *everyone* is going after them on a particular issue. Safety in numbers and all that.
close - Sunday, August 30, 2020 - link
I'm sure that Ian doesn't hate AMD and I am not suggesting he is a shill or on Intel's payroll. But he *is* biased and can't seem to work around it or make an effort to seem more balanced in his texts. The hallmark of a good journalist is keeping their tone equally neutral whether they're talking for or against the things they personally like more or less.And he's obviously treating Intel with gloves in a way I haven't seen any other popular outlet do (like ArsTechnica or GamersNexus). I could give you examples in the dozens of Ian treating Intel with kid gloves from not burning Intel to the stake for making a fool of him (and other journalists) for not spotting obvious signs that other people spotted even if they weren't in the room, to singing some massive praise in reviews to some CPUs that barely brought 5% over the previous gen so the comparison to ones 3-4 generations old was hammered again and again.
close - Sunday, August 30, 2020 - link
@Spunkjji: "he didn't say the guy was an ass or a liar"We must have different definitions of calling someone's statements "disingenuous". It literally means insincere. And I understand a random on the comment section may not have a solid grasp of... words. But a journalist has no such excuse. Again, he called an orchestrated deception as "not communicated as well as it should have been" and someone not wanting to give a detailed answer after a slide deck as "a cop-out" and "disingenuous".
So then I can just say Ian's way of expressing his unhappiness towards AMD compared to Intel is *extremely* disingenuous. I guess you don't develop integrity no matter how many titles you slap in front of your name.
d0x360 - Wednesday, August 19, 2020 - link
Crazy liquid metal? Liquid metal is just an alternative to thermal paste. It's more efficient... But it also has to be replaced or should be replaced yearly because it dries out.If sony really is using liquid metal as their TIM then it's a massive mistake that will lead to throttling at best and dead systems at worst.
darealist - Tuesday, August 18, 2020 - link
RDNA2 will only be able to do economic Ray Tracing L O Lfrbeckenbauer - Tuesday, August 18, 2020 - link
so are current and next Gen Nvidia GPUs. Look at the gigantic framerate hit you get when trying to do full RT, it's not worth it. You have to apply it smartlyd0x360 - Wednesday, August 19, 2020 - link
I dunno... Control uses ray tracing for quite a bit and it shows in both visual quality and the huge performance hit... Buuut chances are these consoles have more rt hardware than 1st gen RTX and when you combine that with VRS and Microsoft's own version of DLSS2... It should perform just fine at 4k even with ray tracing.On a 2080ti I have to render Control at 1080p to get a locked (output) 4k60 at the games max settings with DLSS2 in quality mode. It can get close at a rendering resolution of 1440p but not close enough, plus the frame rate is all over the place.
That being said DLSS2 is such an amazing improvement over v1 that in Control is essentially looks native and in Death Stranding it looks better than native but the render resolution is also higher.
Point is... It can be done with no sacrifice to fidelity.
hecksagon - Thursday, August 20, 2020 - link
What they mean is that the scene is still a raster render with some ray tracing to create shadows or real time reflections on certain objects. No hardware right now can render an modern AAA game fully ray traced, and they won't be able to for a long time. I doubt it will even be pursued because the compute resources required are better used for other things.Spunjji - Wednesday, August 19, 2020 - link
That's the same as literally ever other GPU out there. Full real-time ray-tracing at current resolutions and expected detail levels is still impossible with hardware available today.Whether it's better than Turing and Ampere is unknown, but you seem to have made up your mind...
Alexvrb - Sunday, August 23, 2020 - link
You should probably read up on the subject. RDNA2 architecture doesn't really have much to do with RT. *Just like Nvidia*, it's a separate hardware block, a RT accelerator. Both Nvidia and AMD are using hybrid rendering. You blend traditional rasterization and RT, balancing the workload as suited to your game/engine. Having a hybrid chip also means your 2080 Ti or Series X can run existing titles better than predecessors, since they have gobs of rasterization-capable hardware.DanNeely - Tuesday, August 18, 2020 - link
How does 16GB of GDDR6 over 10 controllers work?Zizy - Tuesday, August 18, 2020 - link
6x2GB+4x1GB. This is also why devs are getting 10GB of full speed + ~3 GB of reduced speed (the rest of reduced speed is for OS).But I have no idea why MS wouldn't use 10x2GB for 20GB, all at full speed. It would drive home "our console is the most powerful" even better.
oleyska - Tuesday, August 18, 2020 - link
it's in the slides why...NixZero - Tuesday, August 18, 2020 - link
it is possible that by reducing memory speed they can get better latency for cpu access, as gddr6 is optimized for througput so not ideal for cpu use.drothgery - Tuesday, August 18, 2020 - link
RAM cost, as they pretty much made clear in the presentation, but I wouldn't be surprised at all if 2023-2024 mid-generation update ups it 20 GB if DRAM prices drop enough.Eliadbu - Tuesday, August 18, 2020 - link
cost, they are paying good penny for the SoC and SSD already, faster and bigger memory means additional $ and in console where you very slim profit margins if any at all, saving those bucks means a lot.6YearsLater - Tuesday, August 18, 2020 - link
“CUs have 25% better perf/clock compared to last gen”.So, what does this mean?What is so called "last gen"?Last generation platform(Polaris)?Or RDNA1?
Eliadbu - Tuesday, August 18, 2020 - link
Microsoft is not AMD so when they compare generation they compare their hardware meaning last generation console. but they should be clear what is performance metric, FP? fps in games ? or something else ?Spunjji - Wednesday, August 19, 2020 - link
I'm guessing they're comparing to Polaris in the Xbox One X; the rest of the performance difference would be down to increased clocks.brucethemoose - Tuesday, August 18, 2020 - link
Holy moly that audio hardware. I hope we get some kind of accelerator with all that on PC.brucethemoose - Tuesday, August 18, 2020 - link
Also, I wish they mentioned something about videobcodecs. AV1 or no AV1 in the consoles and RDNA2 is a pretty big deal.RedOnlyFan - Tuesday, August 18, 2020 - link
From the transistor count and die size the TSMCs 7+ nm density doesn't add up. 42M/mm2. GPU are more efficient density right??psychobriggsy - Tuesday, August 18, 2020 - link
Yeah, 42.8 MTr/mm^2 is a bit low given that Renoir can hit 60MTr/mm^2 already. Is 7nm Enhanced actually N7+ EUV however? It might just be N7P. I know this doesn't fit in with what people expected, but with that density...Maybe it is using higher-performance larger transistors in a lot of areas.
Spunjji - Friday, August 21, 2020 - link
There's quite a bit more non-core logic in this chip than Renoir - and a much larger proportion of the die is the GPU - so it's not necessarily very helpful to compare the two.wujj123456 - Tuesday, August 18, 2020 - link
I laughed when I saw "GPU 12 FLOPs". (Slides are correct.) Then I started to think about what 12 FLOPs feels like and realized 12 FLOPs is still way faster than what I can do. Damn computers.liquid_c - Tuesday, August 18, 2020 - link
“Zen2 server class CPU cores”This line doesn’t belong in a marketing slide for a gaming console.
darkz3r0 - Tuesday, August 18, 2020 - link
I think the server class it's meant to be more stable/reliable in the long term, cpu and gpu nowdays implement to many oc techniques out the box that's why oc these days sucks a lot, all we have done this decade is increment power and frequency, look how intel ended it up, if you look amd zen2 cpus the 3700x has the same core count but it can boost to 4.4ghz in paper (in reality using a good mb and ram combination numbers are more likely 4.1~4.2) so the epyc or xeon side of cpus are clocked way lower and that's why they aren't targeted for gaming, that's why my guess that info from amd shows server class because it's not boosted like let's say ps5, and that's ps5 requires a larger cooler solution, it's just doesn't make any sense, I wish people stop consuming gaming stuff, I rather like to have a line straight than a zigzag, but that's me I like efficiency instead or performance burstssmithg5 - Tuesday, August 18, 2020 - link
It does ECC with the GDDR6 memory (a custom thing). They are going to use this same SoC for xCloud Servers in Azure, and they designed it with use in the data center in mind - using a virtualized display controller for instance. They’re going to run 4 XBox One instances from a single chip.I think it’s a reasonable thing to say.
Spunjji - Wednesday, August 19, 2020 - link
Remember, it's a comparison to their previous products which used tablet-class CPU cores. I think darkz3r0 is right about them emphasising the stable clocks, too.KimGitz - Tuesday, August 18, 2020 - link
Microsoft say they were targeting a 4-6x improvement of the Gpu in the Xbox One at the 1x power. Microsoft have an impressive console. What remains now so that we can finally call it is price. Kudosdarkz3r0 - Tuesday, August 18, 2020 - link
yeah looking at the memory speeds (which is more important than ssd speeds) for graphics this console will deliver pc class graphics, I like xbox over ps4 as a device, even the og xbox one had better quality cables over ps4, since xbox one s/x the quality was above, my xbox one x beats my ps4 pro, I had to purchase a new ps4 pro with newer power supply because it was simply loud that even wearing headset I was bothered, never had noise issues with xbox one x, the quality of xbox 1x is worthy the extra $100 over ps4 pro, it has better quality audio, better outputs (disney+ can do HDR where ps4 pro can't) in games xbox 1x is not only performing faster fps but the quality of it, I had hard time getting the colors in my ps4 pro right later they added a hdr calibration setting, still some games looks washed out, if you are using a ips panel or bad quality tv probably it wont matter since the gamma is very low, but when using a fald tv with a good panel (oled or qled quality) xbox one x without need of tweak looked more richer so movies are more stunning on it, that's why I adore my xbox and I have no doubt they will make series x a good device it's just needs gamesEliadbu - Tuesday, August 18, 2020 - link
I agree Microsoft learnt their lesson with Xbox one x waited with its release but made much better device than PS4 pro. I bought the PS4 pro because of the games - I got a good PC to enjoy games I could play on it but I wanted the exclusives so I got the PS4 pro. for the question what is more important for performance of GPU traditionally only the memory speed matters because assets and textures are preloaded but with the new consoles and the ability to load on the fly things might get more complicated.anyway for me PS 5 is almost a guarantee buy over Xbox mostly because of games and ps4 games backwards compatible.
MojArch - Tuesday, August 18, 2020 - link
Wow! so much fan boyisem!First off the Ps4 cable has good quality which wonders me how fan boyish some one can be or might never owned a PS4 pro!
Second i have Ps4 pro and never had as bad as sound issue you are describing, i am not denying it was a bit load but nothing like you said.( might needed a little bit clean up which obviously you have never done)
You must never owned a PS4 pro because it clearly can do HDR and if you had PS4 pro and got washed out looking game is just because your TV is not certified for HDR, so buy one that has HDR certificate and voila every thing looks amazing!
So i suggest try one game with actual HDR certified TV on PS4pro to see what a real HDR is instead of running around and telling every one lies!
Raqia - Tuesday, August 18, 2020 - link
This would be a very attractive APU for a gaming PC. It's a shame that the integrated design of consoles with one big and fast pool of memory isn't copied PC designs with open application installation, and enthusiasts are still left with only legacy architectures with DIMMs and a GPU sitting across at PCIe bus with its own separate pool of memory. Yes, you get upgrade options and the opportunity to swap in snapshots of the state of the art GPU once in a while, but the integration really does improve efficiency, and the snapshots will matter less as process improvements slow. Furthermore, the cost of the entire console is way less than a high end GPU as well.Microsoft can afford a custom design like this because it gets subsidized by licensing in the end; it would be great if it licensed up the APU for implementations by other OEMs like something meant to work with Valve's Steam for instance.
drothgery - Tuesday, August 18, 2020 - link
GDDR has significantly more latency than DDR, which usually isn't a good trade-off for a general purpose computer. For gaming and streaming, it's a different story.Raqia - Tuesday, August 18, 2020 - link
That's a fair point, and I wouldn't use this to run something like a DB server. However, I wouldn't provision a GPU like this for a server chip either and something architected like this would still run productivity apps just fine.I think it makes perfect sense to redefine the "unit of upgrade" abstraction for an entertainment oriented PC along similar same lines as a console, bundling a CPU and RAM upgrade along with a GPU upgrade which would give you more performance for the buck now that high end GPUs retail for > $1k. You can keep separate your storage, power supply, and PCIe cards just like before and keep apace with any innovations from the console space as well.
vithrell - Friday, August 21, 2020 - link
That architecture would require support in Windows/Linux drivers, there was Chinese PC-Console called Subor Z+ with custom APU based on quad core zen and ~1500 SP iGPU coupled with GDDR6 memory and it had compatibility issues. People also ran Linux on PS4 and PS4 Pro, and there 8 small CPU cores were the problem.eastcoast_pete - Tuesday, August 18, 2020 - link
After re-reading the parts about the audio processing, here is how I translated it for myself: " we wanted to get Dolby Atmos-like audio, but really wanted to avoid paying license fees for console and games. So, we came up with this".eastcoast_pete - Tuesday, August 18, 2020 - link
Also, found it interesting that they emphasized the AVX256 prowess of their CPU cores. I guess upcoming games will make use of that; the numbers are impressive. Alternatively, is this about using the chip in Azure xCloud?abufrejoval - Thursday, August 20, 2020 - link
Looks like really nice hardware!Now if they only gave up on trying to suck people into the Microsoft store...
I want to be able to buy a game on Steam and then play it on any of my various PCs, in the cloud when I am on the road (e.g. Nvidia Geforce Now) or on an Xbox (even a PSx for that matter).
The hassle of being locked into distinct eco-systems for every device is just blocking sales.
Ptosio - Monday, August 24, 2020 - link
Can it run WIndows?It has a decent PC-class hardware - if Microsoft really cares about reducing electrowaste and carbon footprint it should allow tro run Windows-on_Xbox so that people would not be forced to buy extra desktop containing essentially the same basic parts.