"Consoles have long development processes, and are thus already behind the curve at launch"
That's only true for the most recent generation of consoles. most other generation of consoles usually launched with more advanced hardware than PCs, but PCs quickly catch up. First GPU with hardware T&L was the N64 in 1996. PCs didn't get that until the Geforce 256 in 1999. First unified shader GPU was the Xbox 360 in 2005. PCs didn't get unified shaders until a year later. etc.
you can pick individual specs but the fact remain that high end PCs are globally faster than console at launch, and the gap only widen with time. Console are cheaper than high end PCs, however.
As takeshi7 said, this is a new thing. Even the Xbox 360 / Playstation 3 were better at launch than even an ultra-high-end gaming PC was. It has only been with the Xbone/PS4 that they were slower than PC at launch.
Also, not sure what you mean by "globally faster".
The PS3 wasn't, but only because it was about as fast as a 360 a year later, still very good for the time though. But today, 12 years after the 360 launch GPUs have gotten much bigger in terms of mm^2 and now you can have multiple of them cause why not? So today we've pegged Vega's launch at > double the relative straight compute power of the (to be released later) Xbox Scorpio.
People have proven they'll pay for giant GPUs with giant cooling blocks, so Nvidia and AMD are more than happy to sell it to them.
I agree. This is a new problem for consoles to be "outdated" when they are essentially now based off PC hardware (or in Nintendo's case, mobile hardware!). Before, it was more specialized and kept a secret, especially in the 90's when we had no idea what console manufacturers were up too. It was speculated for years Nintendo would go to CD-ROM but didn't. The Dreamcast surprised everyone with a proprietary optical format...and the Saturn had dual processors that were a bitch to program for since they were each assigned to different tasks. I still think Sega goes down as the most interesting console developer (with Nintendo close behind) because of their sheer unpredictability and rampant product developement cycles. The Genesis had numerous versions and 2 substantial upgrade components, not to mention four controller designs, in a period shorter than most modern console generations. Nintendo kept it simple, only ever making one global upgrade accessory for a console (the N64 RAMBUS expansion pack...although the N64 DD did launch in limited quantities in Japan as did the Nintendo Satellite network, but Nintendo never planned on launching these outside of Japan's limited audience)
Where Nintendo had always been revolutionary was with controllers. The first of its kind D-PAD controller, the first controller with shoulder buttons, whatever the hell the N64 controller was and how amazing it is in retrospect, and obviously the Wii. Even their underdog controllers like the NES Max and the 'dog bone' NES controllers are about as close to perfection as a basic 8 bit controller gets.
Well to be precise, Switch is only partially based on mobile hardware (ARM CPU), but the GPU is of the same architecture that is used in desktop parts..the fact that Tegra is a mobile chip doesn't change the fact Maxwell is an universal architecture designed to scale across platforms.
I own a Genesis, a 32X, and a Sega CD as well. Amazing add-ons, the likes of which were rarely seen for consoles even in those days. They even ran a few gaming gems that might not otherwise have existed.
For the Genesis, what four controller designs are you referring to? There's only two first-party Sega controllers, unless you include the Arcade Sticks or other specialized accessories like the Flight Stick or Enforcer. Saturn had three main first-party designs. The N64 controller... I hated. It was an advanced design but overall it ended up being an awkward design I had to put up with for years. The analog stick in particular was a pile of junk after a few years of use. Even the analog stick ("3D") variant of the Saturn controller was better, though for 2D games I preferred the standard Model 2 Saturn controller - it was the gold standard of d-pads for me until Xbox Elite controllers (unless you count third-party controllers). Dreamcast was the first "modern-design" analog + d-pad controller that I used and really liked, though Xbox "S" controllers surpassed it in most regards and modern Xbox controllers are superior in all regards (especially the latest "S" model controllers with the rubberized grip or the Elite with it's fancy upgrades).
Saturn's twin main processors weren't "assigned to different tasks" so much as they were hard to feed simultaneously. They were both general-purpose Hitachi SH-2s, but they shared the same bus and couldn't access system memory at the same time. So if you wanted to keep both CPUs crunching, you had to make extremely careful use of the cache memory. On top of that, it had two VDPs which WERE completely different animals, and you had to figure out how to use both to do very different tasks and combine the result into something seamless. Then there were several other subprocessors which you needed to take advantage of to reduce the workload on the CPUs (including but not limited to a 20Mhz SH-1 and a 11.3Mhz 68K). Not to mention the heavily segmented memory pools. The raw power was greater than that of the PSX, but few developers and games were able to really harness it. That's why when they released the Dreamcast it was a much simpler design - it still had room for interesting tricks and optimizations, but it was far easier for the average developer to target.
I think the main reason the current console generation didn't have performance beyond what high end gaming PCs could do is that we were just coming off of a recession and they wanted their consoles to remain more affordable. But every previous generation that wasn't an issue so they'd be released with specs and features that were simply not possible on PC.
It was more likely that Sony and MS were tired of eating the early manufacturing costs of custom hardware than any concern for the recession. PS3 caused Sony to take a MASSIVE hit with every console sold for a couple of years, and in the past that had been a rather normal occurance with consoles' custom chips/hardware. IIRC, the PS4 was the first PS console to not force Sony to take a loss with every console sold. They also apparently took a little concern with developers complaining about how difficult it was to extract high performance from the Cell, and before it, the odd emotion engine/graphics synthesizer in PS2, so they went with an architecture that just about everybody has experience with, a GCN GPU and an X86-64 cpu.
Actually Sony took a small loss per unit on PS4 because they outsmarted themselves. They agreed contracts in other than the Japanese Yen because they thought the Yen would be very strong and then Abeonomics kicked in (policies of the Japanese Prime Minister Shinzo Abe) and the Japanese Yen devalued sharply (as was the intent of Abeonomics to stimulate the Japanese economy) and Sony suddenly found they're attempt to squeeze out more profit had backfired.
It wasn't a big loss and it was not their intent to lose money per sale, but they did still make a modest loss per unit. XBOne was cost neutral, iirc.
The 360 could be argued for on the GPU front, but the PS3 launched a year later and the GTX 8800 was out the same month, which was much more powerful. The Cell, ehh, still a mixed bag as it always was.
Numerically speaking, I agree. However PCs and PC GPUs are not optimised for gaming specifically but designed to handle multiple tasks and work with multiple other hardware. Essentially brute forcing performance rather than being the most efficient at handling one specific task.
The Xbox 360 numerically couldn't match up to a high end PC either but because it was optimised soecifically to handle gaming, it could out-perform its equivalent PC builds.
When DF went to Turn 10, they were shown the ForzaTech engine running Forza 6 at 4k with 4k textures at the most intensive situation - max cars, wet weather. With PC equivalent Ultra Settings, the GPU was running at 88% - still 12% extra available. On a GTX 1070 running Forza 6 Apex, this caused few frame rate drops in a few places but the GTX1080 had no trouble either. I know its not necessarily an accurate comparison but it does show that the 6tflops of the Scorpio, with Forza 6, can match a GTX1080 for delivering a ultra setting, 4k solid 60fps performance. The GTX1080 is around 11tflops. The Scorpio may not match the GTX1080 with other engines and games but all the optimisation and customisation has enabled the Scorpio to punch much higher than its numerical specs would indicate compared to PCs - not unlike the XB360 did 10yrs ago...
Xbox 360 outperform the equivalent PC? Hardly. Half the Generation the games could run on weaker hardware than the xbox 360. Games like Oblivion, Bioshock, Unreal Tournament 3, Call of Duty 3/4/world at war, Gears of War, Asassins Creed, Left 4 Dead etc'. All ran fine on hardware weaker than the 360.
Then the other half of the generation ports were running with higher details out of the box. Higher resolutions, Tessellation, Higher framerates, ambient occolusion, improved lighting, shadowing, physics, larger player counts... Higher resolution textures, better AA and Texture filtering... That's not all free you know.
By the time Battlefield 3 launched the PC was looking almost a generation ahead of consoles.
PC games like Crysis (2007) basically showed what the PC could do during the early parts of that generation... Arguably Crysis graphically can still compete with the best looking Xbox 360 and Playstation 3 games even a decade after release.
******
As for PC GPU's not being "Optimised for Gaming". - That's a load of poop. It's basically the same GPU's in consoles. It's the software stack that deviates and sometimes significantly.
****
As for Forza 6 at 4k, 60fps. Remember. Digital Foundry stated that the game was running with Xbox One levels of visuals, but also had resources to spare. It's not directly comparable to the PC.
Converesly, there is more to performance than flops. A GPU with more flops can actually be beaten by a GPU with less flops.
Meanwhile the more powerful Playstation 3 came with a 7800GTX variant, while basically the 8800 was released in literally the same month (and several months before the console came out in the european PAL variant). And literally twice as fast as the 7800 from last year that was one of the worst timing for console gaming ever, and ironically it lasted for ever before the PS4 in 2013 came out. You are listing basically the expectations and ironically those were not performing even better. The N64 was up against much faster 3DFX cards, sure without texture and lighting units, but butting out much higher resolutions and much faster graphics. Feel free to compare quake 2 on glide with 3dfx voodoo 1 cards vs the n64 version.
The cost for a PC in 1996 that meets the minimum system requirements for quake 2 would have been ~10x the cost of the N64. And that's just the minimum requirements.
The N64 was still more technologically advanced than any PC you could get in 1996. Just because you could get a PC for $2000 that meets the minimum system requirements for Quake 2 that the N64 could run for $200 doesn't mean that the N64 didn't have higher capabilities that absolutely No PC had at the time.
Quake 2 actually run in higher quality already with anything that could run under the glide API ;-)
And posting complete PCs instead of competent prices makes you look silly.Those were back in the days even more overpriced than they are today. PC gaming always has been a lot cheaper for thinkers who build their systems themselves.
Which olds true even for today, $170 Ryzen 1400, a $200 RX480 and 250 bucks for ram, board, psu and a super cheap case and you are within scorpio specs already without trouble and people who build for a longer time already their systems usually don't need a new PSU or case either, the peripherals are reusable from system to system as well, which at least seems to be true for consoles from the same brand as well.
Either way, more technology advanced those not mean performing better. And in case of the N64 that claim is hard to keep up in context that the 3Dfx voodoo was delivering quite the performance and offered higher resolutions than even the Wii ;-)
Here have direct comparison between between Quake 64 and Quake PC and notice that the n64 version runs at half a quarter of the resolution (320x240) instead of the 640x480 of the voodoo 1. That's like the difference from 1080p to 4k. Though the even bigger issue with the n64 was the low amount of memory which reduced quality of basically everything, textures, map sizes, etc
https://www.youtube.com/watch?v=OEMs23ihkyA <<< n64 https://www.youtube.com/watch?v=P8qlE5XBopw <<< Voodoo 1 And finally, just for the kickers, quake 1 on a Amiga 1200 from 1992 in software mode and 800x600, just because I run q1 myself without voodoo and in the same resolution on my first own x86 which was a 1000,-DM system, a little under $1000.
Your "more technology advanced than any PC" N64 can not hold a candle against even a 4 year old gaming PC ;-)
I played the first two quakes initially on my dx4 and pentium 166, both a little older than 1996, just with software renderer initially, as I can not find a good video for those, for the heck of it, the 1997 Pentium 233 Mmx in quake 3 arena in software renderer, the 233mmx is admittedly a little faster than the 1996 pentium 200, but Q3A is dramatically more demanding as well ;-)
Good times, when our monitors still capped out at 1280x1024 at around 75hz to 100hz, took the industry long enough to finally catch up in frequency. 85hz was always my favorite back then and these days I don't notice a difference when I go above that on my 144hz screen, so I guess it was better balanced back then. ;-)
That's not true, the N64 doesn't have "Hardware T&L" as in fixed-purpose hardware for processing transform and lighting (neither do current GPUs as they use shaders for that). in fact it doesn't even have a traditional rendering pipeline like a modern GPU. The reality co-processor was essentially a software-re-programmable vector processor attached to a hardware rasterizer. It's both more simple and more flexible than the concurrently available PC 3D cards, but it's essentially a technical dead-end because the technology wasn't developed any further, primarily because of claims that it was really difficult to program for (I have no experience programming for it so I don't have first-hand knowledge). It also processed audio.
In short, very interesting architecture, but not ahead of the PCs at the time. Compare the PC and N64 versions of Turok Dinosaur Hunter, Doom or any of the other very few cross-platform games for examples.
Pretty sure that N64 did not have GPU hardware T&L, at least not the kind we would associate with the Geforce 256. Even a Voodoo 1 had overall superior 3d capabilities to N64, as we saw with the near perfect N64 emulation running UltraHLE on those cards.
What the consoles did have in that era were main cpus and internal buses that were much more capable than the equivalent PC x86 architectures at generating triangles to feed their 3d rasterizers . But PC hardware development over the past couple of decades has been far more geared to multimedia/gaming than the business machines it originated as, and thus, today's x86/64 cpus + a GPU are really tough to beat for price/performance even in games. Or the gaming consoles wouldn't be using them.
No they werent. The T&L calc was done by the R4200, the main cpu. Its just that the main cpus of consoles were much better at doing this than the x86 cpus we had at the time feeding our 3d rasterizer cards.
I stand corrected. However, your statement of "No PC could match N64 in 1996" is wrong as Voodoo 1 was out that year. Granted, you'd spend a buttload on one of them and a good equivalent PC compared to the N64, but you could indeed buy/build a more capable gaming PC then.
No need to be snarky. There are also many things that Voodoo 1 excelled at that the RCP didnt. The most obvious example being the N64's awful texture filtering.
And if it wasnt an order of magnitude more powerful than the N64 you never would have gotten UltraHLE to work on Voodoo 1. Emulation at its very core requires hardware that is at least a decent amount more powerful than the system you are emulating unless the hardware is nearly identical in architecture (which PCs werent).
Sorry for the extra additionals making the flow seem a bit odd. As more and more of this story was passing my eyes, it was changing some of my assumptions as I went along and required new analysis. We'll reach out to MS and perhaps get some clarification on some of the details and publish a new piece in due course. Ideally I would love to got the same access DF had. I bet MS would love my questions...
I don't see there being any exclusives. Whatever they develop is going to be cross-platform, but not really cross platform, that's the point to shifting to a PC-in-a-box, probably at a financial loss. Develop the game for the PC. Benchmark the game on the Scorpio. Ship game with locked graphics settings. Profit.
Impressive. Microsoft will likely keep this lead for several years; I can't see Sony working in a PS4 Pro Pro. It really seems like they've figured out how to learn from and correct their mistakes in many aspects of their business in the last few years. Scorpio will be far from perfect, but it looks like an impressive setup.
The PS4 released in late 2013. The PS4 Pro released exactly 3 years later. PS5 will probably follow this cadence and release 3 after that, meaning availability in time for 2019 holidays.
When speaking of Microsoft's lead, if by "several years" you mean 2, then you're right.
Don't get the fanboy started. PlayStation has mopped the floor with the Xbox for several years now and has an insurmountable lead. Microsoft's strategy of being bigger but late will likely not pay off but don't tell them that.
They say in the video Jaguar cores and there was no time to use Zen anyway. The SP count is clearly 2560 , 64 per CU and that fits the stated FLOPS too given the clocks. Your napkin math power numbers are way off as the box is more than just the SoC, the CPU is not 8 cores Zen and the GPU is lower clocks on a better process and with more GDDR5. Would be surprised if the SoC goes above 100W.
So the comment about 'two blocks of cores' doesn't trigger your Zen at all?
Sure the box is more than just the SoC - my numbers include the DRAM as well (you might have missed that). But if you're saying there's 100W for the SoC, then there's 145W for everything else. 'Everything else' - DVD, network, audio, wifi - probably doesn't amount to more than 30W ? (you're unlikely to be using the DVD and the full SoC at the same time). Perhaps my overall number is high, but not as far off as you're thinking.
The PSU would not hit its upper limit at any time in usage, it likely peaks around 80% in a worst case scenario - the % depends a bit on the engineer designing the system and the efficiency curve of the PSU used. So you got mobo (including conversion for the SoC), the RAM (including GDDR5), HDD, optical, USB (a bunch for VR and others), wifi and others. However, my 100W estimate was based on Jaguar, GPU size and clocks and process not on PSU dimensions. For Zen, there was no time (even less so to customize), 2.3GHz means 0.7V or so and that's wasteful as on 14nm efficiency for Summit Ridge is great up to 1V and about 3.3GHz.
Do wonder about L3 but yet to form an opinion on that. Likely no Infinity fabric with Jaguar and Polaris. I do think it is interesting that they run Win 10 on GDDR5 without any DDR3/4. For a HBM based APU it would be important to be able to use HBM as system DRAM, cost wise for OEMs.
The current consoles already use 'two blocks of cores.' They use two clusters of four jaguar cores each. There are mistakes in the table as well. Fourty CUs (Compute Units) would total 2560 SPs (Stream Processors) not the 1920 shown in the table and current Xbox One CPU runs at 1.75ghz not 1.6ghz. Also the current consoles use APUs so not sure why the speculation of there being no new silicon design involved. The "slim" version of both consoles already have jaguar cores ported to a finfet process.
Are they connected via ccx? Not asking to be snarky, asking because if so that would seem to debunk the byline that the cross-ccx occurrences hit performance. Or would that be different because now it's over infinity fabric? Either way wouldn't nearly all devs have years of experience for programming around that?
Wow nice write up a lot of information for sure. At this point there are a lot of unanswered questions but we are slowly getting there. I do hope it has some sort of Zen core setup in it as well as Vega of some sort. Those old Jag cores need to put on a shelf and collect dust nothing says low end like these types of CPU's like the Jag's and Atom based CPU's in a fresh new console design you don't want your shiny new toy you are about to release to the world deemed low end right out of the gate.
Catch words like Zen and Vega are all good for PR marketing even if we all know they are cut down or running slower than the desktop version the PR department can spin it in good ways. I also do not think MS wants to go through another release like they did with the first Xbox one where it was deemed slower right at launch they want something that wow's everyone this time around so that's my take on it anyways.
There's only so much "awesome" to be had with it, it's still running the same library as the OG XBone so it is crippled out of the gate the same way the PS4 pro is.
Is using GDDR5 for the CPU going to affect the compute side of things much? I remember reading a while back that lower latency RAM (like DDR3) was better for the CPU but I don't know how that has held up.
DDR3 has lower latency in terms of cycles ( 8 on average, compared to average of 15 in GDDR5 ), that's almost double. but duo to clock speeds, GDDR5 have lower latency time as it's clocks are more than double than that of DDR3
Latency as measured in cycles is higher, but the clocks are so much higher that latency in absolute time isn't worlds apart. The memory controller and caches should make the impact negligible too. The PS4 had great success with GDDR5 shared system memory.
Doubtful GF has the capacity to be able to service them or customer requirement. A game console that will sell 10s of millions of units with a relatively large die eats up a lot of wafers.
This thing won't sell tens of millions. Its going to be a "hardcore" elite system. Just like with the PS4?Pro most sales even after its launched will be Xbox One S's not this one.
I think this entire article needs to be revised, because so much of the speculation is no longer speculation and confirmed by Microsoft in Digital Foundry's articles. Scorpio will have a single CPU+GPU die, just like Xbox One and PS4/PS4 Pro. It will be built on 16-nm FinFET, just like Xbox One S (and PS4 Slim/PS4 Pro). The custom x86 cores are confirmed to be based on Jaguar, which makes sense for backwards-compatibility, but modified for higher performance (exactly 31% better IPC compared to Xbox One, according to MS). The power supply will also be an internal one, meaning no large external power brick (continuing that from the Xbox One S, thank goodness).
I wrote a comment right at the start of the comments section about this.
Just to pick you up on one comment though - '31% better IPC'. If these cores are still Jaguar, the IPC is the same no matter what the frequency. DF didn't state IPC, and the clock difference is 31%, which goes along with '31% better performance'.
i stopped here, bro, even on desktop there is no single DEDICATED GPU can drive 4k at Ultra details while maintaining 60 FPS, so how about a crappy console that costs half of what that video card only costs?!
The thing you're probably overlooking, brosef, is that consoles typically have one and only one set of display settings that can't be altered by the user. Broski, that means that every time you play a console game, you're gaming at the system's max settings. You're effectively gaming at "Ultra" all the time which is awesomesauce, my personal brodaka-brodina-brodudebro.
And anyway, bromance, Ultra varies from game to game so it means a lot of different things. It'd be a massive bromistake to argue for or against something on the basis of such a poorly defined term. *high-bro-fives*
I know someone is going to say, "Well gee BrokenCrayons, if you can't change the settings on a console game, then aren't you also playing at the game's lowest possible settings too?" It's a valid point and to address it, I refer you to the famous Brodinger's Cat. In essence, the settings are both at ultra and at low at the same time. It's only when the gamer perceives them as one or the other that they take on the characteristics of bro-ultraness or bro-lowness. So then, in order to be a console gamer that plays at ultra, you merely have to assert the settings are at ultra and let quantum mechanics do the rest.
Wow, I don't really know what to say, except that I'm glad that you got a laugh out of that. The cat's probably happy too, but I'm not really sure since its quantum state hasn't been observed recently so we have no way of knowing.
Ultra's just a conspiracy to push recommended specs higher. Why else would they always include tessellation? Why are there three console companies? And don't tell me you need a better raster engine to connect the dots.
I love how you make a point while being this snarky :D
Anyway, ultra is pretty meaningless, as the resolution always trumps anti-aliasing for example and many other detail levels as well, the best reason to still have all of this are marketing stills and for players who already reached the maximum of their hardware.
This old wisdom might finally break at 6 or 8k, but it certainly holds up on qhd and uhd. Deactivating anti-aliasing is always worth it, and some useless bloom or motion-blur is pointless too. Motion Blur effects are especially useless as real motion blur based on 80+ hz screens is well, real and thus not only more realistic, but as well a lot more in line with your perception, immersion and game experience.
I guess you could have said that at the very beginning of this generation of consoles with the original PS4 and Xbox One. But I think most would agree that the consoles switching to an x86 architecture was and is a "good" thing for PC gaming and gaming overall.
Well yes, I mean all but Halo 5 and The Halo Collection have ended up on PC through the Windows Store and/or Play Anywhere promotion, that was very beneficial to me. I guess it's a good way to get Windows 10 into the home, but perhaps it would be more beneficial to not lock it down?
What would the benefit be to Microsoft? A locked down system means close to zero piracy, and game sales are where they make all their money on a console. Also, if it was just a regular Windows 10 PC where you could install Premiere Pro and After Effects and use it as a low-end editing rig, why would anyone buy a much more expensive PC with similar specs? The box is only as cheap as it is because Microsoft is banking on you buying lots of games to play on it.
And because it uses an absurd balance of a $30 CPU, $150 GPU, and $50 RAM (obviously not what MS pays for any of those pieces).
Let's be real, this would be even worse than all the people who got the Pentium Anniversary Edition and a GTX 970, and by a major company. It only works because it's a console.
On the one hand, it's one step again above the PS4 Pro, in raw performance at least. On the other hand, the architecture changes (memory in particular) from the XB1 mean that, unlike the PS4 Pro, cross-compatibility is not automatic. The PS4/Pro situation is pretty easy for consumers: you buy a PS4 game, and it works on both the PS4 and PS4 Pro, always. If this is not the case with the Scorpio and you need to pay close attention to compatibility, that's going to be annoying.
I didn't see anything to suggest that games would have to be specially compiled to run on Scorpio vs. regular Xbox One, and in fact requiring this would essentially make this a whole new console, not a performance update to an existing console ... so I ask, what makes you think this is how things will work?
The different memory architecture: anything using the fast ESRAM no longer has that available. Microsoft have ruled out Scorpio-only SKUs - at least according to the whitepaper Eurogamer received - so developers MUST optimise for the presence of ESRAM or suffer a performance hit on XB1. Back-compat using the GDDR5 to 'emulate' the ESRAM will be a performance bottleneck, as the two behave very differently (and would break anything that implicitly times down to the instruction level).
You're left with a few options: older games need to be reoptimised or suffer a memory bottleneck. New games face either being optimised for one or the other (optimising for the XB1 is the obvious choice) or the extra effort of having two different memory management schemes.
Or you drop implicit back-compat, and use a system similar to XB1's 360 back-compat, and only work for whitelisted titles.
Everything said has made it clear all games will work completely one both systems with no issues going forward. I will take the highly qualified Microsoft engineers at what's been posted so far.
From DF article to completely refute your claims of any issues
"For the record, ESRAM is indeed surplus to requirements in the Scorpio design, as we revealed a couple of months back. "The memory system we've got, we've got enough bandwidth to more than cover what we got from ESRAM," explains Nick Baker. "We simply go and use our virtual memory system to map the 32MB of physical address that the old games thought they got into 32MB in the GDDR5. So latency is higher, but in terms of aggregate performance, the improved bandwidth and improved GPU performance means we don't hit any issues."
This is important: ESRAM has extremely low latencies, while GDDR5 is great at bulk bandwidth. If you are expecting only GDDR5 then you write memory operations to take advantage of that, and access bug chunks in one go and cram as much into on-chip cache as possible that you need low-latency access to. If you are writing with ESRAM in mind, you take advantage of that ability to access lots of small chunks with low latency. If you then suddenly lose that ESRAM, you now have an extra latency hit applying to a lot of small operations. You either re-write to batch things and access large chunks, or you have a performance bottleneck.
The "performance bottleneck" you speak of is just "less better performance than could be expected if not trying to use ESRAM in that way". So yeah, games developers who specifically optimize for ESRAM are going to have a game that runs as well as it could on normal XBox One, and not quite as fast as it could (theoretically, this hasn't been proven yet) on Scorpio.
Games not optimized to use ESRAM are going to possibly be slower on XBox One, but theoretically faster on Scorpio than they would be otherwise.
The question is, how much worse is performance on XBox One if you don't trying to use ESRAM, and how much better is it on Scorpio, and which do you want to optimize for?
They seem to be suggesting that the differences are marginal anyway. If it were me, I'd probably start out assuming that it's better to optimize for the XBox One, and trust that the design of Scorpio isn't going to penalize me to any meaningful degree for doing so.
So latency is higher, but *every* single game ever made (so far) hasn't needed it, I can't believe we wasted so much of the chip on useless memory.
It should be embarrassing when Sony* makes better hardware than you do.
* yes, I'm old enough to remember the "Back to the Future" "all the best stuff is made in Japan" being funny (cause it was true). But nobody has wanted a trinitron for a long time.
Who cares! The idea behind consoles such as XBox and Playstation is that the game that is made for that device plays on it and the user does not have to worry about how strong or sophisticated the hardware is. So, it totally irrelevant what an XBox is made of. There is no choice when it comes to consoles, at least not after the buyer decides between the two, and he does not need to care what is driving the unit either. And given that the new games will almost certainly be incompatible with their current consoles, the users really do not have any choice but to keep buying these products as long as they want to play game. So, people will buy the units even if Microsoft or Sony put Salami inside them, just as long as they can play the next game on it.
Without a certain level of hardware, you don't get certain experiences. Sure, people buy for experience, not hardware, but they're not mutually exclusive. Especially as the depth of experiences required to satiate a user goes up.
The depth of experience argument only works for the PC not the consoles. Like reinstalling a game that did not get past the opening screen from two years back after upgrading or buying a new PC. To be more precise, first comes the experience and then the depth of it starts to vie for your attention. In the case of the PC there will not be any experience to begin with in the first place when the PC is older than two years. The console gamer will always get his experience, which is what the consoles are for and that is enough. There will be a few "wow"s after they have bought their new console, but it will soon go away and nobody pays any attention to how well textured the hero's clothing or the alien's skin is and gets on with playing the game because the game plays well and smooth and because that is what the consoles are there for and because he does not have to worry about the power of the hardware in the box. How good the game is actually is, is another story. Not many good games recently.
2yr replacement cycle on PCs has been extended significantly since the i series of CPUs were introduced, my PC is 6yrs old and gamed at 1080p ultra and later with a new video card 1440p ultra. I have 2nd gen i7.
"(...) it will soon go away and nobody pays any attention to how well textured the hero's clothing or the alien's skin is (...)"
I personally nurture on such aspects and it is not something which decreases over time. These days, it does appear like sustainability may be a challenge to some or to not few, but to simply give up on it or to underestimate everyone, these are not acceptable approaches.
This is patently false. All Xbox games will work equally on both. Why can't people get this. It's no different than a PC gamer who is playing BF 1 at 1080p, medium settings running right next to a guy playing at 4K and high settings. Its the exact same game content and gameplay, it just looks better. This will be no different.
Apparently XBox one game will run on Scorpio, but I could not find anything that confirms that Scorpio games can be run on XBox One, albeit with a lower resolution.
I highly highly doubt that Microsoft is going to allow games to be released that only work on one platform and not the other. They will require strict compatibility across both platforms, so that no end user ever has to be presented with the situation where a game they bought doesn't work on their system.
Its interesting that a new next-gen console would be using a very low horsepower CPU. It indicates we've plateaued in CPU requirements and GPU performance is the main feature driver. Silicon area and power consumption also matches that assumption.
The PS3 and PS3 were interesting, sold at below cost and seemed to be the cheapest way to get a DVD or blu ray player. These new consoles aren't much cheaper than just buying equivalent off the shelf parts, doesn't seem to have much HW innovation left in this area.
"AMD technically has several cores potentiall available for Scorpio: Excavator (Bulldozer-based, as seen on 28nm), Jaguar-based (also from 28nm) or Zen based (seen on 14nm GF)."
How about Puma? While still Bobcat-based, you did make the distinction between Excavator and Bulldozer.
Thank you for the insightful article, Ian. It'd be interesting to see how the memory crossbar situation plays out in practice. It's refreshing to read a piece on the subject that is not PR-heavy. I almost fear for Digital Foundry's integrity - I understand that the opportunity Microsoft gave them might have been irresistible, but still their piece is too enthusiastic and reads almost entirely like pure PR; it was a bit strange seeing them do that given their previous work. I'm sure it's just a one-off, though.
What's disappointing about Scorpio to me is that it uses a blower, whereas both Xbox One and One S utilized open-air coolers. Sure, it looks like it'd be a nice blower, but even if it's on par with NVIDIA's reference design, it'd still be louder than what's come before. Apparently, Microsoft are obsessing about size, which at least to me is less important than noise. A prettier box surely does sell more than the vague and hard-to-explain premise quietness.
It's virtual memory mapping that makes it possible. You don't really know where the memory allocated is physically placed. In fact it get fragmented as you make and destroy objects, even in PC parts.
Digital Foundry just released a bit more info about the Scorpio, namely that it supports HDMI 2.1 with VRR and also FreeSync. Which is pretty awesome. Microsoft seems to have checked every box that it had to with this hardware.
The 360s eDRAM was a bit more interesting than credited, since the ROPs actually sat on the daughter die with the eDRAM, the bandwidth to the ROPs was drastically higher than that, in the league of 200GB/s. Granted that's not to the whole GPU, only the ROPs, but those are the bandwidth eaters.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
113 Comments
Back to Article
takeshi7 - Thursday, April 6, 2017 - link
"Consoles have long development processes, and are thus already behind the curve at launch"That's only true for the most recent generation of consoles. most other generation of consoles usually launched with more advanced hardware than PCs, but PCs quickly catch up. First GPU with hardware T&L was the N64 in 1996. PCs didn't get that until the Geforce 256 in 1999. First unified shader GPU was the Xbox 360 in 2005. PCs didn't get unified shaders until a year later. etc.
fred666 - Thursday, April 6, 2017 - link
you can pick individual specs but the fact remain that high end PCs are globally faster than console at launch, and the gap only widen with time. Console are cheaper than high end PCs, however.CharonPDX - Friday, April 7, 2017 - link
As takeshi7 said, this is a new thing. Even the Xbox 360 / Playstation 3 were better at launch than even an ultra-high-end gaming PC was. It has only been with the Xbone/PS4 that they were slower than PC at launch.Also, not sure what you mean by "globally faster".
Frenetic Pony - Friday, April 7, 2017 - link
The PS3 wasn't, but only because it was about as fast as a 360 a year later, still very good for the time though. But today, 12 years after the 360 launch GPUs have gotten much bigger in terms of mm^2 and now you can have multiple of them cause why not? So today we've pegged Vega's launch at > double the relative straight compute power of the (to be released later) Xbox Scorpio.People have proven they'll pay for giant GPUs with giant cooling blocks, so Nvidia and AMD are more than happy to sell it to them.
Samus - Friday, April 7, 2017 - link
I agree. This is a new problem for consoles to be "outdated" when they are essentially now based off PC hardware (or in Nintendo's case, mobile hardware!). Before, it was more specialized and kept a secret, especially in the 90's when we had no idea what console manufacturers were up too. It was speculated for years Nintendo would go to CD-ROM but didn't. The Dreamcast surprised everyone with a proprietary optical format...and the Saturn had dual processors that were a bitch to program for since they were each assigned to different tasks. I still think Sega goes down as the most interesting console developer (with Nintendo close behind) because of their sheer unpredictability and rampant product developement cycles. The Genesis had numerous versions and 2 substantial upgrade components, not to mention four controller designs, in a period shorter than most modern console generations. Nintendo kept it simple, only ever making one global upgrade accessory for a console (the N64 RAMBUS expansion pack...although the N64 DD did launch in limited quantities in Japan as did the Nintendo Satellite network, but Nintendo never planned on launching these outside of Japan's limited audience)Where Nintendo had always been revolutionary was with controllers. The first of its kind D-PAD controller, the first controller with shoulder buttons, whatever the hell the N64 controller was and how amazing it is in retrospect, and obviously the Wii. Even their underdog controllers like the NES Max and the 'dog bone' NES controllers are about as close to perfection as a basic 8 bit controller gets.
darkich - Saturday, April 8, 2017 - link
Well to be precise, Switch is only partially based on mobile hardware (ARM CPU), but the GPU is of the same architecture that is used in desktop parts..the fact that Tegra is a mobile chip doesn't change the fact Maxwell is an universal architecture designed to scale across platforms.Alexvrb - Tuesday, April 11, 2017 - link
I own a Genesis, a 32X, and a Sega CD as well. Amazing add-ons, the likes of which were rarely seen for consoles even in those days. They even ran a few gaming gems that might not otherwise have existed.For the Genesis, what four controller designs are you referring to? There's only two first-party Sega controllers, unless you include the Arcade Sticks or other specialized accessories like the Flight Stick or Enforcer. Saturn had three main first-party designs. The N64 controller... I hated. It was an advanced design but overall it ended up being an awkward design I had to put up with for years. The analog stick in particular was a pile of junk after a few years of use. Even the analog stick ("3D") variant of the Saturn controller was better, though for 2D games I preferred the standard Model 2 Saturn controller - it was the gold standard of d-pads for me until Xbox Elite controllers (unless you count third-party controllers). Dreamcast was the first "modern-design" analog + d-pad controller that I used and really liked, though Xbox "S" controllers surpassed it in most regards and modern Xbox controllers are superior in all regards (especially the latest "S" model controllers with the rubberized grip or the Elite with it's fancy upgrades).
Saturn's twin main processors weren't "assigned to different tasks" so much as they were hard to feed simultaneously. They were both general-purpose Hitachi SH-2s, but they shared the same bus and couldn't access system memory at the same time. So if you wanted to keep both CPUs crunching, you had to make extremely careful use of the cache memory. On top of that, it had two VDPs which WERE completely different animals, and you had to figure out how to use both to do very different tasks and combine the result into something seamless. Then there were several other subprocessors which you needed to take advantage of to reduce the workload on the CPUs (including but not limited to a 20Mhz SH-1 and a 11.3Mhz 68K). Not to mention the heavily segmented memory pools. The raw power was greater than that of the PSX, but few developers and games were able to really harness it. That's why when they released the Dreamcast it was a much simpler design - it still had room for interesting tricks and optimizations, but it was far easier for the average developer to target.
takeshi7 - Friday, April 7, 2017 - link
I think the main reason the current console generation didn't have performance beyond what high end gaming PCs could do is that we were just coming off of a recession and they wanted their consoles to remain more affordable. But every previous generation that wasn't an issue so they'd be released with specs and features that were simply not possible on PC.blppt - Sunday, April 9, 2017 - link
It was more likely that Sony and MS were tired of eating the early manufacturing costs of custom hardware than any concern for the recession. PS3 caused Sony to take a MASSIVE hit with every console sold for a couple of years, and in the past that had been a rather normal occurance with consoles' custom chips/hardware. IIRC, the PS4 was the first PS console to not force Sony to take a loss with every console sold. They also apparently took a little concern with developers complaining about how difficult it was to extract high performance from the Cell, and before it, the odd emotion engine/graphics synthesizer in PS2, so they went with an architecture that just about everybody has experience with, a GCN GPU and an X86-64 cpu.h4rm0ny - Saturday, April 15, 2017 - link
Actually Sony took a small loss per unit on PS4 because they outsmarted themselves. They agreed contracts in other than the Japanese Yen because they thought the Yen would be very strong and then Abeonomics kicked in (policies of the Japanese Prime Minister Shinzo Abe) and the Japanese Yen devalued sharply (as was the intent of Abeonomics to stimulate the Japanese economy) and Sony suddenly found they're attempt to squeeze out more profit had backfired.It wasn't a big loss and it was not their intent to lose money per sale, but they did still make a modest loss per unit. XBOne was cost neutral, iirc.
anubis44 - Sunday, April 9, 2017 - link
@CharonPDX: By 'globally faster' he means 'I work for nVidia in the marketing department, and I'm spreading more FUD for my master, Jen Hsun Huang.'tipoo - Wednesday, April 19, 2017 - link
The 360 could be argued for on the GPU front, but the PS3 launched a year later and the GTX 8800 was out the same month, which was much more powerful. The Cell, ehh, still a mixed bag as it always was.tipoo - Wednesday, April 19, 2017 - link
The RSX, even launching a year later, was also decidedly behind the 360s unified shader + EDRAM Xenos.
Gasaraki88 - Friday, April 7, 2017 - link
Agreed. he's just cherry picking stuff.BAMozzy - Saturday, April 8, 2017 - link
Numerically speaking, I agree. However PCs and PC GPUs are not optimised for gaming specifically but designed to handle multiple tasks and work with multiple other hardware. Essentially brute forcing performance rather than being the most efficient at handling one specific task.The Xbox 360 numerically couldn't match up to a high end PC either but because it was optimised soecifically to handle gaming, it could out-perform its equivalent PC builds.
When DF went to Turn 10, they were shown the ForzaTech engine running Forza 6 at 4k with 4k textures at the most intensive situation - max cars, wet weather. With PC equivalent Ultra Settings, the GPU was running at 88% - still 12% extra available. On a GTX 1070 running Forza 6 Apex, this caused few frame rate drops in a few places but the GTX1080 had no trouble either. I know its not necessarily an accurate comparison but it does show that the 6tflops of the Scorpio, with Forza 6, can match a GTX1080 for delivering a ultra setting, 4k solid 60fps performance. The GTX1080 is around 11tflops. The Scorpio may not match the GTX1080 with other engines and games but all the optimisation and customisation has enabled the Scorpio to punch much higher than its numerical specs would indicate compared to PCs - not unlike the XB360 did 10yrs ago...
StevoLincolnite - Sunday, April 9, 2017 - link
Xbox 360 outperform the equivalent PC? Hardly. Half the Generation the games could run on weaker hardware than the xbox 360.Games like Oblivion, Bioshock, Unreal Tournament 3, Call of Duty 3/4/world at war, Gears of War, Asassins Creed, Left 4 Dead etc'. All ran fine on hardware weaker than the 360.
Then the other half of the generation ports were running with higher details out of the box. Higher resolutions, Tessellation, Higher framerates, ambient occolusion, improved lighting, shadowing, physics, larger player counts... Higher resolution textures, better AA and Texture filtering... That's not all free you know.
By the time Battlefield 3 launched the PC was looking almost a generation ahead of consoles.
PC games like Crysis (2007) basically showed what the PC could do during the early parts of that generation... Arguably Crysis graphically can still compete with the best looking Xbox 360 and Playstation 3 games even a decade after release.
******
As for PC GPU's not being "Optimised for Gaming". - That's a load of poop. It's basically the same GPU's in consoles. It's the software stack that deviates and sometimes significantly.
****
As for Forza 6 at 4k, 60fps. Remember. Digital Foundry stated that the game was running with Xbox One levels of visuals, but also had resources to spare. It's not directly comparable to the PC.
Converesly, there is more to performance than flops. A GPU with more flops can actually be beaten by a GPU with less flops.
tipoo - Wednesday, April 19, 2017 - link
iirc Oblivion was pretty hard to run at launch, or at least max out. I remember people getting like x1900s for it.Granted that could be more about developers knowing what to cull back on fixed hardware consoles.
zavrtak - Thursday, April 6, 2017 - link
Meanwhile the more powerful Playstation 3 came with a 7800GTX variant, while basically the 8800 was released in literally the same month (and several months before the console came out in the european PAL variant). And literally twice as fast as the 7800 from last year that was one of the worst timing for console gaming ever, and ironically it lasted for ever before the PS4 in 2013 came out.
You are listing basically the expectations and ironically those were not performing even better. The N64 was up against much faster 3DFX cards, sure without texture and lighting units, but butting out much higher resolutions and much faster graphics.
Feel free to compare quake 2 on glide with 3dfx voodoo 1 cards vs the n64 version.
takeshi7 - Thursday, April 6, 2017 - link
The cost for a PC in 1996 that meets the minimum system requirements for quake 2 would have been ~10x the cost of the N64. And that's just the minimum requirements.Sources: http://www.computerhope.com/q2.htm
https://books.google.com/books?id=NGNpFuAXu70C&...
Death666Angel - Friday, April 7, 2017 - link
You weren't talking about costs in the OP. Stay on topic.takeshi7 - Friday, April 7, 2017 - link
The N64 was still more technologically advanced than any PC you could get in 1996. Just because you could get a PC for $2000 that meets the minimum system requirements for Quake 2 that the N64 could run for $200 doesn't mean that the N64 didn't have higher capabilities that absolutely No PC had at the time.zavrtak - Sunday, April 9, 2017 - link
Quake 2 actually run in higher quality already with anything that could run under the glide API ;-)And posting complete PCs instead of competent prices makes you look silly.Those were back in the days even more overpriced than they are today. PC gaming always has been a lot cheaper for thinkers who build their systems themselves.
Which olds true even for today, $170 Ryzen 1400, a $200 RX480 and 250 bucks for ram, board, psu and a super cheap case and you are within scorpio specs already without trouble and people who build for a longer time already their systems usually don't need a new PSU or case either, the peripherals are reusable from system to system as well, which at least seems to be true for consoles from the same brand as well.
Either way, more technology advanced those not mean performing better. And in case of the N64 that claim is hard to keep up in context that the 3Dfx voodoo was delivering quite the performance and offered higher resolutions than even the Wii ;-)
Here have direct comparison between between Quake 64 and Quake PC and notice that the n64 version runs at half a quarter of the resolution (320x240) instead of the 640x480 of the voodoo 1. That's like the difference from 1080p to 4k. Though the even bigger issue with the n64 was the low amount of memory which reduced quality of basically everything, textures, map sizes, etc
https://www.youtube.com/watch?v=OEMs23ihkyA <<< n64
https://www.youtube.com/watch?v=P8qlE5XBopw <<< Voodoo 1
And finally, just for the kickers, quake 1 on a Amiga 1200 from 1992 in software mode and 800x600, just because I run q1 myself without voodoo and in the same resolution on my first own x86 which was a 1000,-DM system, a little under $1000.
https://www.youtube.com/watch?v=YCs79Y7HgTY
Your "more technology advanced than any PC" N64 can not hold a candle against even a 4 year old gaming PC ;-)
I played the first two quakes initially on my dx4 and pentium 166, both a little older than 1996, just with software renderer initially, as I can not find a good video for those, for the heck of it, the 1997 Pentium 233 Mmx in quake 3 arena in software renderer, the 233mmx is admittedly a little faster than the 1996 pentium 200, but Q3A is dramatically more demanding as well ;-)
https://www.youtube.com/watch?v=UpA3NpYJSKs
Good times, when our monitors still capped out at 1280x1024 at around 75hz to 100hz, took the industry long enough to finally catch up in frequency. 85hz was always my favorite back then and these days I don't notice a difference when I go above that on my 144hz screen, so I guess it was better balanced back then. ;-)
zavrtak - Sunday, April 9, 2017 - link
Should be "the n64 version runs at a quarter of the resolution", sorry did not catch that little mistake. No edit button is a pain.Flunk - Thursday, April 6, 2017 - link
That's not true, the N64 doesn't have "Hardware T&L" as in fixed-purpose hardware for processing transform and lighting (neither do current GPUs as they use shaders for that). in fact it doesn't even have a traditional rendering pipeline like a modern GPU. The reality co-processor was essentially a software-re-programmable vector processor attached to a hardware rasterizer. It's both more simple and more flexible than the concurrently available PC 3D cards, but it's essentially a technical dead-end because the technology wasn't developed any further, primarily because of claims that it was really difficult to program for (I have no experience programming for it so I don't have first-hand knowledge). It also processed audio.In short, very interesting architecture, but not ahead of the PCs at the time. Compare the PC and N64 versions of Turok Dinosaur Hunter, Doom or any of the other very few cross-platform games for examples.
blppt - Thursday, April 6, 2017 - link
Pretty sure that N64 did not have GPU hardware T&L, at least not the kind we would associate with the Geforce 256. Even a Voodoo 1 had overall superior 3d capabilities to N64, as we saw with the near perfect N64 emulation running UltraHLE on those cards.What the consoles did have in that era were main cpus and internal buses that were much more capable than the equivalent PC x86 architectures at generating triangles to feed their 3d rasterizers . But PC hardware development over the past couple of decades has been far more geared to multimedia/gaming than the business machines it originated as, and thus, today's x86/64 cpus + a GPU are really tough to beat for price/performance even in games. Or the gaming consoles wouldn't be using them.
takeshi7 - Thursday, April 6, 2017 - link
N64 had hardware transform and lighting in the sense that the CPU did not have to do the T&L calculations. They were offloaded.blppt - Thursday, April 6, 2017 - link
No they werent. The T&L calc was done by the R4200, the main cpu. Its just that the main cpus of consoles were much better at doing this than the x86 cpus we had at the time feeding our 3d rasterizer cards.takeshi7 - Friday, April 7, 2017 - link
Multiple sources disagree with you. The N64 CPU didn't have to do the T&L calculations.https://en.wikipedia.org/wiki/Transform,_clipping,...
http://gliden64.blogspot.com/2013/09/hardware-ligh...
http://xenol.kinja.com/the-nintendo-64-is-one-of-t...
blppt - Friday, April 7, 2017 - link
I stand corrected. However, your statement of "No PC could match N64 in 1996" is wrong as Voodoo 1 was out that year. Granted, you'd spend a buttload on one of them and a good equivalent PC compared to the N64, but you could indeed buy/build a more capable gaming PC then.takeshi7 - Friday, April 7, 2017 - link
I'd love to see a Voodoo 1 card do real time reflections like the N64 could.blppt - Friday, April 7, 2017 - link
No need to be snarky. There are also many things that Voodoo 1 excelled at that the RCP didnt. The most obvious example being the N64's awful texture filtering.And if it wasnt an order of magnitude more powerful than the N64 you never would have gotten UltraHLE to work on Voodoo 1. Emulation at its very core requires hardware that is at least a decent amount more powerful than the system you are emulating unless the hardware is nearly identical in architecture (which PCs werent).
Ian Cutress - Thursday, April 6, 2017 - link
Sorry for the extra additionals making the flow seem a bit odd. As more and more of this story was passing my eyes, it was changing some of my assumptions as I went along and required new analysis. We'll reach out to MS and perhaps get some clarification on some of the details and publish a new piece in due course. Ideally I would love to got the same access DF had. I bet MS would love my questions...ImSpartacus - Thursday, April 6, 2017 - link
That drive for additional depth and research is unquestionably why many of is frequent AT, so don't apologize for that.xaml - Monday, April 17, 2017 - link
Yes, exactly.Eden-K121D - Thursday, April 6, 2017 - link
I really hope they have better exclusives this time and this launches at $499 just in time for my college freshman yearHomeworldFound - Thursday, April 6, 2017 - link
I don't see there being any exclusives. Whatever they develop is going to be cross-platform, but not really cross platform, that's the point to shifting to a PC-in-a-box, probably at a financial loss.Develop the game for the PC.
Benchmark the game on the Scorpio.
Ship game with locked graphics settings.
Profit.
euler007 - Thursday, April 6, 2017 - link
Install base has a bigger impact on getting exclusives than spec. If this turns out to be a great UHD player and D2 has cross-play I might get one.Drumsticks - Thursday, April 6, 2017 - link
Impressive. Microsoft will likely keep this lead for several years; I can't see Sony working in a PS4 Pro Pro. It really seems like they've figured out how to learn from and correct their mistakes in many aspects of their business in the last few years. Scorpio will be far from perfect, but it looks like an impressive setup.ToTTenTranz - Thursday, April 6, 2017 - link
The PS4 released in late 2013. The PS4 Pro released exactly 3 years later. PS5 will probably follow this cadence and release 3 after that, meaning availability in time for 2019 holidays.When speaking of Microsoft's lead, if by "several years" you mean 2, then you're right.
MrSensitiveNipples - Thursday, April 6, 2017 - link
2 years and sony drops their next console and m$ drops one the year after that. I see what the plan is. Well played m$.FourEyedGeek - Friday, April 7, 2017 - link
What lead? What are they leading at?fanofanand - Sunday, April 9, 2017 - link
Don't get the fanboy started. PlayStation has mopped the floor with the Xbox for several years now and has an insurmountable lead. Microsoft's strategy of being bigger but late will likely not pay off but don't tell them that.jjj - Thursday, April 6, 2017 - link
They say in the video Jaguar cores and there was no time to use Zen anyway.The SP count is clearly 2560 , 64 per CU and that fits the stated FLOPS too given the clocks.
Your napkin math power numbers are way off as the box is more than just the SoC, the CPU is not 8 cores Zen and the GPU is lower clocks on a better process and with more GDDR5. Would be surprised if the SoC goes above 100W.
Ian Cutress - Thursday, April 6, 2017 - link
So the comment about 'two blocks of cores' doesn't trigger your Zen at all?Sure the box is more than just the SoC - my numbers include the DRAM as well (you might have missed that). But if you're saying there's 100W for the SoC, then there's 145W for everything else. 'Everything else' - DVD, network, audio, wifi - probably doesn't amount to more than 30W ? (you're unlikely to be using the DVD and the full SoC at the same time). Perhaps my overall number is high, but not as far off as you're thinking.
Eden-K121D - Thursday, April 6, 2017 - link
More like Zen. Also is it 16nmFFC or FF+jjj - Thursday, April 6, 2017 - link
The PSU would not hit its upper limit at any time in usage, it likely peaks around 80% in a worst case scenario - the % depends a bit on the engineer designing the system and the efficiency curve of the PSU used. So you got mobo (including conversion for the SoC), the RAM (including GDDR5), HDD, optical, USB (a bunch for VR and others), wifi and others.However, my 100W estimate was based on Jaguar, GPU size and clocks and process not on PSU dimensions.
For Zen, there was no time (even less so to customize), 2.3GHz means 0.7V or so and that's wasteful as on 14nm efficiency for Summit Ridge is great up to 1V and about 3.3GHz.
Ian Cutress - Thursday, April 6, 2017 - link
I've updated the piece - it flows substantially better without some of the conjecture.jjj - Thursday, April 6, 2017 - link
Do wonder about L3 but yet to form an opinion on that.Likely no Infinity fabric with Jaguar and Polaris.
I do think it is interesting that they run Win 10 on GDDR5 without any DDR3/4. For a HBM based APU it would be important to be able to use HBM as system DRAM, cost wise for OEMs.
SunnyNW - Thursday, April 6, 2017 - link
The current consoles already use 'two blocks of cores.' They use two clusters of four jaguar cores each. There are mistakes in the table as well. Fourty CUs (Compute Units) would total 2560 SPs (Stream Processors) not the 1920 shown in the table and current Xbox One CPU runs at 1.75ghz not 1.6ghz. Also the current consoles use APUs so not sure why the speculation of there being no new silicon design involved. The "slim" version of both consoles already have jaguar cores ported to a finfet process.SunnyNW - Thursday, April 6, 2017 - link
Nice! Those were some quick changes. Also I'm sure it's safe to say its going to be 8 cores/8 threads since jaguar never had any SMT capability.takeshi7 - Thursday, April 6, 2017 - link
Don't forget Jaguar already comes in 'two blocks of cores' already. Jaguar comes in 4 core clusters, just like Zen.fanofanand - Sunday, April 9, 2017 - link
Are they connected via ccx? Not asking to be snarky, asking because if so that would seem to debunk the byline that the cross-ccx occurrences hit performance. Or would that be different because now it's over infinity fabric? Either way wouldn't nearly all devs have years of experience for programming around that?fanofanand - Sunday, April 9, 2017 - link
God please anandtech give us an edit. Each cluster is a ccx, what I mean is if the design is the same why such a penalty in Ryzen?rocky12345 - Thursday, April 6, 2017 - link
Wow nice write up a lot of information for sure. At this point there are a lot of unanswered questions but we are slowly getting there. I do hope it has some sort of Zen core setup in it as well as Vega of some sort. Those old Jag cores need to put on a shelf and collect dust nothing says low end like these types of CPU's like the Jag's and Atom based CPU's in a fresh new console design you don't want your shiny new toy you are about to release to the world deemed low end right out of the gate.Catch words like Zen and Vega are all good for PR marketing even if we all know they are cut down or running slower than the desktop version the PR department can spin it in good ways. I also do not think MS wants to go through another release like they did with the first Xbox one where it was deemed slower right at launch they want something that wow's everyone this time around so that's my take on it anyways.
fanofanand - Sunday, April 9, 2017 - link
There's only so much "awesome" to be had with it, it's still running the same library as the OG XBone so it is crippled out of the gate the same way the PS4 pro is.LostWander - Thursday, April 6, 2017 - link
Is using GDDR5 for the CPU going to affect the compute side of things much? I remember reading a while back that lower latency RAM (like DDR3) was better for the CPU but I don't know how that has held up.Xajel - Thursday, April 6, 2017 - link
DDR3 has lower latency in terms of cycles ( 8 on average, compared to average of 15 in GDDR5 ), that's almost double. but duo to clock speeds, GDDR5 have lower latency time as it's clocks are more than double than that of DDR3LostWander - Thursday, April 6, 2017 - link
Oh alright makes sense. Thanks for the info!tipoo - Wednesday, April 19, 2017 - link
Latency as measured in cycles is higher, but the clocks are so much higher that latency in absolute time isn't worlds apart. The memory controller and caches should make the impact negligible too. The PS4 had great success with GDDR5 shared system memory.
Amoro - Thursday, April 6, 2017 - link
16nm at TSMC? Interesting...just like the PS4 Pro. I wonder why they didn't go with GloFo.Jtaylor1986 - Thursday, April 6, 2017 - link
Doubtful GF has the capacity to be able to service them or customer requirement. A game console that will sell 10s of millions of units with a relatively large die eats up a lot of wafers.Jumangi - Thursday, April 6, 2017 - link
This thing won't sell tens of millions. Its going to be a "hardcore" elite system. Just like with the PS4?Pro most sales even after its launched will be Xbox One S's not this one.drothgery - Friday, April 7, 2017 - link
In 2018, sure. In 2020, not so much.NextGen_Gamer - Thursday, April 6, 2017 - link
I think this entire article needs to be revised, because so much of the speculation is no longer speculation and confirmed by Microsoft in Digital Foundry's articles. Scorpio will have a single CPU+GPU die, just like Xbox One and PS4/PS4 Pro. It will be built on 16-nm FinFET, just like Xbox One S (and PS4 Slim/PS4 Pro). The custom x86 cores are confirmed to be based on Jaguar, which makes sense for backwards-compatibility, but modified for higher performance (exactly 31% better IPC compared to Xbox One, according to MS). The power supply will also be an internal one, meaning no large external power brick (continuing that from the Xbox One S, thank goodness).Eden-K121D - Thursday, April 6, 2017 - link
Maybe be excavator because it has been out since a long time and Zen has very high IPC compared to JaguarIan Cutress - Thursday, April 6, 2017 - link
I wrote a comment right at the start of the comments section about this.Just to pick you up on one comment though - '31% better IPC'. If these cores are still Jaguar, the IPC is the same no matter what the frequency. DF didn't state IPC, and the clock difference is 31%, which goes along with '31% better performance'.
Ian Cutress - Thursday, April 6, 2017 - link
I've updated the piece - it flows substantially better without some of the conjecture.MANICX100 - Thursday, April 6, 2017 - link
Some corrections for the table. Firstly the Xbox One CPU was clocked @ 1.75 GHz not 1.6. Secondly Eurogamer gave Scorpio memory bandwidth as 320 GB/s.Ian Cutress - Thursday, April 6, 2017 - link
Yup, sorry, that was a copy/paste from an old table.YazX_ - Thursday, April 6, 2017 - link
"as well as a substantial part of 4K gaming."i stopped here, bro, even on desktop there is no single DEDICATED GPU can drive 4k at Ultra details while maintaining 60 FPS, so how about a crappy console that costs half of what that video card only costs?!
Ian Cutress - Thursday, April 6, 2017 - link
So everything needs to be Ultra? Good for you.BrokenCrayons - Thursday, April 6, 2017 - link
The thing you're probably overlooking, brosef, is that consoles typically have one and only one set of display settings that can't be altered by the user. Broski, that means that every time you play a console game, you're gaming at the system's max settings. You're effectively gaming at "Ultra" all the time which is awesomesauce, my personal brodaka-brodina-brodudebro.And anyway, bromance, Ultra varies from game to game so it means a lot of different things. It'd be a massive bromistake to argue for or against something on the basis of such a poorly defined term. *high-bro-fives*
BrokenCrayons - Thursday, April 6, 2017 - link
One last thing and I'm done, I promise!I know someone is going to say, "Well gee BrokenCrayons, if you can't change the settings on a console game, then aren't you also playing at the game's lowest possible settings too?" It's a valid point and to address it, I refer you to the famous Brodinger's Cat. In essence, the settings are both at ultra and at low at the same time. It's only when the gamer perceives them as one or the other that they take on the characteristics of bro-ultraness or bro-lowness. So then, in order to be a console gamer that plays at ultra, you merely have to assert the settings are at ultra and let quantum mechanics do the rest.
bji - Thursday, April 6, 2017 - link
You are officially awesome. You deserve better than the crappy comments system of Anandtech. I mean that sincerely.BrokenCrayons - Friday, April 7, 2017 - link
Wow, I don't really know what to say, except that I'm glad that you got a laugh out of that. The cat's probably happy too, but I'm not really sure since its quantum state hasn't been observed recently so we have no way of knowing.Shadowmaster625 - Thursday, April 6, 2017 - link
bruh...lmcd - Thursday, April 6, 2017 - link
Ultra's just a conspiracy to push recommended specs higher. Why else would they always include tessellation? Why are there three console companies? And don't tell me you need a better raster engine to connect the dots.zavrtak - Sunday, April 9, 2017 - link
I love how you make a point while being this snarky :DAnyway, ultra is pretty meaningless, as the resolution always trumps anti-aliasing for example and many other detail levels as well, the best reason to still have all of this are marketing stills and for players who already reached the maximum of their hardware.
This old wisdom might finally break at 6 or 8k, but it certainly holds up on qhd and uhd. Deactivating anti-aliasing is always worth it, and some useless bloom or motion-blur is pointless too. Motion Blur effects are especially useless as real motion blur based on 80+ hz screens is well, real and thus not only more realistic, but as well a lot more in line with your perception, immersion and game experience.
HomeworldFound - Thursday, April 6, 2017 - link
So it's a PC in a box with propriety software to lock it down.SunnyNW - Thursday, April 6, 2017 - link
I guess you could have said that at the very beginning of this generation of consoles with the original PS4 and Xbox One. But I think most would agree that the consoles switching to an x86 architecture was and is a "good" thing for PC gaming and gaming overall.HomeworldFound - Thursday, April 6, 2017 - link
Well yes, I mean all but Halo 5 and The Halo Collection have ended up on PC through the Windows Store and/or Play Anywhere promotion, that was very beneficial to me. I guess it's a good way to get Windows 10 into the home, but perhaps it would be more beneficial to not lock it down?gerz1219 - Thursday, April 6, 2017 - link
What would the benefit be to Microsoft? A locked down system means close to zero piracy, and game sales are where they make all their money on a console. Also, if it was just a regular Windows 10 PC where you could install Premiere Pro and After Effects and use it as a low-end editing rig, why would anyone buy a much more expensive PC with similar specs? The box is only as cheap as it is because Microsoft is banking on you buying lots of games to play on it.lmcd - Thursday, April 6, 2017 - link
And because it uses an absurd balance of a $30 CPU, $150 GPU, and $50 RAM (obviously not what MS pays for any of those pieces).Let's be real, this would be even worse than all the people who got the Pentium Anniversary Edition and a GTX 970, and by a major company. It only works because it's a console.
edzieba - Thursday, April 6, 2017 - link
On the one hand, it's one step again above the PS4 Pro, in raw performance at least. On the other hand, the architecture changes (memory in particular) from the XB1 mean that, unlike the PS4 Pro, cross-compatibility is not automatic. The PS4/Pro situation is pretty easy for consumers: you buy a PS4 game, and it works on both the PS4 and PS4 Pro, always. If this is not the case with the Scorpio and you need to pay close attention to compatibility, that's going to be annoying.bji - Thursday, April 6, 2017 - link
I didn't see anything to suggest that games would have to be specially compiled to run on Scorpio vs. regular Xbox One, and in fact requiring this would essentially make this a whole new console, not a performance update to an existing console ... so I ask, what makes you think this is how things will work?edzieba - Thursday, April 6, 2017 - link
The different memory architecture: anything using the fast ESRAM no longer has that available. Microsoft have ruled out Scorpio-only SKUs - at least according to the whitepaper Eurogamer received - so developers MUST optimise for the presence of ESRAM or suffer a performance hit on XB1. Back-compat using the GDDR5 to 'emulate' the ESRAM will be a performance bottleneck, as the two behave very differently (and would break anything that implicitly times down to the instruction level).You're left with a few options: older games need to be reoptimised or suffer a memory bottleneck. New games face either being optimised for one or the other (optimising for the XB1 is the obvious choice) or the extra effort of having two different memory management schemes.
Or you drop implicit back-compat, and use a system similar to XB1's 360 back-compat, and only work for whitelisted titles.
Jumangi - Thursday, April 6, 2017 - link
Everything said has made it clear all games will work completely one both systems with no issues going forward. I will take the highly qualified Microsoft engineers at what's been posted so far.Jumangi - Thursday, April 6, 2017 - link
From DF article to completely refute your claims of any issues"For the record, ESRAM is indeed surplus to requirements in the Scorpio design, as we revealed a couple of months back. "The memory system we've got, we've got enough bandwidth to more than cover what we got from ESRAM," explains Nick Baker. "We simply go and use our virtual memory system to map the 32MB of physical address that the old games thought they got into 32MB in the GDDR5. So latency is higher, but in terms of aggregate performance, the improved bandwidth and improved GPU performance means we don't hit any issues."
edzieba - Friday, April 7, 2017 - link
"So latency is higher"This is important: ESRAM has extremely low latencies, while GDDR5 is great at bulk bandwidth. If you are expecting only GDDR5 then you write memory operations to take advantage of that, and access bug chunks in one go and cram as much into on-chip cache as possible that you need low-latency access to. If you are writing with ESRAM in mind, you take advantage of that ability to access lots of small chunks with low latency. If you then suddenly lose that ESRAM, you now have an extra latency hit applying to a lot of small operations. You either re-write to batch things and access large chunks, or you have a performance bottleneck.
bji - Friday, April 7, 2017 - link
The "performance bottleneck" you speak of is just "less better performance than could be expected if not trying to use ESRAM in that way". So yeah, games developers who specifically optimize for ESRAM are going to have a game that runs as well as it could on normal XBox One, and not quite as fast as it could (theoretically, this hasn't been proven yet) on Scorpio.Games not optimized to use ESRAM are going to possibly be slower on XBox One, but theoretically faster on Scorpio than they would be otherwise.
The question is, how much worse is performance on XBox One if you don't trying to use ESRAM, and how much better is it on Scorpio, and which do you want to optimize for?
They seem to be suggesting that the differences are marginal anyway. If it were me, I'd probably start out assuming that it's better to optimize for the XBox One, and trust that the design of Scorpio isn't going to penalize me to any meaningful degree for doing so.
wumpus - Friday, April 7, 2017 - link
So latency is higher, but *every* single game ever made (so far) hasn't needed it, I can't believe we wasted so much of the chip on useless memory.It should be embarrassing when Sony* makes better hardware than you do.
* yes, I'm old enough to remember the "Back to the Future" "all the best stuff is made in Japan" being funny (cause it was true). But nobody has wanted a trinitron for a long time.
mobutu - Thursday, April 6, 2017 - link
... that magnetic oldtime spinner ...versesuvius - Thursday, April 6, 2017 - link
Who cares! The idea behind consoles such as XBox and Playstation is that the game that is made for that device plays on it and the user does not have to worry about how strong or sophisticated the hardware is. So, it totally irrelevant what an XBox is made of. There is no choice when it comes to consoles, at least not after the buyer decides between the two, and he does not need to care what is driving the unit either. And given that the new games will almost certainly be incompatible with their current consoles, the users really do not have any choice but to keep buying these products as long as they want to play game. So, people will buy the units even if Microsoft or Sony put Salami inside them, just as long as they can play the next game on it.Ian Cutress - Thursday, April 6, 2017 - link
Without a certain level of hardware, you don't get certain experiences. Sure, people buy for experience, not hardware, but they're not mutually exclusive. Especially as the depth of experiences required to satiate a user goes up.versesuvius - Friday, April 7, 2017 - link
The depth of experience argument only works for the PC not the consoles. Like reinstalling a game that did not get past the opening screen from two years back after upgrading or buying a new PC. To be more precise, first comes the experience and then the depth of it starts to vie for your attention. In the case of the PC there will not be any experience to begin with in the first place when the PC is older than two years. The console gamer will always get his experience, which is what the consoles are for and that is enough. There will be a few "wow"s after they have bought their new console, but it will soon go away and nobody pays any attention to how well textured the hero's clothing or the alien's skin is and gets on with playing the game because the game plays well and smooth and because that is what the consoles are there for and because he does not have to worry about the power of the hardware in the box. How good the game is actually is, is another story. Not many good games recently.Icehawk - Saturday, April 8, 2017 - link
2yr replacement cycle on PCs has been extended significantly since the i series of CPUs were introduced, my PC is 6yrs old and gamed at 1080p ultra and later with a new video card 1440p ultra. I have 2nd gen i7.xaml - Monday, April 17, 2017 - link
"(...) it will soon go away and nobody pays any attention to how well textured the hero's clothing or the alien's skin is (...)"I personally nurture on such aspects and it is not something which decreases over time. These days, it does appear like sustainability may be a challenge to some or to not few, but to simply give up on it or to underestimate everyone, these are not acceptable approaches.
Jumangi - Thursday, April 6, 2017 - link
This is patently false. All Xbox games will work equally on both. Why can't people get this. It's no different than a PC gamer who is playing BF 1 at 1080p, medium settings running right next to a guy playing at 4K and high settings. Its the exact same game content and gameplay, it just looks better. This will be no different.versesuvius - Friday, April 7, 2017 - link
Apparently XBox one game will run on Scorpio, but I could not find anything that confirms that Scorpio games can be run on XBox One, albeit with a lower resolution.bji - Friday, April 7, 2017 - link
I highly highly doubt that Microsoft is going to allow games to be released that only work on one platform and not the other. They will require strict compatibility across both platforms, so that no end user ever has to be presented with the situation where a game they bought doesn't work on their system.webdoctors - Thursday, April 6, 2017 - link
Its interesting that a new next-gen console would be using a very low horsepower CPU. It indicates we've plateaued in CPU requirements and GPU performance is the main feature driver. Silicon area and power consumption also matches that assumption.The PS3 and PS3 were interesting, sold at below cost and seemed to be the cheapest way to get a DVD or blu ray player. These new consoles aren't much cheaper than just buying equivalent off the shelf parts, doesn't seem to have much HW innovation left in this area.
Jumangi - Thursday, April 6, 2017 - link
This isn't a new generation of console. It's an improved version of the current one. All Xbox games are playable on both.hechacker1 - Thursday, April 13, 2017 - link
*Maybe* for a console. But try pairing a pc with Jaguar cores and a RX480 and tell me it doesn't suffer at 4k. Of course it will.Gigaplex - Thursday, April 6, 2017 - link
"AMD technically has several cores potentiall available for Scorpio: Excavator (Bulldozer-based, as seen on 28nm), Jaguar-based (also from 28nm) or Zen based (seen on 14nm GF)."How about Puma? While still Bobcat-based, you did make the distinction between Excavator and Bulldozer.
beginner99 - Friday, April 7, 2017 - link
It's a console. It's not gonna be Zen and it's not going to be Vega. To expensive and also would make backwards compatibility a pain in the ass.Besides that can you guys enable HTTPS for the comment login? We are in 2017. Sending plaintext passwords is retarded.
bji - Friday, April 7, 2017 - link
Anandtech comments system is so RIDICULOUSLY retarded in many, many ways. I just can't understand why they don't switch over to Disqus.scmorange16 - Friday, April 7, 2017 - link
Any details about pricing?zodiacfml - Friday, April 7, 2017 - link
The speculation is sound but I hope to be surprised with recently released or unreleased technologyyhselp - Saturday, April 8, 2017 - link
Thank you for the insightful article, Ian. It'd be interesting to see how the memory crossbar situation plays out in practice. It's refreshing to read a piece on the subject that is not PR-heavy. I almost fear for Digital Foundry's integrity - I understand that the opportunity Microsoft gave them might have been irresistible, but still their piece is too enthusiastic and reads almost entirely like pure PR; it was a bit strange seeing them do that given their previous work. I'm sure it's just a one-off, though.What's disappointing about Scorpio to me is that it uses a blower, whereas both Xbox One and One S utilized open-air coolers. Sure, it looks like it'd be a nice blower, but even if it's on par with NVIDIA's reference design, it'd still be louder than what's come before. Apparently, Microsoft are obsessing about size, which at least to me is less important than noise. A prettier box surely does sell more than the vague and hard-to-explain premise quietness.
Kieran12k - Saturday, April 8, 2017 - link
Dev's have only got access to 8 gigs of GDDR5 for games, so that means they can't use 326 gb/s of bandwidth.4 gigs are reserved for the OS part. I think you forget that?
hechacker1 - Thursday, April 13, 2017 - link
It's virtual memory mapping that makes it possible. You don't really know where the memory allocated is physically placed. In fact it get fragmented as you make and destroy objects, even in PC parts.rarson - Tuesday, April 11, 2017 - link
Digital Foundry just released a bit more info about the Scorpio, namely that it supports HDMI 2.1 with VRR and also FreeSync. Which is pretty awesome. Microsoft seems to have checked every box that it had to with this hardware.tipoo - Wednesday, April 19, 2017 - link
The 360s eDRAM was a bit more interesting than credited, since the ROPs actually sat on the daughter die with the eDRAM, the bandwidth to the ROPs was drastically higher than that, in the league of 200GB/s. Granted that's not to the whole GPU, only the ROPs, but those are the bandwidth eaters.http://img.cinemablend.com/cb/3/d/5/2/9/7/3d5297b5...