I don't think so. Based on the information they gave away in the video presentation, the memory will have a >300GB/s bandwith. If we take into account that the PS4Neo is rumored to have >270GB/s, this hints at a super fast GDDR5 or a slow GDDR5X, but not HBM.
XOne has not a unified memory architecture, there are RAM and ESRAM inside. 320GB/s is probably the agregate bandwidth of the ESRAM and the system RAM. And you get the peak bandwidth only when using both memory in same time.
The ESRAM was to partially make up for the slow (for graphics) DDR3 system RAM. I doubt Scorpio would use ESRAM when they will surely need at least fast GDDR5 to sufficiently feed the beefy GPU.
I would imagine they would have to keep the ESRAM for backwards compatibility purposes. I imagine at least some games were developed with that 32MB "L4 Cache" in mind
If the memory in Scorpio is indeed faster than the ESRAM in XBO (320GB/s compared to some aggregate), you should be able to just partition off 32MB of SRAM and 'pretend' it's ESRAM with no problem. You usually don't break backwards compatibility by having faster memory.
But latency will be different and in optimized console games I would believe it's not too far fetched that this could matter. Console games can be optimized so extremely because devs can rely on systems being identical including latency. So you I would think the ESRAM would be needed again.
Faster in terms of bandwidth, yes, but latency with GDDR5 is somewhat high. So there are technical hurdles if they really wanted to rip out the ESRAM. I don't think they will. The ESRAM will shrink VERY nicely on GloFo/Samsung's 14nm LPP process. They could bump speed up on it too - worst case scenario it truly is used as a large L4. An alternative is that they double down and make it faster and larger. Like 128MB of ESRAM. But I find that about as likely as them removing it entirely.
Anyway, what's this talk of keeping Jaguar cores, Ian? If they continue to use a Cat core, they would use Puma+ with substantially higher clocks. 2.5+ would be well within the realm of possibility.
The ESRAM can only be used by the GPU, I don't think the latency matters much, considering that GPU are designed to deal with high latency. For GPUs, bandwidth is much more important.
Wouldn't Scorpio just do an online update to the executable during install and bypass emulation for the most part. Maybe emulation if someone has a slow/limited internet connection.
They could retain the eSRAM and use it as an L4 cache for the CPU and/or for retaining backwards compatibility. At 14nm I doubt the eSRAM is going to be taking up such a large % of the die-area.
No. ESRAM is just a cache. You can remove cache layers in a hierarchy and the CPU/GPU doesn't care. As long as there is system memory to pull from, you can disable every cache except L1 (because it has dedicated instruction and data lines and thus isn't like L2/L3/L4) and the CPU wouldn't notice.
Sorry to post in an old thread but this is incorrect. First of all it's not just a cache. Second it's not about what the "CPU notices" but it's about what the code is attempting to utilize. XB1 games utilize that ESRAM directly and in some cases heavily to extract as much bandwidth as possible. Removing it would make backwards compatibility more difficult (though not impossible with emulation).
Yeah, if that image is real, then it looks like there are 8 memory packages, which would be too many for a realistic HBM-based solution. So it's likely either GDDR5 or GDDR5X. Eight packages strongly suggests a 256-bit bus.
And once you've got that bus width locked down, then you need the memory to run at an effective 10 Gbps if you want to get to 320 GB/s of bandwidth. It's convenient that GDDR5X is already available today at that speed.
I wish the new consoles used HBM2 memory, but they probably won't. However, I do believe they MUST use Zen CPUs. Using small cores again would be disappointing. Quad-core Zen with 8 threads would make for a great easy transition for console developers as well, even if it has something like 1.2GHz clock speed.
I don't think there is a Jaguar on 14nm FinFET. My understanding is that no one wanted to pay to reengineer 28nm planar Jaguar to 12nm FinFET as it's not just a simple die shrink. As such PS4k "NEO" isn't Jaguar and neither is Scorpio. I've seen "Neo" referred to as "Zen-lite" and if Scorpio isn't coming out until next Christmas it may well be full Zen. That said my sources aren't the greatest so if Ananadtech says otherwise I'd believe you over them any day.
This makes sense. I think they both will be Zen featured, but PS4Neo will have an underclocked RX480 while the Scorpio will have an overclocked one. The RX480 hints at >5TFlops, which can easily mean around 5.5TFlops. That said, an overclock can easily push it to 6TFlops.
dude they use fully custom designs on these consoles nowadays.. it's gonna definitely have more CU's than a RX480. they dont take cookie cutter cpu designs and slap them in a console, it is tailor made now.
not they don't, they dont use those GPU's. they might be comparable, but its not a 7790 and 7870. 7790 used 1gb of gddr5, xbox one uses ddr3. It's called semi-custom because of the Cores both CPU and GPU are within a particular architecture. full custom would be a RISC cpu and totally different GPU cores or something like that.
I would say that, compared to previous consoles, these are MUCH less custom than earlier versions. Cell processor anyone? This console generation uses AMD CPU and GPU IP, just re-arranged efficiently on-die with respect to the memory system
Definitely a higher CU counts GPU. Overclocking too much out of a GPU generates unreasonable amount of power consumption. In a space conscious console, heat is a major issue.
If that were true then they'd already be running the cards at those speeds as stock in the PC space. AMD have no reason to leave performance on the table there.
I'd argue more CUs and a lower clock than the PC variant is more likely, with some spare CUs to sacrifice to keep yields nice and high..
Huh...crazy. That would be awesome if they're using AMD's full CPU...would be a huge upgrade, not to mention that it's apparently got piles more GPU hardware.
This seems like it it might actually be a serious upgrade, worth thinking about as an Xbox 4.
And yay for backwards compatibility.
*cough* Hey, Nintendo? Remember how so far you've only matched the systems from 2005? Uh...
Memory shown in the images was separate and almost certainly GDDR. Bandwith was quoted in the presentation as 320GB/s so I am guessing either 5Gbbs GDDR5 on a 512bit bus or 10Gbbs GDDR5X on a 256bit bus?
Also I'm loving the fact that the supposed 10 year lifespan of the current consoles has been completely thrown out of the window at this point...
Almost certainly the latter. A 512bit bus would be too expensive for a mass market part. 7.5GHz GDDR5 and a 384bit bus with an asymmetric mix of ram chip sizes to hit 8gb isn't totally impossible; but by next GDDR5X production should be in full swing, so there's little reason to use the older ram type anymore. Especially since plain GDDR5 will probably be starting to ramp down in volume; which means there won't be any future price/power cuts as the manufacturing processes are improved.
Well, they still might last 10 years. PS3 is still being made and it is not even compatible with current gen titles. PS4 and Neo are (supposed to be) fully compatible, so release of more powerful version doesn't really terminate original PS4. They can coexist as long as new games work on both machines.
What we are having here is definitely PC-zation of consoles. More frequent updates, multiple performance/price products, but still the same platform. I'm expecting that original PS4 will be shrunk to Lite/Mini version, but regardless, we will see exactly the same parts being made for years. Since there will be noticeable price differentiation between PS4 and Neo, PS4 will remain solid choice for more casual gamers, people who game on smaller/lower res TVs, secondary console for kids room (where old 720p TV might have already found refugee).
Eventually, not everyone wants to invest into top-of-the-line gaming PC, and console gamers should be more frugal/less hardware enthusiasts than PC gamers. I really expect they will embrace and keep buying cheaper PS4.
I don't know if original PS4 will last 10 years exactly, but I will be surprised if that hardware isn't available 8 years after release, in some form.
Personally, I am happy with these refreshes... as long as compatibility is preserved. I like collecting physical copies of games, and still have all my PS2 and PS3 titles. I also have PS2 and PS3 in good condition, but even if I don't use them much/at all these days, it does cross my mind what to do once hardware dies. Having new hardware capable of running old games, yep - definitely one happy gamer here.
Funny, when the PS4 and XB1 came out alot of insiders were saying they would be each companies last console and that PC gaming would take back over... Now not only were they not the last console they arent even the last PS4 and XB1 consoles. Until PC gaming is not a massive pain in the rear, consoles will still have a place.
At this point consoles are highly specialized x86 computers. Just like you would use a Quadro/Firepro with a Xeon/Opteron for workstation use, because it is optimized for certain loads. You use consoles with dedicated hardware to get high performance at a lower price then with a general purpose PC. Eventhough your Console and PC can run the same code base. the Console will run better with the given hardware.
Yes, but the difference is usability. Consoles are super easy, have a consistent controller setup that you dont need to futz with. The main benefit is more fun, less config... Of course the downside is less power and lesser graphic quality.
LOL... I get that. It used to be fun for me too... Until life got busier, then I had to make some difficult cuts to my time budget ;) . Other than that though, alot of people (like people that would never visit tech sites like this) couldn't deal with it on a PC regardless of their available time.
I still maintain gaming PC... and use it... albeit less than console. Now that even traditionally single player games are getting some multiplayer content, I'm finding it less and less interesting to deal with hacks and cheats which run rampart in pretty much every PC multiplayer game. Yes they were there before, but it just might be that I am getting too old, and my gaming time too limited to be wasted on something frustrating - and being cheat-killed by some kid with through-the-wall exploit is definitely getting into "frustrating" category. I don't mind losing games - in fact I was never really above average - but like in sports, it is fine only as long as it is fair, and hats down to people with real skills. People with hacks, no thanks.
My friends insisted on playing The Division on PC, albeit we all have PS4 as well. I agreed - some of them have monster PCs and want to use them. However, we don't go to multiplayer Dark Zone. Tried handful of times, often got annihilated through multiple walls and floors... so. For playing game with better visuals (not that console versions look bad at all!), we are missing part of the game, and quite solid part, too. Wasn't worth it, imho.
"my gaming time too limited to be wasted on something frustrating" - Exactly. You can fit alot of different "frustrating" things into that one comment, but it really sums it up pretty well. PC gaming will always look the best and have some great aspects, but for my limited time, I'd just rather play and have fun than deal with it.
Yep... BF3 was kind of eye opener for me. I played it both on PS3 and PC. PC version looked better by margin - my 6870 could run game at 1080p and pretty much maxed out sans AA - and it also had 64-players matches, versus 24 on PS3.
But... outside of playing with friends, when I was on my own I've chosen to play on PS3 much more often than on PC. It just felt more relaxing and balanced - everyone playing on same hardware, mostly with same controllers, and most of all - no crazy on-the-run head-shots and other metahuman skills... or "skills". Eventually, some of them might have been down to my internet speed/lag and opponents' real skills, but when you ambush someone and spray him with LMG from well concealed position and he basically turns around, spots you and head-shots you with first round, all in one smooth motion, without even slowing down (shooting from hip?)... well it might be luck or crazy skill, but it does raise some suspicion as well. Some people troll-cheat, make it obvious just to annoy others. Some smart-cheat and try to make it look as credible as possible.
Eventually, that game made me reconsider my priorities. Simplicity, less cheating and fun that came out of that apparently took the best seats. Regardless of weaker looks and other downgrades, BF3 on PS3 was simply more enjoyable for me.
They probably learned their lesson when the devs complained about perf 6 months before it was even released - coupled with the explosive popularity of VR/4K gaming shortly after they ship consoles that are barely pushing 720p and 1080p. I think both Sony and MS missed their marks - Microsoft more so.
Yup, both XB1 and PS4 were pretty small jumps from their predecessors when you figure the 7 year gap into it. If we have a 6 gflop console in 2017 that is kind of huge. I almost cant believe it.
Exactly. Many, Sony and Microsoft included, believed the console market had already peaked and would only shrink further with smartphones and tablets growing in popularity. Of course they were Completely wrong and Sony themselves admitted they weren't even sure why the PS4 was selling as quickly as it was (fastest selling console in history for them). So obviously both played it safe with relatively cheap hardware. In teh past they would usually lose money on the hardware and make it back with the software but the PS4 and XBone boxes where both profitable from day one.
A bit small for 2017. At that point they should target 4k gaming and higher res than current headsets VR. Would be a bit tricky as this chip could already be 300-400mm2 on 16ff, depending on what cores they use but Scorpio might feel outdated by launch , even more so if Sony and Nintendo do similar or faster things before them.
4k gaming on console class hardware is probably another year or two out. Even the fastest current GPUs on the market struggle at the resolution. Next year will probably see top end PC GPUs capable of doing it at reasonable FPS/Quality levels; but the 1180TI and R590 (Fury 2?) will cost more than a gaming console itself. It'll be 2018/19 before a console size GPU has that much power, and I doubt MS will want to launch a new console that soon after the 2017 model. A 2020 launch would be more likely.
4K should be viable on a > 6Tflops Polaris/Vega GPU with 320GB/sec memory bandwidth if they target Medium/High settings with 0xMSAA. They can also run the games at 30 fps. Another alternative is to scale existing Xbox One games from 720-900p to 4K @ 60Hz. That should be a peace of cake for the Scorpio given the specs outlined. While you are correct that 6Tflops isn't enough for 4K @ 60Hz with everything maxed out, this is not necessary for the console market just yet. 2018/2019 also seems completely of the question now if both Sony and MS are releasing Neo/Scorpio in 2017. In that case, I'd expect next generation consoles to come out in 2020/2021 then.
Actually during the presentation Phil Spencer stated that the Scorpio is being made to handle 4K gaming with high quality visuals (along with VR). The Xbox S (the slim) is supposed to be capable of up-scaling current games to 4K output according to a Microsoft official.
You'll have the same graphics as you do now, but with 4x the pixels, which should be doable because you get like 4.5x the performance with the new console. I don't know how "techies" still can't get this easy calculation yet.
Not sure how people like you still fail to realize that FLOPs don't directly translate to performance in video games. The R9 290 is 4.8 AMD TFLOPs and the Fury X is 8.6 TFLOPs. The Fury X is not even close to that much faster in video games than an R9 290.
In order to maintain compatability, will it not require the same sort of eSRAM implementation? If so then maybe they are playing number games to arrive at 320GB/s. Could it be a 100 GB/s eSRAM interface combined with a 200 GB/s GDDR5 interface, or using some goofy sum of squares math, maybe 250GBs + 100GB/S = 320GB/s? Also, doesnt the CPU have some number of GFLOPS of floating point performance? Could this be a 5.5TFLOP GPU combined with a few hundred GFLOP CPU to total 6 TFLOP? At any rate, it is probably going to require 40CU, which means a 44 or even a 48CU die for yields.
CPU cores are only in the tens of GFLOPs/core. Even if they are doing some sort of shenanagens with memory bandwidth, adding the CPU power to the GPU power ends up as rounding error in a TFLOP number.
8-core cpu running at 3GHz using avx512 can in theory achieve 32 single precision operations per cycle. Ie. 3GHz Zen would be 768Gflops (on paper) and 4GHz would be 1 TFlops!
Still not much compared to gpu, but enough to be more than just rounding error compared to ~5 TFLOPS GPU.
I was thinking the same. Pretty sure it will need to have eSRAM, and knowing Microsoft, there will be / already is marketing spin around that. Remembering how keen they were on tossing eSRAM's theoretical max. contribution into X1 performance, and describing X1 success with vague things like how many miles, bullets, headshots and whatnot have X1 gamers achieved - I'm expecting the same will continue.
Eventually, will not be surprised if Scorpio and Neo are pretty much the same hardware, with addition of eSRAM on Scorpio's side. Using same SoC makes sense from economy perspective, AMD will do cheaper one SoC selling in greater numbers, than two SoCs each doing only half as much. I even wouldn't be surprised if MS and Sony make some secret pact under the table and stop competing on hardware - they both got bitten by underperforming yet overpriced hardware. Yes one will end up with upper hand, but risk of having bad marketing and poorer multiplatform titles for the whole generation is quite a risk to take, if it can be avoided; I will not be surprised if they agree on standardized SoC and compete with accessories and exclusive titles, while securing equality for 3rd party games.
Are we sure they aren't using the classic marketing move of combining the memory bandwidth numbers to make them bigger, even though that's not how it works in the real world?
AMD quite confusingly refers to their new architecture as both GCN4 and Polaris in press releases and presentations. I guess they can be used interchangeably, but I find the terminology used by anandtech and other tech sites makes more sense. GCN4 is the architecture and Polaris refers to specific GPUs based on GCN4, Polaris 10 and 11.
I've only seen presentations from AMD explaining the star naming system -- they want the same recognition as the likes of Maxwell and Pascal for their architectures. Polaris is one, Vega is the next.
It sounds like GCN4 refers to the shader architecture, the CUs, geometry (tessellator), scheduler, etc, while Polaris encompasses GCN4 and other technologies integrated on die such as the display and multimedia engines.
If the die render is corect then it can't be 8 Zen + RX 480 because the die it's too small for 14nm. I think it has to be 400+ mm2 - a very expensive die for a console.
It might be 10nm, but I think it's too early for a 10nm die shrink of Zen (expecially for GloFlo) and too expensive for a console.
Unless Microsoft is willing to throw money away (see LinkedIn) to gain back the console market at any cost.
Jaguar has already been ported to 14nm in XBOX One S.
It seems both Scorpio and Neo will have the same SoC but Microsoft will want more powah - higher clocked, thus bulkier than Neo (a repeat of what happened this current gen).
T1beriu - do you have a link that Jaguar has been ported to 14nm for the Xbox One S? I haven't seen anything that suggests that, and some articles that suggest otherwise (that neither Sony nor Microsoft wanted to pay for Jaguar to be ported to FinFET).
Though I suppose iit would be a huge tip-off if that is the case that Xbox One S has 14nm Jaguar - also suggesting that the PS4 Neo could also be a shrunk Jaguar plus Polaris on a single APU. Seems odd that it would be worth it to do that for Jaguar. Better to just wait a few months, save your $100M, and get Zen in there.
Well, smaller Jaguar would be backward compatible to previous generation, while Seon would not be. If this would really be a new consoles, zen would be wice, but because this is just power upgrade, the beefier Jaguar make it much easier to do.
40% smaller + they included the big power brick inside the box. That tells me of big power reduction, thus new SoC on 14nm.
Another hint is the old consoles. Every time a console goes slim is because the SoC manufacturing process goes down. PS3 and XBOX 360 had 3 or 4 process shrinks during their life time. So it's no big deal to put some money on porting Jaguar to 14nm.
Have you ever seen that PS4? "40% smaller + they included the big power brick inside the box. That tells me of big power reduction, thus new SoC on 14nm" That is a horrible assumption considering the PS4 already did those things from launch.
That's considered Quad Core not 8 core CPU like MS mentioned. HT is not a core. It's 4 core 8 threads. You don't advertise a quad-core i7 as eight cores.
Since Zen has multi-threading, it could easily be just a quad-core Zen with 8 threads, so the game engines/developers won't have to change their architectures too much. Even in quad-core form and even with a 1Ghz clock speed, Zen should still be significantly faster than the current Jaguar CPUs.
I believe this is a great marketing for AMD. Current-gen APU's can't handle 1080p with decent settings. Most APU's can run at 60fps only content displayed in 720p with medium settings. For Microsoft to advertise that in a year and a half, their custom designed SoC will play 4K 60fps, leaves me wondering if the Zen APU (not custom) will deliver at least 1440p @ high settings.
It's quite a jump in IGP performance. Definitely worth looking forward to this.
A ZEN APU can't deliver 1440p @ high settings because of low memory bandwidth (DDR4). When HBM comes to APUs, then we'll see that kind of performance coming for an APU and that's 2-3 years away.
I doubt we'll see HBM2 on 14nm - or ever. AMD already says Navi will use a "next-gen memory", although it's still not clear whether that refers to scaling up HBM2 to all products, or a completely new type of memory.
I'm surprised a site like Anandtech isn't considering 4-core CPU with 8 threads, and assumes by default it's going to be an 8-core Zen. That would be way too expensive for a console, and it's completely unnecessary to provide a big boost over the current CPU.
In fact, we should consider lucky it will even be a quad-core Zen, and not a higher clocked dual-core one with 4 threads, or worse, another Jaguar successor. I'd be disappointed if it's not a quad-core Zen with 8 threads, though - it just makes the most sense.
With all this emphasis on 4K and Phil SPencer saying the new Scorpio is "built specifically to lead the console industry into true 4K gaming..." It makes me wonder would it not be better to use the extra GPU power and console resources for better graphical fidelity at 1080p? Better shadows, lighting, etc. can go a long way in bringing photo-realistic gaming with 6 Tflops at 1080p. Full HD resolution is already pretty crisp and with good AA I can't help but wonder if this would not be the better direction. Any thoughts?
Developers can choose to use it how they see fit. I think dynamic resolution and dynamic graphics will become more common especially with first party. You can already see it with Forza 6 Apex dynamic resolution and graphics. It's really cool.
Read this shortly after the E3 presentation, but didn't have time to comment.
Thank you so much for releasing this in a timely matter - when I hit the Anandtech bookmark after the presentation this was precisely the article I was hoping for. Thank you for taking the time to write it, on time. Very much appreciated.
6 TFLOPS is already slow right now and let's not forget this comes out in Holiday 2017. By Holiday 2017 the RX 570 will probably be capable of delivering that. Not to mention these are AMD FLOPs we are talking about here. By 2017 NVIDIA's 1180 will probably be pushing 3x as much power as whats in the Scorpio.
In essence this is just another console that M$ will be selling at a profit and not anything groundbreaking like the Xbox 360. It's going to struggle mightly to even keep 24 FPS at 4K and cutting edge games will probably still have trouble mainting 60 FPS at 1080p.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
101 Comments
Back to Article
Pinn - Monday, June 13, 2016 - link
Process shrink. HBM unified CPU/GPU.taikamya - Monday, June 13, 2016 - link
I don't think so. Based on the information they gave away in the video presentation, the memory will have a >300GB/s bandwith. If we take into account that the PS4Neo is rumored to have >270GB/s, this hints at a super fast GDDR5 or a slow GDDR5X, but not HBM.Huacanacha - Monday, June 13, 2016 - link
With just 320 GB/s of memory bandwidth it's very unlikely to be HBM. Sounds like high frequency GDDR5 or GDDR5X.Hamilcar - Monday, June 13, 2016 - link
XOne has not a unified memory architecture, there are RAM and ESRAM inside. 320GB/s is probably the agregate bandwidth of the ESRAM and the system RAM. And you get the peak bandwidth only when using both memory in same time.Huacanacha - Monday, June 13, 2016 - link
The ESRAM was to partially make up for the slow (for graphics) DDR3 system RAM. I doubt Scorpio would use ESRAM when they will surely need at least fast GDDR5 to sufficiently feed the beefy GPU.ishould - Monday, June 13, 2016 - link
I would imagine they would have to keep the ESRAM for backwards compatibility purposes. I imagine at least some games were developed with that 32MB "L4 Cache" in mindMorawka - Monday, June 13, 2016 - link
your talking about microsoft here.. the king of emulation and backwards compatibility. If anyone could pull it off without ESRAM, they couldEden-K121D - Thursday, April 6, 2017 - link
You're correct mateinighthawki - Monday, June 13, 2016 - link
If the memory in Scorpio is indeed faster than the ESRAM in XBO (320GB/s compared to some aggregate), you should be able to just partition off 32MB of SRAM and 'pretend' it's ESRAM with no problem. You usually don't break backwards compatibility by having faster memory.beginner99 - Tuesday, June 14, 2016 - link
But latency will be different and in optimized console games I would believe it's not too far fetched that this could matter. Console games can be optimized so extremely because devs can rely on systems being identical including latency. So you I would think the ESRAM would be needed again.Alexvrb - Tuesday, June 14, 2016 - link
Faster in terms of bandwidth, yes, but latency with GDDR5 is somewhat high. So there are technical hurdles if they really wanted to rip out the ESRAM. I don't think they will. The ESRAM will shrink VERY nicely on GloFo/Samsung's 14nm LPP process. They could bump speed up on it too - worst case scenario it truly is used as a large L4. An alternative is that they double down and make it faster and larger. Like 128MB of ESRAM. But I find that about as likely as them removing it entirely.Anyway, what's this talk of keeping Jaguar cores, Ian? If they continue to use a Cat core, they would use Puma+ with substantially higher clocks. 2.5+ would be well within the realm of possibility.
fallaha56 - Tuesday, June 14, 2016 - link
i'd be thinking 4 hyperthreaded Zen cores...killeak - Tuesday, June 14, 2016 - link
The ESRAM can only be used by the GPU, I don't think the latency matters much, considering that GPU are designed to deal with high latency. For GPUs, bandwidth is much more important.killeak - Tuesday, June 14, 2016 - link
Talking about the ESRAM on the XB1Fujikoma - Tuesday, June 14, 2016 - link
Wouldn't Scorpio just do an online update to the executable during install and bypass emulation for the most part. Maybe emulation if someone has a slow/limited internet connection.StevoLincolnite - Tuesday, June 14, 2016 - link
They could retain the eSRAM and use it as an L4 cache for the CPU and/or for retaining backwards compatibility.At 14nm I doubt the eSRAM is going to be taking up such a large % of the die-area.
patrickjp93 - Saturday, June 18, 2016 - link
No. ESRAM is just a cache. You can remove cache layers in a hierarchy and the CPU/GPU doesn't care. As long as there is system memory to pull from, you can disable every cache except L1 (because it has dedicated instruction and data lines and thus isn't like L2/L3/L4) and the CPU wouldn't notice.Alexvrb - Friday, July 22, 2016 - link
Sorry to post in an old thread but this is incorrect. First of all it's not just a cache. Second it's not about what the "CPU notices" but it's about what the code is attempting to utilize. XB1 games utilize that ESRAM directly and in some cases heavily to extract as much bandwidth as possible. Removing it would make backwards compatibility more difficult (though not impossible with emulation).Also: Killeak, the CPU can access ESRAM.
Eden-K121D - Tuesday, June 14, 2016 - link
I bet they will use GDDR5 X as it will allow for a smaller memory busImSpartacus - Monday, June 13, 2016 - link
Yeah, if that image is real, then it looks like there are 8 memory packages, which would be too many for a realistic HBM-based solution. So it's likely either GDDR5 or GDDR5X. Eight packages strongly suggests a 256-bit bus.And once you've got that bus width locked down, then you need the memory to run at an effective 10 Gbps if you want to get to 320 GB/s of bandwidth. It's convenient that GDDR5X is already available today at that speed.
Krysto - Tuesday, June 14, 2016 - link
I wish the new consoles used HBM2 memory, but they probably won't. However, I do believe they MUST use Zen CPUs. Using small cores again would be disappointing. Quad-core Zen with 8 threads would make for a great easy transition for console developers as well, even if it has something like 1.2GHz clock speed.RiZad - Monday, June 13, 2016 - link
Lets hope its Zenda_asmodai - Monday, June 13, 2016 - link
I don't think there is a Jaguar on 14nm FinFET. My understanding is that no one wanted to pay to reengineer 28nm planar Jaguar to 12nm FinFET as it's not just a simple die shrink. As such PS4k "NEO" isn't Jaguar and neither is Scorpio. I've seen "Neo" referred to as "Zen-lite" and if Scorpio isn't coming out until next Christmas it may well be full Zen. That said my sources aren't the greatest so if Ananadtech says otherwise I'd believe you over them any day.taikamya - Monday, June 13, 2016 - link
This makes sense. I think they both will be Zen featured, but PS4Neo will have an underclocked RX480 while the Scorpio will have an overclocked one.The RX480 hints at >5TFlops, which can easily mean around 5.5TFlops. That said, an overclock can easily push it to 6TFlops.
Morawka - Monday, June 13, 2016 - link
dude they use fully custom designs on these consoles nowadays.. it's gonna definitely have more CU's than a RX480. they dont take cookie cutter cpu designs and slap them in a console, it is tailor made now.T1beriu - Monday, June 13, 2016 - link
Do they? Last time I heard AMD had a SEMI-custom division.XBOX One has a Radeon 7790 and PS4 has a 7870 and both had Jaguar cores.
So both future consoles will have already made AMD bits inside them.
Remember, SEMI-custom. Dude.
Morawka - Monday, June 13, 2016 - link
not they don't, they dont use those GPU's. they might be comparable, but its not a 7790 and 7870. 7790 used 1gb of gddr5, xbox one uses ddr3. It's called semi-custom because of the Cores both CPU and GPU are within a particular architecture. full custom would be a RISC cpu and totally different GPU cores or something like that.ishould - Monday, June 13, 2016 - link
I would say that, compared to previous consoles, these are MUCH less custom than earlier versions. Cell processor anyone? This console generation uses AMD CPU and GPU IP, just re-arranged efficiently on-die with respect to the memory systemdragonsqrrl - Monday, June 13, 2016 - link
The PS4 was comparable to the 7790, the XBox One had fewer CUs but was also GCN 1.1.dragonsqrrl - Monday, June 13, 2016 - link
Never mind, the PS4 had more CUs than Bonaire, but I'm pretty sure its GPU was also based on GCN 1.1.Eden-K121D - Thursday, April 6, 2017 - link
You're so freaking rightrevanchrist - Monday, June 13, 2016 - link
Definitely a higher CU counts GPU. Overclocking too much out of a GPU generates unreasonable amount of power consumption. In a space conscious console, heat is a major issue.taikamya - Monday, June 13, 2016 - link
I agree heat is a major issue, but I don't think that an RX480 is too far behind the 6TFlops, the overclock I imagine is rather small.Spunjji - Tuesday, June 14, 2016 - link
If that were true then they'd already be running the cards at those speeds as stock in the PC space. AMD have no reason to leave performance on the table there.I'd argue more CUs and a lower clock than the PC variant is more likely, with some spare CUs to sacrifice to keep yields nice and high..
Wolfpup - Monday, June 13, 2016 - link
Huh...crazy. That would be awesome if they're using AMD's full CPU...would be a huge upgrade, not to mention that it's apparently got piles more GPU hardware.This seems like it it might actually be a serious upgrade, worth thinking about as an Xbox 4.
And yay for backwards compatibility.
*cough* Hey, Nintendo? Remember how so far you've only matched the systems from 2005? Uh...
Morawka - Monday, June 13, 2016 - link
there is a Jaguar on 14nm, it's being used in the new slim.Spunjji - Tuesday, June 14, 2016 - link
I haven't seen any info on that and my Google searches aren't very enlightening - do you have a source?Krysto - Tuesday, June 14, 2016 - link
That makes sense. I do hope it works out that way. Zen needs a lot of love.Phill49 - Monday, June 13, 2016 - link
Memory shown in the images was separate and almost certainly GDDR. Bandwith was quoted in the presentation as 320GB/s so I am guessing either 5Gbbs GDDR5 on a 512bit bus or 10Gbbs GDDR5X on a 256bit bus?Also I'm loving the fact that the supposed 10 year lifespan of the current consoles has been completely thrown out of the window at this point...
DanNeely - Monday, June 13, 2016 - link
Almost certainly the latter. A 512bit bus would be too expensive for a mass market part. 7.5GHz GDDR5 and a 384bit bus with an asymmetric mix of ram chip sizes to hit 8gb isn't totally impossible; but by next GDDR5X production should be in full swing, so there's little reason to use the older ram type anymore. Especially since plain GDDR5 will probably be starting to ramp down in volume; which means there won't be any future price/power cuts as the manufacturing processes are improved.nikon133 - Monday, June 13, 2016 - link
Well, they still might last 10 years. PS3 is still being made and it is not even compatible with current gen titles. PS4 and Neo are (supposed to be) fully compatible, so release of more powerful version doesn't really terminate original PS4. They can coexist as long as new games work on both machines.What we are having here is definitely PC-zation of consoles. More frequent updates, multiple performance/price products, but still the same platform. I'm expecting that original PS4 will be shrunk to Lite/Mini version, but regardless, we will see exactly the same parts being made for years. Since there will be noticeable price differentiation between PS4 and Neo, PS4 will remain solid choice for more casual gamers, people who game on smaller/lower res TVs, secondary console for kids room (where old 720p TV might have already found refugee).
Eventually, not everyone wants to invest into top-of-the-line gaming PC, and console gamers should be more frugal/less hardware enthusiasts than PC gamers. I really expect they will embrace and keep buying cheaper PS4.
I don't know if original PS4 will last 10 years exactly, but I will be surprised if that hardware isn't available 8 years after release, in some form.
Personally, I am happy with these refreshes... as long as compatibility is preserved. I like collecting physical copies of games, and still have all my PS2 and PS3 titles. I also have PS2 and PS3 in good condition, but even if I don't use them much/at all these days, it does cross my mind what to do once hardware dies. Having new hardware capable of running old games, yep - definitely one happy gamer here.
retrospooty - Monday, June 13, 2016 - link
Funny, when the PS4 and XB1 came out alot of insiders were saying they would be each companies last console and that PC gaming would take back over... Now not only were they not the last console they arent even the last PS4 and XB1 consoles. Until PC gaming is not a massive pain in the rear, consoles will still have a place.QinX - Monday, June 13, 2016 - link
At this point consoles are highly specialized x86 computers. Just like you would use a Quadro/Firepro with a Xeon/Opteron for workstation use, because it is optimized for certain loads. You use consoles with dedicated hardware to get high performance at a lower price then with a general purpose PC.Eventhough your Console and PC can run the same code base. the Console will run better with the given hardware.
retrospooty - Monday, June 13, 2016 - link
Yes, but the difference is usability. Consoles are super easy, have a consistent controller setup that you dont need to futz with. The main benefit is more fun, less config... Of course the downside is less power and lesser graphic quality.looncraz - Monday, June 13, 2016 - link
For me, the futz *IS* the fun :[retrospooty - Monday, June 13, 2016 - link
LOL... I get that. It used to be fun for me too... Until life got busier, then I had to make some difficult cuts to my time budget ;) . Other than that though, alot of people (like people that would never visit tech sites like this) couldn't deal with it on a PC regardless of their available time.nikon133 - Monday, June 13, 2016 - link
I still maintain gaming PC... and use it... albeit less than console. Now that even traditionally single player games are getting some multiplayer content, I'm finding it less and less interesting to deal with hacks and cheats which run rampart in pretty much every PC multiplayer game. Yes they were there before, but it just might be that I am getting too old, and my gaming time too limited to be wasted on something frustrating - and being cheat-killed by some kid with through-the-wall exploit is definitely getting into "frustrating" category. I don't mind losing games - in fact I was never really above average - but like in sports, it is fine only as long as it is fair, and hats down to people with real skills. People with hacks, no thanks.My friends insisted on playing The Division on PC, albeit we all have PS4 as well. I agreed - some of them have monster PCs and want to use them. However, we don't go to multiplayer Dark Zone. Tried handful of times, often got annihilated through multiple walls and floors... so. For playing game with better visuals (not that console versions look bad at all!), we are missing part of the game, and quite solid part, too. Wasn't worth it, imho.
retrospooty - Monday, June 13, 2016 - link
"my gaming time too limited to be wasted on something frustrating" - Exactly. You can fit alot of different "frustrating" things into that one comment, but it really sums it up pretty well. PC gaming will always look the best and have some great aspects, but for my limited time, I'd just rather play and have fun than deal with it.nikon133 - Tuesday, June 14, 2016 - link
Yep... BF3 was kind of eye opener for me. I played it both on PS3 and PC. PC version looked better by margin - my 6870 could run game at 1080p and pretty much maxed out sans AA - and it also had 64-players matches, versus 24 on PS3.But... outside of playing with friends, when I was on my own I've chosen to play on PS3 much more often than on PC. It just felt more relaxing and balanced - everyone playing on same hardware, mostly with same controllers, and most of all - no crazy on-the-run head-shots and other metahuman skills... or "skills". Eventually, some of them might have been down to my internet speed/lag and opponents' real skills, but when you ambush someone and spray him with LMG from well concealed position and he basically turns around, spots you and head-shots you with first round, all in one smooth motion, without even slowing down (shooting from hip?)... well it might be luck or crazy skill, but it does raise some suspicion as well. Some people troll-cheat, make it obvious just to annoy others. Some smart-cheat and try to make it look as credible as possible.
Eventually, that game made me reconsider my priorities. Simplicity, less cheating and fun that came out of that apparently took the best seats. Regardless of weaker looks and other downgrades, BF3 on PS3 was simply more enjoyable for me.
Michael Bay - Monday, June 13, 2016 - link
With upgrade cycle this short(itself a blasphemy in the console world!), it might be called PC just as well.retrospooty - Monday, June 13, 2016 - link
Yup, that and if it's a ">6 TFLOPs" GPU that is a huge jump for only being 3 years later. NICE!inighthawki - Monday, June 13, 2016 - link
They probably learned their lesson when the devs complained about perf 6 months before it was even released - coupled with the explosive popularity of VR/4K gaming shortly after they ship consoles that are barely pushing 720p and 1080p. I think both Sony and MS missed their marks - Microsoft more so.retrospooty - Monday, June 13, 2016 - link
Yup, both XB1 and PS4 were pretty small jumps from their predecessors when you figure the 7 year gap into it. If we have a 6 gflop console in 2017 that is kind of huge. I almost cant believe it.SunnyNW - Monday, June 13, 2016 - link
Exactly. Many, Sony and Microsoft included, believed the console market had already peaked and would only shrink further with smartphones and tablets growing in popularity. Of course they were Completely wrong and Sony themselves admitted they weren't even sure why the PS4 was selling as quickly as it was (fastest selling console in history for them). So obviously both played it safe with relatively cheap hardware. In teh past they would usually lose money on the hardware and make it back with the software but the PS4 and XBone boxes where both profitable from day one.Krysto - Tuesday, June 14, 2016 - link
I also think they're making a mistake not waiting for 4k VR.jjj - Monday, June 13, 2016 - link
A bit small for 2017. At that point they should target 4k gaming and higher res than current headsets VR.Would be a bit tricky as this chip could already be 300-400mm2 on 16ff, depending on what cores they use but Scorpio might feel outdated by launch , even more so if Sony and Nintendo do similar or faster things before them.
DanNeely - Monday, June 13, 2016 - link
4k gaming on console class hardware is probably another year or two out. Even the fastest current GPUs on the market struggle at the resolution. Next year will probably see top end PC GPUs capable of doing it at reasonable FPS/Quality levels; but the 1180TI and R590 (Fury 2?) will cost more than a gaming console itself. It'll be 2018/19 before a console size GPU has that much power, and I doubt MS will want to launch a new console that soon after the 2017 model. A 2020 launch would be more likely.RussianSensation - Monday, June 13, 2016 - link
4K should be viable on a > 6Tflops Polaris/Vega GPU with 320GB/sec memory bandwidth if they target Medium/High settings with 0xMSAA. They can also run the games at 30 fps. Another alternative is to scale existing Xbox One games from 720-900p to 4K @ 60Hz. That should be a peace of cake for the Scorpio given the specs outlined. While you are correct that 6Tflops isn't enough for 4K @ 60Hz with everything maxed out, this is not necessary for the console market just yet. 2018/2019 also seems completely of the question now if both Sony and MS are releasing Neo/Scorpio in 2017. In that case, I'd expect next generation consoles to come out in 2020/2021 then.SunnyNW - Monday, June 13, 2016 - link
Actually during the presentation Phil Spencer stated that the Scorpio is being made to handle 4K gaming with high quality visuals (along with VR). The Xbox S (the slim) is supposed to be capable of up-scaling current games to 4K output according to a Microsoft official.RiZad - Thursday, June 16, 2016 - link
The Xbox one S was never stated to upscale games.Krysto - Tuesday, June 14, 2016 - link
You'll have the same graphics as you do now, but with 4x the pixels, which should be doable because you get like 4.5x the performance with the new console. I don't know how "techies" still can't get this easy calculation yet.MapRef41N93W - Saturday, June 25, 2016 - link
Not sure how people like you still fail to realize that FLOPs don't directly translate to performance in video games. The R9 290 is 4.8 AMD TFLOPs and the Fury X is 8.6 TFLOPs. The Fury X is not even close to that much faster in video games than an R9 290.taisserroots - Monday, June 13, 2016 - link
You sure they didn't post CPU+GPU teraflop performance?extide - Monday, June 13, 2016 - link
Even if they did, the CPU would add so little that it would hardly matter.Shadowmaster625 - Monday, June 13, 2016 - link
In order to maintain compatability, will it not require the same sort of eSRAM implementation? If so then maybe they are playing number games to arrive at 320GB/s. Could it be a 100 GB/s eSRAM interface combined with a 200 GB/s GDDR5 interface, or using some goofy sum of squares math, maybe 250GBs + 100GB/S = 320GB/s? Also, doesnt the CPU have some number of GFLOPS of floating point performance? Could this be a 5.5TFLOP GPU combined with a few hundred GFLOP CPU to total 6 TFLOP? At any rate, it is probably going to require 40CU, which means a 44 or even a 48CU die for yields.DanNeely - Monday, June 13, 2016 - link
CPU cores are only in the tens of GFLOPs/core. Even if they are doing some sort of shenanagens with memory bandwidth, adding the CPU power to the GPU power ends up as rounding error in a TFLOP number.zepi - Monday, June 13, 2016 - link
8-core cpu running at 3GHz using avx512 can in theory achieve 32 single precision operations per cycle. Ie. 3GHz Zen would be 768Gflops (on paper) and 4GHz would be 1 TFlops!Still not much compared to gpu, but enough to be more than just rounding error compared to ~5 TFLOPS GPU.
http://stackoverflow.com/questions/15655835/flops-...
BillyONeal - Tuesday, June 14, 2016 - link
The only hardware implementing AVX512 is Xeon Phi; you're not going to see that on "cat" cores.blppt - Tuesday, June 14, 2016 - link
Plus, even i7 level AVX2 turns your cpu into a nuclear furnace---can only imagine the heat problems for the set top box console.nikon133 - Monday, June 13, 2016 - link
I was thinking the same. Pretty sure it will need to have eSRAM, and knowing Microsoft, there will be / already is marketing spin around that. Remembering how keen they were on tossing eSRAM's theoretical max. contribution into X1 performance, and describing X1 success with vague things like how many miles, bullets, headshots and whatnot have X1 gamers achieved - I'm expecting the same will continue.Eventually, will not be surprised if Scorpio and Neo are pretty much the same hardware, with addition of eSRAM on Scorpio's side. Using same SoC makes sense from economy perspective, AMD will do cheaper one SoC selling in greater numbers, than two SoCs each doing only half as much. I even wouldn't be surprised if MS and Sony make some secret pact under the table and stop competing on hardware - they both got bitten by underperforming yet overpriced hardware. Yes one will end up with upper hand, but risk of having bad marketing and poorer multiplatform titles for the whole generation is quite a risk to take, if it can be avoided; I will not be surprised if they agree on standardized SoC and compete with accessories and exclusive titles, while securing equality for 3rd party games.
Cygni - Monday, June 13, 2016 - link
Are we sure they aren't using the classic marketing move of combining the memory bandwidth numbers to make them bigger, even though that's not how it works in the real world?h4rm0ny - Tuesday, June 14, 2016 - link
Vega will probably also be "GCN4". So whilst Polaris is GCN4, GCN4 is not Polaris.Meteor2 - Monday, June 13, 2016 - link
You can stop saying 'GCN 4'. The architecture is Polaris.dragonsqrrl - Monday, June 13, 2016 - link
AMD quite confusingly refers to their new architecture as both GCN4 and Polaris in press releases and presentations. I guess they can be used interchangeably, but I find the terminology used by anandtech and other tech sites makes more sense. GCN4 is the architecture and Polaris refers to specific GPUs based on GCN4, Polaris 10 and 11.Meteor2 - Tuesday, June 14, 2016 - link
I've only seen presentations from AMD explaining the star naming system -- they want the same recognition as the likes of Maxwell and Pascal for their architectures. Polaris is one, Vega is the next.dragonsqrrl - Wednesday, June 15, 2016 - link
AMD has referred to it as "4th gen GCN" or GCN 4 in slides:http://www.anandtech.com/show/9886/amd-reveals-pol...
It sounds like GCN4 refers to the shader architecture, the CUs, geometry (tessellator), scheduler, etc, while Polaris encompasses GCN4 and other technologies integrated on die such as the display and multimedia engines.
T1beriu - Monday, June 13, 2016 - link
If the die render is corect then it can't be 8 Zen + RX 480 because the die it's too small for 14nm. I think it has to be 400+ mm2 - a very expensive die for a console.It might be 10nm, but I think it's too early for a 10nm die shrink of Zen (expecially for GloFlo) and too expensive for a console.
Unless Microsoft is willing to throw money away (see LinkedIn) to gain back the console market at any cost.
T1beriu - Monday, June 13, 2016 - link
So 8 x Jaguar + Polaris 10 (RX 480) on 14 nm.Jaguar has already been ported to 14nm in XBOX One S.
It seems both Scorpio and Neo will have the same SoC but Microsoft will want more powah - higher clocked, thus bulkier than Neo (a repeat of what happened this current gen).
jagadiesel1 - Monday, June 13, 2016 - link
T1beriu - do you have a link that Jaguar has been ported to 14nm for the Xbox One S? I haven't seen anything that suggests that, and some articles that suggest otherwise (that neither Sony nor Microsoft wanted to pay for Jaguar to be ported to FinFET).Though I suppose iit would be a huge tip-off if that is the case that Xbox One S has 14nm Jaguar - also suggesting that the PS4 Neo could also be a shrunk Jaguar plus Polaris on a single APU. Seems odd that it would be worth it to do that for Jaguar. Better to just wait a few months, save your $100M, and get Zen in there.
haukionkannel - Tuesday, June 14, 2016 - link
Well, smaller Jaguar would be backward compatible to previous generation, while Seon would not be. If this would really be a new consoles, zen would be wice, but because this is just power upgrade, the beefier Jaguar make it much easier to do.Meteor2 - Tuesday, June 14, 2016 - link
Scorpio is 18 months away. It will be Zen.fallaha56 - Tuesday, June 14, 2016 - link
correctT1beriu - Thursday, June 16, 2016 - link
Have you seen the XBOX One S? :)40% smaller + they included the big power brick inside the box. That tells me of big power reduction, thus new SoC on 14nm.
Another hint is the old consoles. Every time a console goes slim is because the SoC manufacturing process goes down. PS3 and XBOX 360 had 3 or 4 process shrinks during their life time. So it's no big deal to put some money on porting Jaguar to 14nm.
RiZad - Thursday, June 16, 2016 - link
Have you ever seen that PS4? "40% smaller + they included the big power brick inside the box. That tells me of big power reduction, thus new SoC on 14nm" That is a horrible assumption considering the PS4 already did those things from launch.fallaha56 - Tuesday, June 14, 2016 - link
think 4 hyperthreaded Zen vs 8not so big...
BehindEnemyLines - Sunday, June 19, 2016 - link
That's considered Quad Core not 8 core CPU like MS mentioned. HT is not a core. It's 4 core 8 threads. You don't advertise a quad-core i7 as eight cores.Krysto - Tuesday, June 14, 2016 - link
Since Zen has multi-threading, it could easily be just a quad-core Zen with 8 threads, so the game engines/developers won't have to change their architectures too much. Even in quad-core form and even with a 1Ghz clock speed, Zen should still be significantly faster than the current Jaguar CPUs.taikamya - Monday, June 13, 2016 - link
I believe this is a great marketing for AMD. Current-gen APU's can't handle 1080p with decent settings. Most APU's can run at 60fps only content displayed in 720p with medium settings. For Microsoft to advertise that in a year and a half, their custom designed SoC will play 4K 60fps, leaves me wondering if the Zen APU (not custom) will deliver at least 1440p @ high settings.It's quite a jump in IGP performance. Definitely worth looking forward to this.
T1beriu - Tuesday, June 14, 2016 - link
A ZEN APU can't deliver 1440p @ high settings because of low memory bandwidth (DDR4). When HBM comes to APUs, then we'll see that kind of performance coming for an APU and that's 2-3 years away.Eden-K121D - Tuesday, June 14, 2016 - link
14nm process needs to be more mature before they could have an APU built on it with HBM 2Krysto - Tuesday, June 14, 2016 - link
I doubt we'll see HBM2 on 14nm - or ever. AMD already says Navi will use a "next-gen memory", although it's still not clear whether that refers to scaling up HBM2 to all products, or a completely new type of memory.T1beriu - Thursday, June 16, 2016 - link
That's why I said 2-3 years. :)Krysto - Tuesday, June 14, 2016 - link
I'm surprised a site like Anandtech isn't considering 4-core CPU with 8 threads, and assumes by default it's going to be an 8-core Zen. That would be way too expensive for a console, and it's completely unnecessary to provide a big boost over the current CPU.In fact, we should consider lucky it will even be a quad-core Zen, and not a higher clocked dual-core one with 4 threads, or worse, another Jaguar successor. I'd be disappointed if it's not a quad-core Zen with 8 threads, though - it just makes the most sense.
T1beriu - Tuesday, June 14, 2016 - link
Because 8 threads ain't the same as 8 cores. That's why. :)SunnyNW - Tuesday, June 14, 2016 - link
With all this emphasis on 4K and Phil SPencer saying the new Scorpio is "built specifically to lead the console industry into true 4K gaming..." It makes me wonder would it not be better to use the extra GPU power and console resources for better graphical fidelity at 1080p? Better shadows, lighting, etc. can go a long way in bringing photo-realistic gaming with 6 Tflops at 1080p. Full HD resolution is already pretty crisp and with good AA I can't help but wonder if this would not be the better direction. Any thoughts?BehindEnemyLines - Sunday, June 19, 2016 - link
Developers can choose to use it how they see fit. I think dynamic resolution and dynamic graphics will become more common especially with first party. You can already see it with Forza 6 Apex dynamic resolution and graphics. It's really cool.fanofanand - Friday, July 22, 2016 - link
But marketing.yhselp - Thursday, June 16, 2016 - link
Read this shortly after the E3 presentation, but didn't have time to comment.Thank you so much for releasing this in a timely matter - when I hit the Anandtech bookmark after the presentation this was precisely the article I was hoping for. Thank you for taking the time to write it, on time. Very much appreciated.
Hrel - Monday, June 20, 2016 - link
Spell check:"So Project Scorpio will have easily have more CUs than Xbox One," - Should remove one "have".
Also, partially subjective, everywhere it says "Xbox One" should read "Xbone".
Hrel - Monday, June 20, 2016 - link
I'd be very surprised if that 320GB/s figure was a sustained load figure, RMS to use audio parlance.More likely it's a burst figure that comes from cache, though what kind and form the cache will take? Your guess is as good as mine.
They better fucking not name this the Xbox Two. They do that I can't ever buy on pure principal.
MapRef41N93W - Saturday, June 25, 2016 - link
6 TFLOPS is already slow right now and let's not forget this comes out in Holiday 2017. By Holiday 2017 the RX 570 will probably be capable of delivering that. Not to mention these are AMD FLOPs we are talking about here. By 2017 NVIDIA's 1180 will probably be pushing 3x as much power as whats in the Scorpio.In essence this is just another console that M$ will be selling at a profit and not anything groundbreaking like the Xbox 360. It's going to struggle mightly to even keep 24 FPS at 4K and cutting edge games will probably still have trouble mainting 60 FPS at 1080p.