AMD-based CPU's are, perhaps, a generation or more behind Intel's current-gen CPU's in many benchmarks, including gaming. Do you think AMD's poor CPU performance will be a major limitation for the PS4, or will it not matter that much (i.e. future games won't be very CPU-bound)?
Might hurt it a bit, but it depends on the clock rate or if they balance across cores properly, but with GPGPU tech being utilized, that should take a good bit of load of the CPU Also depends on the resolution its gonna be running at.
My bet is that it wouldn't matter much. In multi-threaded scenarios Intel tends to still be ahead of AMD, but the gap is much smaller. And seeing as how there is 8 threads available hopefully developers will start putting more effort into multi-threading their games to take full advantage.
I don't see it as an issue, especially considering this will be a vast leap over CellBE and the GeForce 7900-class GPU of PS3. Besides, this combo is still driving just 1080P and eventually 4K displays. I think the hardware will be more than sufficient to produce a good gaming experience. Granted, this upcoming generation of console will probably run for 10 years.
You are talking about desktop GPUs. The GPU inside the PS4 is as good as it gets for the $. ~ 2 Tflops with 176GB/sec suggests a GPU that's extremely close to HD7970M. HD7970M delivers 95% of the performance of a GTX680M for $350 less:
Since you cannot fit a 200W GPU inside a console, not even a 150W one, HD7970M style GPU was hands down the best possible high-end choice for PS4. You simply cannot do better for the price/performance and power consumption right now.
The original XBox 360 shipped with a 203W power supply, and the Xenos GPU was believed to have a TDP of about a hundred watts, so 150 or 200W in a console isn't impossible.
Heck, my Shuttle XPC has a 500W PSU that's much smaller than the 360's power brick. I suspect the large size of the 360's PSU had more to do with it being passively cooled than anything else. If the PS4 (or 720) had an internal GPU that could leverage the active cooling system inside the console, fitting a 150W GPU inside the thing would not be terribly difficult.
Remember a console can dedicate all its resources to graphics so a 7800gtx on pc game won't compare to the PS3 graphics just look at Uncharted 3 for example. So likely the PS4 will outmuscle anything you can do on a 7970m.
I never said it's impossible but it would make the console a lot larger, more expensive to cool, etc. I realize that PS3/360 consoles drew 200-240W of power when they launched. For cost reasons the CPU+GPU are on the same die. That alone means going with a discrete GPU option like HD7870 would have cost a lot more money. At the end of the day Sony is not the company it used to be. It cannot afford to sell an $800 (original BOM for PS3) console for $600 in this market. Not only can they not afford such losses on the hardware but the market will refuse to buy a $599 console.
A generation behind? If you are referencing Titan, it is in no way comparable to what goes into consoles. Looking past the fact that nVidia doesnt have an integrated GPU solution (As they do not make CPU's), nVidia's mobile chips are certainly not a generation ahead of what AMD has. I would actually give AMD the edge in mobile GPU's.
AMD's best CPUs lag behind Intel's right now, but that's not saying anything-they're still crazy powerful compared to anything else.
The issue is that this ISN'T AMD's best CPU design, it's the successor to Bobcat, a (better than) Atom competitor. Remains to be seen how that does (Microsoft's using the same thing).
Still, it'll be a lot faster than we have now.
I'll be SUPER disappointed if this doesn't have full backwards compatibility, and also if it has a WIi U-style controller that doesn't work with it's battery dead or pulled.
I've never understood the obsession with backward compatibility. For one, unless you have a library of games, it's not important, and if you have a library of games you already have a PS3. If you already have a PS3 just keep it to play your old library of games. Problem solved. Two, I would much rather have $100-$200 less in parts, and, by extension, retail price of the console and no backward compatibility then vice versa. And no, you are not going to run PS3 games on the PS4 in emulation, so that is not an option.
So you don't have 3-4 consoles plugged into the TV. Personally, in my household - we have a Wii on a kids TV. A PS2 on the main TV. We are waiting for the PS4... I don't see the point of spending $250 for a PS3 when that can go towards the PS4. But, we would love to play some PS3 titles on the same same PS4 console.
That would be better than buying a used PS3 for $150 and sucking up space... both with the Console itself, the controllers, cabling, etc.
Its bad enough we'll still have the PS2 for now.
The horse power of the PS4 should allow PS2/PS3 gaming through emulation.
how are they behind intel exactly? have u seen the newest game engines? in BF3 and MoH Warfighter AMD & Intel are neck and neck, its only for older single threaded games that intel beat AMD out. Any new game that's multi threaded doesn't matter. I imagine with all PS4 games using similar modern engines like Dice's Frostbite Engine 2.0 that AMD or Intel would be neck and neck in CPU performance.
AMD may be head to head with and sometimes even in front of a same-price intel core processors, but not nearly with a competitor at the same TDP. Apparently Sony decided the price is more important, and will live with the bigger/noisier design that the higher power consumption of the AMD cores brings with it.
Or they just know that AMD needed this deal badly, and like the leverage this gives them.
C'mon, get your facts right. Jaguar is a low-power core. It has excellent performance/watt and performance/mm^2. There is nothing from Intel which can compete with it. Atom is too slow, SB/IB is too big and too expensive. We are NOT talking about the big server design from AMD, Orochi. And even the Bulldozer cores are not "bigger/noisier" than comparable Intel cores.
To be fair the Bulldozer vs. Sandy Bridge comparison is rather irrelevant given that this is about Jaguar.
Anyway, Intel's single-threaded performance is significantly higher but that's only reflected in games that have computationally intensive threads.
Usually MMOs, RPGs and tactical games.
Shooters and other action-genre games traditionally aren't as CPU-limited, which means it shouldn't impact console gaming as much as it would PC gaming.
That said, the per-thread performance of the PS4's Jaguar cores are likely to be significantly slower than that of the Bulldozer chip used for the comparison.
It's not even one of the high-end CPU's though. It's Jaguar cores, which are the successor of Bobcat (competitor to Atom). Jaguar seems about as powerful as ARM's Cortex A15 to me. Not sure why they'd prefer 8 of those over 4 high-end cores. Just because of price?
Also I hope the GPU is at least a GCN one. Can we confirm that?
In Geekbench no ARMs have any results in vector perf. Why?
For Bobcat it is very much about size and speed of cache. E-350 has L2 running at half speed. This is much too slow. Full speed is a must. 8 cores at 2GHz with full speed 4MB L2. The L1 cache for Bobcat is much better than Bulldozer's. Separate L1 for every core must be an optimum. Bulldozer should have 64i+64d for every core...
Remember that the Devs will be a lot closer to the 'wire', and wont have Windows in the way, so AMD's disadvantage in Windows might not be as significant. Carmack has repeatedly pushed for MS to allow game devs closer access to the hardware.
Plus cost, if your loosing 20% performance AND 50% cost...then it makes sense for a mass consumer device.
I wonder what the OS will be. Given Sony's history, I doubt it'll be Linux/BSD. It's probably not going to be Windows or Mac. So what does that leave? Did they make their own x86 OS?
API used by game devs is completely irrelevant (and considering what a strange CPU Cell is, no wonder there are "tasks"), the OS itself is Linux based.
Tasks and threads aren't APIs. But there are plenty of OS specific APIs that would allow someone to figure out the OS just by the API. I haven't been able to find anything online saying the PS3 OS is Linux based.
The PS3 currently uses a custom 'Nix based OS, and the main (or rather the best performing) API is LibGCM which is very low level compared to OpenGL/DirectX etc.
Just as Quadro and Firepro cards use the same silicon as the ordinary gaming cards do, yet produce considerable performance advantage over the gaming cards, the custom made CPU and GPU in this unit will benefit from the same kind of treatment. Rest assured that AMD can tweak its products to deliver next gen performance with this years silicon. It is only a matter of the volume that is guaranteed to be purchased that teases a company like AMD or NVIDIA to do so or not, and the potential sales volume for the console, any console, is always very high. At least when it comes to companies with an established developer and software base like Sony and Microsoft. In short, don't worry. The performance will most probably blow your mind.
Jaguar is a low-power core and the successor of Bobcat. Everything that Intel has in this market is Atom. And Atom is pure sh** compared to Jaguar. Btw, benchmarks say nothing most of the time. Even the bigger AMD processors are good, but most of the time slowed by poor software optimization under Windows.
Most Software, Library, and Other tools are written with Intel x86 in mind. Not AMD's x86. And Software Optimization is key to performance. I guess with MUCH closer to metal programming, tuning, AMD'S CPU isnt as bad as some of those benchmarks made out to be.
( That doesn't mean that it will out perform Intel. is just that there are still quite a bit more to squeeze out from AMD )
It's not a problem. The PS4 is designed to offload the data parallel workloads to the iGPU. A throughput optimized processor (just like a GCN multiprocessor) is more optimal for physics and AI calcuraltion, or just any kind of sorting, culling or asset decompression. Even you can simulate physics on the iGPU, and copy the results back to the CPU in the same frame, so you can create smoke particles that affect the AI.
AMD's APU is several generations ahead of Intel, which makes it up nicely, doesn't it? So who went for AMD's APU:
1) Sony 2) Microsoft 3) Valve (SteamBox)
It's also a huge win for AMD on another front => games (nearly the only application that needs much computing power that we run on PC) will tend to be multi-threaded from the beginning, so there goes Intel's single thread performance advantage.
NO. Look at the larger picture here. Regardless of Intel or AMD Sony has chosen an x86 multicore CPU which most games today are not even close to being optimized for. This new console will mean that game developers will specifically start coding for multicore x86 architectures and we will likely see huge leaps in performance for games.
This is a win for PC and console gamers, heck it is a win for the entire gaming and software industry.
PC processors have been fast enough that many games still are poorly optimized for more than a few cores. As a software engineer, I can understand why -- it's a pain in the ass.
Consoles are a different world, though. Your game has to compete with the other guy's game on hardware that won't change for years, so there's a strong push to get every bit of performance you can get. This is a huge win for all platforms (even mobile phones have 4+ cores), and will greatly strengthen the market for a highly thread-optimized ecosystem.
Which gives consoles the PROs and CONS.... like the Amiga computers from the 80s~90s. The Amiga 500 sold from 1987~1991 with no change other than systemboard tweaks and shrinks.
The games hit the hardware very very tightly. So much that most games didn't allow for the multi-tasking OS to run. Remember, this is a 7,14Mhz single core computer... so they needed as much as they could from it.
This hurt the Amiga as a computer... even with my 25Mhz high end Amiga 3000, I had to run "emulation" of the Amiga500 with ADOS 1.3, rather than my full 32bit/ADOS 2.x or 3.0 OS.
From what I have experienced and many of has is that for the most part Gaming will always be based on resources of the Company and tools as well as hardware. CPU bound gaming ? Well from what AMD has planned APU is a CPU and GPU. ;) Does this make any sense?
Lol. But SSDs in a console is needless cost imho, games load from optical media anyways unless you want a SSD large enough to fit every game in it, which would make the console so much more expensive.
512 MB was on the low side when they released the last gen. 8 gigs is on the high side now. 4 gigs of ram is really all a gaming PC needs. 8 gigs is nice tho. I think we wont see ram size as a problem in these consoles for 10 years.
4Gb is all a gaming PC needs right now because the vast majority of AAA releases have been pegged for consoles with their paltry 512Mb. Now that devs have 8Gb to play with, you better believe that 8Gb will be bog-standard by the end 2015.
The fact that the PS4 has 8Gb of graphics memory to play with is going to have an interesting effect on PC graphics cards too.
was gonna say the same but u beat me to it. now that they've passed the 4gb/64bit mark, 8gb is gonna be paltry by 2015-16. by then most systems should have 64gb RAM (heck i have 16GB ram now, and i upgraded to that in 2012.... it only cost me $70!)
Always surprises me when people forget about how tech has exponential growth. Seriously TEN YEARS? PC already has texture packs that demand over 1GB of video ram let alone system RAM and as 1 guy said, it's mainly due to the porting from the current rag consoles that have been outdated since 2 years before their release.
Well, the 8GB RAM is unified between CPU and GPU. Current gamer PCs have at least 4GB for the CPU and another 2GB for the GPU. The high end ones would be sitting at 8+3GB and more.
Current PC's have 4GB of DDR3 for the CPU, and another 2GB of DDR5 for the GPU.
They do not share though. So the GPU only has 2GB to work with, while the CPU only has 4GB of slower ram to work with.
The PS4 will be 8GB of unified RAM. So if a developer makes a game and the CPU only needs 1GB of DDR5 for its tasks, then the GPU can access the rest of the 6.5GB of DDR5 or so that the OS is not using in the background.
The rumours/leaks pegged the PS4 at 4GB of faster GDDR5 whereas the next XBox is thought to have 8GB of slower DDR3. It's good that the PS4 now turns out to have 8GB after-all. It might cost more now, but the cost will drop over the console's life-cycle while the extra RAM will no doubt prove useful to developers.
Yup, I thought it would either be 4GB GDDR5, or 8GB DDR3. Got the best of both worlds. Nice. Even the fanciest of graphics cards usually don't have half of that GDDR5 on them,and none of us can get it for system memory.
The PS4 is locked down. Assuming Sony didn't have a sudden change of heart (a completely massive one...) they'll likely try even harder to lock this thing down than they did with even the PS3.
People will eventually break that (like all prior systems), but then there will be the challenge of getting a regular operating system to actually run on it (getting graphics fully functional will likely be the hardest problem... even leveraging the open source radeon drivers it likely won't be easy). Eventually people might be able to have Steam up and running on it... but by that point, the hardware in the PS4 will likely look fairly antiquated by normal PC standards (just like the PS3/360's hardware is very low end by today's standards).
It took 4 years to break their last system, and actually breaking into i twas a lucky combination of Other OS running on it on top of insanely silly crypto mistake on programmers part.
I can't believe how often idiocy like this gets parroted. You cannot compare to the specs of a PC to the specs of a console. Even when relatively similar base hardware is used, it is like comparing apples to tires. The console is highly optimized to drive graphics and other calculations that are game related. A PC using an HD 7870 will get crushed by a console if it is based on the hardware Anand says it is.
According to AMD the same hardware can be 5 or 6 times faster when directx or opengl layers can be bypassed. Coding for the GPUs native instructions and not going through an abstraction layer can have big performance improvements
Since the GPU and CPU share a die and it's likely a custom chip, a shared cache between the GPU and CPU may be possible. If that's done it would put a PC to shame
I call shenanigans on those claims. I've wondered wy no one's ever really addressed them.
They don't match up to what we see in real life-a similar GPU as what the consoles have delivers similar results (I know, I've got a Geforce 9650m GT I still use daily).
For another, the Xbox is already using Direct X, and the Playstation 3 Open GL. If that was really killing performance that much, the Playstation 2, Playstation Portable, Wii, and 3DS could all run the same games as the Playstation 3 and Xbox 360.
For another-if this was true, someone like Carmack would have been talking about it YEARS ago. He's certainly railed against dumb moves in APIs before. All sorts of developers would be talking about it and demanding action from Microsoft.
Maybe there's some bizarre situation where some graphics function really is that much slower through Direct X still than Open GL or something, but there's just no way it's really making things 10 or 100x slower.
No, facts are kind correct. The reason performance on a PC is so slow (comparitivly) to the same hardware in a console is all about those layers and other overheads.
First you have Windows Direct3D (or Open GL), which is a general 3D API, lots of other performance hugging systems, and then you have the graphics card drivers themselves.
The ps3 in particular does a really good job (nowdays) of getting rid of those things with the GPU command buffer generation library (libgcm). It's a better way than OpenGL or Direct X for when the hardware is known. An openGL will be written on top likely for the ps4 (just like the ps3) so as to ease developers into it.
There is so much capability that cannot be used with standard API's - especially in these new GPUs, and particularly with AMDs new tech and the new APU tech that will be in the PS4.
I'll point you to a great article on why this PS4 will be pretty astonishing in the years ahead, as they are working on a new libgcm.
Jesus, who says it's slower on PC??? Ever seen how, say, Dragon Age looks like on PC vs PS3/Xbox?
What is vastly different with consoles is IT MAKES SENSE TO OPTIMIZE FOR PARTICULAR HARDWARE. Now how much you could win from such optimization depends, of course, on how much your generic code+API it is based on sucked. But there is no way a mature API would slow something down 5-6 times. Even 1.5 times slower is hard to imagine.
I've never seen an article like that and just assumed console development was also done with a 3D API. It seems misleading then to be comparing the "performance" of the PS4 GPU to an HD 7870 when the actual code and graphics that get rendered on the PS4 are at a much higher performance level. I understand that the comparison in an article like this is of the hardware and low level performance but most people probably just extrapolate that to in game performance.
what pariah said. A 7870 in a PC is not optimized because developers have to develop for it and 2 dozen other configurations, so a lot of efficiency is lost. whereas in a console they know EXACTLY what it's capable of and can optimize specifically for it. a 7870 in a console would be similar to a 7950 in a PC. :)
I can't believe how often this stupidity gets repeated. The NUMBER ONE REASON PC's require a lot to push games on max is because devs take the cheap way out and use effects like HBAO,HDAO,SSAO that require a lot of GPU power for almost no return. For example, GOW played at med settings 800x600 with a 9700pro. If PC hardware was SO inefficient that wouldn't be possible.
There are some efficiencies to be had but not on the level that you're suggesting or that have been suggested.
It is a fact that consoles are more efficient you got one example and my 7600gt with 1gb of ram and 3800+ amd single core lagged on gears of war and lowest settings and 640x480 so I don't buy it.
Not a chance. Developers have access to the 'metal' on PS4. Certain functions will work 10-100x faster because there is no API overhead. Some of the graphics we've seen at the unveiling are already approaching Crysis 3 level and HD7870 cannot even come close to maxing out Crysis 3. In 4-5 years the type of games PS4 would be able to run, an HD7870 would not and certainly not a $550 PC because that means a Core i3 or low end FX4000.
Launch price of PS3 ranged from $500-$600 and steam sales offset hardware costs
I do agree consoles can get better performance out of hardware but my old HD2600 and E5200 from 2006/2007 can still play most games at 720p low/medium 45-60fps.
Are you currently using PSN on your PS3? Compare PSN price of multi-plat games to the Steam prices. Most of them match up, thanks to a EU PSN sale, Dishonored and Darksiders 2 are actually cheaper on PSN than on Steam. I am quite sure that a few games are actually cheaper on Steam than on PSN.
However the way Sony is pushing for agressive PSN pricing gives me hope that the PC-console digital game download price difference will shrink considerably over the coming years
That's a great story. Unfortunately it falls apart because it assumes PC's in general and budget PC's specifically will still be running AMD 7870 class GPU's in 4-5 years.
They of course will not be. The PS4 on the other hand will. Even if the claims of speed gains are true, by the time devs are confortable with the hardware and cranking out highly optimized games on PS4 hardware, the PC will be 3-5 generations past the silicon in the PS4.
PS4 will cost $400 and have better performance then your $600 PC with 7870 my i53570k, 4gb ram, and gtx 660 system costed $800 to build last year. PS4 will be half that and offer better performance.
Yep the 8GB of GDDR5 is real monster in here! A lot of people seems to forget how much faster this memory is than normal DDR3 or upcoming DDR4 are! This really allows this machine to trech its muscles. The GPU is allso much faster than I expected. And 8 cores even smaller should be sufficient to feed those GPU cores, so this can really fly for a console! 1080p seems to be possible and good 720p games should be just fine.
x64 isn't an architecture, nor does x86 mean 32 bit. Yes, it is a 64 bit processor. Something can be both x86 compatible and 64 bit...All the chips in the last few years on our PCs have been.
AMD calls the instruction set AMD64 and Intel calls it x86-64. Microsoft only calls it x64 because they think ours heads are too mushy to remember x86-64.
I wonder how long it will take before the os is being given in isos all over the internet for users to install on a normal PC.:D in the same fashion as MAC OS.
I think this is unlikely to work. These still aren't off the shelf parts, I doubt you'll actually be able to build your own PS4 from Newegg. And short of some amazing reverse-engineering by hackers, we won't have drivers for anything except the official hardware.
The opposite is much more likely to be true - Windows, OSX or Linux being installed on this x64 CPU (provided Sony allows it or the console is hacked).
Yes you might right, however this thing is built mostly with off-shelve parts, I doubt OSX will be installed on this, Linux and Win most likely, dont know why would this be done (unless it is possible to create partitions without bricking the device for dual-boot).
Must be you have no friends. Consoles are cheaper/easier to replace than a decent computer, if someone is drunk or rowdy. Much easier to throw a disc in and grab some controllers.
I don't play console games anymore, but they have clear advantages over something like a PC. Consoles are simple and easier to operate, great for multi-player with people in the same room, and have exclusives that you can't get on the PC (especially if you're a fan of Japanese developers). They are more family-friendly and you don't have to tweak any settings within games. Also, crashes and freezes are far less frequent on consoles.
If I was still into gaming like I was when I was a teenager, I'd buy a PS4 or Xbox 720.
I'ld have to disagree. A PC game without intrusive DRM can be a hell of a lot easier to operate. Most of my PC games just load. I don't have to sign in, I don't have to select my save location. I just click on the game icon, wait ten seconds, click continue and I'm on my way.
As for crashes, the only one's I've had often were in Fallout 3 and NV, and even then that was using mods.
My main experience console gaming is Forza 4. A minute just to load the game. Last time I checked I had more than 50 hours in menus (not including upgrading/tuning. That was for ~250 hours of play all up. Four days of my life just waiting.
I was looking everywhere for detailed info on the specs, thanks for putting up this info and your thoughts on it!!!! Really more than i expected as some people were saying its gonna be 4GB RAM (which would've been terrible if its supposed to last 8 years). The thing is these specs aren't anywhere near futureproof..... in 2 years they'll be completely low end, let alone 5 years! Anyways from the games i saw being previewed, the baseline for graphics will be like BF3 Ultra on PC, which is still fantastic!
For PC building at least their is no option to install GDDR5 ram. That kind of ram is only in GPUs to my knowledge. Is this equivalent to say ...the systems DDR3 RAM? I also realize DDR3 is built into the worse off GPUs but I guess my question goes to differentiating that between the GPU RAM and the system RAM. They say the PS4 has 8gb of GDDR5 RAM...But its like everyone is acting like its the system RAM. Can the whole system run off the GPU RAM? Kind of a 2fer? I hope I phrased this well enough because im not sure if I actually got it across correct. Just so I'm clear here (Dont need any wasted replies explaining that DDR3 is worst) I know DDR3 RAM in video cards sucks and GDDR5 RAM rocks. GDDR5 is better.
Using GDDR5 throughout the system got me thinking as well. I've never really known the latency involved with GDDR5 but it sure means that Jaguar has a huge amount of bandwidth.
As others have said, a strong CPU isn't really too necessary here, and as more and more games are multithreaded, I expect even two Jaguars can spit out a decent amount of work (they are, in essence, as powerful as K8 cores albeit with a huge list of ISAs available to them). I do wonder if HSA comes into any of this. Regardless, perhaps it'll convince developers to spend more time on properly threading their games with so many cores available to them, and with them using x86, porting should be easier.
As for "built by AMD", I thought they were just licensing the tech out for somebody else to make it?
I know I'm asking much, but some more technical details regarding the cpu/gpu/ram would be nice... :-)
8 GB DDR5 and 7850 performance levels is nice (hopefully not comparing to a mobile gpu part). Of course, it also depends of the DDR5 speed. Unified fast cache would be nice too.
But the cpu is very, very slow by today's (pc gaming) standards. Even with 8 cores, unless it's 3 Ghz, the performance won't be enough to push the graphics part to its limit. Don't forget that a game is not quite the optimal task to spread to more than 4 cores, so the single threaded performance is still important. Hell, I've touched notebooks with this kind of cpus and they are painfully slow comparing to the cheapest core i3. They are much better than Atoms, I agree, but there's not the comparison I would make. We need at least desktop Trinity cpu performance here.
Anyway, I know it's impossible, but I would like to see XBMC on this someday... :-)
Do you feel that the CPU performance of existing consoles (Cell and PPC in the Xbox 360) are lacking? Jaguar is much, much better for integer computation than either of those two machines, even when clocked at a far lower rate. For example, the Power architecture in the 360 has *horrible* IPC as it is in-order, with many pipeline hazards. Clock for clock, Jaguar should be on the order of 3-4 times as fast as the PPE in Cell and the Xbox 360, as it is out-of-order, has extensive register renaming, and a very good branch predictor. Couple that with an increase in real core count, and it seems like a sensible decision for Sony to have made.
I also have a 1GHz C-50 Bobcat that completely smokes my old 2GHz AMD 3000+ (which in turn was faster than the 3GHz P4's of the same era). I also have an i7-2630QM laptop. I only notice the extra power when simulating clocks.
Modern CPU's are so much faster in part because they include all the other chips that used to make up a mother board (like most of the north bridge and memory controller). Think about grabing the pen next to you compared to sending someone to the shops to buy a new one. Both pen's write just as fast, just your going to have to wait a while for the one from the shops.
The stuff Xorrax talks about is the other part. A great branch predictor and a good memory hierarchy do great things for IPC.
Sorry, but no. If you think a C50 performance is enough, you're delusional. Just check some cpu benchmarks... Even a newer 1800 Jaguar is way more slower than any Sandy Bridge mobile Celeron.
I'm an AMD fan so I'm glad to see them in both major consoles, but we need Trinity performance here, not Jaguar. And please don't compare to PS3 or xbox 360 cpus. Compare to the x86 world...
Pick your comparison -- I'm comparing to existing consoles, which these are replacing. Not PC gaming
The size, cost, and power benefits of 8 Jaguar cores over other something else doesn't seem like a bad decision. As others have mentioned, it will motivate developers to optimize for multi-core more than they have historically.
Links to benchmarks? Jaguar isn't in the market yet, so I'd be surprised to find anything meaningful out there other than data from engineering samples, which may be suspect.
You can't compare what CPU performance is like on a PC, compared to a console with minimal OS overheads.
Besides, with more heterogeneous compute capability in the PS4, you could offload a heck of a lot to the main GPU, freeing up the CPU's for other less demanding tasks.
I think an 8 core 1.6GHz Jaguar is fine. No it's not the highest end you can get on PC, but they have costs to consider. Plus it will have some modifications that we don't know of too, and no Jaguar on the PC side has access to GDDR5 memory. Developers will have to work around its limitations, that's just par for the course for consoles.
I'm not really savvy to this stuff. Can someone do any sort of preliminary cost analysis to guess how much the components would be? Any sort of general idea would be fine. Just wondering how heavy of a loss Sony is expected to take on this.
First we have to estimate the transistor count. My estimate is 3.2 billion, 2.8 billion for the gpu and 400 million for the cpu. For a chip that size Sony is probably paying $80-$100 apiece for the first million, and who knows what after that. I cant find the price on a HD7950M gpu die but it cant be much more than $100. So this chip isnt going to be much more than that either. Since the volume is higher I would expect it to be closer to $70 than $100. I'm sure AMD is eating it compared to the margins they get from discrete gpu sales.
Really, AMD...? A company with $2.70 stock price, that tell you just how capable this company is...not. I guess when you go with the worst you get what you pay for. Give me an Intel i7 processor anytime, I'll pay the extra for performance...
I don't mind them using AMD, but Jaguar? Really? Ouch. It's Cell craziness all over again, only I'm not sure 8x Jaguar will really provide that much more performance than the CellBE. At 1.6GHz, Jaguar is basically something like 30% faster than Atom at 1.6GHz, which means something like a 1.6GHz Ivy Bridge Core or a Piledriver core would be three times faster. I would have much rather seen 3GHz quad-core Piledriver or similar than octal-core Jaguar.
Of course a CPU like the FX8350 would be often bored when coupled with a HD7850 feeding 1080p. So it would also be a good idea to replace the GPU with something in the 7950 or better range. And if Sony still manages to sell that box for <500$, everybody would be happy.
LOL, good one, but you misspelled you name. As 8 jaguar cores is about the same size as a single core i7. Add far cheaper process. You could probably get a quarter to a half Intel single core. Try gaming on that :)
I'm not sure about that. Had you said Piledriver "module", then I wouldn't be quibbling, but as Jaguar should be roughly equivalent to K8 in performance per core, and Phenom was only a modest step up, I couldn't see how a Piledriver core on its own would outperform it so massively especially considering they're smaller than Stars cores themselves. Even so, if you meant module, I'd agree 100%.
I'm actually quite disappointed with the PS4. Apart from using hardware that's barely current-gen, the new services seem to be blatant ripoffs of what NVidia and Nintendo are doing, and nothing actually new seems to be included in the package. The streaming service produces horrible image quality, from what I've seen from the Killzone video. Also, the lack of any next-gen processing like tesselation in the video worries me.
Anyone who was expecting a sub 600 dollar console to be better than PCs costing more was setting themselves up for disappointment anyways. The rumored specs were pretty much dead on, with a pleasant surprise in the 8GB GDDR5.
None of us can get GDDR5 as system memory, and most of us don't have half that, most not even a quarter of it, in a GPU.
And every time a new console launches there are articles about which inexpensive PCs are better, but those kind of miss the point imo, lets see which runs games better in 7 years. Developers won't care about your 7 year old PC, they will care about the current generation of console hardware though and will ensure their game runs well on it.
The reason a PC wont last as long, even if it's more powerful, is because PC graphics improve more over time. Even many current console ports right now look significantly better on PC. So yes a console will obviously always run it's games well as it's hardware stays the same. But if PC games stayed at the same level of graphics for many years at a time, then PC's would easily be able to do the same.
Personally i'd prefer the better graphics and if it means upgrading now and again i don't have a problem with that.
I'm not saying otherwise, mind. Just that they are different philosophies, and the influx of silly articles elsewhere detailing how so and so computers are already better than it are silly.
... "The PS4 also supports instant suspend/resume."
That is going to be huge for home consoles. Bigger than anyone can even imagine. Who the hell needs saves when you can just suspend the console at a point and then start up right where you left off?
No logos, no menus, no copyrights, no waiting. You sit down. You pick up your controller. Your PS4 with its camera detects you and the game just starts up. Before you have even sat down, the game is sitting staring at you paused. Waiting for you to hit the Options button to continue.
You need to go to the potty, then the grocery store to pick up more Red Bull, nachos, and a card for apologizing to your girlfriend for forgetting her birthday while playing the new Killzone? You come back an hour later, you sit down. It's on, waiting for you again.
Especially if it can leverage HDMI-CEC to manage your HDTV and receiver on and off, too.
Huh? I don't see why this is big deal. You can just pause the game and turn the TV off or do something else. Computers stay on all day and night, why not the PS3 too? I've left it on for days and never had any issues.
I think the point of this was that it's a low power mode.
It's not so much that the PS3 can't stay on all day and night, it can, easily, I've left mine on Folding@Home for weeks at a time. It's just every time I think to pause the game and just turn off the TV, I have a little mental battle; How long til I'm back at the TV? How much electricity is the system going to be drawing needlessly? How much time is going to be wasted waiting for the game to reload?
All of that is now gone, cause it just suspends. It's not so much that it wasn't possible, just that it's made to do it now. You can turn it off mindlessly even if you're going to be coming back in 15 minutes or 5 hours, which will be nice for a console, especially if the game only allows you to save it at particular points in the game.
What I'm missing in most "previews" I've seen so far is the fact that the jaguar cores are the first generation cores using HSA. For sure that'll have an effect on the performance when comparing to the current generation cpu's, making comparisons really hard to do. This ps4 will pack so much more punch then the specs suggest when comparing to current pc's.
People aren't thinking about some things when they comment about the CPU being an AMD CPU.
1: These are low power Jaguar Cores, not Piledriver/Bulldozer. That means 5-25W range for power draw. Using TSMC 28nm HKMG Process.
2: You can't expect the performance of a Standard Jaguar APU to be equal to this APU, due to the difference in memory controller. As a standard JG uses DDR3, where this will have access to much higher speed GDDR5. It should be interesting to see if this gets around the memory controller bottleneck present since the Phenom.
3: This should change the landscape for gaming. Since all new games will be able to run on multithreaded PC hardware. And in this coming gens case, it should benefit AMD with most of them being pre-optimized for AMD hardware.
Had to add another one, what I'd give for an edit function on this.
4: These Jaguar CPU cores are all full function x86-64 CPU cores. Which means all new games for these consoles will be programmed to run in native x86-64 mode. This should mean, most PC ports will come able to run in native x86-64 mode as well. Which should add much better visual quality, and give more access to RAM for programmers.
GDDR5 actually has a higher latency than DDR3, much higher actually. I wonder how that will play out. With GPUs the latency doesn't matter as much as bandwidth is the most important thing, but with CPUs after a point bandwidth isn't as important and a super high latency in comparison to DDR3 could hurt.
I wouldn't worry too much about latency affecting CPU performance, it certainly never harmed most PC desktop CPU's that much when we jumped from DDR1>2>3.
Bandwidth is still king for graphics, and that's mainly what this machine is built for.
I'm aware, maybe it won't be a big deal, that's why I said I was just wondering about it. But I think the latency from DDR 1-2-3 is a lesser difference than DDR3-GDDR5. Maybe the clock rate cancels some or most of that out. Again I'm unsure, just speculating.
People here are talking about how AMD CPUs are slower than Intel CPUs. This is true in a WINDOWS environment. What we all must remember is that this system will have an OS and software specifically optimized for AMD x86 multi-threaded processors. AMD CPUs in Windows are victims of poor optimization. AMD doesn't crank out a widely-used compiler for their CPUs like Intel does. The PS4 simply will not have that problem.
We're about to see the true power of an x86 system!
Me thinks it is because Intel ones are simply too expensive and unlike to negotiate the price. It would be pretty silly spend half of the budget on cpu (i.e. $200 for an i7 for a $400-odd console) while the money could be better spent on somewhere else.
seems to me haedware wise the new xbox and the PS 4 will be very similar and they will have similar performance. The main thing diffeernt between them will be thier UI and online features. I don't really see the need to buy new systems yet as I have too many unfinished games oon both the 350 and PS3. Granted the graphics will be nicer but will the gameplay be that different? the social stuff is not important to gameplay to me, eventually I will get both new systems once something comes out that I must have Halo 5and the new Killzone would interest me . hard to justify the expense of new systems when the current ones still have life in them,
Many people here seem to have unrealistic expectations for PS4. The supposed "high-end" has moved a lot from where it used to be. There is no way you are going to get a $1000, 300 Watt graphics card in console targeted at approximately half that price (or less?). How many people actually buy these ridiculously expensive, power hungry cards anyway? I wouldn't buy a video card for more than $300 (and even that seems high); the price/performance ratio is not there for these super high-end offerings.
Also, does anyone think that Intel or Nvidia was ever really an option? If they went with a separate CPU and GPU, this increases cost and power significantly (communication off chip waste power and reduces performance). Intel has plenty of CPUs which could have been an option, but they do not have a powerful enough gpu for a single chip solution. Nvidia isn't really an option either. They have the gpu, but no good cpus. Would you rather have an nvidia GPU with some integrated ARM cores? AMD is really the only one who can easily provide both a powerful GPU and sufficient cpu power for this chip. AMD wins on price, performance, and power since neither Nvidia nor Intel can offer this on a single chip.
Direct comparison with the amount of memory in a PC is not relevant either. Most of the memory in a current PC is wasted; the GPU memory is really only a high-bandwidth cache for stuff that is in system memory. The 8 GB of GDDR5 on the PS4 should be significantly more efficient compared to the PC architecture. Hopefully it is all accessible from the GPU and CPUs, and is all in a single, cache coherent memory space; this seems like it this could be figured out from dev kits...?
It would be nice if we could get a similar architecture in the PC space. With how small current CPU cores are, and with the memory controller on the CPU, it really doesn't make sense to have CPUs and GPUs separate any more. The actual integer cores are tiny these days. Most of the CPU is taken up with caches and FP units. If you have a GPU on the same die, then the CPU FP units are a lot less important. You still need to have FP units for mixed code, but for serious FP processing, just execute it on the GPU. Although, without using super fast on-board memory, the bandwidth to the GPU will be lacking. It sounds like Intel may be trying to solve this with more on-die memory; most people don't actually need more CPU cores anyway. I was expecting the PS4 to use a large on-die memory, but this probably isn't necessary since they went with such large and fast external memory.
For a PC though, I would want the ability use multiple GPU/CPU/Memory cards.
"GPU memory is really only a high-bandwidth cache for stuff that is in system memory"
That's not correct. There's no need to keep textures in system RAM once they're in graphics card RAM. Also you completely ignore virtual memory. When a game is ran on the PC, if other software is using RAM the game needs a lot of it will be paged out.
The console, on the other hand, will likely not have virtual memory and will keep at least 1GB if not 2GB of that RAM for its operating system and other software (streaming, social, ...).
And BTW, we do have a similar architecture on the PC side. AMD APU's have been available for a while, and AMD's plan has been to offer more and more integration with time. The main differences here are a skew towards more graphics power and less CPU power (which isn't necessarily good for a general purpose PC chip) and a lot more memory bandwidth. It would be interesting to see if any of these make it to PC space in some form.
It is a shame the bottleneck will be physical media. We may have to discuss in the future the long load screens to the initial loads that will take place. I love the massive capacity Blu-ray affords, but it is the slowest kid on the block. Then the Mechanical Hard Drive is going to be an issue. I hope that Sony allows the ability to swap out the Hard Drive (Sata 3)and has support for SSD through the firmware that will boost the performance overall. Hell the option to load almost everything onto a SSD off of BLuray would be an incredible option.. Xbox360 was allowing this if I am not mistaken.. Overall it is appearing to be one heck of system for pure gaming.
Alot of odd comments being made about the future of PC gaming, and the current state of PC gaming.
Someone said that most games need 4GB of RAM to play? Really? I can play multiple PC games like Simcity and Farcry 3 without even touching the 4GB mark on my system, and that is with Windows 7 - 64bit in the background ( which alone only uses 1.3GB of DDR3 ).
I have yet to even get close to 100% RAM usage. No matter what I'm doing ( except if I'm purposefully doing it on a prime64 stress test ).
8GB of DDR5 will be matched by 2015 at a reasonable price?! That is a funny thing to consider. PC's don't even use GDDR5 for anything but GPU's. There isn't a CPU even in the works right now that will be using GDDR5. Not that I've read or heard about anyway. As for a PC GPU using 8GB of DDR5 by 2015 that will be at a mass market price? Hell no. The only ones I could even search about are well into the 2,000$ range. Seems at the moment PC's are generally going to be using 3GB of DDR5 as a standard within the next few years.
There are plenty of PC exclusives out there like the total war series, Metro and others that do not worry about the console world. Not sure why people think developers will actually hold back on their titles simply for the sake of the console world. All they have to do is create the engine for the PC and then basically turn everything to Normal / High settings for the consoles. Turn off AA, turn on DX9, etc. This type of thing is done on the PC all day long and it takes all of 10 seconds to change your settings around to match your system's capabilities.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
160 Comments
Back to Article
Paulman - Wednesday, February 20, 2013 - link
AMD-based CPU's are, perhaps, a generation or more behind Intel's current-gen CPU's in many benchmarks, including gaming. Do you think AMD's poor CPU performance will be a major limitation for the PS4, or will it not matter that much (i.e. future games won't be very CPU-bound)?steve walther - Wednesday, February 20, 2013 - link
Might hurt it a bit, but it depends on the clock rate or if they balance across cores properly, but with GPGPU tech being utilized, that should take a good bit of load of the CPUAlso depends on the resolution its gonna be running at.
suprem1ty - Wednesday, February 20, 2013 - link
My bet is that it wouldn't matter much. In multi-threaded scenarios Intel tends to still be ahead of AMD, but the gap is much smaller. And seeing as how there is 8 threads available hopefully developers will start putting more effort into multi-threading their games to take full advantage.mga318 - Wednesday, February 20, 2013 - link
If the GPU is between a Radeon HD 7850 and a 7870, it probably won't a problem.MonkeyPaw - Wednesday, February 20, 2013 - link
I don't see it as an issue, especially considering this will be a vast leap over CellBE and the GeForce 7900-class GPU of PS3. Besides, this combo is still driving just 1080P and eventually 4K displays. I think the hardware will be more than sufficient to produce a good gaming experience. Granted, this upcoming generation of console will probably run for 10 years.Wreckage - Wednesday, February 20, 2013 - link
Their GPU is also a generation behind NVIDIA's GPU.RussianSensation - Wednesday, February 20, 2013 - link
You are talking about desktop GPUs. The GPU inside the PS4 is as good as it gets for the $. ~ 2 Tflops with 176GB/sec suggests a GPU that's extremely close to HD7970M. HD7970M delivers 95% of the performance of a GTX680M for $350 less:http://www.notebookcheck.net/Review-Update-Radeon-...
Since you cannot fit a 200W GPU inside a console, not even a 150W one, HD7970M style GPU was hands down the best possible high-end choice for PS4. You simply cannot do better for the price/performance and power consumption right now.
Guspaz - Wednesday, February 20, 2013 - link
The original XBox 360 shipped with a 203W power supply, and the Xenos GPU was believed to have a TDP of about a hundred watts, so 150 or 200W in a console isn't impossible.Heck, my Shuttle XPC has a 500W PSU that's much smaller than the 360's power brick. I suspect the large size of the 360's PSU had more to do with it being passively cooled than anything else. If the PS4 (or 720) had an internal GPU that could leverage the active cooling system inside the console, fitting a 150W GPU inside the thing would not be terribly difficult.
blanarahul - Thursday, February 21, 2013 - link
I did some calculations and it seems that the APU has 256-bit wide GDDR5 memory clocked at 5500 MHz and the GPU is clocked at 800 MHz.blanarahul - Thursday, February 21, 2013 - link
GPU performance should be at par with the 7850 but it is slower than 7970M. It does have higher memory bandwidth as compared to 7970M.sohcermind - Sunday, February 24, 2013 - link
Remember a console can dedicate all its resources to graphics so a 7800gtx on pc game won't compare to the PS3 graphics just look at Uncharted 3 for example. So likely the PS4 will outmuscle anything you can do on a 7970m.RussianSensation - Friday, February 22, 2013 - link
I never said it's impossible but it would make the console a lot larger, more expensive to cool, etc. I realize that PS3/360 consoles drew 200-240W of power when they launched. For cost reasons the CPU+GPU are on the same die. That alone means going with a discrete GPU option like HD7870 would have cost a lot more money. At the end of the day Sony is not the company it used to be. It cannot afford to sell an $800 (original BOM for PS3) console for $600 in this market. Not only can they not afford such losses on the hardware but the market will refuse to buy a $599 console.gruffi - Thursday, February 21, 2013 - link
No, it's vice versa.Stuka87 - Thursday, February 21, 2013 - link
A generation behind? If you are referencing Titan, it is in no way comparable to what goes into consoles. Looking past the fact that nVidia doesnt have an integrated GPU solution (As they do not make CPU's), nVidia's mobile chips are certainly not a generation ahead of what AMD has. I would actually give AMD the edge in mobile GPU's.sirroman - Thursday, February 21, 2013 - link
If it's confirmed to be GCN, you are wrong.Wolfpup - Thursday, February 21, 2013 - link
AMD's best CPUs lag behind Intel's right now, but that's not saying anything-they're still crazy powerful compared to anything else.The issue is that this ISN'T AMD's best CPU design, it's the successor to Bobcat, a (better than) Atom competitor. Remains to be seen how that does (Microsoft's using the same thing).
Still, it'll be a lot faster than we have now.
I'll be SUPER disappointed if this doesn't have full backwards compatibility, and also if it has a WIi U-style controller that doesn't work with it's battery dead or pulled.
HammerStrike - Thursday, February 21, 2013 - link
I've never understood the obsession with backward compatibility. For one, unless you have a library of games, it's not important, and if you have a library of games you already have a PS3. If you already have a PS3 just keep it to play your old library of games. Problem solved. Two, I would much rather have $100-$200 less in parts, and, by extension, retail price of the console and no backward compatibility then vice versa. And no, you are not going to run PS3 games on the PS4 in emulation, so that is not an option.Belard - Thursday, February 21, 2013 - link
Simplicity. That is why.So you don't have 3-4 consoles plugged into the TV. Personally, in my household - we have a Wii on a kids TV. A PS2 on the main TV. We are waiting for the PS4... I don't see the point of spending $250 for a PS3 when that can go towards the PS4. But, we would love to play some PS3 titles on the same same PS4 console.
That would be better than buying a used PS3 for $150 and sucking up space... both with the Console itself, the controllers, cabling, etc.
Its bad enough we'll still have the PS2 for now.
The horse power of the PS4 should allow PS2/PS3 gaming through emulation.
poohbear - Thursday, February 21, 2013 - link
how are they behind intel exactly? have u seen the newest game engines? in BF3 and MoH Warfighter AMD & Intel are neck and neck, its only for older single threaded games that intel beat AMD out. Any new game that's multi threaded doesn't matter. I imagine with all PS4 games using similar modern engines like Dice's Frostbite Engine 2.0 that AMD or Intel would be neck and neck in CPU performance.ShieTar - Thursday, February 21, 2013 - link
AMD may be head to head with and sometimes even in front of a same-price intel core processors, but not nearly with a competitor at the same TDP. Apparently Sony decided the price is more important, and will live with the bigger/noisier design that the higher power consumption of the AMD cores brings with it.Or they just know that AMD needed this deal badly, and like the leverage this gives them.
gruffi - Thursday, February 21, 2013 - link
C'mon, get your facts right. Jaguar is a low-power core. It has excellent performance/watt and performance/mm^2. There is nothing from Intel which can compete with it. Atom is too slow, SB/IB is too big and too expensive. We are NOT talking about the big server design from AMD, Orochi. And even the Bulldozer cores are not "bigger/noisier" than comparable Intel cores.plonk420 - Thursday, February 21, 2013 - link
amd (bulldozer) vs intel performance (sandy):"(pretty much) the same"
AvP
BF3
Battleforge
Crysis 2
DA2
Hard Reset (1600p)
Metro 2033
Stalker: COP (1600p)
Shogun 2 (1600p)
WoW Pre-Pandaria? (1600p)
3DMark11
Unigine Heaven 2.0
"similar"
Arkham City
Dirt 3
Stalker: COP (1200p)
Shogun 2 (1200p)
WoW Pre-Pandaria? (1200p)
"pitifully slower"
Civ5
Hard Reset (1200p)
SC2
Skyrim
plonk420 - Thursday, February 21, 2013 - link
oh, sauce is techPowerUpxaml - Friday, February 22, 2013 - link
Sauce?sohcermind - Sunday, February 24, 2013 - link
sauce = source that's how you say it in a New York accent.Exodite - Thursday, February 21, 2013 - link
To be fair the Bulldozer vs. Sandy Bridge comparison is rather irrelevant given that this is about Jaguar.Anyway, Intel's single-threaded performance is significantly higher but that's only reflected in games that have computationally intensive threads.
Usually MMOs, RPGs and tactical games.
Shooters and other action-genre games traditionally aren't as CPU-limited, which means it shouldn't impact console gaming as much as it would PC gaming.
That said, the per-thread performance of the PS4's Jaguar cores are likely to be significantly slower than that of the Bulldozer chip used for the comparison.
Krysto - Thursday, February 21, 2013 - link
It's not even one of the high-end CPU's though. It's Jaguar cores, which are the successor of Bobcat (competitor to Atom). Jaguar seems about as powerful as ARM's Cortex A15 to me. Not sure why they'd prefer 8 of those over 4 high-end cores. Just because of price?Also I hope the GPU is at least a GCN one. Can we confirm that?
Oldboy1948 - Saturday, February 23, 2013 - link
In Geekbench no ARMs have any results in vector perf. Why?For Bobcat it is very much about size and speed of cache. E-350 has L2 running at half speed. This is much too slow. Full speed is a must. 8 cores at 2GHz with full speed 4MB L2.
The L1 cache for Bobcat is much better than Bulldozer's. Separate L1 for every core must be an optimum. Bulldozer should have 64i+64d for every core...
cjb110 - Thursday, February 21, 2013 - link
Remember that the Devs will be a lot closer to the 'wire', and wont have Windows in the way, so AMD's disadvantage in Windows might not be as significant. Carmack has repeatedly pushed for MS to allow game devs closer access to the hardware.Plus cost, if your loosing 20% performance AND 50% cost...then it makes sense for a mass consumer device.
ionis - Thursday, February 21, 2013 - link
I wonder what the OS will be. Given Sony's history, I doubt it'll be Linux/BSD. It's probably not going to be Windows or Mac. So what does that leave? Did they make their own x86 OS?medi01 - Thursday, February 21, 2013 - link
PS3 is Linux based, what on Earth could PS4 be based on? Windows 8?ionis - Thursday, February 21, 2013 - link
I thought PS3 games used tasks. If it was Linux based, wouldn't it use threads?medi01 - Thursday, February 21, 2013 - link
API used by game devs is completely irrelevant (and considering what a strange CPU Cell is, no wonder there are "tasks"), the OS itself is Linux based.ionis - Thursday, February 21, 2013 - link
Tasks and threads aren't APIs. But there are plenty of OS specific APIs that would allow someone to figure out the OS just by the API. I haven't been able to find anything online saying the PS3 OS is Linux based.powerarmour - Friday, February 22, 2013 - link
The PS3 currently uses a custom 'Nix based OS, and the main (or rather the best performing) API is LibGCM which is very low level compared to OpenGL/DirectX etc.versesuvius - Thursday, February 21, 2013 - link
Just as Quadro and Firepro cards use the same silicon as the ordinary gaming cards do, yet produce considerable performance advantage over the gaming cards, the custom made CPU and GPU in this unit will benefit from the same kind of treatment. Rest assured that AMD can tweak its products to deliver next gen performance with this years silicon. It is only a matter of the volume that is guaranteed to be purchased that teases a company like AMD or NVIDIA to do so or not, and the potential sales volume for the console, any console, is always very high. At least when it comes to companies with an established developer and software base like Sony and Microsoft. In short, don't worry. The performance will most probably blow your mind.gruffi - Thursday, February 21, 2013 - link
Jaguar is a low-power core and the successor of Bobcat. Everything that Intel has in this market is Atom. And Atom is pure sh** compared to Jaguar. Btw, benchmarks say nothing most of the time. Even the bigger AMD processors are good, but most of the time slowed by poor software optimization under Windows.iwod - Thursday, February 21, 2013 - link
Most Software, Library, and Other tools are written with Intel x86 in mind. Not AMD's x86. And Software Optimization is key to performance. I guess with MUCH closer to metal programming, tuning, AMD'S CPU isnt as bad as some of those benchmarks made out to be.( That doesn't mean that it will out perform Intel. is just that there are still quite a bit more to squeeze out from AMD )
zlatan - Thursday, February 21, 2013 - link
It's not a problem. The PS4 is designed to offload the data parallel workloads to the iGPU. A throughput optimized processor (just like a GCN multiprocessor) is more optimal for physics and AI calcuraltion, or just any kind of sorting, culling or asset decompression. Even you can simulate physics on the iGPU, and copy the results back to the CPU in the same frame, so you can create smoke particles that affect the AI.medi01 - Thursday, February 21, 2013 - link
AMD's APU is several generations ahead of Intel, which makes it up nicely, doesn't it?So who went for AMD's APU:
1) Sony
2) Microsoft
3) Valve (SteamBox)
It's also a huge win for AMD on another front => games (nearly the only application that needs much computing power that we run on PC) will tend to be multi-threaded from the beginning, so there goes Intel's single thread performance advantage.
cknobman - Thursday, February 21, 2013 - link
NO. Look at the larger picture here.Regardless of Intel or AMD Sony has chosen an x86 multicore CPU which most games today are not even close to being optimized for.
This new console will mean that game developers will specifically start coding for multicore x86 architectures and we will likely see huge leaps in performance for games.
This is a win for PC and console gamers, heck it is a win for the entire gaming and software industry.
medi01 - Thursday, February 21, 2013 - link
It's a loss for Intel (single threaded crown, multi threaded, much much less so) and nVidia.Sivar - Thursday, February 21, 2013 - link
This guy knows what he's talking about.PC processors have been fast enough that many games still are poorly optimized for more than a few cores. As a software engineer, I can understand why -- it's a pain in the ass.
Consoles are a different world, though. Your game has to compete with the other guy's game on hardware that won't change for years, so there's a strong push to get every bit of performance you can get. This is a huge win for all platforms (even mobile phones have 4+ cores), and will greatly strengthen the market for a highly thread-optimized ecosystem.
Belard - Thursday, February 21, 2013 - link
Which gives consoles the PROs and CONS.... like the Amiga computers from the 80s~90s. The Amiga 500 sold from 1987~1991 with no change other than systemboard tweaks and shrinks.The games hit the hardware very very tightly. So much that most games didn't allow for the multi-tasking OS to run. Remember, this is a 7,14Mhz single core computer... so they needed as much as they could from it.
This hurt the Amiga as a computer... even with my 25Mhz high end Amiga 3000, I had to run "emulation" of the Amiga500 with ADOS 1.3, rather than my full 32bit/ADOS 2.x or 3.0 OS.
DimaB - Monday, March 4, 2013 - link
From what I have experienced and many of has is that for the most part Gaming will always be based on resources of the Company and tools as well as hardware.CPU bound gaming ? Well from what AMD has planned APU is a CPU and GPU. ;)
Does this make any sense?
ImSpartacus - Wednesday, February 20, 2013 - link
I'm betting Mr. Shimpi is kicking back the drink right about now. He's probably surrounded by a few dozen high capacity SSDs.God help us if the HDD isn't user replaceable.
tipoo - Thursday, February 21, 2013 - link
Lol. But SSDs in a console is needless cost imho, games load from optical media anyways unless you want a SSD large enough to fit every game in it, which would make the console so much more expensive.Flunk - Thursday, February 21, 2013 - link
I don't think we really know enough about the PS4 to say that yet.meyerkev248 - Wednesday, February 20, 2013 - link
And then in 5 more years, we'll be complaining about the console limitation of 8 GB.SlyNine - Wednesday, February 20, 2013 - link
512 MB was on the low side when they released the last gen. 8 gigs is on the high side now. 4 gigs of ram is really all a gaming PC needs. 8 gigs is nice tho. I think we wont see ram size as a problem in these consoles for 10 years.coffeetable - Wednesday, February 20, 2013 - link
4Gb is all a gaming PC needs right now because the vast majority of AAA releases have been pegged for consoles with their paltry 512Mb. Now that devs have 8Gb to play with, you better believe that 8Gb will be bog-standard by the end 2015.The fact that the PS4 has 8Gb of graphics memory to play with is going to have an interesting effect on PC graphics cards too.
poohbear - Thursday, February 21, 2013 - link
was gonna say the same but u beat me to it. now that they've passed the 4gb/64bit mark, 8gb is gonna be paltry by 2015-16. by then most systems should have 64gb RAM (heck i have 16GB ram now, and i upgraded to that in 2012.... it only cost me $70!)sohcermind - Sunday, February 24, 2013 - link
16GB of RAM is pointless for gaming.piroroadkill - Thursday, February 21, 2013 - link
4Gb = 512MBThe capital B is bytes, lowercase b is bits..
sohcermind - Sunday, February 24, 2013 - link
4GB RAM is enough for PC exclusives too don't kid yourself plus PCs use a ton of ram in background tasks which consoles don't need.TheSlamma - Thursday, February 21, 2013 - link
Always surprises me when people forget about how tech has exponential growth. Seriously TEN YEARS? PC already has texture packs that demand over 1GB of video ram let alone system RAM and as 1 guy said, it's mainly due to the porting from the current rag consoles that have been outdated since 2 years before their release.Death666Angel - Thursday, February 21, 2013 - link
Well, the 8GB RAM is unified between CPU and GPU. Current gamer PCs have at least 4GB for the CPU and another 2GB for the GPU. The high end ones would be sitting at 8+3GB and more.IAmRandom301982 - Tuesday, April 16, 2013 - link
Current PC's have 4GB of DDR3 for the CPU, and another 2GB of DDR5 for the GPU.They do not share though. So the GPU only has 2GB to work with, while the CPU only has 4GB of slower ram to work with.
The PS4 will be 8GB of unified RAM. So if a developer makes a game and the CPU only needs 1GB of DDR5 for its tasks, then the GPU can access the rest of the 6.5GB of DDR5 or so that the OS is not using in the background.
Stuka87 - Thursday, February 21, 2013 - link
4Gb is *NOT* enough for any current games. Most games I play will easily consume 4GB of RAM. Then once you add windows and such, we are far over that.Stuka87 - Thursday, February 21, 2013 - link
That first 4Gb should be 4GB.SlyNine - Saturday, February 23, 2013 - link
When you are talking about ram. It's kinda implied that you're talking in bytes and not bits.ltcommanderdata - Wednesday, February 20, 2013 - link
The rumours/leaks pegged the PS4 at 4GB of faster GDDR5 whereas the next XBox is thought to have 8GB of slower DDR3. It's good that the PS4 now turns out to have 8GB after-all. It might cost more now, but the cost will drop over the console's life-cycle while the extra RAM will no doubt prove useful to developers.tipoo - Thursday, February 21, 2013 - link
Yup, I thought it would either be 4GB GDDR5, or 8GB DDR3. Got the best of both worlds. Nice. Even the fanciest of graphics cards usually don't have half of that GDDR5 on them,and none of us can get it for system memory.nathanddrews - Wednesday, February 20, 2013 - link
Steam.Zink - Wednesday, February 20, 2013 - link
No it will be all locked down. A $550 PC with an HD 7870 will get you the same performance and allow you to upgrade the hardware.SlyNine - Wednesday, February 20, 2013 - link
what is locked down? Steam? It runs on any PC linux, mac, or windows. Thats locked down?KitsuneKnight - Wednesday, February 20, 2013 - link
The PS4 is locked down. Assuming Sony didn't have a sudden change of heart (a completely massive one...) they'll likely try even harder to lock this thing down than they did with even the PS3.People will eventually break that (like all prior systems), but then there will be the challenge of getting a regular operating system to actually run on it (getting graphics fully functional will likely be the hardest problem... even leveraging the open source radeon drivers it likely won't be easy). Eventually people might be able to have Steam up and running on it... but by that point, the hardware in the PS4 will likely look fairly antiquated by normal PC standards (just like the PS3/360's hardware is very low end by today's standards).
medi01 - Thursday, February 21, 2013 - link
It took 4 years to break their last system, and actually breaking into i twas a lucky combination of Other OS running on it on top of insanely silly crypto mistake on programmers part.Pariah - Wednesday, February 20, 2013 - link
I can't believe how often idiocy like this gets parroted. You cannot compare to the specs of a PC to the specs of a console. Even when relatively similar base hardware is used, it is like comparing apples to tires. The console is highly optimized to drive graphics and other calculations that are game related. A PC using an HD 7870 will get crushed by a console if it is based on the hardware Anand says it is.user991 - Wednesday, February 20, 2013 - link
According to AMD the same hardware can be 5 or 6 times faster when directx or opengl layers can be bypassed. Coding for the GPUs native instructions and not going through an abstraction layer can have big performance improvementshttp://www.bit-tech.net/hardware/graphics/2011/03/...
Since the GPU and CPU share a die and it's likely a custom chip, a shared cache between the GPU and CPU may be possible. If that's done it would put a PC to shame
evilpaul666 - Wednesday, February 20, 2013 - link
Although I don't know what all PS3 devs are using (obviously), but I've heard on PixelJunk's podcast that they use OpenGL for graphics.Wolfpup - Thursday, February 21, 2013 - link
I call shenanigans on those claims. I've wondered wy no one's ever really addressed them.They don't match up to what we see in real life-a similar GPU as what the consoles have delivers similar results (I know, I've got a Geforce 9650m GT I still use daily).
For another, the Xbox is already using Direct X, and the Playstation 3 Open GL. If that was really killing performance that much, the Playstation 2, Playstation Portable, Wii, and 3DS could all run the same games as the Playstation 3 and Xbox 360.
For another-if this was true, someone like Carmack would have been talking about it YEARS ago. He's certainly railed against dumb moves in APIs before. All sorts of developers would be talking about it and demanding action from Microsoft.
Maybe there's some bizarre situation where some graphics function really is that much slower through Direct X still than Open GL or something, but there's just no way it's really making things 10 or 100x slower.
Blighty - Thursday, February 21, 2013 - link
No, facts are kind correct. The reason performance on a PC is so slow (comparitivly) to the same hardware in a console is all about those layers and other overheads.First you have Windows Direct3D (or Open GL), which is a general 3D API, lots of other performance hugging systems, and then you have the graphics card drivers themselves.
The ps3 in particular does a really good job (nowdays) of getting rid of those things with the GPU command buffer generation library (libgcm). It's a better way than OpenGL or Direct X for when the hardware is known. An openGL will be written on top likely for the ps4 (just like the ps3) so as to ease developers into it.
There is so much capability that cannot be used with standard API's - especially in these new GPUs, and particularly with AMDs new tech and the new APU tech that will be in the PS4.
I'll point you to a great article on why this PS4 will be pretty astonishing in the years ahead, as they are working on a new libgcm.
http://www.gameranx.com/updates/id/12324/article/p...
medi01 - Thursday, February 21, 2013 - link
Jesus, who says it's slower on PC???Ever seen how, say, Dragon Age looks like on PC vs PS3/Xbox?
What is vastly different with consoles is IT MAKES SENSE TO OPTIMIZE FOR PARTICULAR HARDWARE. Now how much you could win from such optimization depends, of course, on how much your generic code+API it is based on sucked. But there is no way a mature API would slow something down 5-6 times. Even 1.5 times slower is hard to imagine.
SlyNine - Saturday, February 23, 2013 - link
Compare a X800XT GOW to Xbox 360 GOW. Your "facts" were just destroyed because the 800XT looks better!The reason PC games run slower now are because of effects that are really easy to implement and hard to execute, like HBAO for example.
Zink - Wednesday, February 20, 2013 - link
I've never seen an article like that and just assumed console development was also done with a 3D API. It seems misleading then to be comparing the "performance" of the PS4 GPU to an HD 7870 when the actual code and graphics that get rendered on the PS4 are at a much higher performance level. I understand that the comparison in an article like this is of the hardware and low level performance but most people probably just extrapolate that to in game performance.poohbear - Thursday, February 21, 2013 - link
what pariah said. A 7870 in a PC is not optimized because developers have to develop for it and 2 dozen other configurations, so a lot of efficiency is lost. whereas in a console they know EXACTLY what it's capable of and can optimize specifically for it. a 7870 in a console would be similar to a 7950 in a PC. :)SlyNine - Saturday, February 23, 2013 - link
I can't believe how often this stupidity gets repeated. The NUMBER ONE REASON PC's require a lot to push games on max is because devs take the cheap way out and use effects like HBAO,HDAO,SSAO that require a lot of GPU power for almost no return. For example, GOW played at med settings 800x600 with a 9700pro. If PC hardware was SO inefficient that wouldn't be possible.There are some efficiencies to be had but not on the level that you're suggesting or that have been suggested.
sohcermind - Sunday, February 24, 2013 - link
It is a fact that consoles are more efficient you got one example and my 7600gt with 1gb of ram and 3800+ amd single core lagged on gears of war and lowest settings and 640x480 so I don't buy it.SlyNine - Monday, February 25, 2013 - link
Well, I dont buy that shit at all either. a 7600gt should be able to play GOW no problem on low @640x480.my 9700pro only had a 3200 barton backing it up. Med 800x600. 20-40 fps. Don't believe it, I dont care.
sohcermind - Monday, February 25, 2013 - link
And that PC I built in 2006 and costed double that of an XBOX 360 and wasn't even as powerful.RussianSensation - Wednesday, February 20, 2013 - link
Not a chance. Developers have access to the 'metal' on PS4. Certain functions will work 10-100x faster because there is no API overhead. Some of the graphics we've seen at the unveiling are already approaching Crysis 3 level and HD7870 cannot even come close to maxing out Crysis 3. In 4-5 years the type of games PS4 would be able to run, an HD7870 would not and certainly not a $550 PC because that means a Core i3 or low end FX4000.stickmansam - Wednesday, February 20, 2013 - link
http://pcpartpicker.com/p/ERS8Cough Cough
Launch price of PS3 ranged from $500-$600 and steam sales offset hardware costs
I do agree consoles can get better performance out of hardware but my old HD2600 and E5200 from 2006/2007 can still play most games at 720p low/medium 45-60fps.
thedarknight87 - Wednesday, February 20, 2013 - link
Are you currently using PSN on your PS3? Compare PSN price of multi-plat games to the Steam prices. Most of them match up, thanks to a EU PSN sale, Dishonored and Darksiders 2 are actually cheaper on PSN than on Steam. I am quite sure that a few games are actually cheaper on Steam than on PSN.However the way Sony is pushing for agressive PSN pricing gives me hope that the PC-console digital game download price difference will shrink considerably over the coming years
stickmansam - Thursday, February 21, 2013 - link
Nope, but hopefully consoles will have games at 50-75% off like steam doesSlyNine - Saturday, February 23, 2013 - link
They will huh, Source?minijedimaster - Thursday, February 21, 2013 - link
That's a great story. Unfortunately it falls apart because it assumes PC's in general and budget PC's specifically will still be running AMD 7870 class GPU's in 4-5 years.They of course will not be. The PS4 on the other hand will. Even if the claims of speed gains are true, by the time devs are confortable with the hardware and cranking out highly optimized games on PS4 hardware, the PC will be 3-5 generations past the silicon in the PS4.
sohcermind - Sunday, February 24, 2013 - link
PS4 will cost $400 and have better performance then your $600 PC with 7870 my i53570k, 4gb ram, and gtx 660 system costed $800 to build last year. PS4 will be half that and offer better performance.sheh - Wednesday, February 20, 2013 - link
176GB/sec memory to CPU?!RussianSensation - Wednesday, February 20, 2013 - link
Shared with the GPU. Possibly 256-bit memory bus width @ 5500mhz. It is more memory bandwidth that the fastest GCN mobile GPU, HD7970M:http://www.amd.com/us/products/notebook/graphics/7...
Pretty impressive since a console can't realistically include 150-200W GPUs like HD7970 or GTX670/680.
SlyNine - Saturday, February 23, 2013 - link
No, I'm pretty sure the 7970 has more, at around 264 GB/s.silverblue - Sunday, February 24, 2013 - link
He did say 7970M though. :)SlyNine - Monday, February 25, 2013 - link
Guess I should read more lol.sheh - Thursday, February 21, 2013 - link
If true, that alone could make a hacked PS4 a good compute platform. That's far more than anything PCs have.haukionkannel - Thursday, February 21, 2013 - link
Yep the 8GB of GDDR5 is real monster in here! A lot of people seems to forget how much faster this memory is than normal DDR3 or upcoming DDR4 are!This really allows this machine to trech its muscles. The GPU is allso much faster than I expected. And 8 cores even smaller should be sufficient to feed those GPU cores, so this can really fly for a console! 1080p seems to be possible and good 720p games should be just fine.
sohcermind - Sunday, February 24, 2013 - link
LOL 1080P is more then fine. Split screen Battlefield 3 @ 1080P @ max settings will be possible on the PS4 imo.powerarmour - Friday, February 22, 2013 - link
Think of the PS4 as an 8GB graphics card primarily with some integrated CPU's ;)Soulkeeper - Wednesday, February 20, 2013 - link
I'm really impressed by the choice of 8GB of fast shared gddr5It should have no problem putting llano/trinity to shame.
This thing will be a beast :)
Sullitude - Wednesday, February 20, 2013 - link
Surely this is x64, a 32bit processor can't utilise 8gb of ram, IIRC.faizoff - Thursday, February 21, 2013 - link
They mean its a x86 architecture CPU. It still supports 64 bit computing.Wolfpup - Thursday, February 21, 2013 - link
x64 isn't a thing. It's just an annoying way someone came up with to write "64-bit x86 CPU". Drives me nuts every time I see it.tipoo - Thursday, February 21, 2013 - link
x64 isn't an architecture, nor does x86 mean 32 bit. Yes, it is a 64 bit processor. Something can be both x86 compatible and 64 bit...All the chips in the last few years on our PCs have been.Flunk - Thursday, February 21, 2013 - link
AMD calls the instruction set AMD64 and Intel calls it x86-64. Microsoft only calls it x64 because they think ours heads are too mushy to remember x86-64.ricardoduarte - Wednesday, February 20, 2013 - link
I wonder how long it will take before the os is being given in isos all over the internet for users to install on a normal PC.:D in the same fashion as MAC OS.Sullitude - Wednesday, February 20, 2013 - link
I think this is unlikely to work. These still aren't off the shelf parts, I doubt you'll actually be able to build your own PS4 from Newegg. And short of some amazing reverse-engineering by hackers, we won't have drivers for anything except the official hardware.The opposite is much more likely to be true - Windows, OSX or Linux being installed on this x64 CPU (provided Sony allows it or the console is hacked).
ricardoduarte - Wednesday, February 20, 2013 - link
Yes you might right, however this thing is built mostly with off-shelve parts, I doubt OSX will be installed on this, Linux and Win most likely, dont know why would this be done (unless it is possible to create partitions without bricking the device for dual-boot).kmh0909 - Wednesday, February 20, 2013 - link
No one needs dedicated gaming consoles.Fujikoma - Wednesday, February 20, 2013 - link
Must be you have no friends. Consoles are cheaper/easier to replace than a decent computer, if someone is drunk or rowdy. Much easier to throw a disc in and grab some controllers.Menoob - Wednesday, February 20, 2013 - link
How can draw to such a stupid conclusion? How old are you anyway? If you think getting drunk is somehow cool you must be particularly green.thedarknight87 - Wednesday, February 20, 2013 - link
No need to be judgemental mate, guy has drunk rowdy friends, he doesn't think getting drunk is cool. Even if he does think so, its his prerogative.tzhu07 - Thursday, February 21, 2013 - link
I don't play console games anymore, but they have clear advantages over something like a PC. Consoles are simple and easier to operate, great for multi-player with people in the same room, and have exclusives that you can't get on the PC (especially if you're a fan of Japanese developers). They are more family-friendly and you don't have to tweak any settings within games. Also, crashes and freezes are far less frequent on consoles.If I was still into gaming like I was when I was a teenager, I'd buy a PS4 or Xbox 720.
This Guy - Thursday, February 21, 2013 - link
I'ld have to disagree. A PC game without intrusive DRM can be a hell of a lot easier to operate. Most of my PC games just load. I don't have to sign in, I don't have to select my save location. I just click on the game icon, wait ten seconds, click continue and I'm on my way.As for crashes, the only one's I've had often were in Fallout 3 and NV, and even then that was using mods.
My main experience console gaming is Forza 4. A minute just to load the game. Last time I checked I had more than 50 hours in menus (not including upgrading/tuning. That was for ~250 hours of play all up. Four days of my life just waiting.
SlyNine - Saturday, February 23, 2013 - link
PC's also have great games you wont see one consoles.poohbear - Thursday, February 21, 2013 - link
I was looking everywhere for detailed info on the specs, thanks for putting up this info and your thoughts on it!!!! Really more than i expected as some people were saying its gonna be 4GB RAM (which would've been terrible if its supposed to last 8 years). The thing is these specs aren't anywhere near futureproof..... in 2 years they'll be completely low end, let alone 5 years! Anyways from the games i saw being previewed, the baseline for graphics will be like BF3 Ultra on PC, which is still fantastic!ilikemoneygreen - Thursday, February 21, 2013 - link
For PC building at least their is no option to install GDDR5 ram. That kind of ram is only in GPUs to my knowledge. Is this equivalent to say ...the systems DDR3 RAM? I also realize DDR3 is built into the worse off GPUs but I guess my question goes to differentiating that between the GPU RAM and the system RAM. They say the PS4 has 8gb of GDDR5 RAM...But its like everyone is acting like its the system RAM. Can the whole system run off the GPU RAM? Kind of a 2fer? I hope I phrased this well enough because im not sure if I actually got it across correct.Just so I'm clear here (Dont need any wasted replies explaining that DDR3 is worst) I know DDR3 RAM in video cards sucks and GDDR5 RAM rocks. GDDR5 is better.
Xorrax - Thursday, February 21, 2013 - link
The 8GB of GDDR5 *is* also the system ram. It's unified and shared between CPU and GPU.silverblue - Thursday, February 21, 2013 - link
Using GDDR5 throughout the system got me thinking as well. I've never really known the latency involved with GDDR5 but it sure means that Jaguar has a huge amount of bandwidth.As others have said, a strong CPU isn't really too necessary here, and as more and more games are multithreaded, I expect even two Jaguars can spit out a decent amount of work (they are, in essence, as powerful as K8 cores albeit with a huge list of ISAs available to them). I do wonder if HSA comes into any of this. Regardless, perhaps it'll convince developers to spend more time on properly threading their games with so many cores available to them, and with them using x86, porting should be easier.
As for "built by AMD", I thought they were just licensing the tech out for somebody else to make it?
tipoo - Thursday, February 21, 2013 - link
The GDDR5 is one unified pool. The CPU and GPU both access the same pool.Mugur - Thursday, February 21, 2013 - link
I know I'm asking much, but some more technical details regarding the cpu/gpu/ram would be nice... :-)8 GB DDR5 and 7850 performance levels is nice (hopefully not comparing to a mobile gpu part). Of course, it also depends of the DDR5 speed. Unified fast cache would be nice too.
But the cpu is very, very slow by today's (pc gaming) standards. Even with 8 cores, unless it's 3 Ghz, the performance won't be enough to push the graphics part to its limit. Don't forget that a game is not quite the optimal task to spread to more than 4 cores, so the single threaded performance is still important. Hell, I've touched notebooks with this kind of cpus and they are painfully slow comparing to the cheapest core i3. They are much better than Atoms, I agree, but there's not the comparison I would make. We need at least desktop Trinity cpu performance here.
Anyway, I know it's impossible, but I would like to see XBMC on this someday... :-)
Xorrax - Thursday, February 21, 2013 - link
Do you feel that the CPU performance of existing consoles (Cell and PPC in the Xbox 360) are lacking? Jaguar is much, much better for integer computation than either of those two machines, even when clocked at a far lower rate. For example, the Power architecture in the 360 has *horrible* IPC as it is in-order, with many pipeline hazards. Clock for clock, Jaguar should be on the order of 3-4 times as fast as the PPE in Cell and the Xbox 360, as it is out-of-order, has extensive register renaming, and a very good branch predictor. Couple that with an increase in real core count, and it seems like a sensible decision for Sony to have made.This Guy - Thursday, February 21, 2013 - link
Still run a 3GHz Pentium 4? Why not?I also have a 1GHz C-50 Bobcat that completely smokes my old 2GHz AMD 3000+ (which in turn was faster than the 3GHz P4's of the same era). I also have an i7-2630QM laptop. I only notice the extra power when simulating clocks.
Modern CPU's are so much faster in part because they include all the other chips that used to make up a mother board (like most of the north bridge and memory controller). Think about grabing the pen next to you compared to sending someone to the shops to buy a new one. Both pen's write just as fast, just your going to have to wait a while for the one from the shops.
The stuff Xorrax talks about is the other part. A great branch predictor and a good memory hierarchy do great things for IPC.
Mugur - Thursday, February 21, 2013 - link
Sorry, but no. If you think a C50 performance is enough, you're delusional. Just check some cpu benchmarks... Even a newer 1800 Jaguar is way more slower than any Sandy Bridge mobile Celeron.I'm an AMD fan so I'm glad to see them in both major consoles, but we need Trinity performance here, not Jaguar. And please don't compare to PS3 or xbox 360 cpus. Compare to the x86 world...
Xorrax - Thursday, February 21, 2013 - link
Pick your comparison -- I'm comparing to existing consoles, which these are replacing. Not PC gamingThe size, cost, and power benefits of 8 Jaguar cores over other something else doesn't seem like a bad decision. As others have mentioned, it will motivate developers to optimize for multi-core more than they have historically.
Links to benchmarks? Jaguar isn't in the market yet, so I'd be surprised to find anything meaningful out there other than data from engineering samples, which may be suspect.
powerarmour - Friday, February 22, 2013 - link
You can't compare what CPU performance is like on a PC, compared to a console with minimal OS overheads.Besides, with more heterogeneous compute capability in the PS4, you could offload a heck of a lot to the main GPU, freeing up the CPU's for other less demanding tasks.
tipoo - Thursday, February 21, 2013 - link
I think an 8 core 1.6GHz Jaguar is fine. No it's not the highest end you can get on PC, but they have costs to consider. Plus it will have some modifications that we don't know of too, and no Jaguar on the PC side has access to GDDR5 memory. Developers will have to work around its limitations, that's just par for the course for consoles.abrowne1993 - Thursday, February 21, 2013 - link
I'm not really savvy to this stuff. Can someone do any sort of preliminary cost analysis to guess how much the components would be? Any sort of general idea would be fine. Just wondering how heavy of a loss Sony is expected to take on this.Shadowmaster625 - Thursday, February 21, 2013 - link
First we have to estimate the transistor count. My estimate is 3.2 billion, 2.8 billion for the gpu and 400 million for the cpu. For a chip that size Sony is probably paying $80-$100 apiece for the first million, and who knows what after that. I cant find the price on a HD7950M gpu die but it cant be much more than $100. So this chip isnt going to be much more than that either. Since the volume is higher I would expect it to be closer to $70 than $100. I'm sure AMD is eating it compared to the margins they get from discrete gpu sales.babyleg003 - Thursday, February 21, 2013 - link
Really, AMD...? A company with $2.70 stock price, that tell you just how capable this company is...not. I guess when you go with the worst you get what you pay for. Give me an Intel i7 processor anytime, I'll pay the extra for performance...JarredWalton - Thursday, February 21, 2013 - link
I don't mind them using AMD, but Jaguar? Really? Ouch. It's Cell craziness all over again, only I'm not sure 8x Jaguar will really provide that much more performance than the CellBE. At 1.6GHz, Jaguar is basically something like 30% faster than Atom at 1.6GHz, which means something like a 1.6GHz Ivy Bridge Core or a Piledriver core would be three times faster. I would have much rather seen 3GHz quad-core Piledriver or similar than octal-core Jaguar.ShieTar - Thursday, February 21, 2013 - link
Of course a CPU like the FX8350 would be often bored when coupled with a HD7850 feeding 1080p. So it would also be a good idea to replace the GPU with something in the 7950 or better range. And if Sony still manages to sell that box for <500$, everybody would be happy.krumme - Thursday, February 21, 2013 - link
LOL, good one, but you misspelled you name.As 8 jaguar cores is about the same size as a single core i7. Add far cheaper process. You could probably get a quarter to a half Intel single core. Try gaming on that :)
Mugur - Thursday, February 21, 2013 - link
EXACTLY.silverblue - Thursday, February 21, 2013 - link
I'm not sure about that. Had you said Piledriver "module", then I wouldn't be quibbling, but as Jaguar should be roughly equivalent to K8 in performance per core, and Phenom was only a modest step up, I couldn't see how a Piledriver core on its own would outperform it so massively especially considering they're smaller than Stars cores themselves. Even so, if you meant module, I'd agree 100%.piroroadkill - Thursday, February 21, 2013 - link
Haha, what? Then stick with the PC Gaming Master Race.Consoles are not made to compete in raw specs with maxed PCs.
Origin64 - Thursday, February 21, 2013 - link
I'm actually quite disappointed with the PS4. Apart from using hardware that's barely current-gen, the new services seem to be blatant ripoffs of what NVidia and Nintendo are doing, and nothing actually new seems to be included in the package. The streaming service produces horrible image quality, from what I've seen from the Killzone video. Also, the lack of any next-gen processing like tesselation in the video worries me.ponderous - Thursday, February 21, 2013 - link
Nice to finally see some info on PS4tipoo - Thursday, February 21, 2013 - link
Anyone who was expecting a sub 600 dollar console to be better than PCs costing more was setting themselves up for disappointment anyways. The rumored specs were pretty much dead on, with a pleasant surprise in the 8GB GDDR5.None of us can get GDDR5 as system memory, and most of us don't have half that, most not even a quarter of it, in a GPU.
And every time a new console launches there are articles about which inexpensive PCs are better, but those kind of miss the point imo, lets see which runs games better in 7 years. Developers won't care about your 7 year old PC, they will care about the current generation of console hardware though and will ensure their game runs well on it.
B3an - Thursday, February 21, 2013 - link
The reason a PC wont last as long, even if it's more powerful, is because PC graphics improve more over time. Even many current console ports right now look significantly better on PC. So yes a console will obviously always run it's games well as it's hardware stays the same. But if PC games stayed at the same level of graphics for many years at a time, then PC's would easily be able to do the same.Personally i'd prefer the better graphics and if it means upgrading now and again i don't have a problem with that.
tipoo - Thursday, February 21, 2013 - link
I'm not saying otherwise, mind. Just that they are different philosophies, and the influx of silly articles elsewhere detailing how so and so computers are already better than it are silly.ionis - Thursday, February 21, 2013 - link
I was really hoping Sony would announce that the PS4 had Rift support.HisDivineOrder - Thursday, February 21, 2013 - link
... "The PS4 also supports instant suspend/resume."That is going to be huge for home consoles. Bigger than anyone can even imagine. Who the hell needs saves when you can just suspend the console at a point and then start up right where you left off?
No logos, no menus, no copyrights, no waiting. You sit down. You pick up your controller. Your PS4 with its camera detects you and the game just starts up. Before you have even sat down, the game is sitting staring at you paused. Waiting for you to hit the Options button to continue.
You need to go to the potty, then the grocery store to pick up more Red Bull, nachos, and a card for apologizing to your girlfriend for forgetting her birthday while playing the new Killzone? You come back an hour later, you sit down. It's on, waiting for you again.
Especially if it can leverage HDMI-CEC to manage your HDTV and receiver on and off, too.
That would feel like the future, wouldn't it?
wedouglas - Saturday, February 23, 2013 - link
Huh? I don't see why this is big deal. You can just pause the game and turn the TV off or do something else. Computers stay on all day and night, why not the PS3 too? I've left it on for days and never had any issues.I think the point of this was that it's a low power mode.
singhjeet - Tuesday, February 26, 2013 - link
It's not so much that the PS3 can't stay on all day and night, it can, easily, I've left mine on Folding@Home for weeks at a time. It's just every time I think to pause the game and just turn off the TV, I have a little mental battle; How long til I'm back at the TV? How much electricity is the system going to be drawing needlessly? How much time is going to be wasted waiting for the game to reload?All of that is now gone, cause it just suspends. It's not so much that it wasn't possible, just that it's made to do it now. You can turn it off mindlessly even if you're going to be coming back in 15 minutes or 5 hours, which will be nice for a console, especially if the game only allows you to save it at particular points in the game.
J-M - Thursday, February 21, 2013 - link
What I'm missing in most "previews" I've seen so far is the fact that the jaguar cores are the first generation cores using HSA. For sure that'll have an effect on the performance when comparing to the current generation cpu's, making comparisons really hard to do. This ps4 will pack so much more punch then the specs suggest when comparing to current pc's.Mathos - Thursday, February 21, 2013 - link
People aren't thinking about some things when they comment about the CPU being an AMD CPU.1: These are low power Jaguar Cores, not Piledriver/Bulldozer. That means 5-25W range for power draw. Using TSMC 28nm HKMG Process.
2: You can't expect the performance of a Standard Jaguar APU to be equal to this APU, due to the difference in memory controller. As a standard JG uses DDR3, where this will have access to much higher speed GDDR5. It should be interesting to see if this gets around the memory controller bottleneck present since the Phenom.
3: This should change the landscape for gaming. Since all new games will be able to run on multithreaded PC hardware. And in this coming gens case, it should benefit AMD with most of them being pre-optimized for AMD hardware.
Mathos - Thursday, February 21, 2013 - link
Had to add another one, what I'd give for an edit function on this.4: These Jaguar CPU cores are all full function x86-64 CPU cores. Which means all new games for these consoles will be programmed to run in native x86-64 mode. This should mean, most PC ports will come able to run in native x86-64 mode as well. Which should add much better visual quality, and give more access to RAM for programmers.
tipoo - Thursday, February 21, 2013 - link
GDDR5 actually has a higher latency than DDR3, much higher actually. I wonder how that will play out. With GPUs the latency doesn't matter as much as bandwidth is the most important thing, but with CPUs after a point bandwidth isn't as important and a super high latency in comparison to DDR3 could hurt.silverblue - Friday, February 22, 2013 - link
Yes, but how much would the latency matter when we're talking an effective 5.5GHz memory speed (1,375MHz GDDR5)powerarmour - Friday, February 22, 2013 - link
I wouldn't worry too much about latency affecting CPU performance, it certainly never harmed most PC desktop CPU's that much when we jumped from DDR1>2>3.Bandwidth is still king for graphics, and that's mainly what this machine is built for.
tipoo - Saturday, February 23, 2013 - link
I'm aware, maybe it won't be a big deal, that's why I said I was just wondering about it. But I think the latency from DDR 1-2-3 is a lesser difference than DDR3-GDDR5. Maybe the clock rate cancels some or most of that out. Again I'm unsure, just speculating.shaolin95 - Friday, February 22, 2013 - link
So you want to go back to DDR and 2-2-2-5 memory?tipoo - Saturday, February 23, 2013 - link
Because that's what I said.wingless - Thursday, February 21, 2013 - link
People here are talking about how AMD CPUs are slower than Intel CPUs. This is true in a WINDOWS environment. What we all must remember is that this system will have an OS and software specifically optimized for AMD x86 multi-threaded processors. AMD CPUs in Windows are victims of poor optimization. AMD doesn't crank out a widely-used compiler for their CPUs like Intel does. The PS4 simply will not have that problem.We're about to see the true power of an x86 system!
demacia89 - Thursday, February 21, 2013 - link
Me thinks it is because Intel ones are simply too expensive and unlike to negotiate the price. It would be pretty silly spend half of the budget on cpu (i.e. $200 for an i7 for a $400-odd console) while the money could be better spent on somewhere else.Q2013 - Friday, February 22, 2013 - link
seems to me haedware wise the new xbox and the PS 4 will be very similar and they will have similar performance. The main thing diffeernt between them will be thier UI and online features. I don't really see the need to buy new systems yet as I have too many unfinished games oon both the 350 and PS3. Granted the graphics will be nicer but will the gameplay be that different? the social stuff is not important to gameplay to me, eventually I will get both new systems once something comes out that I must have Halo 5and the new Killzone would interest me . hard to justify the expense of new systems when the current ones still have life in them,jamescox - Saturday, February 23, 2013 - link
Many people here seem to have unrealistic expectations for PS4. The supposed "high-end" has moved a lot from where it used to be. There is no way you are going to get a $1000, 300 Watt graphics card in console targeted at approximately half that price (or less?). How many people actually buy these ridiculously expensive, power hungry cards anyway? I wouldn't buy a video card for more than $300 (and even that seems high); the price/performance ratio is not there for these super high-end offerings.
Also, does anyone think that Intel or Nvidia was ever really an option? If they went with a separate CPU and GPU, this increases cost and power significantly (communication off chip waste power and reduces performance). Intel has plenty of CPUs which could have been an option, but they do not have a powerful enough gpu for a single chip solution. Nvidia isn't really an option either. They have the gpu, but no good cpus. Would you rather have an nvidia GPU with some integrated ARM cores? AMD is really the only one who can easily provide both a powerful GPU and sufficient cpu power for this chip. AMD wins on price, performance, and power since neither Nvidia nor Intel can offer this on a single chip.
Direct comparison with the amount of memory in a PC is not relevant either. Most of the memory in a current PC is wasted; the GPU memory is really only a high-bandwidth cache for stuff that is in system memory. The 8 GB of GDDR5 on the PS4 should be significantly more efficient compared to the PC architecture. Hopefully it is all accessible from the GPU and CPUs, and is all in a single, cache coherent memory space; this seems like it this could be figured out from dev kits...?
It would be nice if we could get a similar architecture in the PC space. With how small current CPU cores are, and with the memory controller on the CPU, it really doesn't make sense to have CPUs and GPUs separate any more. The actual integer cores are tiny these days. Most of the CPU is taken up with caches and FP units. If you have a GPU on the same die, then the CPU FP units are a lot less important. You still need to have FP units for mixed code, but for serious FP processing, just execute it on the GPU. Although, without using super fast on-board memory, the bandwidth to the GPU will be lacking. It sounds like Intel may be trying to solve this with more on-die memory; most people don't actually need more CPU cores anyway. I was expecting the PS4 to use a large on-die memory, but this probably isn't necessary since they went with such large and fast external memory.
For a PC though, I would want the ability use multiple GPU/CPU/Memory cards.
ET - Sunday, February 24, 2013 - link
"GPU memory is really only a high-bandwidth cache for stuff that is in system memory"That's not correct. There's no need to keep textures in system RAM once they're in graphics card RAM. Also you completely ignore virtual memory. When a game is ran on the PC, if other software is using RAM the game needs a lot of it will be paged out.
The console, on the other hand, will likely not have virtual memory and will keep at least 1GB if not 2GB of that RAM for its operating system and other software (streaming, social, ...).
And BTW, we do have a similar architecture on the PC side. AMD APU's have been available for a while, and AMD's plan has been to offer more and more integration with time. The main differences here are a skew towards more graphics power and less CPU power (which isn't necessarily good for a general purpose PC chip) and a lot more memory bandwidth. It would be interesting to see if any of these make it to PC space in some form.
sohcermind - Sunday, February 24, 2013 - link
While I'm happier with Sony's (and MS') CPU selection this time around,Microsoft hasn't announced anything for all you know the cpu could be a 16 core cpu that was rumored a year ago.
DimaB - Monday, March 4, 2013 - link
It is a shame the bottleneck will be physical media. We may have to discuss in the future the long load screens to the initial loads that will take place.I love the massive capacity Blu-ray affords, but it is the slowest kid on the block.
Then the Mechanical Hard Drive is going to be an issue.
I hope that Sony allows the ability to swap out the Hard Drive (Sata 3)and has support for SSD through the firmware that will boost the performance overall.
Hell the option to load almost everything onto a SSD off of BLuray would be an incredible option..
Xbox360 was allowing this if I am not mistaken..
Overall it is appearing to be one heck of system for pure gaming.
IAmRandom301982 - Tuesday, April 16, 2013 - link
Alot of odd comments being made about the future of PC gaming, and the current state of PC gaming.Someone said that most games need 4GB of RAM to play? Really? I can play multiple PC games like Simcity and Farcry 3 without even touching the 4GB mark on my system, and that is with Windows 7 - 64bit in the background ( which alone only uses 1.3GB of DDR3 ).
I have yet to even get close to 100% RAM usage. No matter what I'm doing ( except if I'm purposefully doing it on a prime64 stress test ).
8GB of DDR5 will be matched by 2015 at a reasonable price?! That is a funny thing to consider. PC's don't even use GDDR5 for anything but GPU's. There isn't a CPU even in the works right now that will be using GDDR5. Not that I've read or heard about anyway. As for a PC GPU using 8GB of DDR5 by 2015 that will be at a mass market price? Hell no. The only ones I could even search about are well into the 2,000$ range. Seems at the moment PC's are generally going to be using 3GB of DDR5 as a standard within the next few years.
There are plenty of PC exclusives out there like the total war series, Metro and others that do not worry about the console world. Not sure why people think developers will actually hold back on their titles simply for the sake of the console world. All they have to do is create the engine for the PC and then basically turn everything to Normal / High settings for the consoles. Turn off AA, turn on DX9, etc. This type of thing is done on the PC all day long and it takes all of 10 seconds to change your settings around to match your system's capabilities.