I am a doctoral student in Cultural Anthropology, with a bachelor's in English & Creative Writing. I specialize in subcultures and cognition.
I love gaming, and I have followed the industry and its technology since I was a kid in the 80's. I have gamed primarily on PC since 2000, though I still follow console news and hardware as well. I was also a sales associate at Micro Center for a while, which was a great experience and got me into PC hardware.
I worked as a mapper and beta tester for the mod Action Half-Life. My maps, most of which have vanilla Half-Life Deathmatch versions, are available on my website.
While most of the system's specs leaked ahead of time, the big surprise at Sony's PS4 event was the inclusion of 8GB of *DDR5 RAM (GDDR5, actually; I abbreviated it for this article). While the quantity was a surprise, the real shock was that it would all be DDR5 - the fast, expensive RAM used on PC video cards. Was this a sign that the system was particularly fast/powerful? Was it a foolish waste of money on Sony's part? I wasn't sure what to make of their decision, so I did a little thinking and research.
Since the PS4 is based on PC hardware, that's the first place to compare. At this point a "typical" mid-range gaming PC probably has a $200 video card with 1 to 2 GB of DDR5 onboard, along with 8GB of DDR3 system memory and a 4-8 core processor.
The PS4 uses an AMD "APU" - a CPU and GPU merged into one unit.
Why don't PCs have DDR5 as system RAM? There are two reasons. For one, DDR3 actually has lower latency than DDR5, making it better for a CPU's quick-access general tasks. And while its bandwidth is a fraction of DDR5's, it is still more than adequate to provide data to the CPU. If we replaced it with gaming DDR5 we might actually see reduced CPU performance due to DDR5's increased latency.
So if Sony was going for a regular PC setup, it would seem optimal to use a mix of DDR3 and DDR5. But the PS4 is based off of AMD's new "APU" style architecture, which integrates the CPU and GPU on a single die. While not as fast as having a separate video card, this has advantages. It's generally better than other "onboard" GPUs - where a GPU is built into the motherboard - because it gives you greater bandwidth between CPU/GPU while reducing power consumption and heat emission. These are concerns not only for notebooks and small PCs, but for consoles as well.
Sony's vague description of the PS4's hardware. "Supercharged?" Is that like Blast Processing?
APU's still, however, share RAM between CPU/GPU. Any "integrated" video card that borrows system RAM - like those in most laptops and cheap desktop PCs - is usually terrible for gaming. Why? RAM borrowed by the video card effectively reduces system RAM, which may be in short supply. It might also saturate the memory bandwidth, leaving the CPU memory-starved. The major problem, though, is that DDR3 system RAM is not designed for gaming purposes. It has plenty of bandwidth for the CPU/system, and lower latency. However, its throughput is meager compared to DDR5. So even if a GPU is good, if it is sharing DDR3 system RAM its performance will be held back.
Suddenly, then, it's clear why Sony used DDR5 - they had to if they wanted to avoid bottlenecking the GPU. Tom's Hardware studied how the graphics performance of AMD Trinity APU's in PCs scaled with memory bandwidth. Even with overclocked DDR3, it looks like the RAM is holding back the GPU. Sony wouldn't want the PS4 to suffer from that problem.
So what does this mean for our assessment of the PS4's capabilities?
First, we can't compare the PS4 directly to an AMD APU like Trinity. Those units are bottlenecked by DDR3, but the PS4's DDR5 provides a robust 176GB/s of bandwidth, comparable to current mid-range PC video cards. It's also clocked at 800mhz, which puts it within a stone's throw of regular desktop GPU speeds. So GPU performance should be closer to a regular desktop video card than integrated or APU graphics solutions. AnandTech suggests GPU performance comparable to an HD7850 or HD7870. Those are excellent $200-$250 mid-range PC cards.
While it's easy to make fun of Sony's move to PC-inspired hardware, the truth is that the PS4 is quite unique, even compared to the AMD PCs from which it is most closely derived.
Second, the DDR5's latency might slow down the system's CPU performance. However, the system's architecture is unique so it's hard to say whether this will be an issue yet. It's possible the CPU is designed to take advantage of that extra bandwidth, for example, or that Sony's DDR5 is lower-latency. At this point the CPU seems to be the weakest link; it will probably take heavily multithreaded programming to take advantage of its 8 cores. Of course, developers of PS3 exclusives like Naughty Dog have been programming for the 6-core Cell for a long time. That might give then an advantage here. And it's likely that the PS4's CPU, like other AMD CPUs, can turn down or off some cores and crank up others for games that use fewer threads.
Overall, having a large, shared pool of fast RAM bodes well for the system's power and longevity. With 8GB of RAM it's unlikely memory will become a limiting issue for 3-4 years. And when it does, the flexible setup will allow developers to better adapt RAM allocation to their game's specific needs. This is a huge improvement over the PS3. While the PS3 has the same amount of RAM as the 360, it's memory is split between system and graphics. This limitation has been linked to the PS3's issues in Skyrim and other games. Sony obviously wanted to avoid being in that situation again.
The new Killzone demo was visually impressive to say the least.
How "powerful" will the PS4 be in the long run? Most of the demos at the (non)unveiling event were running on PCs which approximated the PS4's specs; it's uncertain whether we've even seen a game running on the actual PS4 hardware yet. What we did see was pretty impressive, though, especially the Killzone demo.
We also saw a PS4 variant of the Unreal Engine 4 tech demo. While less impressive than the PC demo, remember that the PC demo was shown running on a $500 GTX680. If Epic can get anywhere near the GTX680's performance on hardware they are still learning, that's a pretty good sign.
Unreal Engine 4 on PS4. Not as nice as on a $500 video card, but still pretty damn shiny.
This all leaves me wondering about the new Xbox, though. Its specs are similar to the PS4's (it's also made by AMD), but it is rumored to use slower DDR3. It's possible, of course, that the "nextBox" will have customized DDR3 of some sort, a faster buffer memory to compensate, or other changes in architecture. Hopefully we'll get those answers soon.
In the meantime, Sony fans certainly have something to feel good about when it comes to the PS4's hardware. This is probably going to be a more powerful system than expected, and one better positioned to last. Its hardware should be comparable to a mid-range gaming PC when it comes out, and as a dedicated gaming machine it will get a bit more mileage out of that hardware. Though I do wonder how much it will cost. It's hard to imagine a price below $500, and even at that price Sony would probably be taking a loss... but that's a question for another day, or another blog at any rate!
*Note: Video cards and the PS4 use "GDDR5", which I abbreviated to "DDR5" because I thought it would be less confusing. GDDR5 is specific to video cards and is designed for high bandwidth. When "regular" DDR5 is eventually developed for use as system RAM, it should have latency suited for processors. Sorry if this caused any confusion, I'll fix the error when I have time to change the images as well.