Do you remember being dizzy in a moving school bus arguing with some fat kid about which console had more "bits" than the other? It was a staple ever-present debate of the 80's and early 90's, and the winner would be the fanboy who could also throw around a few words about "sprite counts" and "clock speed". Nostalgia perhaps prompted InsertCredit to ask a Sony PR rep about the PlayStation 3's "bits" to find out how that now-irrelevant unit of measurement stacks up to those legacy systems:
The PS3 is 128 bit, but it is more 128 bit than the others. The number of bits isn't really a very good measure anymore. To be honest, it hasn't been a good measure since PS1 days. That said...
Most single pieces of data fit in 32 or 64 bits. The benefit of 128 bits is that you can operate on 4 pieces of 32-bit data at the same time, which is called SIMD (Single Instruction, Multiple Data). This is only useful for data that needs the same operation on all 4 pieces, which is common in games for things such as 3D graphical transformations, physical simulation, collision detection, etc. 128-bits is the "sweet spot" of price and performance, so that is what everyone seems to have settled upon. For graphics, it is even trickier to explain. The biggest difference is that in the past, graphics chips were "fixed-function". Now, they are programmable. But people don't really talk about it in terms of bits; instead, they usually measure in terms of flops."
Eight Intellivisons = PlayStation 3
Jaguar + N64 = PlayStation 3
Intellvision + Genesis + Jaguar - Saturn + Neo Geo + NES + 32X = PlayStation 3
Dreamcast - Wii + NES x SNES = PlayStation 3
Dreamcast / PS2 x supergrafx + PC-FX + Neo Geo Pocket Color - CD-i + Nintendo 64 + 3DO = PlayStation 3
Trivia: Did you know that the Intellivision was actually 16 bit?