As a nerd, I am frequently irritated by minor technical inaccuracies, like hard drive labels
and Ted Stevens
. Recently a game called Bit Boy!!
hit WiiWare, and it is GEEKING ME OUT because it perpetuates the myth that "bit" counts are related to console generations
This is easy to believe: I myself, growing up, thought that every progressive console generation had "double the bits." The NES and Master System had eight. SNES and Genesis, 16. Virtual Boy and 32X? 32 bits. And of course, the Nintendo Ultra
64. But this pattern has since disappeared, and the notion that bit widths and console generations are directly related is, in fact, total nonsense
- Hold on a second! I thought a 32 bit operating system could only use about three
GB of RAM, not four! That's because address ranges in the last gigabyte are typically reserved for hardware memory mapping - when your CPU uses these addresses, it isn't looking for main memory, but other stuff like coprocessor registers or controller signal lines.
- Graphics coprocessors have bus widths too, but they refer exclusively to vector calcluations, and to communication between the GPU and VRAM. Traditionally this was just your frame buffer, but as texture calculations, shaders, and GPGPU
computing become more prevalent, this could start to be a relevant figure. You can currently buy an 896-bit graphics card
- I have never played Bit Boy!!