If you own a games console, you’ve probably seen people arguing about whose console is best. Perhaps they’ll say their console has better games, or the games are cheaper. Back when video game consoles were new, there was a different kind of debate: the bits war.
“Bits” sound a little weird, but they’re what defined how much thinking a processor can do, so they’re very important. Back in the day when games consoles were a very new concept, more bits meant more potential, and thus more ambitious games.
Have you ever heard of “8-bit” before? It’s a term used to refer to the really old consoles, like the Nintendo Entertainment System. 8-bit means that the console’s processor could calculate and store numbers at or lower than what 8-bit could handle.
So, what was the maximum number that 8-bit processors could handle? To work this out, you take the number 2 (which represents the two states of binary; 1 and 0) and multiply it by itself equal to the number of bits the processor is. For 8-bit, we take the number 2, then we multiply it by 2 to get 4, and then multiply that by 2 to get 8, then multiply that by 2 to get 16, and so on. We repeat this process 8 times in total to see what 8-bit’s limit was.
Really smart, boffin-type people have a quick way of writing out this process. It’s called the “power,” and it’s represented by a teeny tiny number next to the number you’re multiplying by. If we were super smart, we could write the above calculation as 28. That little 8 tells us to multiply 2 by itself eight times over. If you want to type this equation into a calculator, you need to type this: 2^8.
If we type 2^8 into a calculator, we get 256. This was the maximum number the NES could hold. As such, games on the NES revolved around this number; for example, each pixel could only be one of 256 pre-selected colors, and characters could only hold 256 of a specific item.
In the very first The Legend of Zelda game, players could only hold up to 255 rupees because of this limitation. It wasn’t 255, because computers are weird and start counting from “0” instead of “1” like we do.
Of course, the console has to “remember” all these different numbers. There’s a number for each pixel color on-screen, one for the money the player has, one for the amount of life they have, and so on. As such, the console used memory to store all these numbers between 0-255, so it could keep track of everything.
But what if we wanted to go bigger? The best way to do this is to increase the number of bits; that way, the processor can handle bigger numbers, and you can do bigger and better calculations without pesky 256 restrictions. This is how the “bit wars” started, with console makers trying to stuff as many bits as they can to impress people.
The first step was to 16-bit, which the Sega Genesis and the Super Nintendo Entertainment System used. If you use the fancy-pants calculation above, you’ll see that 16-bit processors could handle numbers up to 65536. That means they had 65536 colors to choose from, players could score up to 65536 points, and hold up to 65536 of an item; much bigger than 256!
The next step up was 32-bit, which the Sega Saturn and Sony PlayStation both used. Their processors could hold numbers up to 4,294,967,296, which was a lot more than 16-bit!
Nintendo, however, had other plans; they released a console with a 64-bit processor called the Nintendo 64 — hence the name! Its maximum number was — are you ready? — 18,446,744,073,709,551,616. Wow!
So how many bits do today’s computers use? Well, as it turns out, we have no real use for numbers bigger than the really big one above, so the processors we use in everyday life come to a stop at 64-bit.
There was a need to upgrade past 32-bit, as it couldn’t use more than 4GB of RAM (you can see above how 32-bit’s biggest number was 4 billion, and 4GB of RAM has around 4 billion ‘spaces’ in its memory). 64-bit processors can handle up to 16 Exabytes of RAM — that’s 17,179,869,184GB! We clearly don’t need that much RAM yet, so we don’t have a need to upgrade past 64-bit processors.