dark mode light mode Search Menu

History of Zero

Open Box on Flickr

Zero is one of the most interesting numbers. There’s lots of words we use to describe zero: nada, zilch, nothing, zip, scratch. But zero is an invention used to solve a problem: how do you account for nothing when you calculate the number of your sheep or wheat or whatever you need to track?

The Sumerians were the first culture to create a counting system which the Babylonians inherited and adapted. However, it’s unclear whether or not they had a way to account for the idea of nothing when they counted and made calculations.

Instead, the Sumerians, Mayans, and many other cultures dealt with the idea of nothing as a placeholder, not a concept. The 0 in the number 10 meant there was nothing in the first column and 202 meant there was nothing in the second column that counts groups of ten things. Instead, we think of zero as the average of -1 and 1, as a number.

The idea and use of zero as a number, not a placeholder, is credited to Indian astronomer and mathematician Brahmagupta in 628 AD. His symbol for zero was a dot under a number. The idea of zero appears a few centuries earlier but Brahmagupta was the first to use zero the way we use it today, as a number useful for calculations.

By 773 AD, his idea of zero reached Baghdad where it became part of the Arabic number system we use. The Arabic number system migrated to Europe with the Moorish conquest of Spain. The Italian mathematician and businessman Fibonacci expanded the use of zero to do business equations without an abacus, a position based numbering tool.

Why does zero matter as a number? We would not have computers without the binary system which treats zero as a number. The binary system is the foundation of our computers. It’s also important for algebra, calculus, and other math. In other words, the world we know would not exist without zero, nada, zilch, zip, scratch.

Learn More

The Origin of Zero




Who Invented Zero?