History and Importance of zero

Zip. Zilch. Naught. The number zero (represented by the digit 0) serves double-duty as a placeholder and a representation of the absence of value in Western mathematics. It can trace its ancestry back 2000 years, but it was popularized in Europe by the mathematician Fibonacci in the 13th century. Without zero, the world as we know it would probably still exist, but it would be far more cumbersome.

The history of zero dates back long before Fibonacci. Babylonians were the first known to use a zero numeral in records dating back to the 3rd century BC, which they represented in a sideways superscript-like notation. The Mayans were the first civilization known to use a form of zero in their system of numerals. Their Long Count calendars, which charted their history, use a glyph meaning the same thing as zero. The first available usage of the glyph dates back to 36 BC, from a site in southern Mexico. The Chinese used zero in computations as far back as the 1st century AD, though the digit itself would not appear until about 600 years later.

The Indians were the first to use the decimal place-value system that our modern number system is based on. Though references to zero date back to 498 AD, the first verifiable use of zero was on a temple inscription from 876 AD. The Islamic scholar Al-Khwarizmi brought the numerical system to the Abbasid Empire. In 1202, Fibonacci wrote a treatise attempting to convince Europe’s populace at large of the Hindu-Arabic numeral system’s superiority to the Roman numeral system. It would not be until the 15th century that the printing press would spread the Hindu-Arabic numerals throughout Europe, and it would take another century for them to enter common practice.

So what do we do with this number? The most common practical application of zero is used in arithmetic, especially with large numbers. Let’s say you have two thousand seventy dollars in your bank account. Suppose you cash a check worth one thousand eighty dollars. How much money do you have? If you’re using Roman numerals, you have to figure out what MMLXX plus MLXXX is. As you can see, it’s easy to count with Roman numerals, but much tougher to calculate with them. With Hindu-Arabic numerals, you have to add 2070 and 1080. See the difference? With a zero as a placeholder, you can add, subtract, multiply and divide large numbers by breaking them down one digit at a time.

But when zero is used as a value, it opens up a completely different world of applications. Here’s an example of a quartic equation, which relies on one side of the equation equaling zero:

3X^4 + 6X^3 – 123X^2 – 126X + 1,080 = 0

For those of us who aren’t mathematicians, why would you want to solve quartic equations in the first place? For a simple answer, look at any Pixar film. Every object in that film is drawn by a computer. To generate the objects and any special effects, a computer must process countless equations. A donut (or a torus in mathematical terms) is just one of the shapes that can be generated by a quartic equation. Without a zero, drawing complex three-dimensional shapes becomes impossible. And this is just one of the many equations that uses a zero as one of its terms.

Computers might still be possible without zero, but they would be less efficient. Large numbers would still exist without zero, but they would be more cumbersome to work with. Higher mathematics would still exist without zero, but the proofs would be far less elegant and require far more work. With its myriad uses, zero is worth far more than the sum of its parts.