Interesting numbers --- zero --- one --- complex --- root 2 --- golden ratio --- e --- pi --- googol --- infinity

Zero is sometimes called nought (naught) or nothing or nil or O (pronounced oh).

Zero is a very strange number. It is neither positive or negative. If you add or subtract zero to any number, that number stays the same. If you multiply any number by zero, you get zero. Any number raised to the power of zero is one, so **2 ^{0}=1**. However, it gets even worse. You cannot divide a number by zero. You cannot take the zeroth root of a number. It's hard to say what

If you have a zero remainder in a division, then you have a whole number as an answer.

In Boolean logic, zero represents False.

There is no year zero in the BC / AD year numbering system. When the numbering goes from BC (before Christ) to AD (anno domini) the year numbering goes: 3BC, 2BC, 1BC, 1AD, 2AD, 3AD.

BC is sometimes known known as BCE (before common era) and AD as CE (common era).

If you want to count how many things there are, for example, how many apples there are in a bowl, then you start counting from one, or you end up with the wrong answer. But computer people and mathematicians sometimes start counting from zero. Fibonacci numbers start from zero, so does the series for evaluating e.

Zero has two meanings. It can be a number in its own right. "I have two apples, and give you two. I now have zero apples." This idea seems normal to us, but it confused the ancient Greeks, and medieval people. How can you say that *nothing* is a number? If you have one or two apples, then fine - you can see them, and see that their number is one of their properties. You can owe someone an apple (because they gave you one and you ate it, so now you must find another apple to give back) so you have have minus one apple. But no apples is nothing. You might just as well say that you have no aardvarks! However, the Indian mathematician Brahmagupta understood that zero was a real number, and gave rules for its use in 628 AD, which were mostly accurate (he ran into trouble with dividing by zero). Once Arabic numbers were accepted, the meaning of the number zero became understood. It would be impossible to have modern mathematics without it.

The other meaning of zero is as a place holder. Early number systems such as the Egyptians had no zero. They didn't need one as they had a unary system. Two was two of the symbols for one, and three was three of those symbols, and so on. By ten, the number of symbols were getting out of hand, so they introduced a new symbol meaning ten, which also got repeated for twenty, thirty and so on. There were also symbols for hundred, thousand and so on. They tended to have a particular order, but they didn't need to. You could muddle the digits up, and still work out how big the number was. The classical Greeks had a different system, with different letters for each digit from one to nine, and yet more digits for ten, twenty and so on. But they didn't need a zero either. The Babylonians developed a positional system. This used the position of the numbers to show how big they were. We have a positional system as well, so 123 is bigger than 97, although the digits 9 and 7 are bigger than 1, 2 or 3. You put numbers in columns, and there is a units column, a tens column, a hundreds column, and so on. This means that there may be no digit in one particular column, such as two hundreds, no tens and five units. We would write this as 205 and not think twice about it. The middle zero is a place holder - "this column is empty". But the Babylonians took a long time to realise that they needed a place holder. They left a space, but spaces have a way getting left out. Anyway, what about a number like 2001? Do you leave a bigger space? Eventually, the Babylonians developed a place holder, but only within a number, not at the end. It was rather a clumsy digit (). Later on, by 130 AD, Ptolemy the Greek astronomer was using the Babylonian number system, but with zero represented by a circle.

It was the Indians again who developed a consistant use of the zero. It started as a dot. Fibonacci helped to introduce the Indian number system (now called Arabic numbers) into Europe. He said "The nine Indian figures are: 9 8 7 6 5 4 3 2 1. With these nine figures, and with the sign 0 ... any number may be written." It sounds as if he didn't realise that zero was a digit!

Interestingly enough, the Mayans developed a positional number system using zero. Obviously enough, this didn't influence any European number system.

I said above that you cannot divide by zero. In Mathematics, you need to watch out for this. It can lead to odd results! Look carefully at the argument below.

Let | a | = | b |

Multiply both sides by a: | a^{2} | = | ab |

Add a:^{2} - 2ab | a^{2} + a^{2} - 2ab | = | ab + a^{2} - 2ab |

Simplify: | 2(a^{2} - ab) | = | a^{2} - ab |

Divide by a:^{2} - ab | 2 | = | 1 |

Oops!

What went wrong? It's the line which tells you to divide by *a ^{2} - ab*. Since

© Jo Edkins 2007 - Return to Numbers index