Is $0$ a natural number?

Simple answer: sometimes yes, sometimes no, it's usually stated (or implied by notation). From the Wikipedia article:

In mathematics, there are two conventions for the set of natural numbers: it is either the set of positive integers $\{1, 2, 3, \dots\}$ according to the traditional definition; or the set of non-negative integers $\{0, 1, 2,\dots\}$ according to a definition first appearing in the nineteenth century.

Saying that, more often than not I've seen the natural numbers only representing the 'counting numbers' (i.e. excluding zero). This was the traditional historical definition, and makes more sense to me. Zero is in many ways the 'odd one out' - indeed, historically it was not discovered (described?) until some time after the natural numbers.


There is no "official rule", it depends from what you want to do with natural numbers. Originally they started from $1$ because $0$ was not given the status of number.

Nowadays if you see $\mathbb{N}^+$ you may be assured we are talking about numbers from $1$ above; $\mathbb{N}$ is usually for numbers from $0$ above.

[EDIT: the original definitions of Peano axioms, as found in Arithmetices principia: nova methodo, may be found at https://archive.org/details/arithmeticespri00peangoog : look at it. ]


I think that modern definitions include zero as a natural number. But sometimes, expecially in analysis courses, it could be more convenient to exclude it.

Pros of considering $0$ not to be a natural number:

  • generally speaking $0$ is not natural at all. It is special in so many respects;

  • people naturally start counting from $1$;

  • the harmonic sequence $1/n$ is defined for any natural number n;

  • the $1$st number is $1$;

  • in making limits, $0$ plays a role which is symmetric to $\infty$, and the latter is not a natural number.

Pros of considering $0$ a natural number:

  • the starting point for set theory is the emptyset, which can be used to represent $0$ in the construction of natural numbers; the number $n$ can be identified as the set of the first $n$ natural numbers;

  • computers start counting by $0$;

  • the rests in the integer division by a $n$ are $n$ different numbers starting from $0$ to $n-1$;

  • it is easier to exclude one defined element if we need naturals without zero; instead it is complicated to define a new element if we don't already have it;

  • integer, real and complex numbers include zero which seems much more important than $1$ in those sets (those sets are symmetric with respect to $0$);

  • there is a notion to define sets without $0$ (for example $\mathbb R_0$ or $\mathbb R_*$), or positive numbers ($\mathbb R_+$) but not a clear notion to define a set plus $0$;

  • the degree of a polynomial can be zero, as can be the order of a derivative;

I have seen children measure things with a ruler by aligning the $1$ mark instead of the $0$ mark. It is difficult to explain them why you have to start from $0$ when they are used to start counting from $1$. The marks in the rule identify the end of the centimeters, not the start, since the first centimeter goes from 0 to 1.

An example where counting from $1$ leads to somewhat wrong names is in the names of intervals between musical notes: the interval between C and F is called a fourth, because there are four notes: C, D, E, F. However the distance between C and F is actually three tones. This has the ugly consequence that a fifth above a fourth (4+3) is an octave (7) not a nineth! On the other hand if you put your first finger on the C note of a piano your fourth finger goes to the F note.

I would say that in the natural language the correspondence between cardinal numbers and ordinal numbers is off by one, thus distinguishing two sets of natural numbers, one starting from 0 and one starting from 1st. The 1st of January was day number $0$ of the new year. And zeroth has no meaning in the natural language...