98
$\begingroup$

Is there a consensus in the mathematical community, or some accepted authority, to determine whether zero should be classified as a natural number?

It seems as though formerly $0$ was considered in the set of natural numbers, but now it seems more common to see definitions saying that the natural numbers are precisely the positive integers.

  • 0
    Perhaps this also might help ⟶ https://math.stackexchange.com/questions/2601711/is-this-a-valid-proof-that-there-are-infinitely-many-natural-numbers2018-03-19

8 Answers 8

52

Simple answer: sometimes yes, sometimes no, it's usually stated (or implied by notation). From the Wikipedia article:

In mathematics, there are two conventions for the set of natural numbers: it is either the set of positive integers $\{1, 2, 3, \dots\}$ according to the traditional definition; or the set of non-negative integers $\{0, 1, 2,\dots\}$ according to a definition first appearing in the nineteenth century.

Saying that, more often than not I've seen the natural numbers only representing the 'counting numbers' (i.e. excluding zero). This was the traditional historical definition, and makes more sense to me. Zero is in many ways the 'odd one out' - indeed, historically it was not discovered (described?) until some time after the natural numbers.

  • 0
    @CharlesStewart I am taught the absolute contrary.2018-01-04
33

There is no "official rule", it depends from what you want to do with natural numbers. Originally they started from $1$ because $0$ was not given the status of number.

Nowadays if you see $\mathbb{N}^+$ you may be assured we are talking about numbers from $1$ above; $\mathbb{N}$ is usually for numbers from $0$ above.

[EDIT: the original definitions of Peano axioms, as found in Arithmetices principia: nova methodo, may be found at https://archive.org/details/arithmeticespri00peangoog : look at it. ]

  • 7
    Clapham-Nicholson, *The Concise Oxford Dictionary of Mathematics* (4th edition): «natural number - One of the numbers 1, 2, 3, ..., **Some authors also include 0**. The set of natural numbers is often denoted by N.» Eric Weisstein, *Concise Encyclopedia of Mathematics* (2nd ed) - «N: The SET of NATURAL NUMBERS (the POSITIVE INTEGERS Z : 1, 2, 3, ...; Sloane’s A000027), denoted N; also called the WHOLE NUMBERS . Like whole numbers, **there is no general agreement on whether$0$should be included in the list of natural numbers**.»2015-01-08
18

I think that modern definitions include zero as a natural number. But sometimes, expecially in analysis courses, it could be more convenient to exclude it.

Pros of considering $0$ not to be a natural number:

  • generally speaking $0$ is not natural at all. It is special in so many respects;

  • people naturally start counting from $1$;

  • the harmonic sequence $1/n$ is defined for any natural number n;

  • the $1$st number is $1$;

  • in making limits, $0$ plays a role which is symmetric to $\infty$, and the latter is not a natural number.

Pros of considering $0$ a natural number:

  • the starting point for set theory is the emptyset, which can be used to represent $0$ in the construction of natural numbers; the number $n$ can be identified as the set of the first $n$ natural numbers;

  • computers start counting by $0$;

  • the rests in the integer division by a $n$ are $n$ different numbers starting from $0$ to $n-1$;

  • it is easier to exclude one defined element if we need naturals without zero; instead it is complicated to define a new element if we don't already have it;

  • integer, real and complex numbers include zero which seems much more important than $1$ in those sets (those sets are symmetric with respect to $0$);

  • there is a notion to define sets without $0$ (for example $\mathbb R_0$ or $\mathbb R_*$), or positive numbers ($\mathbb R_+$) but not a clear notion to define a set plus $0$;

  • the degree of a polynomial can be zero, as can be the order of a derivative;

I have seen children measure things with a ruler by aligning the $1$ mark instead of the $0$ mark. It is difficult to explain them why you have to start from $0$ when they are used to start counting from $1$. The marks in the rule identify the end of the centimeters, not the start, since the first centimeter goes from 0 to 1.

An example where counting from $1$ leads to somewhat wrong names is in the names of intervals between musical notes: the interval between C and F is called a fourth, because there are four notes: C, D, E, F. However the distance between C and F is actually three tones. This has the ugly consequence that a fifth above a fourth (4+3) is an octave (7) not a nineth! On the other hand if you put your first finger on the C note of a piano your fourth finger goes to the F note.

I would say that in the natural language the correspondence between cardinal numbers and ordinal numbers is off by one, thus distinguishing two sets of natural numbers, one starting from 0 and one starting from 1st. The 1st of January was day number $0$ of the new year. And zeroth has no meaning in the natural language...

10

There are the two definitions, as you say. However the set of strictly positive numbers being the natural numbers is actually the older definition. Inclusion of $0$ in the natural numbers is a definition for them that first occurred in the 19th century.

The Peano Axioms for natural numbers take $0$ to be one though, so if you are working with these axioms (and a lot of natural number theory does) then you take $0$ to be a natural number.

  • 1
    @celtschk: Interesting point, thank you. It makes sense also in the way it was probably intended, namely "$0$ is one of the natural numbers," i.e., just as $15$ is one. In any case it is not a real problem, just potentially confusing.2013-08-01
8

These lecture notes from a combinatorics course given for many years by N.G. de Bruijn suggest a helpful alternative:

Due to the confusion caused by N. Bourbaki about the natural numbers, we feel obliged to define: $\begin{align}\Bbb N_0 & = \{0,1,2,\ldots\}\quad \text{ and } \\ \Bbb N_1 & = \{1,2,3,\ldots\}. \end{align}$

(Page 4)

  • 0
    So would \mathbb{N}_2 = \{2, 3, 4, 5, 6,\ldots\} = \mathbb{N}_{>1}? The right-most term would be how I am tuahg thow to express a set of that special case, but it seems like there are *many* different kinds of notations, purely because there are only $26$ letters in the alphabet.2018-01-04
7

According to ISO 80000-2:2009: Quantities and Units - Part 2: Mathematical signs and symbols to be used in the natural sciences and technology, page 6;

$\mathbb{N}=\{0,1,2,3,\ldots\}$ $\mathbb{N^*}=\{1,2,3,\ldots\}$

enter image description here

  • 0
    i think [Edsger W. Dijkstra](https://www.cs.utexas.edu/users/EWD/transcriptions/EWD08xx/EWD831.html) has a useful comment on this.2018-10-03
4

As others have said, there's no consensus on this. However, if you need unambiguous notation, you can use: $\mathbb{Z}_{\geq 0}, \mathbb{Z}_{\geq 1}.$ This is a good option if you're writing something short and sweet, e.g. for posts to this website. In something longer, like an article or PhD, you may wish to spend a sentence or two establishing a convention that is more visually appealing. Personally, I use: $\mathbb{N} = \{0,1,2,3,\cdots\}, \qquad \mathbb{W} = \{1,2,3,\cdots\}.$ The motivation behind $\mathbb{W}$ is that its elements can be referred to as "whole" numbers (although as others have said, the term "whole number" is highly ambiguous unless and until you tell the reader precisely what you mean.)

Anyway, what I really wanted to say is that even if our only interest is number theory, nonetheless we still need both of these number systems, because prime factorization sets up an isomorphism between the monoid $\mathbb{W}$ (with multiplication) and the monoid of all finitely-supported functions $\mathrm{Prime} \rightarrow \mathbb{N}$ (with pointwise addition.)

  • 0
    Go here $\longrightarrow$ https://www.barcodesinc.com/articles/mathematical-terms-dictionary.htm and look at the $W$ section with *Whole Numbers*. It says that $\mathbb{W} = \{0, 1, 2, 3,\ldots\}$.2018-02-02
3

I remember all of my courses at University using only positive integers (not including $0$) for the Natural Numbers. It's possible that they had come to an agreement amongst the Maths Faculty, but during at least two courses we generated the set of natural numbers in ways that wouldn't make sense if $0$ was included.

One involved the cardinality of Sets of Sets, the other defined the natural numbers in terms of the number $1$ and addition only ($0$ and Negative Integers come into the picture later when you define an inverse to addition).

As a result when teaching the difference between Integers and Natural Numbers I always define $0$ as an integer that isn't a Natural Number.

  • 0
    @badp Ok, you have$0$and addition. 0+0=0. Where's the rest of N?2016-10-17