Is $0$ an Infinitesimal?

$0$ is infinitesimal.

Natural language is a bad reference for mathematical definitions; it's 'optimized' for quickly conveying meaning in 'natural' settings, not for expressing things precisely. There are all sorts of conventions like if if you ever hear someone talk about a "small number", you're supposed to assume there's a good reason for using that phrase instead of "zero", and thus should assume that the number is, in fact, nonzero, despite the fact zero is a small number.

For a nonnumeric example of this phenomenon, if I told you I lived near Paris, you would infer that I do not live in Paris.

With that in mind, I am not surprised to find that the English meaning of infinitesimal excludes zero.

However, that makes for a bad mathematical definition. The typical mathematical usage of infinitesimal is in a sense where 0 would be included; e.g. in nonstandard analysis, if $f$ is a standard, continuous function with $f(0) = 0$, then we would like to say "$f(x)$ is infinitesimal whenever $x$ is infinitesimal". Being able to say that requires that we consider $0$ to be infinitesimal; if we did not, then we would have to say something more awkward, like "$f(x)$ is either infinitesimal or zero whenever $x$ is infinitesimal".


The definition you mention from your textbook doesn't really make sense when taken literally, at least when taken out of context like this.


The only context I can think of where an infinitesimal makes sense is in a formalization of nonstandard analysis. In this context,

$\epsilon \in {}^*\mathbb{R}$ is called infinitesimal if $|\epsilon| < r$ for all positive real numbers $r$.

The set of infinitesimals $I$ is an ideal in the ring of finite hyperreal numbers. Modding out by the ideal $I$ one gets the original set of real numbers $\mathbb{R}$. In fact, the important standard part function is the natural ring homomorphism from the finite hyperreals to the reals induced by taking this quotient.

All ideals contain $0$, so in this context one always wants $0$ to be infinitesimal. Also, most definitions involving infinitesimals (e.g., $f(x) - f(a)$ is infinitesimal whenever $x - a$ is infinitesimal) require that $0$ is infinitesimal. In cases where we don't want to include $0$, we should explicitly ask for a nonzero infinitesimal.


That $\varepsilon$ is an infinitesimal means that the sum $$ |\varepsilon|+\cdots+|\varepsilon| $$ remains less than $1$ no matter how large is the finite number of terms being added.

By that definition, $0$ is an infinitesimal. But the term is seldom used except when talking about non-zero infinitesimals.