Why is it enough for $\lim\limits_{h \to 0} \vert f(x+h)-f(x)\vert = 0$ to show uniform continuity

It is not true: the statement$$(\forall x\in\mathbb R):\lim_{h\to0}\bigl\lvert f(x+h)-f(x)\bigr\rvert=0$$is equivalent to continuity, not to uniform continuity. However, the statement$$\lim_{h\to0}\sup_{x\in\mathbb R}\bigl\lvert f(x+h)-f(x)\bigr\rvert=0$$does imply uniform continuity.


The argument you saw was probably the following:

$f$ is uniformly continuous if and only if $$\lim\limits_{ h \to 0}\vert f(x+h)-f(x)\vert=0 \mbox{ uniformly in } x \in \mathbb R$$

Here uniformly in $x$ means that for each $\epsilon >0$ one can chose the same $\delta>0$ for all $x$. And if you write clearly what it means for $\delta$ not to depend on $x$ you will see that this is exactly the definition of uniform continuity.


Take $f(x)=x^2,$ which obviously fits the definition. Let $\epsilon=1.$ Then for any $\delta>0$ with $|h|<\delta,$ $|f(x+h)-f(x)|=|(x+h)+x||(x+h)-x|=|2x+h||h|$. With $\delta$ fixed, certainly you can choose $x$ so that $|f(x+h)-f(x)|>\epsilon.$