If $\sum_{n_0}^{\infty} a_n$ diverges prove that $\sum_{n_0}^{\infty} \frac{a_n}{a_1+a_2+...+a_n} = +\infty $

This problem has an interesting history from the early days of real analysis attached to it. I really like this story and the proof it contains of the problem in this question is also different from the other answers posted.


In 1827 a M. Oliver published a proof in Crelle's Journal (Journal für die reine und angewandte Mathematik) of the following theorem:

Theorem M. Oliver: A series of positive terms converges if and only if $\lim_{n\to\infty} na_n = 0$.

Not long after this paper had been published Niels Henrik Abel responded with a short note noting that the theorem "does not seem to hold true" as the series $\sum_{n=2}^\infty\frac{1}{n\log(n)}$ diverges but $\lim_{n\to\infty}n\cdot \frac{1}{n\log(n)} = 0$ contradicting the theorem above.

Abel then followed it up by another short note showing that there cannot exist any function $\phi(n)$ such that

$$\sum_{k=1}^\infty a_n ~~~\text{converges} \iff \lim_{n\to\infty}\phi(n)a_n = 0$$

The proof is remarkably simple and uses the lemma below (which coincides with the problem this question asks for). I will here present Abel's original proof. It's from memory so it's not word for word but the main idea behind the proof should be intact.


Lemma: if a series of positive terms $\sum_{k=1}^\infty a_k$ diverges then so does the series $\sum_{k=1}^\infty \frac{a_k}{a_1+a_2+\ldots + a_{k-1}}$.

Proof: Let $s_n = \sum_{k=1}^{n}a_k$, then

$$\log\left(\frac{s_k}{s_{k-1}}\right) = \log\left(1+\frac{a_k}{s_{k-1}}\right) < \frac{a_k}{s_{k-1}}$$

since $\log(1+x) < x$. Summing over $k=2,3,\ldots,n$ the left hand side telescopes to give

$$\log\left(\frac{s_n}{s_{1}}\right) < \sum_{k=1}^n \frac{a_k}{s_{k-1}}$$

Now if $s_n = \sum_{k=1}^n a_k$ diverges then so does the logarithm of $s_n$ and it follows that $\sum_{k=1}^n \frac{a_k}{s_{k-1}}$ also diverges.


Proof of Abel's theorem above: Assume there exist a function $\phi(n)$ with the properties that $\sum a_k$ converges if and only if $\lim_{n\to\infty}\phi(n)a_n = 0$. Then $\sum_{k=1}^\infty \frac{1}{\phi(k)}$ diverges since $\lim_{n\to\infty} \phi(n)\frac{1}{\phi(n)} = 1$. By the lemma above this implies that $$\sum_{k=1}^\infty \frac{\frac{1}{\phi(k)}}{\frac{1}{\phi(1)}+\frac{1}{\phi(2)} + \ldots + \frac{1}{\phi(k-1)}}$$ also diverges but $$\lim_{n\to\infty} \phi(n)\frac{\frac{1}{\phi(n)}}{\frac{1}{\phi(1)}+\ldots + \frac{1}{\phi(n-1)}} = \lim_{n\to\infty}\frac{1}{\frac{1}{\phi(1)}+\ldots + \frac{1}{\phi(n-1)}} = 0$$

which would indicate that it is convergent. No such function $\phi(n)$ can therefore exist.


Suppose the $a_n$ are non-negative. To prove that the series diverges to $+\infty$, it is enough to prove that for every $N\in\Bbb N$, there exists $N'>N$ such that $$\sum_{n=N}^{N'}\frac{a_n}{a_0+\cdots+a_n}\geq \frac12$$ So fix $N>0$, and notice that for every $M>N$ $$\sum_{n=N}^{M}\frac{a_n}{a_0+\cdots+a_n}\geq\sum_{n=N}^{M}\frac{a_n}{a_0+\cdots+a_{M}}=\frac{a_N+\cdots+a_{M}}{a_0+\cdots+a_{M}}=1-\frac{C}{a_0+\cdots+a_{M}}$$ where $C=a_0+\cdots+a_{N-1}$ is a constant. The second term is of the form "constant over something that tends to infinity" and so tends to $0$, and eventually we get a minoration by $\frac12$, showing that the series diverges to infinity.


Intuition comes from the continuous analogue: if $f : [a, \infty) \to \Bbb{R}^+$ is a positive piecewise continuous function such that $\int_{a}^{\infty} f(x) \, dx = \infty$, then

$$ \int_{a}^{\infty} \frac{f(x)}{F(x)} \, dx = \log F(\infty) - \log F(a) = \infty $$

where $F$ is an antiderivative of $f$ (which is also positie on $[a, \infty)$. Bringing this idea to discrete setting is not hard. Indeed, when $a_k$'s are positive then you may apply Stolz theorem to

$$ \lim_{n\to\infty} \frac{\sum_{k=1}^{n} (a_k / s_k)}{\log s_n}. \tag{1} $$

More precisely, we have either $\limsup_{n\to\infty} a_n / s_n > 0$ or that (1) converges to 1, and in either case the series diverges.