$\frac{1}{n}\sum_{k=1}^n(X_k-\mathbb E[X_k])$ converges a.s. to $0$

A proof, which you kind of started, can be derived from Kolmogorov inequality,

For $n < m$, Kolmogorov inequality applied to the martingale $Y_k-Y_n$, $n \leq k \leq m$, gives that, for all $\epsilon > 0$ $$ P(\max_{n \leq k \leq m} |Y_k - Y_n| > \epsilon) \leq \frac{1}{\epsilon^2} \sum_{k = n}^m \frac{Var(X_k)}{k^2}. $$

So, for all $\epsilon > 0$

$$ P(\inf_l \max_{n,m \geq l} |Y_m - Y_n| > \epsilon) \leq \lim_{l \rightarrow \infty} P(\max_{n,m \geq l} |Y_m - Y_n| > \epsilon) = 0. $$

Now take $\epsilon_p \rightarrow 0$. Define $$ \Omega_p = \{ \inf_l \max_{n,m \geq l} |Y_m - Y_n| \leq \epsilon_p \}, $$ and $\Omega' = \bigcap_p \Omega_p$. Then $P(\Omega') = 1$ and, for all $\omega \in \Omega'$, $Y_n(\omega)$ is a Cauchy sequence, therefore converges. Kronecker's Lemma then finishes the proof, as before.

Comment

This does not really "avoid the martingale machinery", though. This argument substitutes martingale convergence by Kolmogorov inequality, which is a maximal inequality for martingales.

It does not seem easy to get away from using the martingale property in one way or another.

The alternative argument proposed by previous answer uses the fact that a series of independent summands that converges in probability must also converge almost surely. The standard proof of this fact is a stopping time argument, that also uses the stronger independence property.


It is well known that a series of independent random variables converges almost surely iff it converges in distribution iff it converges in probability. If you are willing to assume this result the rest is very easy. To show that $Y_n$ converges in probability it is enough to prove convergence in $L^{2}$. But convergence in $L^{2}$ follows from the following basic fact from FA: if $(x_n)$ is s sequence in a Banach space such that $\sum \|x_n\| <\infty$ then $\sum x_n$ converges in the norm.