Completeness of $\ell^2$ done right

Right, if the uniform convergence of $(f_n)$ is shown, you can interchange limits per theorem 7.11 (using $\infty$ as an accumulation point of $\mathbf{N}$).

To show the uniform convergence, for $k \in \mathbf{N}$, let $P_k \colon \ell^2 \to \ell^2$ be the projection setting all components with index $> k$ to $0$, i.e.

$$(P_k(x))_j = \begin{cases} x_j, & j \leqslant k \\ 0, & j > k.\end{cases}$$

Then we note that $f_n(k) = \lVert P_k(x_n)\rVert^2$, and consequently

$$\lvert f_n(k) - f_m(k)\rvert = \bigl\lvert \lVert P_k(x_n)\rVert^2 - \lVert P_k(x_m)\rVert^2\bigr\rvert = \bigl\lvert\lVert P_k(x_n)\rVert - \lVert P_k(x_m)\rVert\bigr\rvert\cdot\bigl(\lVert P_k(x_n)\rVert + \lVert P_k(x_m)\rVert\bigr).$$

Now $P_k$ never increases the norm, i.e. $\lVert P_k(x)\rVert \leqslant \lVert x\rVert$ for all $x\in \ell^2$, and since $\lVert x_n\rVert$ is a Cauchy sequence, it follows that $\bigl(\lVert x_n\rVert\bigr)$ is bounded, say $\lVert x_n\rVert \leqslant K$ for all $n$. Thus from the above we obtain

$$\lvert f_n(k) - f_m(k)\rvert \leqslant 2K\bigl\lvert \lVert P_k(x_n)\rVert - \lVert P_k(x_m)\rVert\bigr\rvert.$$

By the reverse triangle inequality it follows that

$$\lvert f_n(k) - f_m(k)\rvert \leqslant 2K\lVert P_k(x_n) - P_k(x_m)\rVert.$$

But $P_k$ is linear, and it never increases norm, so

$$\lvert f_n(k) - f_m(k)\rvert \leqslant 2K\lVert P_k(x_n - x_m)\rVert \leqslant 2K\lVert x_n - x_m\rVert.$$

This bound is independent of $k$, and hence

$$\lVert f_n - f_m\rVert_{\infty} := \sup \{ \lvert f_n(k) - f_m(k)\rvert : k \in \mathbf{N}\} \leqslant 2K\lVert x_n - x_m\rVert.$$

And since $(x_n)$ is a Cauchy sequence, it follows that $(f_n)$ is a Cauchy sequence with respect to the uniform norm, thus it converges uniformly to its pointwise limit $f$.


Your prerequisites that $\|\mathbf x_n\|$ is bounded and converges to $y$ pointwise is not enough for uniform convergence. In fact, it gives only weak convergence. A simple counterexample is $\mathbf x_n$ with $x_{n,j}=0$ for $j\ne n$ and $x_{n,n}=1$.

You really need to make more use of the fact that $\mathbf x_n$ is Cauchy. Denote by $a^{[k]}$ the truncation of the sequence $a$ $$ a^{[k]}=\{a_1,a_2,a_3,\ldots,a_k,0,0,\ldots\}. $$ Then for $m,n\ge N$ we have for every fixed $k$ $$ \left|\|\mathbf x_n^{[k]}\|-\|\mathbf x_m^{[k]}\|\right|\le\|\mathbf x_n^{[k]}-\mathbf x_m^{[k]}\|\le\|\mathbf x_n-\mathbf x_m\|\le\epsilon. $$ Taking the limit for $m\to+\infty$ we get for $n\ge N$ $$ \left|\|\mathbf x_n^{[k]}\|-\|\mathbf y^{[k]}\|\right|\le\epsilon,\ \forall k, $$ that is $\|\mathbf x_n^{[k]}\|$ converges to $\|\mathbf y^{[k]}\|$ uniformly.

This is all you need if you exchange the limits for $$ \phi_n(k)=\|\mathbf x_n^{[k]}\|=\sqrt{f_n(k)},\qquad \phi(k)=\|\mathbf y^{[k]}\|=\sqrt{f(k)}. $$