Infinite Sum of Normals

Adding the missing hypothesis that the random variables are independent, if every $X_k$ is normal $\mathcal N(\mu_k,\sigma_k^2)$ and if the series $\sum\limits_k\mu_k$ and $\sum\limits_k\sigma_k^2$ both converge then yes, the series $\sum\limits_kX_k$ converges in distribution.

Easiest approach: characteristic functions + Lévy's convergence theorem.


Let's consider the sequence $\{S_n\}$ of finite partial sums: $\displaystyle S_n = \sum_{k=1}^n X_k$. As you state, this is a sequence of normal random variables, each distributed as $$ S_n\sim\mathcal{N}\left(m_n,s_n^2\right) $$ where $\displaystyle m_n=\sum^n_{k=1}\mu_k$ and $\displaystyle s_n^2=\sum^n_{k=1}\sigma_k^2$. Now the question is: does $S_n$ converge in some suitable sense to $\displaystyle S\sim\mathcal{N}\left(m,s^2\right)$ as $n\rightarrow\infty$, given $\displaystyle m=\lim_{n\rightarrow\infty}m_n$ and $\displaystyle s^2=\lim_{n\rightarrow\infty}s_n^2$ exist?

For the "suitable sense" of convergence, let us use the convergence in distribution. This kind of convergence means that the cumulative distribution function (CDF) converges to the limiting CDF, pointwise, wherever the limiting CDF is continuous. Since the limiting distribution is normal, it is continuous everywhere. There is also a result known as Scheffe's lemma that implies that if the probability density functions (PDFs) converge, this implies convergence in distribution as well. So this is what we'll test: does the sequence of PDFs converge to the limiting PDF? Let $f_n(x)$ be the sequence of PDFs for the $S_n$ variables $$ f_n(x) = \frac{1}{s_n\sqrt{2\pi}}\exp\left(-\frac{(x-m_n)^2}{2s_n^2}\right) $$ To finish off a proof, you have to show that $$ f(x) = \lim_{n\rightarrow\infty}f_n(x) = \frac{1}{s\sqrt{2\pi}}\exp\left(-\frac{(x-m)^2}{2s^2}\right) $$ Intuitively, I know this is to be true because this sequence of functions doesn't do one of those "nasty" things like become infinitely thin in places; for one, $s_n$ controls the width of these things, and that is guaranteed to be monotonically increasing and bounded. But, I'd have to brush up on my real analysis a bit to coherently argue for this.

In any case, if you can show that limit is correct rigorously, you'll have shown that the PDF of the sequence converges, so the sequence converges in probability to the one you conjectured.