Proof that the sum of two Gaussian variables is another Gaussian

I prepared the following as an answer to a question which happened to close just as I was putting the finishing touches on my work. I posted it as a different (self-answered) question but following suggestions from Srivatsan Narayanan and Mike Spivey, I am putting it here and deleting my so-called question.

If $X$ and $Y$ are independent standard Gaussian random variables, what is the cumulative distribution function of $\alpha X + \beta Y$?

Let $Z = \alpha X + \beta Y$. We assume without loss of generality that $\alpha$ and $\beta$ are positive real numbers since if, say, $\alpha < 0$, then we can replace $X$ by $-X$ and $\alpha$ by $\vert\alpha\vert$. Then, the cumulative probability distribution function of $Z$ is $$ F_Z(z) = P\{Z \leq z\} = P\{\alpha X + \beta Y \leq z\} = \int\int_{\alpha x + \beta y \leq z} \phi(x)\phi(y) dx dy $$ where $\phi(\cdot)$ is the unit Gaussian density function. But, since the integrand $(2\pi)^{-1}\exp(-(x^2 + y^2)/2)$ has circular symmetry, the value of the integral depends only on the distance of the origin from the line $\alpha x + \beta y = z$. Indeed, by a rotation of coordinates, we can write the integral as $$ F_Z(z) = \int_{x=-\infty}^d \int_{y=-\infty}^{\infty}\phi(x)\phi(y) dx dy = \Phi(d) $$ where $\Phi(\cdot)$ is the standard Gaussian cumulative distribution function. But, $$d = \frac{z}{\sqrt{\alpha^2 + \beta^2}}$$ and thus the cumulative distribution function of $Z$ is that of a zero-mean Gaussian random variable with variance $\alpha^2 + \beta^2$.


I posted the following in response to a question that got closed as a duplicate of this one:

It looks from your comment as if the meaning of your question is different from what I thought at first. My first answer assumed you knew that the sum of independent normals is itself normal.

You have $$ \exp\left(-\frac12 \left(\frac{x}{\alpha}\right)^2 \right) \exp\left(-\frac12 \left(\frac{z-x}{\beta}\right)^2 \right) = \exp\left(-\frac12 \left( \frac{\beta^2x^2 + \alpha^2(z-x)^2}{\alpha^2\beta^2} \right) \right). $$ Then the numerator is $$ \begin{align} & (\alpha^2+\beta^2)x^2 - 2\alpha^2 xz + \alpha^2 z^2 \\ \\ = {} & (\alpha^2+\beta^2)\left(x^2 - 2\frac{\alpha^2}{\alpha^2+\beta^2} xz\right) + \alpha^2 z^2 \\ \\ = {} & (\alpha^2+\beta^2)\left(x^2 - 2\frac{\alpha^2}{\alpha^2+\beta^2} xz + \frac{\alpha^4}{(\alpha^2+\beta^2)^2}z^2\right) + \alpha^2 z^2 - \frac{\alpha^4}{\alpha^2+\beta^2}z^2 \\ \\ = {} & (\alpha^2+\beta^2)\left(x - \frac{\alpha^2}{\alpha^2+\beta^2}z\right)^2 + \alpha^2 z^2 - \frac{\alpha^4}{\alpha^2+\beta^2}z^2, \end{align} $$ and then remember that you still have the $-1/2$ and the $\alpha^2\beta^2$ in the denominator, all inside the "exp" function.

(What was done above is completing the square.)

The factor of $\exp\left(\text{a function of }z\right)$ does not depend on $x$ and so is a "constant" that can be pulled out of the integral.

The remaining integral does not depend on "$z$" for a reason we will see below, and thus becomes part of the normalizing constant.

If $f$ is any probability density function, then $$ \int_{-\infty}^\infty f(x - \text{something}) \; dx $$ does not depend on "something", because one may write $u=x-\text{something}$ and then $du=dx$, and the bounds of integration are still $-\infty$ and $+\infty$, so the integral is equal to $1$.

Now look at $$ \alpha^2z^2 - \frac{\alpha^4}{\alpha^2+\beta^2} z^2 = \frac{z^2}{\frac{1}{\beta^2} + \frac{1}{\alpha^2}}. $$

This was to be divided by $\alpha^2\beta^2$, yielding $$ \frac{z^2}{\alpha^2+\beta^2}=\left(\frac{z}{\sqrt{\alpha^2+\beta^2}}\right)^2. $$ So the density is $$ (\text{constant})\cdot \exp\left( -\frac12 \left(\frac{z}{\sqrt{\alpha^2+\beta^2}}\right)^2 \right) . $$ Where the standard deviation belongs we now have $\sqrt{\alpha^2+\beta^2}$.


I don't know how I missed that one, indeed:
http://en.wikipedia.org/wiki/Sum_of_normally_distributed_random_variables
Thanks Kaestur Hakarl!