Multiplication of a random variable with constant

For a random variable $X$ with finite first and second moments (i.e. expectation and variance exist) it holds that $\forall c \in \mathbb{R}: E[c \cdot X ] = c \cdot E[X]$ and $ \mathrm{Var}[c\cdot X] = c^2 \cdot \mathrm{Var} [ X]$

However the fact that $c\cdot X$ follows the same family of distributions as does $X$ is not trivial and has to be shown seperately. (Not true e.g. for the Beta distribution, which is also in the exponential family). You can see it if you look at the characteristic function of the product $c\cdot X$: $ \exp\{i\mu c t - \frac{1}{2} \sigma^2 c^2 t^2\}$ which is the characteristic function of a normal distribution wih $\mu'= \mu\cdot c$ and $\sigma' = \sigma \cdot c$.


Another way of characterizing a random variable's distribution is by its distribution function, that is, if two random variables have the same distribution function then they are equal.

In our case, let $X \sim N(\mu,\sigma^2)$ then set $Y = c X$ with $c > 0$ and call $F$ the distribution function of $X$ and $G$ the distribution function of $Y$. Then:

$G(y) = P[Y \le y] = P[cX \le y] = P\Big[X \le \frac yc\Big] = F\Big(\frac yc\Big)$

Now we differentiate and we get:

$g(y) = f(\frac yc) \frac1c$

where $g$ is the density function for $Y$ and $f$ is the density function for $X$. Then we just try to express this as a normal density:

$g(y) = f(\frac yc ) \frac1c = \frac{1}{\sqrt{2\pi}(c\sigma)} e^{\frac{-(cx-c\mu)^2}{2(c\sigma)^2}} = \frac{1}{\sqrt{2\pi}(c\sigma)} e^{\frac{-(y-c\mu)^2}{2(c\sigma)^2}}$

But this last is the density of a $N(c\mu,(c\sigma)^2)$

This is a calmed formulation of what Dilip Sarwate pointed out in the comments before.

The case c < 0

$G(y) = P[Y \le y] = P[cX \le y] = P\Big[X \ge \frac yc\Big] = 1 - F\Big(\frac yc\Big)$

differentiating:

$g(y) = -f(\frac yc) \frac1c = f(\frac yc) \frac{1}{|c|} = \frac{1}{\sqrt{2\pi}(|c|\sigma)} e^{\frac{-(y-c\mu)^2}{2(c\sigma)^2}}$

Note that this does not pose difficulties since $\sqrt{(c \sigma)^2} = |c| \sigma$.


I'll try to present it in a way which is relatively intuitive, but still maintains some mathematical rigor.

Let $Y=kX$, where X ~ $N(0,1)$.

Now, we can see $Y=X+X+X.......k$ times. Thus, we can get the expected value of Y and the variance of Y using linearity.

$E(Y)=E(X+X+X...)=E(X)+E(X)+....k$ times $=k\mu$ (using linearity of expectation).

To, we can get $Var(Y)=Var(kX)=E((kX)^2)-(E(kX))^2$ (by definition of Variance)

So, $Var(Y)= E(k^2X^2)-(E(kX))^2=k^2(E(X^2))-(k.E(X))^2$ (using above proved result for $E(kX)$)

Rewriting, $Var(Y)= k^2E(X^2)-k^2(E(X))^2=k^2(E(X^2)-(E(X))^2)=k^2Var(X)$

So, we now have $E(Y)$ and $Var(Y)$. We also know that the sum of independent normally distributed variables is normally distributed, so Y must be normal as Y is a sum of normally distributed variables.

So, basically you know have both the E and the Var of a normally distributed variable, which tells you the distribution.

$Y$ ~ $N(k\mu,k^2\sigma)$, where $\mu=Mean(X)$ and $\sigma=Var(X)$.

In your case, $\mu=0$. Hope it helps!