Resultant probability distribution when taking the cosine of gaussian distributed variable

A quick way to find the mean of $\cos(\theta)$, where $\theta\sim \mathcal{N}(0, \sigma^2)$, is through calculating the mean of a complex variable $e^{j\theta}=\cos(\theta)+j\sin(\theta)$. We have

$E [e^{j\theta}]=e^{0+(j\sigma)^2/2}=e^{-\sigma^2/2}$

which implies that the mean of the imaginary part $E [\sin(\theta)]$ equals zero and the mean of the real part $E[\cos(\theta)]$ equals $e^{-\sigma^2/2}$.

The answer $\mu_1$ derived by Will Jagy is in fact the Taylor series expansion of $e^{-\sigma^2/2}$.

The variance of $\cos(\theta)$ can be obtained as:

$E[\cos^2(\theta)]-E[\cos(\theta)]^2= E[\frac{1}{2}+\frac{\cos(2\theta)}{2}]- E[\cos(\theta)]^2= \frac{1}{2}[1-e^{-\sigma^2}]^2$


I wrote out the first few terms in the power series for $ \cos \theta $ and then the first few terms of the series for $ \cos^2 \theta .$ I used your hypothesis of normal distribution, the mean of $ \theta $ is $ \mu = 0$ while the variance is some $ \sigma^2 .$

Then I looked up the expected values of $ \theta^2, \; \theta^4, \; \theta^6, \; \theta^8 $ at http://en.wikipedia.org/wiki/Gaussian_distribution#Moments and used that to find good approximations for your new mean $\mu_1$ and variance $\sigma_1^2$ in $$ \mu_1 = E[ \cos \theta ] = 1 - \frac{\sigma^2}{2} + \frac{\sigma^4}{8} - \frac{\sigma^6}{48} + \cdots $$ and $$ \mu_1^2 + \sigma_1^2 = E[ \cos^2 \theta ] = 1 - \sigma^2 + \sigma^4 - \frac{2 \sigma^6}{3} + \cdots $$ So when you subtract you get $ \sigma_1^2 \approx \frac{\sigma^4}{2} $

I will think about it some more, there is a large theory for calculating moments. But I do not see much to be done in the way of an explicit pdf or cdf.


Hi, I know this was asked a long time ago but I have just discovered it because I require a similar solution. It is possible to generate an expression, albeit as an infinite summation. For practical purposes, the first few terms of the summation should suffice.

Let $X$ denote a random variable with pdf $f_X(x)$. Let $Y=g(X)$ be a function of $X$. We can specify the cdf of $Y$, denoted $F_Y(y)$ as follows:

$F_Y(y)=\mathbb{P}(g(X)\leq y)=\int\limits_{\Omega}f_X(x)\text{d}x$,

where the domain of integration $\Omega$ is defined as

$\Omega=\left\lbrace x:g(x)\leq y \right\rbrace$

In our case, $g(x)=\cos x$, so we need an expression for the domain of $x\in\mathbb{R}$ such that $\cos x\leq y$. This is given by

$2k\pi+\arccos(y) \leq x < 2(k+1)\pi-\arccos(y)\, k\in\mathbb{Z}$

So integrating over this domain, we obtain

$F_Y(y)=\sum\limits_{k=\infty}^{\infty} \int\limits_{2k\pi+\arccos(y)}^{2(k+1)\pi-\arccos(y)} f_X(x)\text{d}x$

Now in our case $X\sim\mathcal{N}(0,\sigma)$, so

$f_X(x)=\dfrac{1}{\sigma\sqrt{2\pi}}\exp\left(\dfrac{-x^2}{2\sigma^2}\right)$

and the integral of this pdf between limits is given by the cdf of the normal distribution, which we denote $\Phi$:

$\int\limits_{a}^{b}f_X(x)\text{d}x = \Phi(b/\sigma)-\Phi(a/\sigma)$

The cdf of $Y$ is therefore

$F_Y(y)=\sum\limits_{k=-\infty}^{\infty} \Phi\left(\dfrac{2(k+1)\pi-\arccos(y)}{\sigma}\right) - \Phi\left(\dfrac{2k\pi-\arccos(y)}{\sigma}\right)$

To compute the pdf, take the derivative with respect to $y$:

$f_Y(y)=\dfrac{dF_Y(y)}{dy} = \sum\limits_{k=-\infty}^{\infty} \dfrac{1}{\sqrt{1-y^2}}\left( f_{X}(2(k+1)\pi-\arccos(y) ) + f_{X}(2k\pi+\arccos(y)) \right)$

There are probably better ways to do this. It's possible the final summation can be rewritten or simplified. But this seems to match with a numerical check.