Intuition behind Variance forumla

We want to measure the expected “deviation” of a random variable from it's expected value. What comes first in mind is the expected difference between the r.v. and it's expected value, that is $E\bigl(X-E(X)\bigr)$. But this number is always zero. --

Next, as drhab pointed out, is the average distance, i.e., the absolute value of the difference, between the r.v. and it's expected value, that is $E\bigl(|X-E(X)|\bigr)$. Now that appears to be a “natural” measure of the deviation, but there is a big drawback: it's not everywhere differentiable, so we can't apply Analysis to it.

Now we're stuck; what to do know? Well, let's follow an idea of a great mathematician, in our case Gauss. He introduced in his early years the expected value of the squared difference between the r.v. and it's expected value, that is $E\bigl((X-E(X))^2\bigr)$, aka variance of $X$. Its drawback: it isn't intuitive.

At the other hand, as Stefanos wrote: “The moment of the most use is when k=2.” That's right, but why is it the moment of most use? The main reason for this is doubtlessly Chebyshev's inequality: http://en.wikipedia.org/wiki/Chebyshev%27s_inequality And this inequality is very intuitive.


If I want to 'measure' how much random variable $X$ is expected to differ from its expected value than intuitively I think of things like $\mathbb E(|X-\mathbb E(X)|)$ or $\mathbb E(X-\mathbb E(X))^2$. Another possibility in the same line is the root of the variance, named deviation. Have a look at the anwer of Stefanos when it concerns second generating moment.