Conditional mean on uncorrelated stochastic variable

$\operatorname{Cov}(X,Y)$ can be $0$ and the variables can still be dependent (exhibiting a purely non-linear dependence), and so $E(X\mid Y) \neq E(X)$. In narrower terms, "mean independence" implies zero covariance, but zero covariance does not necessarily imply mean-independence: if $E(X\mid Y) = E(X) \Rightarrow \operatorname{Cov}(X,Y) =0$ but not the reverse.

As a simple example, consider the following situation: Let $Y$ be a random variable with $E(Y)=0,\,\, E(Y^3)=0$. Define another random variable $X = Y^2 + u,\,\, E(u\mid Y) =0$. Then $E(X\mid Y) = Y^2 \implies E(X) = E(Y^2) \neq 0$. Then

$$\text{Cov}(X,Y) = E(XY) - E(X)E(Y) = E\Big[E(XY\mid Y)\Big] - E(Y^2)\cdot 0$$

$$=E\Big[YE(X\mid Y)\Big] = E(Y\cdot Y^2) = E(Y^3)=0.$$

So covariance is zero but $X$ is not mean-independent from $Y$.


Your assertion is false.

For two random variables $X$ $Y$ one can consider three measures of un-related-ness:

$1$: Independence: $p(X,Y) = p(X)p(Y)$ Equivalently, $P(X|Y)= P(X)$. This is the most important and strongest one.

$2$: Unpredictability (this term is not very standard): conditioning on one variable does not change the expectation of the other. Actually, this property is not symmetric, so there are two possible cases:

$2a$: $E(X|Y) = E(X)$

$2b$: $E(Y|X) = E(Y)$

(These properties can also be stated as: the regression line is horizontal/vertical)

$3$: Uncorrelatedness (orthogonality): $Cov(X,Y) = 0$ . Equivalently: $E(X Y) = E(X) E(Y)$

It can be shown that $1 \implies 2a$, $1 \implies 2b$ , $2a \implies 3$ , $2b \implies 3$ (hence, $1 \implies 3$). All other implicancies don't hold.

enter image description here

An example: consider $(X,Y)$ having uniform distribution over the triangle with vertices at $(-1,0)$, $(1,0)$, $(0,1)$. Check (you can deduce it from symmetry considerations) that $E(X|Y)=E(X)$ and $Cov(X,Y)=0$, but $E(Y|X)$ varies with $X$.


Assume that $X$ is symmetric Bernoulli, that is, such that $P[X=+1]=P[X=-1]=\frac12$, that $Z$ is symmetric Bernoulli independent of $X$, and that $Y=0$ on $[X=-1]$ while $Y=Z$ on $[X=+1]$.

In other words, the distribution of $(X,Y)$ is $\frac12\delta_{(-1,0)}+\frac14\delta_{(+1,+1)}+\frac14\delta_{(+1,-1)}$. Still in other words, $Y=\frac12(X+1)Z$. Thus, $E[X]=E[Y]=E[XY]=0$ but $X=2Y^2-1$ almost surely hence $E[X\mid Y]=2Y^2-1\ne0$ with full probability.

To sum up, $\mathrm{Cov}(X,Y)=0$ does not imply $E[X\mid Y]=E[X]$ (although the opposite implication holds).