What does orthogonal random variables mean?

Orthogonality comes from the idea of vanishing inner product. In case of random variables $$ \mathbb E \left [ X\right ] = \int_{-\infty}^\infty xd\mu_X $$ so, orthogonal RVs are those with $$ \mathbb E \left [ XY\right ] = \int_{-\infty}^\infty \int_{-\infty}^\infty xy d\mu_X d\mu_Y = 0 $$


Orthogonal means the vectors are at perpendicular to each other. We state that by saying that vectors x and y are orthogonal if their dot product (aka inner product) is zero, i.e. $x^\intercal y$=0.

However for vectors with random components, the orthogonality condition is modified to be Expected Value$E[x^\intercal y]=0$. This can be viewed as saying that for orthogonality, each random outcome of $x^\intercal y$ may not be zero, sometimes positive, sometimes negative, possibly also zero, but Expected Value $E[x^\intercal y]=0$. Keeping in mind, expected value is the same thing as the mean or average of possible outcomes.

Naturally when talking about orthogonality, we are talking about vectors.