What is meant by "dot product between random variables?"

The space $L^0(\Omega)$ of all random variables on a fixed sample space $\Omega$ is a vector space - the (outcome-wise) sum of two random variables is a random variable, and a scalar multiple of a random variable is again a random variable. So in that sense, random variables can be viewed as "vectors" because they are the elements of a vector space.

By "dot product" they likely mean the $L^2$ inner product, defined by $\langle X, Y \rangle = E[XY]$. This obeys the same basic algebraic properties as the ordinary Euclidean dot product: bilinear (with respect to the addition and scalar multiplication described above), symmetric, positive definite. Strictly speaking, this inner product doesn't necessarily live on $L^0(\Omega)$, but rather on the vector subspace $L^2(\Omega) \subset L^0(\Omega)$ consisting of random variables with finite second moment.


For two joint discrete variables, the expectation of their product is a weighted dot product of their value vectors (all diagonal values are positive making the diagonal matrix positive definite):

$$ \mathbf{E}[XY] = \sum_{i=1}^n p_i x_i y_i = (x_1,...,x_n) \begin{pmatrix} p_1 & ... & 0\\ \vdots & \ddots & \vdots \\ 0 & ...& p_n \end{pmatrix} (y_1,...,y_n)^T$$

Here, $(X,Y)$ has $n$ possible realizations $(x_i, y_i)$ with probabilities $p_i$, $i=1,...,n$.


Suppose you have a collection of $n$ samples of dependent (in general) variables $X$ and $Y$: $(x_1, y_1), (x_2, y_2), \ldots, (x_n, y_n)$

Then we can view this collection of $n$ samples as a pair of vectors in $\mathbb{R}^n$: $(x_1, x_2, \ldots, x_n)$ and $(y_1, y_2, \ldots, y_n)$.

Then what your colleague is saying is that we can view correlation between $X$ and $Y$ as a kind of normalized inner product between these two vectors.