What is the Difference between Variance and MSE?

The main difference is whether you are considering the deviation of the estimator of interest from the true parameter (this is the mean squared error), or the deviation of the estimator from its expected value (this is the variance). Consequently, we can see that when the bias of the estimator is zero, the variance and mean squared error are equal.

Mathematically, if $\hat \theta$ is an estimator for $\theta$, then $$\operatorname{MSE}[\hat \theta] = \operatorname{E}[(\hat\theta - \theta)^2],$$ whereas $$\operatorname{Var}[\hat\theta] = \operatorname{E}[(\hat\theta - \operatorname{E}[\hat\theta])^2].$$ And regarding the previous remark, $$\operatorname{Bias}[\hat\theta] = \operatorname{E}[\hat\theta - \theta],$$ so when the bias is zero, $\operatorname{E}[\hat\theta] = \theta$ and now we easily see how the MSE and variance become equivalent.

Note, however, we can also write: $$\operatorname{Var}[\hat\theta] = \operatorname{E}[\hat \theta^2 - 2\hat\theta \operatorname{E}[\hat \theta] + \operatorname{E}[\hat\theta]^2] = \operatorname{E}[\hat\theta^2] - \operatorname{E}[\hat\theta]^2,$$ so that $$\begin{align*} \operatorname{Var}[\hat\theta] + \operatorname{Bias}^2[\hat\theta] &= \operatorname{E}[\hat\theta^2] - \operatorname{E}[\hat\theta]^2 + (\operatorname{E}[\hat \theta] - \theta)^2 \\ &= \operatorname{E}[\hat\theta^2] - 2\theta \operatorname{E}[\hat\theta] + \theta^2 \\ &= \operatorname{E}[(\hat \theta - \theta)^2] \\ &= \operatorname{MSE}[\hat\theta]. \end{align*}$$


The variance measures how far a set of numbers is spread out whereas the MSE measures the average of the squares of the "errors", that is, the difference between the estimator and what is estimated. The MSE of an estimator $\hat{\theta}$ of an unknown parameter $\theta$ is defined as $E[(\hat{\theta}-\theta)^2]$.

The MSE is the second moment (about the origin) of the error, that's why it includes both the variance of the estimator and its bias (the bias being $E(\hat{\theta})-\theta$).

In other words, the variance just measures the dispersion of the values; the MSE indicates how different the values of the estimator and the actual values of the parameters are. The MSE is a comparison of the estimator and the true parameter, as it were. That's the difference.

Edit: I'll use your example: Suppose we have a bull's-eye, the mean of the estimator is the target. The variance measures how far the arrows are from the target. Now suppose we have another bull's-eye, and this time the target is the true parameter. The MSE measures how far the arrows (estimates) are from the target. In the first case, we just measure the dispersion of the values of the estimator with respect to its mean. In the second case, we measure the error we make when estimating the parameter, i.e. we are comparing it with the true parameter (that's why we want estimators with variance and MSE as small as possible). Don't confuse the mean of an estimator with the true value of a parameter.