Dimension of the sum of two vector subspaces

In the latter case, they are actually the same plane, so their sum is again the same plane (as they are closed under addition).

Here is another (analogous) way to think about it. Let's start with a basis $B_0$ for $U_1\cap U_2.$ We can extend $B_0$ to a basis $B_1$ for $U_1$ and a basis $B_2$ for $U_2$. Then $B_1\cup B_2$ is a basis for $U_1+U_2,$ and $B_1\cap B_2=B_0,$ so $$\begin{align}\dim(U_1+U_2) &= |B_1\cup B_2|\\ &= |B_1|+|B_2|-|B_1\cap B_2|\\ &= |B_1|+|B_2|-|B_0|\\ &= \dim(U_1)+\dim(U_2)-\dim(U_1\cap U_2).\end{align}$$ This generalizes nicely to $\Bbb F^n$, and allows us to avoid geometric arguments that may be less sensible for an arbitrary field $\Bbb F$.

Also, it will never be the case that the intersection of two planes in space is precisely $\{0\}.$ If there were two such planes $U_1$ and $U_2,$ then we would have $$\dim(U_1+U_2)=\dim(U_1)+\dim(U_2)-\dim(U_1\cap U_2)=2+2-0=4>3=\dim(\Bbb R^3),$$ which is not possible, since $U_1+U_2$ is a subspace of $\Bbb R^3$.


Cameron Buie's answer does answer the original question sufficiently. However, I'm adding the formal proof of the theorem in context (taken from "Linear Algebra Done Right" by Sheldon Axler).

If $U_1$ and $U_2$ are subspaces of a finite dimensional vector space then: $$\dim(U_1+U_2)=\dim U_1 + \dim U_2 - \dim(U_1 \cap U_2)$$

Let $u_1,...,u_m$ be a basis of $U_1\cap U_2$; thus $\dim (U_1\cap U_2)=m$. Because $u_1,...,u_m$ is a basis in $U_1\cap U_2$, it is linearly independent in $U_1$. Hence this list can be extended to a basis $u_1,...,u_m,v_1,...,v_j$ of $U_1$. Thus, $\dim U_1 = m+j$. Also extend $u_1,..,u_m$ to a basis $u_1,...,u_m, w_1,...,w_k$ of $U_2$. $\dim U_2 = m +k$.

We will show that $u_1,...,u_m,v_1,...,v_j,w_1,...,w_k$ is a basis of $U_1+U_2$. This will complete the proof because then we will have $$\dim(U_1+U_2)=m+j+k=\dim U_1 + \dim U_2 - \dim (U_1\cap U_2)$$

We just need to show that the list $u_1,...,u_m,v_1,...,v_j,w_1,...,w_k$ is linearly independent. To prove this, suppose:

$$a_1u_1+...+au_m+b_1v_1+...+b_jv_j+c_1w_1+...+c_kw_k=0$$

where all $a,b,c$'s are scalars. We need to show that all the $a,b$ and $c$'s are $0$.

The equation can be rewritten as

$$c_1w_1+...+c_kw_k=-a_1u_1 - ... -a_mu_m -b_1v_1 - ... - b_j v_j$$

Which shows that $c_1w_1+...+c_kw_k\in U_1$. But actually all $w$'s are in $U_2$. So the LHS must be an element of $U_1\cap U_2$.

$c_1w_1+...+c_kw_k=d_1u_1+...+d_mu_m$ for some choice of scalars $d_1,d_2,...,d_m$. But $u_1,...,u_m,w_1,...,w_k$ is linearly independent. So our last equation implies that all the $c$'s equal $0$.

Thus our original equation involving $a,b,c$ becomes

$$a_1u_1+...+a_mu_m+b_1v_1+...+b_jv_j=0$$

But we already knew that the list $u_1,...,u_m,v_1,...,v_j$ is linearly independent. This equation implies that all the $a$'s and $b$'s are $0$. We now know that all $a,b$ and $c$'s are $0$, hence proving our original claim.