Vector spaces. When in the real world are we checking if it's a vector space or not?

You'll never have to prove something is a vector space in real life. You may want to prove something is a vector space, because vector spaces have a simply enormous amount of theory proven about them, and the non-trivial fact that you wish to establish about your specific object might boil down to a much more general, well-known result about vector spaces.

Here's my favourite example. It's still a little artificial, but I came by it through simple curiousity, rather than any course, or the search for an example.

Consider the logic puzzle here. It's a classic. You have a $5 \times 5$ grid of squares, coloured black or white. Every time you press a square, it changes the square and the (up to four) adjacent squares from black to white or white to black. Your job is to press squares in such a way that you end up with every square being white.

So, my question to you is, can you form any configuration of white and black squares by pressing these squares? Put another way, is any $5 \times 5$ grid of black and white squares a valid puzzle with a valid solution?

Well, it turns out that this can be easily answered using linear algebra. We form a vector space of $5 \times 5$ matrices whose entries are in the field $\mathbb{Z}_2 = \lbrace 0, 1 \rbrace$. We represent white squares with $0$ and black squares with $1$. Such a vector space is finite, and contains $2^{25}$ vectors. Note that every vector is its own additive inverse (as is the case for any vector space over $\mathbb{Z}_2$).

Also note that the usual standard basis, consisting of matrices with $0$ everywhere, except for a single $1$ in one entry, forms a basis for our vector space. Therefore, the dimension of the space is $25$.

Pressing each square corresponds to adding one of $25$ vectors to the current vector. For example, pressing the top left square will add the vector

$$\begin{pmatrix} 1 & 1 & 0 & 0 & 0 \\ 1 & 0 & 0 & 0 & 0 \\ 0 & 0 & 0 & 0 & 0 \\ 0 & 0 & 0 & 0 & 0 \\ 0 & 0 & 0 & 0 & 0 \end{pmatrix}.$$

We are trying to find, therefore, a linear combination of these $25$ vectors that will sum to the current vector (remember $-v = v$ for all our vectors $v$).

So, my question that I posed to you, boils down to asking whether these $25$ vectors span the $25$-dimensional space. Due to standard results in finite-dimensional linear algebra, this is equivalent to asking whether the set of $25$ vectors is linearly independent.

The answer is, no, they are not linearly independent. In particular, if you press the buttons highlighted in the following picture, you'll obtain the white grid again, i.e. the additive identity.

enter image description here

Therefore, we have a non-trivial linear combination of the vectors, so they are linearly dependent, and hence not spanning. That is, there must exist certain configurations that cannot be attained, i.e. there are invalid puzzles with no solution.

The linear dependency I found while playing the game myself, and noticing some of the asymmetries of the automatically generated solutions, even when the problem itself was symmetric. Proving the linear dependence is as easy as showing the above picture. I still don't know of an elegant way to find an example of a puzzle that can't be solved though! So, my proof is somewhat non-constructive, and very easy if you know some linear algebra, and are willing to prove that a set is a vector space.


Vector spaces, and vector space-like sets, crop up in a lot of places. The specific example here is artificial, yes, but it is meant to show that we need to be careful when checking whether a set of vector-like objects, together with two operations (vector addition and scalar multiplication), actually is a vector space or not.