How to quickly check if vectors are an orthonormal basis of a vector space?

If you are using a computing environment where matrix operations are fast, you can check that

$$A^T \cdot A = I$$

where $A$ is a matrix of your basis of column-vectors vectors: $(i_1|i_2|i_3)$.

Note that according to matrix multiplication semantics, each element in the result matrix corresponds to the dot-product of a pair of basis vectors. Hence it exactly matches the definition of orthonormality: the dot-product $<i_j,i_k>$ is 1 on the diagonal (when $j = k$) and 0 elsewhere (when $j \ne k$).


That is correct.

  1. All vectors need to be linearly independent

This is by definition the case for any basis: the vectors have to be linearly independent and span the vector space. An orthonormal basis is more specific indeed, the vectors are then:

  • all orthogonal to each other: "ortho";
  • all of unit length: "normal".

Note that any basis can be turned into an orthonormal basis by applying the Gram-Schmidt process.


A few remarks (after comments):

  • the vector space needs to be equipped with an inner product to talk about orthogonality of vectors (you're then working in a so called inner product space);
  • if all vectors are mutually orthogonal, then they are definitely linearly independent (so you wouldn't have to check this separately, if you check orthogonality).

What you write down is correct, and that is exactly the definition of orthonormal bases.

If in the Euclidean spaces, you know we can check the linear dependence by its determinant, which is the only thing making the matter easier.