Rotations in space-time

Take a system of four mutually orthogonal vectors lined up with the four axis x,y,z,t in 4D space. Apply an arbitrary rotation to move the four axis to some other positions. They remain mutually orthogonal of course. We now need to show that we can rotate the vectors back to their starting position using rotations in the six planes. The inverse of these would then be the six rotations we require to match the original rotation.

Start with the vector that was originally aligned with the x-axis. We need to rotate it back there. First use a rotation in the xy plane to rotate it so that it moves into the xzt hyperplace. Then use a rotation in the xz plane so that it moves into the xt plane. Then use a rotation in the xt plane so that it lines up with the x axis as required.

The other three orthogonal vectors will have been rotated as well, but since they remain orthogonal to the first vector they are now in the yzt hyperplane orthogonal to the x axis. You can now apply use the three rotations in the remaining planes yz, zt and yt to line them up with their original axis using the solution for 3d space. So it's done.


Revised answer:

I think Landau is referring to the extrinsic Euler angle parametrization of rotations (also known as the method of Givens rotations or Jacobi rotations). Basically, there is an explicit algorithm by which one can achieve any orientation-preserving orthogonal transformation as a (highly non-unique) sequence of rotations in pairs of coordinates. You can prove the existence of this decomposition in general by induction on the number of coordinates, and this is essentially what Philip Gibbs did in his answer, for the case of dimension 4.


Original answer:

The only way I know how to make Landau's statement both precise and correct is to say that the vector space of first-order infinitesimal rotations in 4 dimensions is spanned by infinitesimal rotations in the 6 pairs of axes. (In particular, the word "resolved" here is a bit of a puzzle to me.)

Any linear transformation in $n$ dimensions (including any rotation) can be written as an $n \times n$ matrix, where for each $k$ between $1$ and $n$, the $k$th column of the matrix gives the coordinates of where the $k$th basis vector goes. In order for a transformation to be a rotation, we need the lengths of the vectors to be preserved, and we need the angles between them to stay the same. We can encode these conditions in a succinct equation asserting that our matrix times its transpose is the identity. The set of such transformations is given by the solutions to the matrix equation, so it forms an algebraic subset of the $n^2$-dimensional space of matrices. In fact, it has the structure of a Lie group, called the orthogonal group $O(n)$.

We can describe infinitesimal rotations by adding an infinitesimal element $\epsilon$ to our number system, which satisfies the properties that $\epsilon \neq 0$ and $\epsilon^2 = 0$. An infinitesimal transformation is a matrix of the form $I + \epsilon M$, where $I$ is the $n \times n$ identity matrix, and $M$ is any matrix with real (non-infinitesimal) entries. In order for this to be a rotation, it is necessary and sufficient that the matrix equation $(I + \epsilon M)(I + \epsilon M)^T = I$ is satisfied. The left side can be expanded as $I + \epsilon M + \epsilon M^T$, so the equation becomes $\epsilon (M + M^T) = 0$. Since the entries of $M$ and $M^T$ are non-infinitesimal, this is equivalent to $M$ being skew-symmetric, i.e., $M = -M^T$. That is, the space of first-order infinitesimal rotations is the space of matrices of the form $I + \epsilon M$, where $M$ is skew-symmetric - this is also called the Lie algebra of the group $O(n)$.

It remains to find a set that spans the space of skew-symmetric matrices. A natural method is given taking all pairs of distinct coordinates, and for each pair, choosing an antisymmetric combination, i.e., $e_{ij} - e_{ji}$, where $e_{ij}$ is the matrix that has a 1 in the $i$th row and $j$th column and zeroes elsewhere. This forms a linearly independent set of size $\binom{n}{2}$, which is the dimension of the space of skew-symmetric matrices. If a matrix $M$ has the form $e_{ij} - e_{ji}$, we can think of it as an infinitesimal rotation in the $x_i x_j$ direction, since exponentiating yields the rotation: $$e^{tM} = (e_{ii} + e_{jj}) \cos t + (e_{ij} - e_{ji}) \sin t + \sum_{k \not \in \{i,j \} } e_{kk}.$$ In dimensions 3 and 4, the rotations you listed are precisely those given by pairs of distinct coordinates.


Just to expand on my comment more, essentially what we are looking at is the special orthogonal group of order 4, often abbreviated SO(4). "Special" (also sometimes called normal) as it's determinate is 1, and "orthogonal" since lets have $M \in SO(k)$ then $MM^T=1$. (In special relativity it is actually SO(3,1), cause your looking at Lorentzian space not Euclidean space so you have a sign difference). With SO(K), you can rotate axis 1 into axes 2, 3,...,K. With axis 2, you can rotate it into 3,4,...K. Therefore, SO(K) has $(K-1)+(K-2)+...+2+1 = \frac{K(K-1)}{2}$ generators.

Intuitively it sorta makes sense that any rotation can be decomposed this way, as, thinking of the 3 dimensional case, any rotation can be seen as rotating one way, and then rotating the other way (very specific, right? :P). 4 dimensions is just another layer on top of that, as you can rotate any of the other 3 dimensions into the 4th (called a boost if the 4th is time).