Why are anti-diagonal / persymmetric matrices not as important as diagonal / symmetric matrices?

I don't know how satisfying this answer will be, but I'll give it a shot anyway. The punchline, I think, is that although these "diagonal properties" have just as much aesthetic appeal as their "anti-diagonal" counterparts, the diagonal properties happen to give us information that is more useful for a matrix as it is used mathematically. That is, diagonal symmetry is a more natural thing to look for in the context of linear algebra.

First of all, note that all of these properties are properties of square (that is, $n \times n$) matrices, which are implicitly linear maps from $\Bbb F^n$ to $\Bbb F^n$ (that is, they produce vectors of $n$ entries from vectors of $n$ entries).

The properties that we really care about in linear algebra are the ones that tell us something about how matrices interact with vectors (and ultimately, with other matrices).

Diagonal Matrices

Diagonal matrices are important because they describe a particularly nice class of linear transformations. In particular: $$ \pmatrix{d_1\\&d_2\\&&\ddots \\ &&& d_n} \pmatrix{x_1\\ x_2\\ \vdots \\ x_n} = \pmatrix{d_1 x_1\\ d_2 x_2\\ \vdots \\ d_n x_n} $$ I would say that what a diagonal matrix represents is the fact that each of the $n$ variables required to specify a vector are decoupled. For example, in order to find the new $x_2$, one only needs to look at the old $x_2$, and do what the matrix says.

When we "diagonalize" a matrix, we're finding a way to describe each vector (that is, $n$ independent "pieces of information") that are similarly decoupled as far as the transformation is concerned. So, for example, the matrix $$ A = \pmatrix{0&1\\4&0} $$ takes a vector $x = (x_1,x_2)$ and produces a new vector $Ax = (x_2,2x_1)$. There's a nice symmetry to that; in particular, applying $A$ twice gives us the vector $A^2 x = (2x_1,2x_2)$, which is to say that $A$ acts like a diagonal matrix whenever you apply it an even number of times.

However, I would argue that we get a clearer picture of what $A$ does if we diagonalize it. In particular, if we write a vector as $x = a_1(1,2) + a_2(1,-2)$, $A$ gives us the new vector $$ Ax = a_1 A(1,2) + a_2 A(2,1) = 2a_1(1,2) - 2a_1(1,-2) $$ In particular, one we know the two pieces of information $a_1$ and $a_2$, we can figure out the new vector using these pieces separately, without having them interact.

So, we see that this antidiagonal $A$ is nice, but just not nearly as simple as the "diagonal version" of the transformation.

Symmetric matrices

Symmetric matrices are particularly nice when we care about dot-products. Dot products are needed whenever you want to think about the angle between vectors in some capacity.

In particular: if we define the dot-product $$ (x_1,\dots,x_n) \cdot (y_1,\dots,y_n) = x_1y_1 + \cdots x_n y_n $$ Then a symmetric $A$ will have the property that $$ (Ax) \cdot y = x \cdot (Ay) $$ Ultimately, this whole thing connects back to diagonal matrices since every symmetric matrix can be diagonalized in the sense described above. The fact that this can be done is known as the spectral theorem.

Persymmetric matrices, however, don't act in a particularly nice way with respect to any usual operations (like the dot product).


$\newcommand{\dd}{\partial}$If $A$ is an $n \times n$ matrix, and if $x$ and $y$ are columns satisfying $y = Ax$, then the $(i, j)$-entry of $A$ measures the "sensitivity" of $y_{i}$ to changes in $x_{j}$. Precisely, if every variable except $x_{j}$ is held constant, then a change of $\Delta x_{j}$ in $x_{j}$ changes $y_{i}$ by $A_{ij}y_{i}$. (This is immediate from the definition of matrix multiplication, and explains why the derivative matrix of a vector-valued function of $n$ variables has the partial derivative $\dd y_{i}/\dd x_{j}$ in the $(i, j)$ entry.)

To say $A$ is diagonal is to say $y_{i}$ depends only on $x_{i}$, the variable with the same index. This condition is invariant under relabeling variables. By contrast, anti-diagonality is not invariant under relabeling.

I don't know of a similarly naive reason that symmetry is so important, but naturally the spectral theorem for matrices guarantees a symmetric matrix is orthogonally diagonalizable; that is, up to a rotational change of variables, $A$ is diagonal.