Are all eigenvectors, of any matrix, always orthogonal?

In general, for any matrix, the eigenvectors are NOT always orthogonal. But for a special type of matrix, symmetric matrix, the eigenvalues are always real and the corresponding eigenvectors are always orthogonal.

For any matrix M with n rows and m columns, M multiplies with its transpose, either M*M' or M'M, results in a symmetric matrix, so for this symmetric matrix, the eigenvectors are always orthogonal.

In the application of PCA, a dataset of n samples with m features is usually represented in a n* m matrix D. The variance and covariance among those m features can be represented by a m*m matrix D'*D, which is symmetric (numbers on the diagonal represent the variance of each single feature, and the number on row i column j represents the covariance between feature i and j). The PCA is applied on this symmetric matrix, so the eigenvectors are guaranteed to be orthogonal.


Fix two linearly independent vectors $u$ and $v$ in $\mathbb{R}^2$, define $Tu=u$ and $Tv=2v$. Then extend linearly $T$ to a map from $\mathbb{R}^n$ to itself. The eigenvectors of $T$ are $u$ and $v$ (or any multiple). Of course, $u$ need not be perpendicular to $v$.


In the context of PCA: it is usually applied to a positive semi-definite matrix, such as a matrix cross product, $X ' X$, or a covariance or correlation matrix.

In this PSD case, all eigenvalues, $\lambda_i \ge 0$ and if $\lambda_i \ne \lambda_j$, then the corresponding eivenvectors are orthogonal. If $\lambda_i = \lambda_j$ then any two orthogonal vectors serve as eigenvectors for that subspace.