Comparing LU or QR decompositions for solving least squares

Matrix Computations

Golub and Van Loan, 3e

$\S$ 5.7.1, p. 270

Comparing flop counts for operations on $n\times n$ matrices:

$\frac{2}{3}n^{3}\ $ $\qquad$ Gaussian elimination

$\frac{4}{3}n^{3}\ $ $\qquad$ Householder orthogonalization

$2n^{3}$ $\qquad \ \ $ Modified Gram-Schmidt

$\frac{8}{3}n^{3}\ $ $\qquad$ Bidiagonalization

$12n^{3}$ $\qquad$ Singular Value Decomposition

Three reasons to choose orthogonalization to solve square systems:

  1. Flop counts exaggerate the Gaussian elimination advantage. When memory traffic and vectorization overheads are considered the $\mathbf{Q}\mathbf{R}$ factorization is comparable in efficiency.

  2. Orthogonalization methods have guaranteed stability, there is no "growth factor" to worry about as in Gaussian elimination.

  3. In cases of ill-conditioning, the orthogonalization methods give an added measure of reliability. $\mathbf{Q}\mathbf{R}$ with condition estimate is very dependable and, of course, SVD is unsurpassed when it comes to producing a meaningful solution to a nearly singular system.


Your reasoning at the top is really odd. The LU decomposition is twice as fast as the standard QR decomposition and it will solve most systems. There are however pathologically ill-conditioned systems and non-square systems. That is where it will use the QR or SVD. The main reason for the SVD is it allows you to be selective about your condition number.

There are many other decompositions. The Cholesky decomposition is twice as fast as the LU decomposition but only for positive definite Hermitian matrices. All of this neglects the sparsity of the matrix as well.