What is the BigO of linear regression?

You can express this as a matrix equation:

alt text

where the matrix alt text is 300K rows and 1200 columns, the coefficient vector alt text is 1200x1, and the RHS vector alt text is 1200x1.

If you multiply both sides by the transpose of the matrix alt text, you have a system of equations for the unknowns that's 1200x1200. You can use LU decomposition or any other algorithm you like to solve for the coefficients. (This is what least squares is doing.)

So the Big-O behavior is something like O(mmn), where m = 300K and n = 1200. You'd account for the transpose, the matrix multiplication, the LU decomposition, and the forward-back substitution to get the coefficients.


The linear regression is computed as (X'X)^-1 X'Y.

If X is an (n x k) matrix:

  1. (X' X) takes O(n*k^2) time and produces a (k x k) matrix

  2. The matrix inversion of a (k x k) matrix takes O(k^3) time

  3. (X' Y) takes O(n*k^2) time and produces a (k x k) matrix

  4. The final matrix multiplication of two (k x k) matrices takes O(k^3) time

So the Big-O running time is O(k^2*(n + k)).

See also: http://en.wikipedia.org/wiki/Computational_complexity_of_mathematical_operations#Matrix_algebra

If you get fancy it looks like you can get the time down to O(k^2*(n+k^0.376)) with the Coppersmith–Winograd algorithm.