Why $A$ invertible $\iff \det A\neq 0$

Well! You see $A^{-1}= Adj A / det(A)$ is also an important result which clearly proofs both cases.


To continue the proof using your approach:

Use the definition of $\det( )$ as the unique alternating multilinear function acting on columns of the matrix such that $\det(I) = 1$ (as opposed to the definition that gives an algebraic expression for $\det()$ in terms of permuations).

$\det(A) \ne 0 \implies $ columns of $A$ are linearly independent $ \implies A$ is full rank.

To get the first arrow, you can do proof by contradiction. So assume the columns of $A$ are dependent and $A_1 = \sum_{i=2}^n c_iA_i$ for some $c_i$. Then $\det(A_1, …, A_n) = \det(A_1 - \sum_{i=2}^n c_iA_i, A_2 + c_2A_2, …, A_n + c_nA_n) = \det(0, A_2 + c_2A_2, …, A_n + c_nA_n) = 0$.

This is the contradiction.

To get the second arrow, it's just the definition of rank.

So $A$ is surjective. By rank-nullity theorem, it is also injective. Therefore it is bijective, proving the existence of $A^{-1}$ as a function. Then it is a small step to prove that $A^{-1}$ linear and then you can complete the proof.


To elaborate on some of the other answers.

The determinant is an alternating multilinear form on the columns (or rows) of a matrix. That means that if you write your matrix as $(v_1, \dots, v_n)$ where $v_i \in \mathbf{R}^n$ are the columns of the matrix, then

  1. $\det$ is linear in each column

  2. If $v_i = v_j$ for any $i \ne j$ then $\det(v_1, \dots, v_n) = 0$.

These two properties imply that if $v_1,\dots,v_n$ are linearly dependent, then $\det(v_1,\dots, v_n) = 0$. For suppose that $v_1, \dots, v_n$ are linearly dependent. Then we can write one of the columns as a linear combination of the others, and for simplicity, we will suppose that the first column can be written as a linear combination of the others. Thus, let

$$v_1 = \sum_{i = 2}^n \alpha_i v_i. $$

Then, we have by 1. and 2.,

\begin{align*} \det(v_1,v_2,\dots,v_n) &= \det \left( \sum_{i = 2}^n \alpha_i v_i,v_2,\dots,v_n\right) \\ &= \sum_{i = 2}^n \alpha_i\det \left(v_i,v_2,\dots,v_n\right) \tag{by 1.} \\ &= \sum_{i = 2}^n 0 \tag{by 2.} \\ &= 0 \end{align*}

Thus if $A$ is not invertible, then the columns of $A$ are linearly dependent, so $\det A = 0$. This is the first proof.


For the second proof, in terms of elementary matrices, we know that there are 3 kinds of elementary row (or column) operations:

  1. Scale any row by a non-zero $\alpha \in \mathbf{R}$
  2. Swap any two rows
  3. Add $\alpha$ times one row to a different row

Each row operation can be written in terms of matrix multiplication. Moreover, if $E$ is the matrix that does the row operation, then $\det E$ is $\alpha$, in case 1; $-1$ in case 2; and $1$ in case 3. When we row reduce $A$, we multiply $A$ on the left by a sequence $E_1,E_{2},\dots, E_k$ of elementary matrices to get

$$ E_kE_{k-1}\cdots E_2E_1 A = \begin{pmatrix} I_r & 0 \\ 0 & 0 \end{pmatrix}$$

where $I_r$ is the identity matrix of size $r = \operatorname{rank}(A)$.

$$ \det(E_k)\cdots \det(E_1) \det(A) = \det\begin{pmatrix} I_r & 0 \\ 0 & 0 \end{pmatrix}.$$

We have $\det(E_k)\cdots \det(E_1) \ne 0$ since $\det(E_i) \ne 0$ for any $i$. If $\det(A) \ne 0$ then we have

$$ \det\begin{pmatrix} I_r & 0 \\ 0 & 0 \end{pmatrix} \ne 0$$

which is only possible if $r = n$. I.e. when $A$ has full rank.