Prove that if Ax = b has a solution for every b, then A is invertible

For each standard basis vector $e_1, \dots, e_n$, we have some $b_1, \dots, b_n$ such that $A b_i = e_i$, respectively. Then simply 'squishing' the $b_1, \dots, b_n$ together into a $n \times n$ matrix directly gives $A^{-1}$.


The result is wrong as stated, since by taking for $A$ a rectangular matrix (more columns than rows) one easily gets counterexamples. I will therefore suppose you implicitly assumed $A$ to be square (a necessary condition for being invertible).

This is then a complement to the answer by basket, using a simple argument found in this answer. By the hypothesis you can find a matrix $B$ such that $AB=I$, since this amounts for every column of $B$ to an equation of the form $Ax=c$, where $c$ is the corresponding column of $I$. Now taking determinants we get $\det(A)\det(B)=1$, so the determinant of $A$ is invertible in your commutative ring. Then $A$ as well is invertible in the matrix ring, namely $\det(B)$ times the cofactor matrix of $A$ gives $A^{-1}$.

That was all that was asked for, but multiplying $AB=I$ to the left by $A^{-1}$ shows that in fact $B=A^{-1}$ and hence $BA=I$. Note that commutativity of the base ring (which allowed taking determinants) is essential; the result does not hold over non-commutative rings (even for $1\times1$ matrices, since for a scalar having a right inverse now does not imply having a left inverse).