Elementary proof that if $A$ is a matrix map from $\mathbb{Z}^m$ to $\mathbb Z^n$, then the map is surjective iff the gcd of maximal minors is $1$

Let $R$ be a commutative ring, and $f : R^n \to R^m$ be a homomorphism of $R$-modules with corresponding $m \times n$ matrix $A$ over $R$. Let $Y_1,\dotsc,Y_v$ be the $m \times m$ submatrices of $A$. Then $f$ is surjective iff the $\mathrm{det}(Y_1),\dotsc,\mathrm{det}(Y_v)$ generate the unit ideal of $R$.

Proof (which I learned from Darij Grinberg): Assume that $f$ is surjective. Then there is some $n \times m$ matrix $B$ with $AB=1_m$. Let $Z_1,\dotsc,Z_v$ denote the $m \times m$ submatrices of $B$. Then the Cauchy-Binet formula (which has a nice graph theoretic proof) implies $1=\mathrm{det}(AB)=\sum_{s=1}^{v} \mathrm{det}(Z_s) \mathrm{det}(Y_s)$.

Conversely, assume $\sum_s \lambda_s \mathrm{det}(Y_s)=1$ for some $\lambda_s \in R$. Let $B_s$ denote the $n \times m$ matrix, which is built up out of $\mathrm{adj}(Y_s)$ and with zero columns which were deleted in $A \mapsto Y_s$. Let $B = \sum_{s=1}^{v} \lambda_s B_s$. Then we have

$$AB = \sum_s \lambda_s A B_s = \sum_s \lambda_s Y_s \mathrm{adj}(Y_s) = \sum_s \lambda_s \mathrm{det}(Y_s) 1 = 1.$$

Remark: One can show that $f$ is injective iff $(\mathrm{det}(Y_1),\dotsc,\mathrm{det}(Y_s))$ is a regular ideal.


I'm not sure what kind of machinery you're willing to use. The following proof is short but sophisticated.

First, recall the three types of integer row operations:

  1. Negating a row,

  2. Switching two rows,

  3. Adding an integer multiple of one row to another.

Each of these operations corresponds to a change of basis in the codomain of $\phi$. Similarly, integer column operations correspond to a change of basis in the domain of $\phi$.

Using integer row operations and integer column operations, any integer matrix can be reduced to Smith normal form.

Now, here is the proof:

  1. Observe that the gcd of the determinants of the $m\times m$ minors is unaffected by integer row and column operations. (In particular, a type 1 column operation will negate some of the determinants, a type 2 column operation will switch certain pairs of determinants and negate others, and a type 3 column operation will add an integer multiple of certain determinants to other determinants.)

  2. Observe that being surjective is also a property not affected by row/column operations. To see this, note that row/column operations on $A$ correspond to multiplication by invertible matrices (with determinant 1) on the left/right. Assuming that $A$ is surjective you show that $B=PAQ$ is surjective too by solving $Bx=y$ with $x=Q^{-1}u$, where $u$ is a solution to $Au=P^{-1}y$.

  3. Therefore it suffices to prove the statement in the case where $A$ is in Smith normal form. Such a matrix has only one $m\times m$ minor with nonzero determinant, and this determinant is $1$ if and only if $\phi$ is onto.