Constructive proof that a kernel consists of nilpotent elements

This answer provides a scheme how to construct a constructive proof, though I'm still working to actually explicitly extract the constructive proof, so please don't accept the answer just yet. (Update: See below.) We'll prove the following statement:

Let $R$ be a reduced ring. Let $A$ be a finitely generated $R$-module and let $B$ be an arbitrary $R$-module. Let injections $\alpha : R \to A$ and $\beta : R \to B$ be given. Then the canonical map $R \to A \otimes_R B$ is injective.

The general case, with $A$ not necessarily being finitely generated, follows formally from this case, since $A$ is the directed union of its finitely generated submodules which contain the image of $\alpha$ and tensoring with $B$ commutes with colimits.

We'll prove this statement by working internal to the little Zariski topos of $R$, that is the topos of sheaves on $\operatorname{Spec}(R)$, as explained in these notes. In this topos $R$, $A$, and $B$ have mirror images $R^\sim$, $A^\sim$, and $B^\sim$ such that $R \to A \otimes_R B$ is injective if and only if $R^\sim \to A^\sim \otimes_{R^\sim} B^\sim$ is a monomorphism in the topos. In order to ultimately be able to extract a fully explicit, constructive, non-toposophic proof, the little Zariski topos needs to be defined in a constructibly sensible way; but this is possible. I presume that the extracted proof will look convoluted at first, but it's possible that it could be simplified even to the point that one wonders why one didn't see it without help of tools.

The point is that working internal to that topos simplifies the situation to the easiest case, namely that the base ring is a field, such that the proof is almost trivial. This is because the internal universe of the Zariski topos has the following peculiarities:

  • The ring $R^\sim$ is a field in the sense that $1 \neq 0$ and $\forall x {:} R^\sim. \neg(\text{$x$ invertible}) \Rightarrow x = 0$.
  • From this it follows that $\forall x{:}R^\sim. \neg\neg(x = 0) \Rightarrow x = 0$. This is a huge simplification, since it's much easier to verify doubly negated statements: In order to show that $\neg\neg\varphi \Rightarrow \neg\neg\psi$, it suffices to show that $\varphi \Rightarrow \neg\neg\psi$. Note that this is really a peculiarity of the Zariski topos. The analogous statement $\forall x \in R. \neg\neg(x = 0) \Rightarrow x = 0$ is in general not intuitionistically justified.
  • Any finitely generated module over $R^\sim$ is not not finite free. (There does not not exist a minimal generating family. The usual proof shows that such a family is linearly independent and therefore a basis.)

Without further ado, here is the internal proof. Let $r:R^\sim$ such that $r \cdot (\alpha(1) \otimes \beta(1)) = 0$ in $A^\sim \otimes B^\sim$. We want to verify that $r = 0$, but it suffices to verify that $\neg\neg(r = 0)$. Therefore we may assume that $A^\sim$ is finite free. Let $(x_1,\ldots,x_n)$ be a basis. Write $\alpha(r) = \sum_i r_i x_i$. Since $A^\sim \otimes B^\sim \cong (B^\sim)^n$, it follows that $r_i \beta(1) = 0$ for all $i$. Since $\beta$ is injective, it follows that $r_i = 0$ for all $i$. Thus $\alpha(r) = 0$. Since $\alpha$ is injective, it follows that $r = 0$.

Update: Here is a fully explicit constructive proof, obtained by working with @HeinrichD in the comments to unravel the scheme sketched above. Unfortunately it's rather convoluted and not particularly memorable; I hope that it can be simplified.

Lemma 1. Let $R$ be a ring. Let $A$ be an $R$-module with generating family $(x_1,\ldots,x_n)$. Assume that the only $g \in R$ such that one of the $x_i$ is an $R[g^{-1}]$-linear combination of the others in $A[g^{-1}]$ is $g = 0$. Then $A$ is free with $(x_1,\ldots,x_n)$ as a basis.

Proof: Let $\sum_i r_i x_i = 0$. Let $i$ be arbitrary. In $A[r_i^{-1}]$, the generator $x_i$ is a linear combination of the others. By assumption it follows that $r_i = 0$.

Lemma 2. Let $R$ be a reduced ring. Let $A$ be a finitely generated $R$-module. Assume that the only $f \in R$ such that $A[f^{-1}]$ is a free $R[f^{-1}]$-module is $f = 0$. Then $R = 0$.

Proof: By induction on the length $n$ of a given generating family $(x_1,\ldots,x_n)$ of $A$. Note that we'll apply the induction hypothesis not to the ring $R$, but to some localizations of $R$.

If $n = 0$, then $A = 0$. Thus we can finish by using the assumption for $f := 1$.

If $n \geq 1$, then we want to verify the assumptions of Lemma 1. Thus let $g \in R$ be given such that one of the $x_i$ is an $R[g^{-1}]$-linear combination of the others in $A[g^{-1}]$. Therefore the $R[g^{-1}]$-module $A[g^{-1}]$ can be generated by $n-1$ elements. By the induction hypothesis (applied to the reduced ring $R[g^{-1}]$ and its module $A[g^{-1}]$, which are easily seen to satisfy the assumptions of the induction hypothesis) it follows that $R[g^{-1}] = 0$ (in this step the assumption enters for many different $f$'s). Therefore $g = 0$.

Thus, by Lemma 1, $A$ is free. We can finish by using the assumption for $f := 1$.

Corollary. Let $R$ be a reduced ring. Let $A$ be a finitely generated $R$-module. Let $B$ be an arbitrary $R$-module. Let injections $\alpha : R \to A$ and $\beta : R \to B$ be given. Then the canonical map $\alpha \otimes \beta : R \to A \otimes_R B$ is injective.

Proof. Let $r \in R$ such that $r \cdot (\alpha(1) \otimes \beta(1)) = 0$. To verify that $r = 0$, we'll apply Lemma 2 to the ring $R' := R[r^{-1}]$ and the $R'$-module $A' := A[r^{-1}]$. Let therefore $f \in R'$ be given such that $A'[f^{-1}]$ is a free $R'[f^{-1}]$-module. The canonical map $R'[f^{-1}] \to A'[f^{-1}] \otimes B[r^{-1}][f^{-1}]$ is injective (the easy case!). Therefore $r = 0$ in $R'[f^{-1}]$. Since $r$ is invertible in $R'$, it follows that $R'[f^{-1}] = 0$ and therefore $f = 0$.


(Edit: This proof is incorrect, but see the comments below.)

I will use Lemma 6.4 of Eisenbud's Commutative Algebra with a View Toward Algebraic Geometry (used in the proof of the equational criterion of flatness).

Let $\alpha : R \to A$ and $\beta : R \to B$ be the structure maps. Suppose $r \in \ker(R \to A \otimes_{R} B)$, in other words $\alpha(r) \otimes 1_{B} = 0$ in $A \otimes_{R} B$. By the lemma cited above, there exist elements $a_{1},\dotsc,a_{n} \in A$ and $r_{1},\dotsc,r_{n} \in R$ such that $$ \alpha(r) = \sum_{i=1}^{n} \alpha(r_{i})a_{i} $$ and $$ \beta(r_{i}) 1_{B} = 0 $$ for all $i$. In other words $r_{i} \in \ker \beta$, so each $r_{i}$ is nilpotent, hence each $\alpha(r_{i})$ is nilpotent, hence $\alpha(r)$ is nilpotent. Thus there exists some $k$ such that $\alpha(r^{k}) = (\alpha(r))^{k} = 0$, hence $r^{k}$ is nilpotent, hence $r$ is nilpotent.


Let us use $k_\min$.

The references we will give are in the book Commutative Algebra. Constructive methods (Lombardi-Quitté) (arXiv:1605.04832v1).

We have to prove the following. Let $k \to A$ and $k\to B$ be two injective morphisms of commutative reduced rings, if $A\otimes_k B=0$ then $k=0$.

1) This is true when $k$ is a discrete field or, more generally, a zerodimensional reduced ring (von Neuman regular). Indeed $k\to A$ is faithfully flat (VIII-6.2) and $k \to B$ is injective, so $ A \to A \otimes_k B$ is injective, thus $A=0$, thus $k=0$

2) This is true when $k$ is a pp-ring (IV-6). Indeed, let $S$ be the monoid of regular elements, $k_S$ is von Neuman regular, and $k_S\to A_S$ remains injective: if $x \in k$ and $x.1_{A_S} =_{A_S} 0$ we have an $u \in S$ such that $ u.x.1_A =_A 0$, thus $ux=_k0$, thus $x=0$ in $k_S$. Similarly $k_S \to B_S$ is injective. If $A\otimes_k B=0$ then $A_S\otimes_{k_S}B_S\simeq k_S \otimes_k (A \otimes_k B)=0$ thus $k_S = 0$ by Item 1, thus $k = 0$.

3) General case. We use $k_\min$. Since $k_\min$ is a pp-ring and $k \to k_\min$ injective ($k$ is reduced), it is sufficient to prove that $k_\min \to k_\min \otimes_k A$ is injective (same thing for $k_\min \to k_\min \otimes_k B$). Indeed letting $A'=k_\min \otimes_k A$ and $B'=k_\min \otimes_k B$, we have $A'\otimes_{k_\min} B'\simeq k_\min \otimes_k(A\otimes_k B)$, so $A'\otimes_{k_\min} B'=0$ which implies $k_\min=0$ by Item 2, which implies $k=0$.

Viewing the construction of $k_\min$, and using the notation of the book it is sufficient to show that $k_{\{u\}} \to k_{\{u\}} \otimes_k A$ is injective for any $u \in k$.

We write $I^*=(0:I)_k=\{x\in k\,;\, xI=0\}$ for an ideal $I$ of $k$ and $u^*$ for $(u)^*$.

By definition $k_{\{u\}} = k/u^* \times k/(u^*)^*$, thus $k_{\{u\}} \otimes_k A = A/(u^*A) \times A/((u^*)^*A)$. Let $x \in k_{\{u\}}$, $x = ( y \mod u^*, z \mod (u^*)^* )$, $y,z \in k$. Let $y'=y1_A$ and $z'=z1_A$ in $A$. We assume $y' \mod u^*A = 0$, $z' \mod ((u^*)^*)A = 0$ and we have to show that $y \mod u^* = 0$, $z \mod (u^*)^* = 0$.

  • $y'=\sum y_i$ where $y_i \in u^* A$. We have $y_i=\sum a_{ij} y_{ij}$ with $a_{ij}\in k$, $ua_{ij} =_k0$, $y_{ij}\in A$. Thus $u y_i =_A 0$, thus $uy'=_A0$. So $u y1_A=_A0$, so $uy =_k 0$, so $y\in u^*$, i.e. $y\mod u^*=0$.
  • $z'=\sum z_i$ where $z_i \in (u^*)^* A$. We have $z_i=\sum b_{ij} z_{ij}$ with $b_{ij} \in (u^*)^*$. Let $v\in u^*$. We get $v b_{ij} = 0$, $v z_i = 0$, $ v z' = 0$ and $vz = 0$. So, for any $v \in u^*$ we have $vz = 0$, thus $z \in (u^*)^*$ i.e. $z \mod (u^*)^* =0$

– Henri Lombardi