Prove existence of evaluation points such that the matrix has nonzero determinant

Proceed by contrapositive. We suppose that for all $a\leq x_1 < \cdots < x_n \leq b$, $$ \det \begin{bmatrix} f_{1}(x_{1}) & f_{1}(x_{2}) & \cdots & f_{1}(x_{n})\\ f_{2}(x_{1}) & f_{2}(x_{2}) & \cdots & f_{2}(x_{n}) \\ \vdots & \vdots & \ddots & \vdots \\ f_{n}(x_{1}) & f_{n}(x_{2}) & \cdots & f_{n}(x_{n}) \end{bmatrix} = 0. $$ Equivalently, the above holds for all choices of $x_1,\dots,x_n \in [a,b]$. Consider the subspace of $\Bbb R^n$ defined by $$ U = \operatorname{span}\{(f_1(x),\dots,f_n(x)) : x \in [a,b]\}. $$ Suppose for the purpose of contradiction that $U = \Bbb R^n$. It follows that there exist vectors $v_1,\dots,v_n \in U$ that span $\Bbb R^n$. If we take these vectors as the columns of a matrix, then we end up with an $n \times n$ matrix of the form above; this matrix has linearly independent columns, which means that its determinant is non-zero. This contradicts our assumption.

So, $U$ is necessarily a proper subspace of $\Bbb R^n$. Select any non-zero $c = (c_1,\dots,c_n) \in U^\perp$. By definition, we have $c^Tv = 0$ for all $v \in U$. That is, for every $x \in [a,b]$ we have $$ c_1 f_1(x) + \cdots + c_n f_n(x) = 0. $$ That is, the functions $f_1,\dots,f_n$ are linearly dependent.

The conclusion follows.


The proof by induction, since I was curious. Reduce from the $n$-case to the $(n-1)$-case by noting that

$$ \det \pmatrix{ f_{1}(x_{1}) & f_{1}(x_{2}) & \cdots & f_{1}(x_{n})\\ f_{2}(x_{1}) & f_{2}(x_{2}) & \cdots & f_{2}(x_{n}) \\ \vdots & \vdots & \ddots & \vdots \\ f_{n}(x_{1}) & f_{n}(x_{2}) & \cdots & f_{n}(x_{n}) } = \\ \det\pmatrix{ f_{1}(x_{1}) & f_{1}(x_{2}) & \cdots & f_{1}(x_{n})\\ 0 & f_{2}(x_{2}) - \frac{f_2(x_1)}{f_1(x_1)} f_1(x_2) & \cdots & f_{2}(x_{n}) - \frac{f_2(x_1)}{f_1(x_1)}f_1(x_n) \\ \vdots & \vdots & \ddots & \vdots \\ 0 & f_{n}(x_{2}) - \frac{f_n(x_1)}{f_1(x_1)} f_1(x_2) & \cdots & f_{n}(x_{n}) - \frac{f_n(x_1)}{f_1(x_1)}f_1(x_n) } = \\ f_1(x_1) \det\pmatrix{ f_{2}(x_{2}) - \frac{f_2(x_1)}{f_1(x_1)} f_1(x_2) & \cdots & f_{2}(x_{n}) - \frac{f_2(x_1)}{f_1(x_1)}f_1(x_n) \\ \vdots & \ddots & \vdots \\ f_{n}(x_{2}) - \frac{f_n(x_1)}{f_1(x_1)} f_1(x_2) & \cdots & f_{n}(x_{n}) - \frac{f_n(x_1)}{f_1(x_1)}f_1(x_n) } $$ and defining $g_j(x) = f_{j+1}(x) - \frac{f_{j+1}(x_1)}{f_1(x_1)}f_1(x)$ for $j = 1,\dots,n-1$.


(I'll call this a recursive algorithm rather than mathematical induction, but one may disagree.)

Let $\mathbf f=(f_1,f_2,\ldots,f_n)^T$.

  • Pick any nonzero vector $v_1$.
  • Since $f_1,\ldots,f_n$ are linearly independent, there exists some $x_1$ such that $v_1^T\mathbf f(x_1)\ne0$.
  • Pick any nonzero vector $v_2\perp\mathbf f(x_1)$ (that is, $v_2^T\mathbf f(x_1)=0$).
  • Since $f_1,\ldots,f_n$ are linearly independent, there exists some $x_2$ such that $v_2^T\mathbf f(x_2)\ne0$.
  • (Continue in this manner...)
  • Pick any nonzero vector $v_n\perp\{\mathbf f(x_1),\mathbf f(x_2),\ldots,\mathbf f(x_{n-1})\}$.
  • Since $f_1,\ldots,f_n$ are linearly independent, there exists some $x_n$ such that $v_n^T\mathbf f(x_n)\ne0$. Now $$ \pmatrix{v_1^T\\ v_2^T\\ \vdots\\ v_n^T}\pmatrix{\mathbf f(x_1)&\mathbf f(x_2)&\cdots&\mathbf f(x_3)} $$ is an upper triangular matrix with nonzero diagonal entries. Hence it is invertible and $\det\pmatrix{\mathbf f(x_1)&\mathbf f(x_2)&\cdots&\mathbf f(x_3)}\ne0$.

By the way, note that neither the continuity of $\mathbf f$ nor the compactness/connectedness of the domain of $\mathbf f$ are relevant. The above proof works as long as $f_1,\ldots,f_n$ are linearly independent (which implies that their common domain has at least $n$ elements, if that matters).