Rank preservation of Hankel matrix by adding constrained sample

We denote $$ P = \pmatrix{ x_1&x_2&x_3\\ x_2 & x_3 & x_4\\ x_3 & x_4 & x_5}, \quad a = \pmatrix{x_0\\x_1\\x_2}, \quad b = \pmatrix{x_4\\x_5\\y}. $$ In the case that $P$ is invertible, $H_1$ would have full rank for all $y$, so suppose that this is not the case. From the fact that $H_0 = [a \ \ P]$ has full rank, we see that $P$ must have rank $2$.

Write $b$ in the form $b(y) = b_0 + ye_3$, where $e_3 = (0,0,1)$. Our goal is to show that there exists a value of $y$ for which $[P\ \ b(y)]$ has rank greater than $P$, which is to say that the line parameterized by $b(y)$ does not intersect the plane spanned by the columns of $P$.

Suppose for contradiction that this is not the case. Then it must hold that the line $b(y)$ is parallel to to the plane spanned by the columns of $P$. That is, there exists a vector $v$ for which $Pv = e_3$. Let $w$ denote a non-zero solution to $Pw = 0$; one exists because $P$ is not invertible. Because $P$ is symmetric, $w$ must be orthogonal to $Pv$ for all $v$, which means that we must have $e_3^Tw = 0$. In other words, there is a non-zero vector $w = (w_1,w_2,0)$ for which $$ Pw = 0 \implies w_1 \pmatrix{x_1\\x_2\\x_3} = -w_2 \pmatrix{x_2\\x_3\\x_4}. $$ If neither column is zero, then let $\alpha = -w_1/w_2$. We have $$ P = x_1 \cdot \pmatrix{1 & \alpha & \alpha^2\\ \alpha & \alpha^2 & \alpha^3\\ \alpha^2 & \alpha^3 & \alpha^4} + \pmatrix{0&0&0\\0&0&0\\0&0&x_5-x_1\alpha^4}. $$ Because $P$ has rank $2$, the second matrix in this sum can't be zero. So, $x_5 \neq \alpha x_4$, since $\alpha x_4 = x_1 \alpha^3$. This means that $(x_4,x_5)$ is not a multiple of $(1,\alpha)$, which means that $Pv = b_0$ has no solution. This means that taking $y = 0$ gives us the desired outcome. (In fact, because the line is parallel to the column space, taking any $y$ yields the desired outcome).

If the first column is zero, then we have $x_1 = x_2 = x_3 = 0$, but we must have $x_4 \neq 0$ since the rank of $P$ is greater than $1$. However, this means that reversing the columns of $H_1$ gives us a triangular matrix, which means that $H_1$ is invertible for all $y$.

If the second column is zero, something similar happens (where we note that this time, we must have $x_1 = 0$). Again, $H_1$ has full rank for all $y$.


Suppose the contrary that $\operatorname{rank}(H_1)\le2$ for all $y\in[-1,1]$. Let $S$ be the square submatrix consisting of the first three columns of $H_1$. Then $S$ must be singular. Since it is also comprised of the last three columns of $H_0$, we have $3=\operatorname{rank}(H_0)\le\operatorname{rank}(S)+1\le\operatorname{rank}(H_1)+1\le3$. Therefore $\operatorname{rank}(S)=\operatorname{rank}(H_1)=2$.

It follows that $\operatorname{span}\{(x_4,x_5,y)^T: y\in[-1,1]\}$ must lie inside the column space of $S$. In particular, both $(x_4,x_5,0)^T$ and $(0,0,1)^T$ lie inside the column space of $S$. Therefore the augmented matrix $$ A=\begin{bmatrix} x_1 & x_2 & x_3 & x_4 & 0\\ x_2 & x_3 & x_4 & x_5 & 0\\ x_3 & x_4 & x_5 & 0 & 1 \end{bmatrix} $$ has rank $2$. By elementary column operations, we can reduce $A$ to $$ B=\begin{bmatrix} x_1 & x_2 & x_3 & x_4 & 0\\ x_2 & x_3 & x_4 & x_5 & 0\\ 0 & 0 & 0 & 0 & 1 \end{bmatrix}. $$ Hence $\operatorname{rank}(B)=2$ too and the first two rows of $B$ must be linearly dependent. It follows that the last two rows of $H_0$ are also linearly dependent. Now we arrive at a contradiction because $H_0$ has full row rank. Thus we conclude that $H_1$ must have full row rank for some $y\in[-1,1]$.