Stability of linear time-varying systems

From the differential equation

$\dot x = A(t)x \tag 0$

we infer

$\dot {\Vert x \Vert^2} = \dfrac{d}{dt}\langle x, x \rangle = \langle \dot x, x \rangle + \langle x, \dot x \rangle$ $= \langle A(t)x, x \rangle + \langle x, A(t)x \rangle = \langle x, A^T(t)x \rangle + \langle x, A(t)x \rangle = \langle x, (A^T(t) + A(t))x \rangle. \tag 1$

Now $A^T(t) + A(t)$ is a symmetric matrix for all $t$, thus its eigenvalues are real, and the condition

$\text{Eig}(A^T(t) + A(t)) < 0 \tag 2$

presumably means said eigenvalues are all negative; from this it follows that

$\forall t, \; \langle x, (A^T(t) + A(t))x \rangle < 0; \tag 3$

combining (1) and (3) yields

$2\dot{\Vert x \Vert} \Vert x \Vert = \dot {\Vert x \Vert^2} < 0, \tag 4$

which as long as

$\Vert x \Vert \ne 0 \tag 5$

implies

$\dot{\Vert x \Vert} < 0, \tag 6$

that is, as long as $x \ne 0$, so that

$\Vert x \Vert > 0, \tag 7$

$\Vert x(t) \Vert$ is monotonically decreasing, and thus the system is stable. $OE\Delta$.


Consider the system $\dot{x}=-xe^{-t}$. This system does not converge to $x(t)=0$ as $t\rightarrow \infty$ since the right-hand side decreases faster than exponentially and hence $x(t)$ converges to some finite non-zero value $0<x^*<x(t_0)$.

Note that the eigenvalue is $-e^{-t}<0$ for all $t\in[0,\infty)$.

This example tells us that for exponential stability the eigenvalues should not only have negative real parts, but also be bounded away from zero.


It is nice to see such an interesting discussion and let me please summarize it. The answer to your question is "Yes" if you ask for stability in the Lyapunov sense, and "No" if you ask for asymptotic convergence. In other words, the condition $\operatorname{eig}\left(A(t)+A^\top(t)\right)<0$ is sufficient to say that $x(t)$ is bounded, and not sufficient to say that $x(t)\to 0$. A nice example was given by @Dmitry. Indeed, for $\dot{x}(t)=-e^{-t}x(t)$ and $x(0)=1$ we have $x(t)=e^{-1+e^{-t}}$ and $x(t)\to e^{-1}\ne 0$. At the same time, we have the Lyapunov function analysis showing (falsely) that the system converges. How is it possible?

Ok, let us check what happens with the Lyapunov function. Define $V(x):=x^\top x$. Then, as shown above, we have $\dot{V}(x,t) = x^\top(t)\left(A(t)+A^\top(t)\right)x(t) \le 0$. So we conclude that $\dot{V}(x,t)<0$ for $x\ne 0$. Well, from this inequality we know for sure that $V(x)$ is bounded and does not increase, so we immediately obtain that $x(t)$ is also bounded. With some standard arguments, we can even show that the equilibrium $x=0$ is Lyapunov stable; however, it does not imply convergence!

Now the tricky part. Is the condition $\dot{V}(x,t)<0$ sufficient to show that $V(x)\to 0$ and $x\to 0$? Yes for autonomous systems and not for non-autonomous systems! When we have $\dot{x}=f(x)$ and $\dot{V}(x,t)=\dot{V}(x)<0$, then this inequality is uniform in time, and we indeed conclude that $V(x)\to 0$. When we have $\dot{x}=f(x,t)$, then the time derivative of $V(x)$ is a function of time as well and the inequality $\dot{V}(x,t)<0$ is not necessary uniform in time. This is exactly what happens with the example above. If we take $V(x):=\frac{1}{2}x^2$, or $$V(t)= \frac{1}{2}\exp\left(-2+2e^{-t}\right),$$ then $\dot{V}(x,t) = -e^{-t}x^2$, or $$\dot{V}(t) = -e^{-t}\exp\left(-2+2e^{-t}\right).$$ We see that the negative definiteness of $\dot{V}(x,t)$ is not uniform in $t$, and the Lyapunov's second method for autonomous systems cannot be applied. Indeed, $\dot{V}(t)\to 0$ and $V(t)$ converges to a nonzero constant.

Ok, so what do we need to claim the convergence? We need the uniformity. The standard way is to show that there exists a continuous positive definite function $W(x)$, such that $\dot{V}(x,t)\le -W(x)$ for all $t$. Then the asymptotic convergence to the origin follows. For linear time-varying system the sufficient for asymptotic (exponential) convergence condition is that there exists a constant $\alpha>0$ such that $$\operatorname{eig}\left(A(t)+A^\top(t)+\alpha I\right)\le0,$$ or, equvalently, $A(t)+A^\top(t)\le -\alpha I$, where $I$ is the identity matrix. Then we have $$\dot{V}(x,t) = x^\top(t)\left(A(t)+A^\top(t)\right)x(t) \le -\alpha x^\top x = -W(x).$$