Derivation of Newton's Decrement

We want to minimize $f(\mathbf{x})$. Firstly, we consider its second-order truncated Taylor series:

$$f(\mathbf{x}) \approx \hat{f}(\mathbf{x})= f(\mathbf{x_0}) + [\mathbf{x}-\mathbf{x_0}]^T\,\nabla f(\mathbf{x_0}) + \frac{1}{2}\,[\mathbf{x}-\mathbf{x_0}]^T\,\nabla^2 f(\mathbf{x_0})\,[\mathbf{x}-\mathbf{x_0}]$$

Considering the Hessian Matrix $\nabla^2 f(\mathbf{x_0})$ is positve-definite, $\hat{f}(\mathbf{x})$ can be easily minimized:

$$\frac{\partial \hat{f}(\mathbf{x^*})}{\partial \mathbf{x}} = \nabla f(\mathbf{x_0}) + \nabla^2 f(\mathbf{x_0})\,[\mathbf{x^*}-\mathbf{x_0}] = \mathbf{0} \Rightarrow \mathbf{x^*} = \mathbf{x_0} - \nabla^2 f(\mathbf{x_0})^{-1}\,\nabla f(\mathbf{x_0})$$

So we have the minimized value:

$$\hat{f}(\mathbf{x^*})= f(\mathbf{x_0}) - [\nabla^2 f(\mathbf{x_0})^{-1}\nabla f(\mathbf{x_0})]^T\nabla f(\mathbf{x_0}) + \frac{1}{2}[\nabla^2 f(\mathbf{x_0})^{-1}\nabla f(\mathbf{x_0})]^T\nabla^2 f(\mathbf{x_0})[ \nabla^2 f(\mathbf{x_0})^{-1}\nabla f(\mathbf{x_0})]$$

So the decrement on the approximation $\hat{f}(\mathbf{x})$ is given by:

$$f(\mathbf{x_0})-\hat{f}(\mathbf{x^*}) = \boxed{\frac{1}{2}\,\nabla f(\mathbf{x_0})^T\,\nabla^2 f(\mathbf{x_0})^{-1}\,\nabla f(\mathbf{x_0})}$$

(A Hessian Matrix is always symmetric for "well behaved" functions)


This is quite trivial

Considering that through a series of Reidemeister moves we find that

$$\forall \chi \in \mathbb{R} \forall f :\mathbb{R}\to\mathbb{R}\exists \lambda\Big[\nabla f(\chi)=\Big(\frac{\lambda^2(\chi)}{\nabla^2f^{-1}(\chi)\nabla f(\chi)}\Big)^{\frac{1}{T}}\Big]$$

Being a direct variant of the Axiom of Cacatus, we find through a series of inverted hyperoperations that

$$\lambda(\chi)=\sqrt{\nabla f(\chi)^T\nabla^2 f^{-1}(\chi)\nabla f(\chi)}$$

$$\mathbb{Q}\exists\mathfrak{D}$$

This is an example of PBI (proof by intimidation).