What is something (non-trivial) that can be done in Hilbert space but not Banach spaces for optimization problems?

Here is a nice result which can be proven in Hilbert and Banach spaces, but the assumptions hide the fact that the space is already Hilbert:

Let $X$ be a Banach space, $f \colon X \to \mathbb R$ be twice Fréchet differentiable in a given point $\bar x \in X$ and suppose that there exists $\alpha > 0$ such that $$f'(\bar x) = 0, \qquad f''(\bar x)[h,h] \ge \alpha \, \|h\|_X^2 \;\forall h \in X.$$ Then, for all $\varepsilon > 0$ there is $\delta > 0$ such that $$f(x) \ge f(\bar x) + \frac{\alpha-\varepsilon}{2}\,\|x - \bar x\|^2 \quad\forall x \in X, \|x-\bar x\|_X \le \delta.$$

I hope this counts as a non-trivial result. In infinite-dimensional optimization, this theorem is quite useful, since it implies some stability of the minimizer $\bar x$ w.r.t. perturbations of the problem (i.e. a discretization can be seen as a perturbation).

The proof uses just a second-order Taylor expansion of $f$ at $\bar x$ and does not need an inner product or a Hilbert space structure.

However, it can be easily checked that $$ (g,h) \mapsto f''(\bar x) [g,h] $$ defines an inner product on $X$ and the associated norm is equivalent to $\|\cdot\|_X$. Hence, $X$ has to be (isomorphic) to a Hilbert space and the theorem is not applicable in Banach spaces.


There are closed forms for nearest point projections in Hilbert space that are implementable: e.g., hyperplane and unit ball.

In "nice" Banach spaces, projections do exist but there are typically no formulas available. This is really no fun when dealing with projection or proximal mappings even in finite-dimensional settings. So optimization in Banach spaces is much harder if you want implementable algorithms.