Differentiable functions with discontinuous derivatives

A trivial premise: asking whether a physical phenomenon, in itself, is continuous or differentiable, of course, makes no more sense than asking whether a physical length is rational or irrational. These are mathematical concepts attached to the mathematical models we choose to describe physical objects. The reason why we use real numbers to measure real world, is rather linked to the good mathematical properties of the real numbers as a foundation for mathematical Analysis, than fine properties of the real world, in spite of the homonimy. As I see it, functions that are differentiable but not $C^1$ plays a little role in physics for the simple and only reason, that they play a little role in mathematics.

Existence theorems for all sort of equations in analysis, as well as convergence theorems, produce functions that are either more regular than simply differentiable ($C^k$, $C^{k,\alpha}$ etc) or less (Sobolev classes etc). Here's an old question about the hypothesis of continuous differentiability vs simple diferentiability in differential calculus .

It doesn't mean that functions like $x^2\sin(1/x)$ can't provide a suitable description of certain physical motions. But, a smooth approximation of it may be a good description as well, though maybe more complicated formally, which is a good reason why after all we may prefer the former. I vaguely think to a ping-pong ball bouncing between the table and the racket pushing it conveniently. Or a vibrating string whose length is forcedly shortened to zero, and maybe more relevant phenomena like the behaviour of an object reaching the barrier sound. In any case, at some microscopical scale, it makes no sense to ask which function gives a better model, just because there is nothing to measure by real numbers.


Here is an example for which we have a "natural" nonlinear PDE for which solutions are known to be everywhere differentiable and conjectured-- but not yet proved-- to be $C^1$.

Suppose that $\Omega$ is a smooth bounded domain in $\mathbb R^d$ and $g$ is a smooth function defined on the boundary, $\partial \Omega$. Consider the prototypical problem in the "$L^\infty$ calculus of variations" which is to find an extension $u$ of $g$ to the closure of $\Omega$ which minimizes $\| Du \|_{L^\infty(\Omega)}$, or equivalently, the Lipschitz constant of $u$ on $\Omega$. When properly phrased, this leads to the infinity Laplace equation $$ -\Delta_\infty u : = \sum_{i,j=1}^d \partial_{ij} u\, \partial_i u \, \partial_j u = 0, $$ which is the Euler-Lagrange equation of the optimization problem.

The (unique, weak) solution of this equation (subject to the boundary condition) characterizes the correct notion of minimal Lipschitz extension. It is known to be everywhere differentiable by a result of Evans and Smart: http://math.mit.edu/~smart/differentiability.ae.pdf.

It is conjectured to be $C^{1,1/3}$, or anyway at least $C^1$. It is known to be $C^{1,\alpha}$ for some $\alpha>0$ in dimension $d=2$ (due to O. Savin), but the problem remains open in dimensions $d\geq 3$.

I am unaware of any other situation in PDE where the regularity gets blocked between differentiability and $C^1$. Typically, if you can prove something is differentiable, the proof can be made quantitative enough to give $C^1$ with a modulus.