Fractional Calculus: Motivation and Foundations.

I think there is more to be found out about this subject. There are many different ways to define fractional derivatives and integrals. I do not know if these come from any deep, fundamental facts, but certainly as a generalization of various formulas. Another way to think about the subject is a list of applications or tricks involving a certain integral transforms, happenstance called "fractional derivatives". But let's think about clear generalizations of integration and differentiation.

For instance, there is indeed the Cauchy formula generalization using $$f^{(n)}(z) = \frac{n!}{2\pi i} \int_C \frac{f(w)}{(w-z)^{n+1}} dz,$$ but there are many more. It is know that for integer $n$ that $$\int_a^x \int_a^{x_1} \cdots \int_a^{x_{n-1}} f(x_n)d x_n = \frac{1}{(n-1)!} \int_a^x (x-t)^{n-1}f(t)dx,$$ where we've integrated on the left $n$ times. Here we might substitute fractional $q$ for $n$ and the gamma function for the factorial to define a fractional integral. That is, $$I^q f(x) = \frac{1}{\Gamma(q)} \int_a^x (x-t)^{q-1} f(t) dt.$$ Note that this integral may not converge for negative $q$. Hence, for fractional derivatives, we have two choices. If we let $I^q$ be the $q$th order integral operator and $D$ the regular differentiation operator, then for a given $q>0$ and integer $n>q$, we could define $$D^qf = D^{n}I^{n-q} f,$$ which is known as the Riemann-Liouville definition, or in the opposite order $$D^qf = I^{n-q} D^n f,$$ which is known as the Caputo definition. But there are more! By an induction argument, one can easily show that $$f^{(n)}(x) = \lim_{h \to 0} \frac{1}{h^n} \sum_{k=0}^n (-1)^k {n \choose k} f(x-kh).$$ We can generalize by replacing $n$ with fractional $q$ and change the upper bound to $\infty$ to get $$f^{(q)}(x) = \lim_{h \to 0} \frac{1}{h^q} \sum_{k=0}^\infty (-1)^k {q \choose k} f(x-kh)$$ where ${q \choose k}$ is defined as usual. Or, we can think in terms of the gamma function: ${q \choose k} = \frac{\Gamma(q+1)}{\Gamma(q-k+1) \Gamma(k+1)}$. Surprisingly, it turns out that we can develop a very similar formula using Riemann sums on a series of repeated integrals. It complicate things slightly, but this means integration and differentiation can be united under one formula. This is the Grunwald-Letnikov definition.

Still more! If we know anything about Fourier series, the original repeated integral generalization is no big surprise since it essentially looks like a convolution. we know that Laplace transforms and Fourier transforms both turn differentiation and integration into multiplication or division by $s$. So we may also define something like $$D^q f(t) = F^{-1}[k^q F[f(t)]]$$ or $$D^q f(t) = L^{-1}[s^q L[f(t)]].$$ These are the Fourier and Laplace generalizations. I think most people, if there were to guess at a generalization, would pick these one.

Some of these definitions are equivalent, some are different, some nearly the same up to annoying details. For instance, the Riemann Liouville definition is the same as the Grunweld-Letnikov definition. I know of a proof in Oldham's and Spanier's book. There's still more to think about. We want to see how many traditional calculus rules still apply. I've seen some new papers on Arxiv about fractional calculus. One is a proof that the Leibniz's rule can never hold. Another is apparently new definition of a fractional diffointegral operator I saw mentioned in this paper.

Is there some natural definition of fractional calculus? The Gamma function, used in several of these definitions, is itself one of many (theoretical) generalizations of the factorial function. We could ask the same question for it, except, the Gamma function is the sole log convex function that appropriately generalizes the factorial function. This is the result of the Bohr Mollerup theorem. So a similar question would be, has any one come up with an appropriate, reasonable, constraint on a fractional integral operator that is unique? I don't know. Perhaps some results have been shown, but I have not been made aware of them.

So my intuitive grasp of the subject is this. We want to understand all possible definitions of fractional derivatives and want to consider what properties they have or don't have, compared to traditional calculus. It is certainly true that fractional derivatives will make sense only for some classes of functions, as in the case of the Fourier transform, so it may be context specific.

I've heard tell of models using fractional calculus of materials with fractional dimensions. I've also seen a paper generalizing Newton's second law, not with real physical consequence, but for the sake of seeing what the math is. But my impression is that some modeling is done. What is certain is that people are studying fractional differential equations, including equations such as the fractional diffusion-wave equation (which unites the heat and wave equation through a fractional time derivative) and the fractional schroedinger equation which has a fractional spatial derivative. So we have some interesting equations to solve, which is nice, but I also think that there is a world of function identities to compute with fractional derivatives involving all sorts of special functions when one sits down to compute things. (For one fun result, the linear fractional differential equation $y^{(\pi)}+y=0$, using the right definition, has an infinite number of linearly independent solutions!)


In this answer, I explain how Euler defined $\zeta(-n)$ for $n\geq 0$. Namely, he defined

$$\zeta(-n):= (1-2^{n+1})^{-1} \frac{d^n}{dx^n}\frac{e^x}{1+e^x}\biggr|_{x=0}$$

This does not seem to suggest a definition of $\zeta(-s)$ for $\Re s\geq 0$; that is, unless we can make sense of $\frac{d^s}{dx^s}$ for a complex number $s$... and indeed, if you write down the integral suggested by the theory of fractional calculus, you will get the analytic continuation of the zeta function!


The bibliography about fractional calculus is extensive. After some precursors during the 18th century, the most striking advances are the ones of J.Liouville in several of it’s reports to the Ecole polytechnique in Paris between 1832 and 1835, then the contribution of B.Riemann in 1847, making that the names of these two mathematicians remain attached to the famous transform which is at the heart of differ-integral calculus.

More recently, the fractional calculus not only has important applications in pure mathematics (let us quote Erdélyi and Higgings, among number of authors), but also interests vast domains of the physical sciences. Heaviside was a brilliant precursor who, from 1920, used fractional calculus in the researches on the electromagnetic propagation [Oliver Heaviside, Electromagnetic Theory, 1920, re-édit.: Dover Pub., New York, 1950.]. Numerous examples are quoted in the book "The Fractional Calculus", Keith B.Oldham, Jerome Spanier, Academic Press,New York, 1974. For example concerning rheology, diffusion, hydrodynamics, thermodynamics, electrochemistry, etc.

In the paper "The fractionnal derivation" published on Scribd (tranlation pp.7-12) : http://www.scribd.com/JJacquelin/documents a formal generalization of some basic relationships for electrical components, thanks to the fractional derivation, appears on the table below (copy from the referenced paper, p.11). It had remarkable consequences on networks made of associations of these components, and on calculations of equivalent networks. This is more developed in the paper "The Phasance Concept" (same link on Sribd).

enter image description here