Derivative formula

A detailed historical discussion of identities like this one can be found in Warren P. Johnson's paper The Pfaff/Cauchy derivative identities and Hurwitz type extensions, The Ramanujan Journal 13 (2007) pp. 167–201. In particular, his formula (1.3) is $$\frac{d^n\ }{dx^n} \phi^n(x) u(x) v(x) = \sum_{k=0}^n \binom nk \left(\frac{d^k\ }{dx^k} \phi^k(x) u(x)\right) \left(\frac{d^{n-k-1}}{dx^{n-k-1}}\phi^{n-k}(x) v'(x)\right)\tag{1}$$ where for $n=k$, $\frac{d^{n-k-1}}{dx^{n-k-1}}\phi^{n-k}(x) v'(x)$ is to be interpreted as $v(x)$. This identity was given by Cauchy in 1826, but is equivalent to an identity given by Pfaff in 1795.

The OP's identity is equivalent to the case $\phi(x)=f$, $u(x)=1$, $v(x) = x$.

There is a related formula in Cayley's paper On the partitions of a polygon, Coll. Math. Papers 13 (1897), 93–113; Proc. London Math. Soc. (1) 22 (1891), 237–262, but I didn't see this formula there.

Formulas like $(1)$ are discussed in my survey paper Lagrange Inversion, Journal of Combinatorial Theory, Series A 144 (2016), pp. 212–249, section 2.6.

There is also a discussion of a somewhat related formula on Terry Tao's blog at https://terrytao.wordpress.com/2016/10/23/another-problem-about-power-series.


In addition to Ira Gessel's answer, let me explain how such Pfaff-Cauchy-Hurwitz type identities are connected to polynomial interpolation techniques and to Raney lemma.

Denote ${\bf k}=\{1,\dots,k\}$.

We prove a more general (not polynomial in $f$, but multilinear in $f_1,\dots,f_k$) identity $$ k(f_1\dots f_k)^{(k-1)}=\sum_{\emptyset\ne A\subset{\bf k}}\left(\prod_{i\in A} f_i\right)^{(|A|-1)}\left(\prod_{i\notin A} f_i\right)^{(n-|A|)}. $$ It suffices to prove this identity when all $f_i$'s are power functions $f_i(t)=t^{x_i}$ (by some standard abstract nonsense). In this case we get a polynomial identity in $x_i's$ (where we denote $(x)_n=x(x-1)\dots(x-n+1)$, and also denote $s(A)=\sum_{i\in A}x_i$): $$ k(s({\bf k}))_{k-1}=\sum_{A\sqcup B={\bf k},A\ne \emptyset}(s(A))_{|A|-1}\,(s(B))_{|B|}. $$ Both parts are polynomials of degree at most $k-1$, and it suffices to check that they coincide on the following combinatorial simplex $\Delta$: $x_i$ take non-negative integral values such that $\sum x_i\leqslant k-1$.

Note that for any $(x_1,\dots,x_k)\in \Delta$ and any partition ${\bf k}=A\sqcup B$, $A\ne \emptyset$, we have $s(A)+s(B)=s({\bf k})\leqslant k-1$, thus either $s(A)<|A|-1$ or $s(B)<|B|$ or $s(A)=|A|-1,s(B)=|B|$, $s({\bf k})=k-1$. Thus if $s({\bf k})<k-1$ both parts vanish, and if $s({\bf k})=k-1$, we have to prove that $$ k!=\sum_{A:s(A)=|A|-1} (|A|-1)! (k-|A|)! $$ I know this identity from Andrei Zelevinsky (2008). Here is his manuscript containing a clever proof both of combinatorial and falling factorials identities. Andrei asked not to circulate this text, but I think that this request is out of date...

Consider a permutation $\pi=(c_1,\dots,c_k)$ of $1,2,\dots,k$. Choose minimal $m$ for which $s(\{c_1,\dots,c_m\})<m$. Then by minimality $s(\{c_1,\dots,c_m\})=m-1$. For how many permutations did we have $\{c_1,\dots,c_m\}=A$, where $A$ is a fixed subset of ${\bf k}$ with $s(A)=|A|-1$? The answer is $(|A|-1)!|B|!$, since for any fixed cyclic order $(c_1,\dots,c_m)$ of $A$ exactly one out of $m$ permutations $(c_{i+1},c_{i+2},\dots,c_i)$ satisfies $s(c_{i+1},\dots,c_{i+t})\geqslant t$ for all $t=1,2,\dots,m-1$. This is a variant of Raney's lemma. Thus the identity.

We may avoid combinatorial arguments with Raney's lemma and remain entirely in algebra. The following trick is suggested by Vlad Volkov: we instead check the points for which $(x_1,\dots,x_{k-1},x_k+1)\in \Delta$. In this case $\sum x_i\in \{-1,0,1,\dots,k-2\}$ and LHS equals 0 unless $x_1=\dots=x_{k-1}=0$, $x_k=-1$. In this last case LHS equals $(-1)^{k-1}k!$. Check the same for RHS. Always either $s(A)<|A|-1$ or $s(B)<|B|=k-|A|$, thus if the corresponding summand in RHS is non-zero, one of the sets $A$, $B$ has a sum of $x$'s equal to -1, i.e., consists of $k$ and several indices $i$ such that $x_i=0$. After some calculations everything reduces to Vandermonde--Chu identity.

Now a bit about multidimensional interpolation. Consider the space $\Pi(n,k)$ of polynomials $f(x_1,\dots,x_k)$ with degree at most $n$. Its dimension equals the number of monomials $x_1^{c_1}\dots x_k^{c_k}$, where $c_i$ are non-negative integers such that $\sum c_i\leqslant n$, we write this as $(c_1,\dots,c_k)\in \Delta_n^k$. Thus if we manage to find $|\Delta_n^k|$ linearly independent linear forms on $\Pi(n,k)$, then the polynomial $f\in \Pi(n,k)$ is uniquely determined by the values of these forms. I claim that the values at points of $\Delta_n^k$ serve as such linear forms. For proving that they are linearly independent we may simply construct a polynomial from $\Pi(n,k)$ which vanishes at all points of $\Delta_n^k$ but the given point $(t_1,\dots,t_k)$. Here it comes: $$\prod_{i=1}^k \prod_{j=0}^{t_i-1}(x_i-j)\cdot \prod_{\ell=t_1+\dots+t_k+1}^n (x_1+\dots+x_k-\ell).$$