How to show this symmetric function inequality

Your expression is secretly a divided difference for the function $x \mapsto x^p$. In general, divided differences for a function $f$ are defined inductively by $f[x_0] = f(x_0)$ and

$f[x_0, ..., x_n] = \frac{f[x_0, ..., x_{n-1}] - f[x_0, ..., x_{n-2}, x_n]}{x_{n-1} - x_{n}}$.

This is a symmetric function of $x_0, ..., x_n$, and it can be expressed explicitly as

$f[x_0, ..., x_n] = \sum_{i=0}^n \frac{f(x_i)}{\prod_{j \ne i} (x_i - x_j)}.$

The main useful fact about divided differences is the mean value theorem for divided differences: if $f$ is $n$ times differentiable and $x_0, ..., x_n$ are distinct real numbers, then there is some $\xi$ in the smallest interval containing the $x_i$ such that

$f[x_0, ..., x_n] = \frac{f^{(n)}(\xi)}{n!}.$

So to answer your question, you just need to determine for which $p$ the $n$th derivative of the function $x \mapsto x^p$ is positive (keeping in mind that your $n$ is off by one from my $n$), which is a fairly easy exercise.


The conjecture for $n=4$ is correct. For general $n$ and any $x_1,\ldots,x_n$, $$ f(p) := \sum_{i=1}^{n}\dfrac{x^p_i}{\prod_{j\neq i}(x_i-x_j)} $$ is positive for $p > n-2$ and switches sign at $p=0,1,2,\ldots,n-2$ and nowhere else. For example:
If $n=5$ then $f(p) \geq 0$ iff $p \geq 3$ or $1 \leq p \leq 2$ or $p \leq 0$;
if $n=6$ then $f(p) \geq 0$ for $p \in [0,1]$, $[2,3]$, or $[4,\infty)$;
if $n=7$ then $f(p) \geq 0$ for $p \in (-\infty,0]$, $[1,2]$, $[3,4]$, or $[5,\infty)$;
etc.

The key fact is the following exponential analogue of "Descartes' law of sines":

Proposition. Let $x_1,\ldots,x_n$ be pairwise distinct positive real numbers, and $c_i$ any real numbers not all zero. Then the function $f(p) := \sum_{i=1}^n c_i^{\phantom.} x_i^p$ has at most $n-1$ real zeros, counted with multiplicity.

Proof: Induction on $n$. For $n=1$ the function $f$ is a nonzero multiple of $x_1^p$, which is never zero; this establishes the base case. Now let $n>1$, and assume we have proved the case $n-1$ of the Proposition. Let $f_1(p) = x_n^{-p} f(p) = \sum_{i=1}^n c_i (x_i/x_n)^p$, which is of the same form as $f$ with $x_n=1$ and has the same zeros and multiplicities. Then the inductive hypothesis applies to the derivative $$ f'_1(p) = \sum_{i=1}^{n-1} (c_i \log (x_i/x_n)) \, (x_i/x_n)^p; $$ so $f'_1$ has at most $n-2$ zeros, counted with multiplicity. By Rolle's Theorem it follows that $f_1$ has at most $n-1$ zeros, counted with multiplicity. This completes the induction step and the proof. QED

It remains to check that our $f$ vanishes at $p=0,1,\ldots,n-2$; then we can use our Proposition to deduce that each of these $n-1$ zeros is simple and there are no others. This can be done by recognizing $f(p)$ as the sum of the residues of the differential $x^p \, dx \left/ \prod_{j=1}^n (x-x_j) \right.$ (which is regular at $x=\infty$ for integers $p \leq n-2$). Alternatively we can relate $f(p)$ to a Schur-Vandermonde determinant with a repeated row.

EDIT here's a more self-contained argument that $f(0)=f(1)=\cdots=f(n-2)=0$. Let $P(x)$ be any polynomial with $\deg P < n$. Then $P(x) \left/ \prod_{j=1}^n (x-x_j) \right.$ has a partial-fraction decomposition $\sum_{i=1}^n c_i / (x-x_i)$. Taking $x \to x_i$ (or multiplying through by $\prod_{i=1}^n (x-x_i)$ and setting $x = x_i$), we see that $c_i = P(x_i) \left/ \prod_{j\neq i}(x_i - x_j) \right.$. On the other hand, taking $x \to \infty$ (or multiplying through by $\prod_{i=1}^n (x-x_i)$ and comparing $x^{n-1}$ coefficients), we see that the $x^{n-1}$ coefficient of $P$ is $\sum_{i=1}^n c_i$. In particular if $P(x) = x^p$ for some nonnegative integer $p \leq n-2$ then $f(p) = \sum_{i=1}^n c_i = 0$. $\Box$