Relation between the roots of a function

$$f(x)=(x-r_1)(x-r_2)\cdots(x-r_n)$$

Using logarithm,

$$\ln (f(x))=\ln (x-r_1)+\ln(x-r_2)+\cdots +\ln(x-r_n)$$

Taking derivative,

$$\frac{f'(x)}{f(x)}= \frac{1}{(x-r_1)}+\frac{1}{(x-r_2)} +\cdots +\frac{1}{(x-r_n)}$$

Done!


While Jaideep Khare's computation with the logarithm gives a good quick way to memorize or derive the result, there is an issue: The logarithms of negative numbers are not defined if one works over the reals, and care must be taken with branches if one moves to the complex plane. When $r_k$ is fixed and $x$ ranges through all the reals, $x-r_k$ will inevitably get negative values.

Here is a way to prove the desired result without resorting to logarithms: By the fundamental theorem of algebra a polynomial can be written in terms of its roots and a coefficient $a\in\mathbb C$ as $$ f(x) = a(x-r_1)\cdots(x-r_n). $$ The coefficient of $x^n$ is $a$ and in your case you know it to be $1$, but it is unimportant. The derivative can be computed by differentiating each term in the product in turn: $$ f'(x) = a(x-r_2)\cdots(x-r_n) + a(x-r_1)(x-r_3)\cdots(x-r_n) + \dots + a(x-r_1)\cdots(x-r_{n-1}), $$ where in the $n$th term of the sum the $n$th term of the product has been dropped. When $f(x)\neq0$, it is easy to divide each term of $f'(x)$ by $f(x)$, and we get $$ \frac{f'(x)}{f(x)} = \frac{1}{x-r_1} + \frac{1}{x-r_2} +\dots+ \frac{1}{x-r_n}. $$

The result and the proof are also applicable to complex polynomials, whereas with logarithms $f(z)$ will always hit the branching set of the logarithm, no matter how you choose it. Also, $\log(ab)=\log(a)+\log(b)$ is not true in general for complex $a$ and $b$.

Finally, let me emphasize that this result holds for a polynomial function $f$ (with roots counted with multiplicity), not for all functions in general. In your question you are working with polynomials but you speak about "functions", so beware of generalizing the result beyond what's true.


Another approach, which is longer but would also work for more general symmetric expressions in the roots, is to use Vieta's formulae.

If a polynomial $f(x) = a_n x^n + a_{n-1} x^{n-1} + \dots + a_1 x + a_0$ has roots $r_1, r_2, \dots, r_k$, then the ratio $-\frac{a_{n-1}}{a_n}$ gives us the sum $r_1 + r_2 + \dots + r_k$.

To turn this into the sum we want, we should modify $f(x)$ slightly.

First, $f(t-x)$ has roots $t - r_1, t - r_2, \dots, t - r_k$.

Second, $f(t-1/x)$, though it is not a polynomial, is still zero when $x = 1/(t-r_1), \dots, 1/(t-r_k)$.

However, multiplying through by $x^n$ gives the polynomial $x^n f(t-1/x)$ which has roots at $1/(t-r_1), \dots, 1/(t-r_k)$.

To apply the Vieta's formula we want to use, it remains to figure out the coefficients of $x^n$ and $x^{n-1}$ in $$ x^n f(t-1/x) = x^n a_n (t-1/x)^n + x^n a_{n-1} (t - 1/x)^{n-1} + \dots + x^n a_1 (t-1/x) + x^n a_0. $$ To get a multiple of $x^n$ from a term $x^n a_k (t-1/x)^k$, we have to choose the $t^k$ term of the binomial expansion of $(t-1/x)^k$. This tells us that the coefficient of $x^n$ in this polynomial is $$ a_n t^n + a_{n-1} t^{n-1} + \dots + a_0 = f(t). $$ To get a multiple of $x^{n-1}$ from a term $x^n a_k (t-1/x)^k$, we have to choose the $\binom k1 t^{k-1}(-1/x)$ term of the binomial expansion of $(t-1/x)^k$. This tells us that the coefficient of $x^{n-1}$ in this polynomial is $$ - a_n \cdot n t^{n-1} - a_{n-1} \cdot (n-1)t^{n-2} - \dots - a_1 = -f'(t). $$ Putting these together, we get that the sum of the roots of $x^n f(t-1/x)$ is $$ \sum_{i=1}^n \frac{1}{t-r_i} = \frac{f(t)}{f'(t)}, $$ as desired.

(For the specific case of the polynomial $f(x) = x^n - 2x + 2$, this would go faster since we wouldn't have to derive the general formula.)