Why does the Newton-Raphson method not converge for some functions?

Newton's method does not always converge. Its convergence theory is for "local" convergence which means you should start close to the root, where "close" is relative to the function you're dealing with. Far away from the root you can have highly nontrivial dynamics.

One qualitative property is that, in the 1D case, you should not have an extremum between the root you want and your initial guess. If you have an odd number of extrema in the way, then you will start going away from the root you want, as you see here. If you have an even number of extrema in the way, then you will start going the right way, but you may later find yourself in a spot with an odd number of extrema in the way, leading to problems later.

Of course you may eventually find an occasion where there are an even number of extrema in the way, and then you manage to skip over all of them and get to the right side. At that point things will usually work out (not always, though). In this problem with your initial guess, that eventually happens, because the system eventually finds its way just slightly to the right of the extremum on the right, which sends it far off to the left.


In fact, you gave up too early ; The method eventually converges :

1   2.000000000000000000000000000
2   1.500000000000000000000000000
3   0.3333333333333333333333333333
4   2.148148148148148148148148148
5   1.637079608343976160068114091
6   0.9483928480399477528436835979
7   1.910874140183680201544963299
8   1.405089904362402921055022221
9   -1.324018083676046424512855515
10  -0.9614381794507316717924414480
11  -0.8500221808505758631523579893
12  -0.8393807176849843501240483025
13  -0.8392867625049899194321196645
14  -0.8392867552141611764525252322
15  -0.8392867552141611325518525647
16  -0.8392867552141611325518525647
17  -0.8392867552141611325518525647
18  -0.8392867552141611325518525647
19  -0.8392867552141611325518525647
20  -0.8392867552141611325518525647
?

The other answers are great. I'd just like to add a concrete example of weird behavior of which the Ian's answer speaks.

Let's consider a function $f(x) = \operatorname{sgn} x \sqrt{|x|}$. According to the algorithm, we iterate $$x_{n+1} = x_n - \frac{f(x_n)}{f'(x_n)}.$$ Now for the derivative (if we're not doing the derivative at $x = 0$, the $\operatorname{sgn} x$ is just a constant, and $|x| = x \operatorname{sgn} x$): $$f' = [\operatorname{sgn} x \sqrt{x \operatorname{sgn} x}]' = \operatorname{sgn} x \frac{1}{2\sqrt{x \operatorname{sgn} x}} \operatorname{sgn} x = \frac{1}{2\sqrt{|x|}}.$$ Plugging in: $$x_{n+1} = x_n - \frac{\operatorname{sgn} x \sqrt{|x|}}{1/(2\sqrt{|x|})} = x_n - 2 \operatorname{sgn} x \left(\sqrt{|x|}\right)^2 =\\ =x_n - 2\operatorname{sgn} x |x| = x_n - 2 x_n = - x_n.$$

So if we start iterating in $x = a$ (where $a \not = 0$), we get the sequence $a, -a, a, -a, \ldots$ and the method loops forever between those two points, never getting to the root $x = 0$!

Edit: Here's a gnuplotted image: enter image description here (In each iteration, we make a tangent in the current point (the blue dashed line) and the $x$ for which the tangent crosses 0 is taken to be the next approximation (so we go along the magenta line in order to get the starting point for the next iteration).)

By the way, have a look at this image from Wikipedia:

enter image description here

It shows the complex plane colored with 5 colors, each color corresponding to one root of the complex equation $z^5 = 1$. Each point then has the color corresponding to the root to which Newton's method converges, if we start from that point. The "flowers" are beautiful to behold but totally abhorrent from the numerical point of view.