I’m starting a new series of blog posts, called “**XY in less than 10 lines of Python**“. This first one is about Newton’s method, which is an old numerical approximation technique that could be used to find the roots of complex polynomials and any differentiable function.

Let’s say we have a complicated polynomial:

and we want to find its roots. Unfortunately we know from the Galois theory that there is no formula for a 5th degree polynomial so we’ll have to use numeric methods. Wait a second. What’s going on? We all learned the quadratic formula in school, and there are formulas for cubic and quartic polynomials, but Galois proved that no such “root-finding” formula exist for fifth or higher degree polynomials, that uses only the usual algebraic operations (addition, subtraction, multiplication, division) and application of radicals (square roots, cube roots, etc).

Some nice guys pointed out on reddit that I didn’t quite get the theory right. Sorry about that, I’m no mathematician by any means. It turns out that this polynomial could be factored into and solved with traditional cubic formula. Also the theorem I referred to is the Abel-Ruffini Theorem and it only applies to the solution to the general polynomial of degree five or greater. Nonetheless the example is still valid, and demonstrates how would you apply Newton’s method, to any polynomial, so let’s crack on.

So in these cases we have to resort to numeric linear approximation. A simple way would be to use the intermediate value theorem, which states that if is continuous on and , then there is an between and so that .

We could exploit this by looking for an where and an where , and then we could be certain that must be between and . So we could check , and find that will be positive, then continue with which would be negative and our proposed would become closer and closer to . This method however can be pretty slow, so Newton devised a better way to speed things up (when it works).

If you look at the figure below, you’ll see the plot of our polynomial. It has three roots at 0, 1, and somewhere in-between. So how do we find these?

In Newton’s method we take a random point , then draw a tangent line through , using the derivative . The point where this tangent line crosses the axis will become the next proposal we check. We calculate the tangent line at and find . We carry on, and as we do , or in other words we can make our approximation as close to zero as we want, provided we are willing to continue with the number crunching. Using the fact that the slope of tangent (by the definition of derivatives) at is we can derive the formula for , i.e. where the tangent crosses the axis:

With all this in mind it’s easy to write an algorithm that approximates $f(x)=0$ with arbitrary error . Obviously in this example depending on where we start we might find different roots.

1 2 3 4 5 6 7 8 9 10 |
def dx(f, x): return abs(0-f(x)) def newtons_method(f, df, x0, e): delta = dx(f, x0) while delta > e: x0 = x0 - f(x0)/df(x0) delta = dx(f, x0) print 'Root is at: ', x0 print 'f(x) at root is: ', f(x0) |

There you go, we’are done in 10 lines (9 without the blank line), even less without the print statements. So in order to use this, we need two functions, and . For this polynomial these are:

1 2 3 4 5 |
def f(x): return 6*x**5-5*x**4-4*x**3+3*x**2 def df(x): return 30*x**4-20*x**3-12*x**2+6*x |

Now we can simply find the three roots with:

1 2 3 4 5 6 7 8 9 10 |
x0s = [0, .5, 1] for x0 in x0s: newtons_method(f, df, x0, 1e-5) # Root is at: 0 # f(x) at root is: 0 # Root is at: 0.628668078167 # f(x) at root is: -1.37853879978e-06 # Root is at: 1 # f(x) at root is: 0 |

All of the above code, and some additional comparison test with the scipy.optimize.newton method can be found in this Gist. And don’t forget, if you find it too much trouble differentiating your functions, just use SymPy, I wrote about it here.

Newton’s method is pretty powerful but there could be problems with the speed of convergence, and awfully wrong initial guesses might make it not even converge ever, see here. Nonetheless I hope you found this relatively useful.. Let me know in the comments.