next up previous
Next: About this document ... Up: lab_template Previous: lab_template


Linear Approximation and Newton's Method

Newton's Method

Many problems in mathematics, science, engineering, and business eventually come down to finding the roots of a nonlinear equation. It is a sad fact of life that many mathematical equations cannot be solved analytically. You already know about the formula for solving quadratic polynomial equations. You might not know, however, that there are formulas for solving cubic and quartic polynomial equations. Unfortunately, these formulas are so cumbersome that they are hardly ever used. Even more unfortunately, it has been proven that no formula can exist for finding roots of quintic or higher polynomials. Furthermore, if your equations involve trig functions, then it is even easier to find equations that do not have analytical solutions. For example, the following simple equation cannot be solved to give a formula for x.


The need to solve nonlinear equations that cannot be solved analytically has led to the development of numerical methods. One of the most commonly used numerical methods is called Newton's method or the Newton-Raphson method. The idea of Newton's method is relatively simple. Suppose you have a nonlinear equation of the form

\begin{displaymath}f(x)=0 \end{displaymath}

where $f$ is a differentiable function. Then the idea of Newton's method is to start with an initial guess $\displaystyle x_0$ for the root and to use the tangent line to $f$ at $\displaystyle x=x0$ to approximate $f$. The equation for the tangent line appears below.

\begin{displaymath}y =f(x_0)+f'(x_0)(x-x_0) \end{displaymath}

If the tangent line is a good approximation to $f$, then the $x$ intercept of the tangent line should be a good approximation to the root. Call this value of $x$ where the tangent line intersects the $x$ axis $\displaystyle x_1$. We can solve the equation above to get the following formula.

\begin{displaymath}x_1=x_0-\frac{f(x_0)}{f'(x_0)} \end{displaymath}

In practice, unless the starting point $\displaystyle x_0$ is very close to the root, the value $\displaystyle x_1$ is not close enough to the root we are seeking, so Newton's method is applied again using the tangent line at $\displaystyle x=x_1$. This process can be repeated, leading to a sequence of values where the value of $\displaystyle x_{n+1}$ is determined from the equation

\begin{displaymath}x_{n+1}=x_n-\frac{f(x_n)}{f'(x_n)},~n=0,1,2,3,... \end{displaymath}

For example, consider the equation from above,


If you want to solve this equation using Newton's method, the first thing to do is to write it in standard form as

\begin{displaymath}\sin(x)-\frac{x}{2}=0 \end{displaymath}

Then, plot the expression to get an idea of where the roots are. You may have to adjust the plot range to locate all of the roots. The following commands show that there are exactly three roots, one at $x=0$, and two others at about 2 and -2.
> f:=x->sin(x)-x/2;
> plot(f(x),x=-6..6);
It isn't very hard to write a Maple command that will do one step of Newton's method. The examples below show a very simple method for doing so, using the function $f$ defined above and a starting value of $x=2.2$. Further iterations can be obtained by using composition, as shown below
> newt:=x->x-f(x)/D(f)(x);
> newt(2.2);
> newt(newt(2.2));
> (newt@@3)(2.2);

One of the problems with Newton's method is knowing when to stop. With a numerical method, you can never get a root exactly, but only a numerical approximation to the root. There are basically two measures of how good your approximation to the root is. One is the absolute value of $f(xn)$. If this number is less than a certain tolerance, then you should have a good approximation to the root. The other measure is the change in the value of $\displaystyle x_n$. If the value of $\displaystyle x_n$ and $\displaystyle x_{n+1}$ are very close, then this can also be a criterion for stopping. You should go back to the example above and experiment with changing the number of iterations.

The worst thing about Newton's method is that it may fail to converge. Here is an example that doesn't seem to converge.

The key to getting Newton's method to converge is to select a good starting value. The best way to do this is to plot the function and determine approximately where the roots are. Then, use these values to start Newton's method.


  1. Use the Newton method with the function $\displaystyle f(x)=\sin(x)-\frac{x}{2}$ from the background and the starting value $x=-1.1$. First enter the function and its plot.
    Describe what happens when Newton's method is used with this initial condition of $1.1$. Start with just a few iterations, and then increase the number of iterations slowly. The maximum number of iterations you use should not exceed fifty.
    Looking at your output, what value of $n$ is required to guarantee that $\displaystyle \vert f(x_k)\vert<10^{-7}$ for all $k \geq n$?
  2. Consider the second function in the background $g(x)=x^5-12x^4+3x^3+7x^2-2x-1$.
    Can you explain why the Newton method fails to converge for the initial guess x = 0? Use graphs with better domain and range choices to help in the explanation.
    Find all the negative roots using the Newton method.

next up previous
Next: About this document ... Up: lab_template Previous: lab_template
Jane E Bouchard