Drexel dragonThe Math ForumDonate to the Math Forum

Ask Dr. Math - Questions and Answers from our Archives
_____________________________________________
Associated Topics || Dr. Math Home || Search Dr. Math
_____________________________________________

Determining Roots of a Function

Date: 04/03/2003 at 10:16:44
From: Javier
Subject: Determining Roots of a Function

Hello,

I know that under specific conditions we can calculate the roots of a 
function using the Newton-Raphson Method. The Newton-Raphson method 
uses the tangent to approximate the root. I want to know if there is a 
method to approximate the root with a parabola instead of the tangent. 

I tried to approximate the root using a parabola like g(x)=(x-x_0)^2 + 
f(x_0), where f is the original function and x_0 is the initial point 
(as in Newton-Raphson). The method would be:
   1) calculate the roots g, r_1 and r_2.
   2) if |f(r_1)|<|f(r_2)| then the new x_0 = r_1
      else x_0 = r_2 and repeat again

I don't know if I can find a root using this method, and if I find a
root, how bad is this method compared with Newton-Raphson?


Date: 04/05/2003 at 07:37:20
From: Doctor Jacques
Subject: Re: Determining Roots of a Function

Hi Javier,

You can indeed devise a method based on that idea.

First, let us see how Newton-Raphson works. There is a geometric 
interpretation (the tangent line), but we can also picture it in 
another way.

Assume that a is an approximation of a root of f(x) (a would be x_n 
in the final formula). We can expand f(x) around a using the Taylor 
series:

[1] f(a+h) = f(a) + hf'(a) + (h^2/2)f"(a)...

In this case, h is a small increment. It is the correction to be added 
to a to get the exact root.

The first step is to approximate the series by retaining only the 
first two terms:

[2] f(a+h) = f(a) + hf'(a) (approximately)

As h is assumed to be small (since a should be "near the root"), this 
approximation can be justified, provided that f"(a) and subsequent 
derivatives are not too large.

As we look for a root, we set f(a+h) = 0:

[3] 0 = f(a) +hf'(a)

from which we get h = -f(a)/f'(a), and the new approximation is:

 a - f(a)/f'(a)

which is Newton's formula.

If you want a quadratic variant of the method, all you have to do is 
keep one more term in [1], i.e. you solve:

[4] 0 = f(a) + hf'(a) + (h^2/2)f"(a)

for h, and take the next approximation as a + h.

This method will certainly converge much more rapidly than Newton's 
method, since we take a better approximation of the Taylor expansion. 
However, there are a few practical problems to take into account:

* It requires more work to solve a quadratic equation than a linear
  one.

* You must select which of the two roots of the quadratic is
  the "correct one." If you implement this in a computer program,
  this is also some additional work, and you can expect some nasty
  surprises if your initial approximation is not close enough to the
  root.

* You must also compute the second derivative f"(a) - still more work.

The conclusion is that, in practice, it is probably more efficient to 
perform a few more iterations of the normal Newton's method.

Does this help?  Write back if you'd like to talk about this 
some more, or if you have any other questions.

- Doctor Jacques, The Math Forum
  http://mathforum.org/dr.math/ 


Date: 04/05/2003 at 11:47:40
From: Doctor Fenton
Subject: Re: Determining Roots of a Function

Hi Javier,

Thanks for writing to Dr. Math. Yes, there are methods that do not use 
derivatives as Newton's method does, but which employ quadratic 
interpolation instead. 

I'm not sure that you can find a root with the method you mention, 
using g(x)=(x-x_0)^2 + f(x_0).  That seems very arbitrary. The secant 
method finds two values x_1 and x_2 that bracket the root r, so that 
f(x_1) and f(x_2) have opposite signs, passes a line through
(x_1,f(x_1)) and (x_2,f(x_2)), and uses the intercept of the line to
replace one of x_1 and x_2.  You could use three values x_1,x_2, and
x_3, with f(x_1) and f(x_3) having opposite signs, interpolate a
quadratic through (x(i),f(x(i)), for i=1, 2, and 3, and use the
intercept of the quadratic to replace one of the three points.
However, this procedure can apparently give poor results in some cases.

The most popular non-derivative method seems to be the one developed 
by Richard P. Brent, based on Dekker's method, which is given in his 
book "Algorithms for Minimization Without Derivatives," which has 
recently been reprinted by Dover books.  

It is also described in Press, et al., _Numerical Recipes_ (in various 
computer languages, along with source code). It actually combines 
bisection and inverse quadratic interpolation to produce rapid 
convergence. Quadratic interpolation alone can sometimes produce poor 
results if the function is not sufficiently smooth, according to 
_Numerical Recipes_.

I would recommend you check one of those sources for detailed 
information.

- Doctor Fenton, The Math Forum
  http://mathforum.org/dr.math/ 


Date: 04/05/2003 at 23:32:44
From: Javier
Subject: Thank you (Determining Roots of a Function)

Thank you for the answers. I have also found using Google that there
is a method called the Muller Method that approximates the root with a 
parabola.
Associated Topics:
High School Functions

Search the Dr. Math Library:


Find items containing (put spaces between keywords):
 
Click only once for faster results:

[ Choose "whole words" when searching for a word like age.]

all keywords, in any order at least one, that exact phrase
parts of words whole words

Submit your own question to Dr. Math

[Privacy Policy] [Terms of Use]

_____________________________________
Math Forum Home || Math Library || Quick Reference || Math Forum Search
_____________________________________

Ask Dr. MathTM
© 1994-2013 The Math Forum
http://mathforum.org/dr.math/