5.10 Comparing the Root-Finder Algorithms

   

 
Java Number Cruncher: The Java Programmer's Guide to Numerical Computing
By Ronald  Mak

Table of Contents
Chapter  5.   Finding Roots

5.10 Comparing the Root-Finder Algorithms

We've considered six algorithms of finding roots in this chapter: bisection, regula falsi, improved regula falsi, secant , Newton's, and fixed-point iteration. How do we decide which algorithm is the best one to use on any given function?

It should be obvious by now that, before choosing an algorithm, you must know what the graph of the function looks like in the interval in which you want to find a root. You need at least a rough idea of where the roots are and how the function behaves near the roots.

Each algorithm requires the function to be continuous near a root. Before looking for a root, be sure there is at least one root to find. Sign changes in the value of the function indicate the presence of a root. Some roots may actually be multiple roots.

We can use several criteria to judge a root-finding algorithm:

  • Reliability . Will it find the root?

  • Rate of convergence . How quickly does the algorithm converge to a root?

  • Complexity . How hard is it to program?

  • Performance . How much work is necessary during each iteration?

  • Safety . What can go wrong? How likely will it diverge, cycle, or shoot off into infinity?

Before we compare convergence rates, let's start with some definitions.

DEFINITION : Suppose we have a root-finding algorithm that generates a sequence of successive approximations x , x 1 , x 2 , . . ., x n , . . . to a root. Let D n be the difference between x n and the root. If

graphics/05equ18.gif


where C is a nonzero constant, then p is the algorithm's order of convergence.

DEFINITION : If p = 1, then the algorithm has a linear convergence rate.

DEFINITION : If 1 < p < 2, then the algorithm has a superlinear convergence rate.

DEFINITION : If p = 2, then the algorithm has a quadratic convergence rate.

We won't do the proofs here, but of the algorithms we've examined in this chapter, the bisection algorithm and fixed-point iteration have linear convergence rates, the regula falsi and secant algorithms have superlinear convergence rates, and Newton's algorithm has a quadratic convergence rate.

The easiest and safest algorithm to program is the bisection algorithm. Once the initial conditions are met (continuous function, different signs at the interval ends), the algorithm is guaranteed to find a root. Unfortunately, its convergence rate is the slowest.

The regula falsi algorithm usually converges faster than the bisection algorithm, and it is very safe. Because it tends to approach a root from only one side, certain pathological functions may cause its convergence to get slower as it gets closer to the root. The regula falsi algorithm requires more computation for each iteration than the bisection algorithm. The improved regula falsi algorithm converges faster, but at the cost of even more computation per iteration.

The secant algorithm requires you to pick two initial points. With good starting points, it can converge even faster than the improved regula falsi algorithm. Bad starting points, however, can send the algorithm wandering before it converges, or it may diverge altogether, so safety can be an issue. As for programming and execution complexity, it is very similar to the regula falsi algorithm.

Fixed-point iteration requires you to derive an iteration function g ( x ) based on the original function f ( x ). The algorithm converges only if you define g ( x ) carefully to meet certain criteria, as explained earlier.

With its quadratic convergence rate, Newton's algorithm is generally the fastest of all the algorithms, especially if the initial value is already close to the root. However, it does require the computation of both the function and its derivative during each iteration, and that can be expensive. Also, as we saw earlier, the algorithm's behavior can be dangerously erratic with certain pathological functions.

Perhaps the best way to find roots is a combination of two algorithms. Start with the bisection or the regula falsi algorithm to quickly find a narrow interval around the root. Then use Newton's algorithm to refine the search if the function's derivative is not too expensive to compute; otherwise , finish off with the improved regula falsi or the secant algorithm.


   
Top
 


Java Number Cruncher. The Java Programmer's Guide to Numerical Computing
Java Number Cruncher: The Java Programmers Guide to Numerical Computing
ISBN: 0130460419
EAN: 2147483647
Year: 2001
Pages: 141
Authors: Ronald Mak

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net