<P> Although all root - finding algorithms proceed by iteration, an iterative root - finding method generally use a specific type of iteration, consisting of defining an auxiliary function, which is applied to the last computed approximations of a root for getting a new approximation . The iteration stops when a fixed point (up to the desired precision) of the auxiliary function is reached, that is when the new computed value is sufficiently close to the preceding ones . </P> <P> Newton's method assumes the function f to have a continuous derivative . Newton's method may not converge if started too far away from a root . However, when it does converge, it is faster than the bisection method, and is usually quadratic . Newton's method is also important because it readily generalizes to higher - dimensional problems . Newton - like methods with higher orders of convergence are the Householder's methods . The first one after Newton's method is Halley's method with cubic order of convergence . </P> <P> Replacing the derivative in Newton's method with a finite difference, we get the secant method . This method does not require the computation (nor the existence) of a derivative, but the price is slower convergence (the order is approximately 1.6). A generalization of the secant method in higher dimensions is Broyden's method . </P> <P> If we use a polynomial fit to remove the quadratic part of the finite difference used in the Secant method, so that it better approximates the derivative, we obtain Steffensen's method, which has quadratic convergence, and whose behavior (both good and bad) is essentially the same Newton's method, but does not require a derivative . </P>

Numerical methods for finding the roots of a function