Papers
Topics
Authors
Recent
2000 character limit reached

A Globally Convergent Newton Method for Polynomials

Published 1 Mar 2020 in math.NA and cs.NA | (2003.00372v1)

Abstract: Newton's method for polynomial root finding is one of mathematics' most well-known algorithms. The method also has its shortcomings: it is undefined at critical points, it could exhibit chaotic behavior and is only guaranteed to converge locally. Based on the {\it Geometric Modulus Principle} for a complex polynomial $p(z)$, together with a {\it Modulus Reduction Theorem} proved here, we develop the {\it Robust Newton's method} (RNM), defined everywhere with a step-size that guarantees an {\it a priori} reduction in polynomial modulus in each iteration. Furthermore, we prove RNM iterates converge globally, either to a root or a critical point. Specifically, given $\varepsilon $ and any seed $z_0$, in $t=O(1/\varepsilon{2})$ iterations of RNM, independent of degree of $p(z)$, either $|p(z_t)| \leq \varepsilon$ or $|p(z_t) p'(z_t)| \leq \varepsilon$. By adjusting the iterates at {\it near-critical points}, we describe a {\it modified} RNM that necessarily convergence to a root. In combination with Smale's point estimation, RNM results in a globally convergent Newton's method having a locally quadratic rate. We present sample polynomiographs that demonstrate how in contrast with Newton's method RNM smooths out the fractal boundaries of basins of attraction of roots. RNM also finds potentials in computing all roots of arbitrary degree polynomials. A particular consequence of RNM is a simple algorithm for solving cubic equations.

Citations (3)

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (1)

Collections

Sign up for free to add this paper to one or more collections.