Novel Root-Finding Algorithm
- The paper introduces a randomized method leveraging root radii approximations to achieve near-optimal Boolean complexity in isolating polynomial roots and clusters.
- It employs a two-stage paradigm with a crude approximation via grid intersections followed by Newton-like refinements to attain target precision.
- The approach robustly handles multiple roots and clusters while supporting parallel processing, offering practical advantages for high-degree polynomial computations.
A novel root-finding algorithm refers here to the class of methods introduced in "Simple and Nearly Optimal Polynomial Root-finding by Means of Root Radii Approximation" (Pan, 2017). This approach leverages randomized geometry in the complex plane, high-precision root radii approximation, and grid-based intersection schemes to reliably isolate well-conditioned complex roots and clusters of a univariate polynomial. The algorithm attains nearly optimal Boolean complexity for the crude approximation stage and immediately dovetails with known fast refinement routines, yielding end-to-end near-optimal performance up to polylogarithmic factors.
1. Two-Stage Root-Finding Paradigm
Modern fast root-finding for univariate polynomials with degree typically decomposes into two phases:
- Crude Approximation Stage: The goal is to find, for each simple root or isolated cluster, a disc (or rectangle) of radius that contains the root or cluster. The approximation only needs to be within of the actual root.
- Refinement Stage: Starting with these discs, a Newton-like or Aberth-like iteration rapidly sharpens the approximation from radius to the target absolute precision . Nearly optimal multiprecision refinement algorithms (e.g., Kirrinnis '98, Pan–Tsigaridas '13–'16) can accomplish this in Boolean time .
The novelty in (Pan, 2017) is a simple and randomized scheme for the initial crude approximation whose cost is already dominated by the subsequent refinement step.
2. Root Radii Approximation Subroutine
Given , the root radii are , ordered by decreasing modulus. Approximating these root radii to within a multiplicative factor for a fixed can be achieved via Schönhage's algorithm (1982):
- Compute the largest (or all) root radii through repeated Graeffe (Dandelin-Lobachevsky) root-squaring steps and coefficient manipulations.
- Achieve any fixed , , in Boolean complexity where reflects the maximum bit-length of coefficients.
- For a shifted polynomial , the radii can be approximated at similar cost, with extra operations for shifting.
Key steps:
- Apply to , target
- Graeffe iterations shrink to
- Take -th roots of the output to recover original scale
3. Randomized Grid Scheme for Initial Isolation
The main initial approximation routine operates as follows:
- Fix a crude radius (trace of permissible error) and a required isolation for root/cluster separation.
- Estimate , ensuring contains all .
- Select to place centers well outside the polynomial's root disk.
Algorithm Steps:
- Pick a random angle .
- Compute three shifted polynomials:
- For each , compute approximate root radii from the shift using Schönhage's routine, resulting in annuli per shift (width at most ).
- The intersection of the first two families' annuli inside yields rectangles (each of diameter ).
- Prune these grid nodes with the third annulus family at random angle . With high probability, each -isolated root or cluster is uniquely "hit" by an annulus from the third family, and its center can be output as a crude approximation.
Randomization ensures that, even in the presence of multiple nearby roots, the chance of ambiguity can be made arbitrarily small using the relationship: where is the desired failure probability.
Cost: Three root-radii computations at each plus extra for grid construction.
4. Multiplicities, Clusters, and Isolation Requirements
- Multiple roots become overlapping chains of annuli; each can be collapsed to a single thickened annulus (width ) and multiplicity attached.
- Isolated clusters behave identically: if a -isolated cluster contains several closely spaced roots, the algorithm recovers a bounding disc for the cluster.
- Guarantees: Any -isolated root or cluster will have an approximation within distance of the true location.
This randomized approach can merge tightly grouped roots (less than separation) into a single disc/cluster, leaving their subsequent resolution to downstream refinement or deflation.
5. Local Root-Refinement: Achieving High Precision
Once -isolated discs are available, any nearly-optimal local refinement procedure—such as those of Kirrinnis or Pan–Tsigaridas—can achieve final accuracy at cost
This can be performed in parallel with minimal communication. Modern multiprecision iteration achieves quadratic or higher convergence locally, so the complexity is dictated by the size and sparsity of the initial covering.
6. Complexity, Near-Optimality, and Comparison with Classical Methods
- Overall Boolean complexity: The total cost for all stages (crude + refinement) is
or when , matching asymptotically the record classic bounds up to polylogarithmic factors.
- Classical lower bound: The Wax–Milnor–Yaglom bound is for well-conditioned roots. Practical methods like Aberth–Ehrlich or classical companion QR cost at least arithmetic steps.
- Comparison: The proposed algorithm is competitive with Pan’s earlier recursive quadtree+Graeffe approach [Pan '95/'02] but much simpler to implement (just three root-radii plus grid filtering). Unlike previous approaches, the new method does not require discriminant logarithms or root deflation.
| Algorithm | Crude Isolation Cost | Local Refinement Cost | Total Cost |
|---|---|---|---|
| Root-radii (this method) | |||
| Pan '95/'02 | , but with quadtree+Graeffe | same | same |
| Aberth–Ehrlich, Jenkins–Traub | N/A (single stage) |
Assumptions: The main requirement is that roots (or clusters) are -isolated (for some ), ensuring that crude approximations resolve each group. The grid-based procedure, with a single random angle, ensures the prescribed success probability .
7. Implementation Considerations and Practical Applicability
- Randomization: Only the angle in the pruning step is randomized. The probability of failure can be made arbitrarily small by tuning .
- Parallelization: Local refinement of the discs can be performed independently, with little inter-processor communication.
- Numerical Stability: There is no dependence on the discriminant of . The method is robust to coefficient scaling, and the shift computations add only marginal cost.
- Ill-conditioned cases: Clusters with root separation may not be individually resolved, but are isolated as single discs. These can be further treated by deflation or local iterative techniques after the initial stage.
Applicability: The method is suitable as a front-end for almost any root-refinement algorithm, especially in high-degree or numerically challenging contexts where ease of implementation and cost predictability are crucial.
8. Summary of Advances
The novel root-finding algorithm in (Pan, 2017) delivers:
- Simplicity: Crude approximation via just three root-radii computations plus randomized angle pruning.
- Efficiency: Nearly optimal Boolean complexity, matching best-known bounds up to polylog factors.
- Flexibility: Handles isolated roots, multiple roots, and well-separated clusters with no need for deflation, discriminant logs, or recursion.
- Parallelism: Minimal processor synchronization in the local refinement.
- Practicality: Immediate compatibility with Aberth, Newton, or Ehrlich–Aberth refinements; easy implementation and efficient for both well- and moderately ill-conditioned polynomials.
This framework represents a significant development in both the theory and practice of univariate polynomial root isolation, with a strong emphasis on minimizing the complexity of the initial crude approximation stage and preserving global scalability (Pan, 2017).