Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
169 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Phase Retrieval using Alternating Minimization (1306.0160v2)

Published 2 Jun 2013 in stat.ML, cs.IT, cs.LG, and math.IT

Abstract: Phase retrieval problems involve solving linear equations, but with missing sign (or phase, for complex numbers) information. More than four decades after it was first proposed, the seminal error reduction algorithm of (Gerchberg and Saxton 1972) and (Fienup 1982) is still the popular choice for solving many variants of this problem. The algorithm is based on alternating minimization; i.e. it alternates between estimating the missing phase information, and the candidate solution. Despite its wide usage in practice, no global convergence guarantees for this algorithm are known. In this paper, we show that a (resampling) variant of this approach converges geometrically to the solution of one such problem -- finding a vector $\mathbf{x}$ from $\mathbf{y},\mathbf{A}$, where $\mathbf{y} = \left|\mathbf{A}{\top}\mathbf{x}\right|$ and $|\mathbf{z}|$ denotes a vector of element-wise magnitudes of $\mathbf{z}$ -- under the assumption that $\mathbf{A}$ is Gaussian. Empirically, we demonstrate that alternating minimization performs similar to recently proposed convex techniques for this problem (which are based on "lifting" to a convex matrix problem) in sample complexity and robustness to noise. However, it is much more efficient and can scale to large problems. Analytically, for a resampling version of alternating minimization, we show geometric convergence to the solution, and sample complexity that is off by log factors from obvious lower bounds. We also establish close to optimal scaling for the case when the unknown vector is sparse. Our work represents the first theoretical guarantee for alternating minimization (albeit with resampling) for any variant of phase retrieval problems in the non-convex setting.

Citations (624)

Summary

  • The paper presents a resampling variant of alternating minimization that guarantees geometric convergence for phase retrieval.
  • It employs a careful initialization using the top singular vector, enhancing computational efficiency over convex methods.
  • The approach requires O(n log³n log(1/ε)) measurements, offering a scalable solution for non-convex phase retrieval challenges.

Overview of "Phase Retrieval using Alternating Minimization"

The paper "Phase Retrieval using Alternating Minimization" by Praneeth Netrapalli, Prateek Jain, and Sujay Sanghavi addresses the well-studied problem of phase retrieval, a scenario where one needs to reconstruct a signal from the magnitude of its linear measurements. This problem is significant in fields like optics and crystallography, where phase information is often lost or hard to measure.

Methodology

The traditional approach for phase retrieval has been the alternating minimization technique, epitomized by algorithms from Gerchberg and Saxton, and Fienup. These algorithms iteratively estimate the missing phase and the signal, yet lack global convergence guarantees.

This paper contributes by demonstrating that a resampling variant of alternating minimization achieves geometric convergence under Gaussian measurement assumptions. The goal of the algorithm is to recover a vector x\mathbf{x} from observations y=ATx\mathbf{y} = |A^T \mathbf{x}|, where AA represents measurements. The authors propose an alternating minimization approach that is initialized with a carefully chosen vector derived from the top singular vector of an aggregate matrix formed by the measurements.

Key Results

  • Convergence: The authors provide the first theoretical guarantee for alternating minimization in non-convex settings of phase retrieval. They show that their method converges geometrically under specific conditions when resampling is used.
  • Sample Complexity: Their resampling variant requires a number of measurements mm that is O(nlog3nlog1ϵ)\mathcal{O}(n \log^3 n \log \frac{1}{\epsilon}), which though slightly higher than convex relaxation techniques, still provides strong performance.
  • Efficiency: Alternating minimization, particularly with their initialization, is computationally more efficient compared to convex methods like PhaseLift and PhaseCut, significantly reducing computation time for large problems.

Implications and Future Directions

The research bridges a crucial gap by providing a convergence guarantee for a non-convex algorithm that is simpler and potentially scalable to larger datasets in practical applications. This positions alternating minimization as a viable competitor to convex approaches for phase retrieval, particularly in scenarios where computational resources are a bottleneck.

Looking ahead, the approach introduces an interesting paradigm for other non-convex optimization problems common in signal processing and machine learning, suggesting that with appropriate resampling or initialization strategies, theoretical guarantees may be achievable.

Future exploration could extend this analysis to other measurement models beyond Gaussian, or adapt the strategy to work without heavy resampling, thereby reducing experimental restrictions. Additionally, investigating its application to real-world data in imaging or crystallography would further validate the practical impact of this method.

Overall, the paper meaningfully advances both theoretical understanding and practical application of phase retrieval using non-convex methods.