Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
156 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A note on Douglas-Rachford, gradients, and phase retrieval (1911.13179v2)

Published 29 Nov 2019 in cs.IT, eess.SP, and math.IT

Abstract: The properties of gradient techniques for the phase retrieval problem have received a considerable attention in recent years. In almost all applications, however, the phase retrieval problem is solved using a family of algorithms that can be interpreted as variants of Douglas-Rachford splitting. In this work, we establish a connection between Douglas-Rachford and gradient algorithms. Specifically, we show that in some cases a generalization of Douglas-Rachford, called relaxed-reflect-reflect (RRR), can be viewed as gradient descent on a certain objective function. The solutions coincide with the critical points of that objective, which---in contrast to standard gradient techniques---are not its minimizers. Using the objective function, we give simple proofs of some basic properties of the RRR algorithm. Specifically, we describe its set of solutions, show a local convexity around any solution, and derive stability guarantees. Nevertheless, in its present state, the analysis does not elucidate the remarkable empirical performance of RRR and its global properties.

Citations (4)

Summary

We haven't generated a summary for this paper yet.