Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
GPT-4o
Gemini 2.5 Pro Pro
o3 Pro
GPT-4.1 Pro
DeepSeek R1 via Azure Pro
2000 character limit reached

A Relaxed Primal-Dual Hybrid Gradient Method with Line Search (2503.17575v1)

Published 21 Mar 2025 in math.OC and eess.IV

Abstract: The primal-dual hybrid gradient method (PDHG) is useful for optimization problems that commonly appear in image reconstruction. A downside of PDHG is that there are typically three user-set parameters and performance of the algorithm is sensitive to their values. Toward a parameter-free algorithm, we combine two existing line searches. The first, by Malitsky et al., is over two of the step sizes in the PDHG iterations. We then use the connection between PDHG and the primal-dual form of Douglas-Rachford splitting to construct a line search over the relaxation parameter. We demonstrate the efficacy of the combined line search on multiple problems, including a novel inverse problem in magnetic resonance image reconstruction. The method presented in this manuscript is the first parameter-free variant of PDHG (across all numerical experiments, there were no changes to line search hyperparameters).

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-up Questions

We haven't generated follow-up questions for this paper yet.