Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
134 tokens/sec
GPT-4o
10 tokens/sec
Gemini 2.5 Pro Pro
47 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

On Convergence of Heuristics Based on Douglas-Rachford Splitting and ADMM to Minimize Convex Functions over Nonconvex Sets (1807.02878v6)

Published 8 Jul 2018 in math.OC

Abstract: Recently, heuristics based on the Douglas-Rachford splitting algorithm and the alternating direction method of multipliers (ADMM) have found empirical success in minimizing convex functions over nonconvex sets, but not much has been done to improve the theoretical understanding of them. In this paper, we investigate convergence of these heuristics. First, we characterize optimal solutions of minimization problems involving convex cost functions over nonconvex constraint sets. We show that these optimal solutions are related to the fixed point set of the underlying nonconvex Douglas-Rachford operator. Next, we establish sufficient conditions under which the Douglas-Rachford splitting heuristic either converges to a point or its cluster points form a nonempty compact connected set. In the case where the heuristic converges to a point, we establish sufficient conditions for that point to be an optimal solution. Then, we discuss how the ADMM heuristic can be constructed from the Douglas-Rachford splitting algorithm. We show that, unlike in the convex case, the algorithms in our nonconvex setup are not equivalent to each other and have a rather involved relationship between them. Finally, we comment on convergence of the ADMM heuristic and compare it with the Douglas-Rachford splitting heuristic.

Summary

We haven't generated a summary for this paper yet.