A Triple-Bregman Balanced Primal-Dual Algorithm for Saddle Point Problems (2506.07117v1)
Abstract: The primal-dual hybrid gradient (PDHG) method is one of the most popular algorithms for solving saddle point problems. However, when applying the PDHG method and its many variants to some real-world models commonly encountered in signal processing, imaging sciences, and statistical learning, there often exists an imbalance between the two subproblems, with the dual subproblem typically being easier to solve than the primal one. In this paper, we propose a flexible triple-Bregman balanced primal-dual algorithm (TBDA) to solve a class of (not necessarily smooth) convex-concave saddle point problems with a bilinear coupling term. Specifically, our TBDA mainly consists of two dual subproblems and one primal subproblem. Moreover, three Bregman proximal terms, each one with an individual Bregman kernel function, are embedded into the respective subproblems. In this way, it effectively enables us to strike a practical balance between the primal and dual subproblems. More interestingly, it provides us a flexible algorithmic framework to understand some existing iterative schemes and to produce customized structure-exploiting algorithms for applications. Theoretically, we first establish the global convergence and ergodic convergence rate of the TBDA under some mild conditions. In particular, our TBDA allows larger step sizes than the PDHG method under appropriate parameter settings. Then, when the requirements on objective functions are further strengthened, we accordingly introduce two improved versions with better convergence rates than the original TBDA. Some numerical experiments on synthetic and real datasets demonstrate that our TBDA performs better than the PDHG method and some other efficient variants in practice.