Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
134 tokens/sec
GPT-4o
10 tokens/sec
Gemini 2.5 Pro Pro
47 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

An accelerated minimax algorithm for convex-concave saddle point problems with nonsmooth coupling function (2104.06206v2)

Published 13 Apr 2021 in math.OC

Abstract: In this work we aim to solve a convex-concave saddle point problem, where the convex-concave coupling function is smooth in one variable and nonsmooth in the other and not assumed to be linear in either. The problem is augmented by a nonsmooth regulariser in the smooth component. We propose and investigate a novel algorithm under the name of OGAProx, consisting of an optimistic gradient ascent step in the smooth variable coupled with a proximal step of the regulariser, and which is alternated with a {proximal step} in the nonsmooth component of the coupling function. We consider the situations convex-concave, convex-strongly concave and strongly convex-strongly concave related to the saddle point problem under investigation. Regarding iterates we obtain (weak) convergence, a convergence rate of order $ \mathcal{O}(\frac{1}{K}) $ and linear convergence like $\mathcal{O}(\theta{K})$ with $ \theta < 1 $, respectively. In terms of function values we obtain ergodic convergence rates of order $ \mathcal{O}(\frac{1}{K}) $, $ \mathcal{O}(\frac{1}{K{2}}) $ and $ \mathcal{O}(\theta{K}) $ with $ \theta < 1 $, respectively. We validate our theoretical considerations on a nonsmooth-linear saddle point problem, the training of multi kernel support vector machines and a classification problem incorporating minimax group fairness.

Summary

We haven't generated a summary for this paper yet.