Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Self-Concordant Analysis of Frank-Wolfe Algorithms (2002.04320v3)

Published 11 Feb 2020 in math.OC, cs.LG, and stat.CO

Abstract: Projection-free optimization via different variants of the Frank-Wolfe (FW), a.k.a. Conditional Gradient method has become one of the cornerstones in optimization for machine learning since in many cases the linear minimization oracle is much cheaper to implement than projections and some sparsity needs to be preserved. In a number of applications, e.g. Poisson inverse problems or quantum state tomography, the loss is given by a self-concordant (SC) function having unbounded curvature, implying absence of theoretical guarantees for the existing FW methods. We use the theory of SC functions to provide a new adaptive step size for FW methods and prove global convergence rate O(1/k) after k iterations. If the problem admits a stronger local linear minimization oracle, we construct a novel FW method with linear convergence rate for SC functions.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Pavel Dvurechensky (92 papers)
  2. Petr Ostroukhov (7 papers)
  3. Kamil Safin (2 papers)
  4. Shimrit Shtern (14 papers)
  5. Mathias Staudigl (32 papers)
Citations (24)

Summary

We haven't generated a summary for this paper yet.