Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Accelerated Randomized Mirror Descent Algorithms For Composite Non-strongly Convex Optimization (1605.06892v6)

Published 23 May 2016 in math.OC and stat.ML

Abstract: We consider the problem of minimizing the sum of an average function of a large number of smooth convex components and a general, possibly non-differentiable, convex function. Although many methods have been proposed to solve this problem with the assumption that the sum is strongly convex, few methods support the non-strongly convex case. Adding a small quadratic regularization is a common devise used to tackle non-strongly convex problems; however, it may cause loss of sparsity of solutions or weaken the performance of the algorithms. Avoiding this devise, we propose an accelerated randomized mirror descent method for solving this problem without the strongly convex assumption. Our method extends the deterministic accelerated proximal gradient methods of Paul Tseng and can be applied even when proximal points are computed inexactly. We also propose a scheme for solving the problem when the component functions are non-smooth.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Le Thi Khanh Hien (17 papers)
  2. Cuong V. Nguyen (25 papers)
  3. Huan Xu (83 papers)
  4. Canyi Lu (24 papers)
  5. Jiashi Feng (295 papers)
Citations (19)

Summary

We haven't generated a summary for this paper yet.