Papers
Topics
Authors
Recent
Search
2000 character limit reached

Randomized Block Subgradient Methods for Convex Nonsmooth and Stochastic Optimization

Published 15 Sep 2015 in math.OC | (1509.04609v1)

Abstract: Block coordinate descent methods and stochastic subgradient methods have been extensively studied in optimization and machine learning. By combining randomized block sampling with stochastic subgradient methods based on dual averaging, we present stochastic block dual averaging (SBDA)---a novel class of block subgradient methods for convex nonsmooth and stochastic optimization. SBDA requires only a block of subgradients and updates blocks of variables and hence has significantly lower iteration cost than traditional subgradient methods. We show that the SBDA-based methods exhibit the optimal convergence rate for convex nonsmooth stochastic optimization. More importantly, we introduce randomized stepsize rules and block sampling schemes that are adaptive to the block structures, which significantly improves the convergence rate w.r.t. the problem parameters. This is in sharp contrast to recent block subgradient methods applied to nonsmooth deterministic or stochastic optimization. For strongly convex objectives, we propose a new averaging scheme to make the regularized dual averaging method optimal, without having to resort to any accelerated schemes.

Citations (6)

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.