Papers
Topics
Authors
Recent
2000 character limit reached

Optimality of the Subgradient Algorithm in the Stochastic Setting (1909.05007v7)

Published 10 Sep 2019 in math.ST, cs.DS, cs.LG, cs.SY, eess.SY, math.OC, math.PR, stat.ML, and stat.TH

Abstract: We show that the Subgradient algorithm is universal for online learning on the simplex in the sense that it simultaneously achieves $O(\sqrt N)$ regret for adversarial costs and $O(1)$ pseudo-regret for i.i.d costs. To the best of our knowledge this is the first demonstration of a universal algorithm on the simplex that is not a variant of Hedge. Since Subgradient is a popular and widely used algorithm our results have immediate broad application.

Citations (4)

Summary

We haven't generated a summary for this paper yet.

Whiteboard

Paper to Video (Beta)

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.