Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 147 tok/s
Gemini 2.5 Pro 52 tok/s Pro
GPT-5 Medium 27 tok/s Pro
GPT-5 High 30 tok/s Pro
GPT-4o 96 tok/s Pro
Kimi K2 188 tok/s Pro
GPT OSS 120B 398 tok/s Pro
Claude Sonnet 4.5 36 tok/s Pro
2000 character limit reached

Ellipsoid method for convex stochastic optimization in small dimension (2011.04462v1)

Published 9 Nov 2020 in math.OC

Abstract: The article considers minimization of the expectation of convex function. Problems of this type often arise in machine learning and a number of other applications. In practice, stochastic gradient descent (SGD) and similar procedures are often used to solve such problems. We propose to use the ellipsoid method with minibatching, which converges linearly and hence requires significantly less iterations than SGD. This is verified by our experiments, which are publicly available. The algorithm does not require neither smoothness nor strong convexity of target function to achieve linear convergence. We prove that the method arrives at approximate solution with given probability when using minibatches of size proportional to the desired precision to the power -2. This enables efficient parallel execution of the algorithm, whereas possibilities for batch parallelization of SGD are rather limited. Despite fast convergence, ellipsoid method can result in a greater total number of calls to oracle than SGD, which works decently with small batches. Complexity is quadratic in dimension of the problem, hence the method is suitable for relatively small dimensionalities.

Summary

We haven't generated a summary for this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.