Boosting Accelerated Proximal Gradient Method with Adaptive Sampling for Stochastic Composite Optimization (2507.18277v1)
Abstract: We develop an adaptive Nesterov accelerated proximal gradient (adaNAPG) algorithm for stochastic composite optimization problems, boosting the Nesterov accelerated proximal gradient (NAPG) algorithm through the integration of an adaptive sampling strategy for gradient estimation. We provide a complexity analysis demonstrating that the new algorithm, adaNAPG, achieves both the optimal iteration complexity and the optimal sample complexity as outlined in the existing literature. Additionally, we establish a central limit theorem for the iteration sequence of the new algorithm adaNAPG, elucidating its convergence rate and efficiency.
Sponsor
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.