Elucidating Subspace Perturbation in Zeroth-Order Optimization: Theory and Practice at Scale (2501.19099v2)
Abstract: Zeroth-order (ZO) optimization has emerged as a promising alternative to gradient-based backpropagation methods, particularly for black-box optimization and LLM fine-tuning. However, ZO methods often suffer from slow convergence due to high-variance stochastic gradient estimators. While subspace perturbations, such as sparsity and low-rank constraints, have been explored to mitigate this issue, their effectiveness remains poorly understood. In this work, we develop a \emph{unified theoretical framework} that analyzes both the convergence and generalization properties of ZO optimization under subspace perturbations. We show that high dimensionality is the primary bottleneck and introduce the notion of \textit{subspace alignment} to explain how the subspace perturbations reduce gradient noise and accelerate convergence. Our analysis further shows that a broad class of subspace perturbations exhibits a similar convergence rate, motivating us to prioritize practical considerations in real-world algorithm design. Building on these insights, we propose an efficient ZO method using block coordinate descent (MeZO-BCD), which perturbs and updates only a subset of parameters at each step. Extensive experiments show that MeZO-BCD significantly accelerates optimization, achieving up to $\mathbf{\times2.77}$ speedup in wall-clock time over MeZO on OPT-13B, while maintaining comparable iteration complexity and fine-tuning performance.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.