Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
144 tokens/sec
GPT-4o
8 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Randomised subspace methods for non-convex optimization, with applications to nonlinear least-squares (2211.09873v1)

Published 17 Nov 2022 in math.OC

Abstract: We propose a general random subspace framework for unconstrained nonconvex optimization problems that requires a weak probabilistic assumption on the subspace gradient, which we show to be satisfied by various random matrix ensembles, such as Gaussian and sparse sketching, using Johnson-Lindenstrauss embedding properties. We show that, when safeguarded with trust region or quadratic regularization, this random subspace approach satisfies, with high probability, a complexity bound of order $\mathcal{O}(\epsilon{-2})$ to drive the (full) gradient below $\epsilon$; matching in the accuracy order, deterministic counterparts of these methods and securing almost sure convergence. Furthermore, no problem dimension dependence appears explicitly in the projection size of the sketching matrix, allowing the choice of low-dimensional subspaces. We particularise this framework to Random Subspace Gauss-Newton (RS-GN) methods for nonlinear least squares problems, that only require the calculation of the Jacobian in the subspace; with similar complexity guarantees. Numerical experiments with RS-GN on CUTEst nonlinear least squares are also presented, with some encouraging results.

Citations (22)

Summary

We haven't generated a summary for this paper yet.