Dice Question Streamline Icon: https://streamlinehq.com

Attainment of non-asymptotic local minimax bounds in higher dimensions

Determine whether, for constrained stochastic convex optimization problems of the form minimize E_{z∼P0}[f(x, z)] over a convex set X defined by convex constraints g_i(x) ≤ 0, there exist computationally efficient algorithms in dimensions d > 1 that attain the non-asymptotic local minimax lower bound previously shown to be achievable by grid search in one dimension, given that grid search becomes computationally prohibitive in higher dimensions.

Information Square Streamline Icon: https://streamlinehq.com

Background

The paper reviews non-asymptotic local minimax lower bounds for constrained stochastic optimization, highlighting results where in one dimension a grid-search algorithm attains these bounds. However, grid search becomes computationally infeasible in higher dimensions, raising uncertainty about whether comparable attainment results exist beyond one dimension.

This question concerns the existence of practical algorithms that can match the non-asymptotic local minimax lower bounds for constrained problems in higher dimensions, maintaining instance-dependent optimality without incurring prohibitive computational costs.

References

Unfortunately, in higher dimensions, grid search algorithms become computationally prohibitive. Thus it is not clear whether such results can be extended to higher dimension.

Stochastic Optimization with Constraints: A Non-asymptotic Instance-Dependent Analysis (2404.00042 - Khamaru, 24 Mar 2024) in Subsection "Non-asymptotic bounds"