Dice Question Streamline Icon: https://streamlinehq.com

Adaptive parameter selection for the Optimal Tensor Method

Develop parameter selection and adaptation strategies for the Optimal Tensor Method (the optimal acceleration scheme of Kovalev et al., 2022; Carmon et al., 2022, implemented here as Algorithm “Optimal”) that improve its empirical efficiency by reducing the number of inner iterations and enhancing overall progress when theoretical parameter settings perform poorly in practice.

Information Square Streamline Icon: https://streamlinehq.com

Background

In the paper’s experimental comparison of high-order acceleration schemes, the Optimal Tensor Method (referred to as Optimal Acceleration) exhibited the weakest practical performance relative to NATA, Near-Optimal Tensor Acceleration, and other methods.

The authors attribute this underperformance primarily to the internal parameters of the Optimal Tensor Method, which were set to their theoretical values in the implementation. This choice led to many inner-loop iterations with limited global progress, suggesting that practical performance is sensitive to parameter tuning and may benefit from adaptive strategies.

Consequently, the authors explicitly identify improving these internal parameters as an open question, motivating research into principled, adaptive parameter selection that preserves theoretical guarantees while providing strong empirical performance.

References

We believe the main issue lies in the internal parameters, which need tuning and adaptation, as we used the theoretical parameters in our implementation. This leads to many inner iterations without significant global progress. Improving these parameters presents an open question for future research.

OPTAMI: Global Superlinear Convergence of High-order Methods (2410.04083 - Kamzolov et al., 5 Oct 2024) in Section: Computational Comparison of Acceleration Methods