Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
184 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Heteroscedastic Bayesian Optimisation for Stochastic Model Predictive Control (2010.00202v2)

Published 1 Oct 2020 in cs.LG, cs.RO, and stat.ML

Abstract: Model predictive control (MPC) has been successful in applications involving the control of complex physical systems. This class of controllers leverages the information provided by an approximate model of the system's dynamics to simulate the effect of control actions. MPC methods also present a few hyper-parameters which may require a relatively expensive tuning process by demanding interactions with the physical system. Therefore, we investigate fine-tuning MPC methods in the context of stochastic MPC, which presents extra challenges due to the randomness of the controller's actions. In these scenarios, performance outcomes present noise, which is not homogeneous across the domain of possible hyper-parameter settings, but which varies in an input-dependent way. To address these issues, we propose a Bayesian optimisation framework that accounts for heteroscedastic noise to tune hyper-parameters in control problems. Empirical results on benchmark continuous control tasks and a physical robot support the proposed framework's suitability relative to baselines, which do not take heteroscedasticity into account.

Citations (13)

Summary

We haven't generated a summary for this paper yet.