Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

DiffTune$^+$: Hyperparameter-Free Auto-Tuning using Auto-Differentiation (2212.03194v2)

Published 6 Dec 2022 in cs.RO

Abstract: Controller tuning is a vital step to ensure the controller delivers its designed performance. DiffTune has been proposed as an automatic tuning method that unrolls the dynamical system and controller into a computational graph and uses auto-differentiation to obtain the gradient for the controller's parameter update. However, DiffTune uses the vanilla gradient descent to iteratively update the parameter, in which the performance largely depends on the choice of the learning rate (as a hyperparameter). In this paper, we propose to use hyperparameter-free methods to update the controller parameters. We find the optimal parameter update by maximizing the loss reduction, where a predicted loss based on the approximated state and control is used for the maximization. Two methods are proposed to optimally update the parameters and are compared with related variants in simulations on a Dubin's car and a quadrotor. Simulation experiments show that the proposed first-order method outperforms the hyperparameter-based methods and is more robust than the second-order hyperparameter-free methods.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Sheng Cheng (40 papers)
  2. Lin Song (44 papers)
  3. Minkyung Kim (16 papers)
  4. Shenlong Wang (70 papers)
  5. Naira Hovakimyan (114 papers)
Citations (8)

Summary

We haven't generated a summary for this paper yet.