Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 80 tok/s
Gemini 2.5 Pro 60 tok/s Pro
GPT-5 Medium 23 tok/s Pro
GPT-5 High 26 tok/s Pro
GPT-4o 87 tok/s Pro
Kimi K2 173 tok/s Pro
GPT OSS 120B 433 tok/s Pro
Claude Sonnet 4 36 tok/s Pro
2000 character limit reached

Data-Driven Analytic Differentiation via High Gain Observers and Gaussian Process Priors (2210.15528v1)

Published 27 Oct 2022 in eess.SY and cs.SY

Abstract: The presented paper tackles the problem of modeling an unknown function, and its first $r-1$ derivatives, out of scattered and poor-quality data. The considered setting embraces a large number of use cases addressed in the literature and fits especially well in the context of control barrier functions, where high-order derivatives of the safe set are required to preserve the safety of the controlled system. The approach builds on a cascade of high-gain observers and a set of Gaussian process regressors trained on the observers' data. The proposed structure allows for high robustness against measurement noise and flexibility with respect to the employed sampling law. Unlike previous approaches in the field, where a large number of samples are required to fit correctly the unknown function derivatives, here we suppose to have access only to a small window of samples, sliding in time. The paper presents performance bounds on the attained regression error and numerical simulations showing how the proposed method outperforms previous approaches.

Summary

We haven't generated a summary for this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.