Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
91 tokens/sec
GPT-4o
12 tokens/sec
Gemini 2.5 Pro Pro
o3 Pro
5 tokens/sec
GPT-4.1 Pro
15 tokens/sec
DeepSeek R1 via Azure Pro
33 tokens/sec
Gemini 2.5 Flash Deprecated
12 tokens/sec
2000 character limit reached

Integral Concurrent Learning: Adaptive Control with Parameter Convergence without PE or State Derivatives (1512.03464v1)

Published 10 Dec 2015 in cs.SY

Abstract: Concurrent learning is a recently developed adaptive update scheme that can be used to guarantee parameter convergence without requiring persistent excitation. However, this technique requires knowledge of state derivatives, which are usually not directly sensed and therefore must be estimated. A novel integral concurrent learning method is developed in this paper that removes the need to estimate state derivatives while maintaining parameter convergence properties. A Monte Carlo simulation illustrates improved robustness to noise compared to the traditional derivative formulation.

Citations (123)

Summary

We haven't generated a summary for this paper yet.