Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash 99 tok/s
Gemini 2.5 Pro 48 tok/s Pro
GPT-5 Medium 36 tok/s
GPT-5 High 40 tok/s Pro
GPT-4o 99 tok/s
GPT OSS 120B 461 tok/s Pro
Kimi K2 191 tok/s Pro
2000 character limit reached

A Convex Optimization Approach to Learning Koopman Operators (2102.03934v1)

Published 7 Feb 2021 in eess.SY, cs.SY, math.DS, and math.OC

Abstract: Koopman operators provide tractable means of learning linear approximations of non-linear dynamics. Many approaches have been proposed to find these operators, typically based upon approximations using an a-priori fixed class of models. However, choosing appropriate models and bounding the approximation error is far from trivial. Motivated by these difficulties, in this paper we propose an optimization based approach to learning Koopman operators from data. Our results show that the Koopman operator, the associated Hilbert space of observables and a suitable dictionary can be obtained by solving two rank-constrained semi-definite programs (SDP). While in principle these problems are NP-hard, the use of standard relaxations of rank leads to convex SDPs. Further, these SDPs exhibit chordal sparsity leading to algorithms that scale linearly with the number of data points.

Citations (6)
List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-up Questions

We haven't generated follow-up questions for this paper yet.

Authors (1)