Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
167 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Perspectives on Contractivity in Control, Optimization, and Learning (2404.11707v2)

Published 17 Apr 2024 in eess.SY, cs.SY, and math.OC

Abstract: Contraction theory is a mathematical framework for studying the convergence, robustness, and modularity properties of dynamical systems and algorithms. In this opinion paper, we provide five main opinions on the virtues of contraction theory. These opinions are (i) contraction theory is a unifying framework emerging from classical and modern works, (ii) contractivity is computationally-friendly, robust, and modular stability, (iii) numerous dynamical systems are contracting, (iv) contraction theory is relevant to modern applications, and (v) contraction theory can be vastly extended in numerous directions. We survey recent theoretical and applied research in each of these five directions.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (86)
  1. On structural contraction of biological interaction networks. arXiv preprint arXiv:2307.13678, 2023.
  2. Z. Aminzare. Stochastic logarithmic Lipschitz constants: A tool to analyze contractivity of stochastic differential equations. IEEE Control Systems Letters, 6:2311–2316, 2022. doi:10.1109/LCSYS.2022.3148945.
  3. Z. Aminzare and E. D. Sontag. Contraction methods for nonlinear systems: A brief introduction and some open problems. In IEEE Conf. on Decision and Control, pages 3835–3847, December 2014. doi:10.1109/CDC.2014.7039986.
  4. Z. Aminzare and E. D. Sontag. Synchronization of diffusively-connected nonlinear systems: Results based on contractions with respect to general norms. IEEE Transactions on Network Science and Engineering, 1(2):91–106, 2014. doi:10.1109/TNSE.2015.2395075.
  5. Transverse exponential stability and applications. IEEE Transactions on Automatic Control, 61(11):3396–3411, 2016. doi:10.1109/tac.2016.2528050.
  6. Deep equilibrium models. In Advances in Neural Information Processing Systems, 2019. URL: https://arxiv.org/abs/1909.01377.
  7. Stefan Banach. Sur les opérations dans les ensembles abstraits et leur application aux équations intégrales. Fundamenta Mathematicae, 3(1):133–181, 1922. doi:10.4064/fm-3-1-133-181.
  8. Compound matrices in systems and control theory: a tutorial. Mathematics of Control, Signals, and Systems, 35(3):467–521, 2023. doi:10.1007/s00498-023-00351-8.
  9. RobustNeuralNetworks.jl: a package for machine learning and data-driven control with certified robustness, 2023. URL: http://arxiv.org/abs/2306.12612.
  10. F. Blanchini and G. Giordano. Piecewise-linear Lyapunov functions for structural stability of biochemical networks. Automatica, 50(10):2482–2493, 2014. doi:10.1016/j.automatica.2014.08.012.
  11. C. Briat. Robust stability and stabilization of uncertain linear positive systems via integral linear constraints: L1subscript𝐿1L_{1}italic_L start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT-gain and L∞subscript𝐿L_{\infty}italic_L start_POSTSUBSCRIPT ∞ end_POSTSUBSCRIPT-gain characterization. International Journal of Robust and Nonlinear Control, 23(17):1932–1954, 2013. doi:10.1002/rnc.2859.
  12. F. Bullo. Contraction Theory for Dynamical Systems. Kindle Direct Publishing, 1.1 edition, 2023, ISBN 979-8836646806. URL: https://fbullo.github.io/ctds.
  13. From contraction theory to fixed point algorithms on Riemannian and non-Euclidean spaces. In IEEE Conf. on Decision and Control, December 2021. doi:10.1109/CDC45484.2021.9682883.
  14. T. Caraballo and P. E. Kloeden. The persistence of synchronization under environmental noise. Proceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences, 461(2059):2257–2267, 2005. doi:10.1098/rspa.2005.1484.
  15. Modelling and contractivity of neural-synaptic networks with Hebbian learning. Automatica, 164:111636, 2024. doi:10.1016/j.automatica.2024.111636.
  16. Positive competitive networks for sparse reconstruction. Neural Computation, January 2024. To appear. doi:10.48550/arXiv.2311.03821.
  17. L. Chua and D. Green. A qualitative analysis of the behavior of dynamic nonlinear networks: Stability of autonomous networks. IEEE Transactions on Circuits and Systems, 23(6):355–379, 1976. doi:10.1109/TCS.1976.1084228.
  18. Online optimization as a feedback controller: Stability and tracking. IEEE Transactions on Control of Network Systems, 7(1):422–432, 2020. doi:10.1109/TCNS.2019.2906916.
  19. Throughput optimality and overload behavior of dynamical flow networks under monotone distributed routing. IEEE Transactions on Control of Network Systems, 2(1):57–67, 2015. doi:10.1109/TCNS.2014.2367361.
  20. S. Coogan. A contractive approach to separable Lyapunov functions for monotone systems. Automatica, 106:349–357, 2019. doi:10.1016/j.automatica.2019.05.001.
  21. S. Coogan and M. Arcak. A compartmental model for traffic networks and its dynamical behavior. IEEE Transactions on Automatic Control, 60(10):2698–2703, 2015. doi:10.1109/TAC.2015.2411916.
  22. Singular perturbation via contraction theory. IEEE Transactions on Automatic Control, October 2023. Submitted. doi:10.48550/arXiv.2310.07966.
  23. G. Dahlquist. Stability and error bounds in the numerical integration of ordinary differential equations. PhD thesis, (Reprinted in Trans. Royal Inst. of Technology, No. 130, Stockholm, Sweden, 1959), 1958.
  24. G. Dahlquist. Error analysis for a class of methods for stiff non-linear initial value problems. In G. A. Watson, editor, Numerical Analysis, pages 60–72. Springer, 1976. doi:10.1007/BFb0080115.
  25. L. D’Alto and M. Corless. Incremental quadratic stability. Numerical Algebra, Control and Optimization, 3:175–201, 2013. doi:10.3934/naco.2013.3.175.
  26. A. Davydov and F. Bullo. Exponential stability of parametric optimization-based controllers via Lur’e contractivity. IEEE Control Systems Letters, 2024. Submitted. doi:10.48550/arXiv.2403.08159.
  27. Contracting dynamics for time-varying convex optimization. IEEE Transactions on Automatic Control, June 2023. Submitted. doi:10.48550/arXiv.2305.15595.
  28. Non-Euclidean contraction theory for robust nonlinear stability. IEEE Transactions on Automatic Control, 67(12):6667–6681, 2022. doi:10.1109/TAC.2022.3183966.
  29. Non-Euclidean contraction analysis of continuous-time neural networks. IEEE Transactions on Automatic Control, August 2023. Submitted. doi:10.48550/arXiv.2110.08298.
  30. Dual seminorms, ergodic coefficients, and semicontraction theory. IEEE Transactions on Automatic Control, 69(5), 2024. To appear. doi:10.1109/TAC.2023.3302788.
  31. A contraction theory approach to singularly perturbed systems. IEEE Transactions on Automatic Control, 58(3):752–757, 2013. doi:10.1109/TAC.2012.2211444.
  32. On QUAD, Lipschitz, and contracting vector fields for consensus and synchronization of networks. IEEE Transactions on Circuits and Systems I: Regular Papers, 58(3):576–583, 2011. doi:10.1109/TCSI.2010.2072270.
  33. B. P. Demidovič. Dissipativity of a nonlinear system of differential equations. Uspekhi Matematicheskikh Nauk, 16(3(99)):216, 1961.
  34. C. A. Desoer and H. Haneda. The measure of a matrix as a tool to analyze computer algorithms for circuit analysis. IEEE Transactions on Circuit Theory, 19(5):480–486, 1972. doi:10.1109/TCT.1972.1083507.
  35. C. A. Desoer and M. Vidyasagar. Feedback Systems: Input-Output Properties. Academic Press, 1975, ISBN 978-0-12-212050-3. doi:10.1137/1.9780898719055.
  36. Convergence, consensus and synchronization of complex networks via contraction theory. In Complex Systems and Networks, pages 313–339. Springer, 2016. doi:10.1007/978-3-662-47824-0_12.
  37. Gradient descent provably optimizes over-parameterized neural networks. In International Conference on Learning Representations, 2019. URL: https://openreview.net/forum?id=S1eK3i09YQ.
  38. Graph-theoretic stability conditions for Metzler matrices and monotone systems. SIAM Journal on Control and Optimization, 59(5):3447–3471, 2021. doi:10.1137/20M131802X.
  39. A. Duvall and E. D. Sontag. A remark on omega limit sets for non-expansive dynamics. 2024. URL: https://arxiv.org/abs/2404.02352.
  40. Implicit deep learning. SIAM Journal on Mathematics of Data Science, 3(3):930–958, 2021. doi:10.1137/20M1358517.
  41. A. F. Filippov. Differential Equations with Discontinuous Righthand Sides. Kluwer, 1988, ISBN 902772699X.
  42. F. Forni and R. Sepulchre. A differential Lyapunov framework for contraction analysis. IEEE Transactions on Automatic Control, 59(3):614–628, 2014. doi:10.1109/TAC.2013.2285771.
  43. F. Forni and R. Sepulchre. Differential dissipativity theory for dominance analysis. IEEE Transactions on Automatic Control, 64(6):2340–2351, 2019. doi:10.1109/TAC.2018.2867920.
  44. C. Gallicchio and A. Micheli. Echo state property of deep reservoir computing networks. Cognitive Computation, 9(3):337–350, 2017. doi:10.1007/s12559-017-9461-9.
  45. LMI conditions for contraction, integral action, and output feedback stabilization for a class of nonlinear systems. Automatica, 154:111106, 2023. doi:10.1016/j.automatica.2023.111106.
  46. P. Giesl. Converse theorems on contraction metrics for an equilibrium. Journal of Mathematical Analysis and Applications, 424(2):1380–1403, 2015. doi:10.1016/j.jmaa.2014.12.010.
  47. Contractivity of distributed optimization and Nash seeking dynamics. IEEE Control Systems Letters, 7:3896–3901, 2023. doi:10.1109/LCSYS.2023.3341987.
  48. Solving Ordinary Differential Equations I. Nonstiff Problems. Springer, 1993. doi:10.1007/978-3-540-78862-1.
  49. Timescale separation in autonomous optimization. IEEE Transactions on Automatic Control, 66(2):611–624, 2021. doi:10.1109/tac.2020.2989274.
  50. H. Jaeger. The “echo state” approach to analysing and training recurrent neural networks. Technical report, German National Research Center for Information Technology, 2001.
  51. Weak and semi-contraction for network systems and diffusively-coupled oscillators. IEEE Transactions on Automatic Control, 67(3):1285–1300, 2022. doi:10.1109/TAC.2021.3073096.
  52. Robust implicit networks via non-Euclidean contractions. In Advances in Neural Information Processing Systems, December 2021. doi:10.48550/arXiv.2106.03194.
  53. R. I. Kachurovskii. Monotone operators and convex functionals. Uspekhi Matematicheskikh Nauk, 15(4):213–215, 1960.
  54. Closed-loop finite-time analysis of suboptimal online control, 2023. URL: http://arxiv.org/abs/2312.05607.
  55. Contraction analysis of monotone systems via separable functions. IEEE Transactions on Automatic Control, 65(8):3486–3501, 2020. doi:10.1109/TAC.2019.2944923.
  56. RNNs of RNNs: Recursive construction of stable assemblies of recurrent neural networks. In Advances in Neural Information Processing Systems, December 2022. doi:10.48550/arXiv.2106.08928.
  57. Achieving stable dynamics in neural circuits. PLoS Computational Biology, 16(8):1–15, 2020. doi:10.1371/journal.pcbi.1007659.
  58. Generalization as dynamical robustness–The role of Riemannian contraction in supervised learning. Transactions on Machine Learning Research, 2023. URL: https://openreview.net/forum?id=Sb6p5mcefw.
  59. N. N. Krasovskiĭ. Stability of Motion. Applications of Lyapunov’s Second Method to Differential Systems and Equations with Delay. Stanford University Press, 1963.
  60. D. C. Lewis. Metric properties of differential equations. American Journal of Mathematics, 71(2):294–312, 1949. doi:10.2307/2372245.
  61. W. Lohmiller and J.-J. E. Slotine. On contraction analysis for non-linear systems. Automatica, 34(6):683–696, 1998. doi:10.1016/S0005-1098(98)00019-3.
  62. S. M. Lozinskii. Error estimate for numerical integration of ordinary differential equations. I. Izvestiya Vysshikh Uchebnykh Zavedenii. Matematika, 5:52–90, 1958. (in Russian). URL: http://mi.mathnet.ru/eng/ivm2980.
  63. W. Lu and T. Chen. New approach to synchronization analysis of linearly coupled ordinary differential systems. Physica D: Nonlinear Phenomena, 213(2):214–230, 2006. doi:10.1016/j.physd.2005.11.009.
  64. M. Lukoševičius and H. Jaeger. Reservoir computing approaches to recurrent neural network training. Computer Science Review, 3(3):127–149, 2009. doi:10.1016/j.cosrev.2009.03.005.
  65. Transverse contraction criteria for existence, stability, and robustness of a limit cycle. Systems & Control Letters, 63:32–38, 2014. doi:10.1016/j.sysconle.2013.10.005.
  66. Control contraction metrics: Convex and intrinsic criteria for nonlinear feedback design. IEEE Transactions on Automatic Control, 62(6):3046–3053, 2017. doi:10.1109/TAC.2017.2668380.
  67. Frequency response functions for nonlinear convergent systems. IEEE Transactions on Automatic Control, 52(6):1159–1165, 2007. doi:10.1109/tac.2007.899020.
  68. C. Pehlevan and D. B. Chklovskii. Neuroscience-inspired online unsupervised learning algorithms: Artificial neural networks. IEEE Signal Processing Magazine, 36(6):88–96, 2019. doi:10.1109/msp.2019.2933846.
  69. A contraction theory approach to stochastic incremental stability. IEEE Transactions on Automatic Control, 54(4):816–820, 2009. doi:10.1109/tac.2008.2009619.
  70. I. Pólik and T. Terlaky. A survey of the S-lemma. SIAM Review, 49(3):371–418, 2007. doi:10.1137/S003614450444614X.
  71. The Yakubovich S-Lemma revisited: Stability and contractivity in non-Euclidean norms. SIAM Journal on Control and Optimization, 61(4):1955–1978, 2023. doi:10.1137/22M1512600.
  72. Nonlinear measures: A new approach to exponential stability analysis for Hopfield-type neural networks. IEEE Transactions on Neural Networks, 12(2):360–370, 2001. doi:10.1109/72.914530.
  73. Lipschitz bounded equilibrium networks. 2020. URL: https://arxiv.org/abs/2010.01732.
  74. Sparse coding via thresholding and local competition in neural circuits. Neural Computation, 20(10):2526–2563, 2008. doi:10.1162/neco.2008.03-07-486.
  75. G. Russo and M. di Bernardo. On distributed coordination in networks of cyber-physical systems. Chaos: An Interdisciplinary Journal of Nonlinear Science, 29(5), 2019. doi:10.1063/1.5093728.
  76. Global entrainment of transcriptional systems to periodic inputs. PLoS Computational Biology, 6(4):e1000739, 2010. doi:10.1371/journal.pcbi.1000739.
  77. Parametrization of linear controllers for p-dominance. IEEE Control Systems Letters, 7:1879–1884, 2023. doi:10.1109/lcsys.2023.3282598.
  78. J. W. Simpson-Porco and F. Bullo. Contraction theory on Riemannian manifolds. Systems & Control Letters, 65:74–80, 2014. doi:10.1016/j.sysconle.2013.12.016.
  79. Robust feedback motion planning via contraction theory. International Journal of Robotics Research, 42(9):655–688, 2023. doi:10.1177/02783649231186165.
  80. G. Söderlind. The logarithmic norm. History and modern theory. BIT Numerical Mathematics, 46(3):631–652, 2006. doi:10.1007/s10543-006-0069-9.
  81. H. Tsukamoto and S.-J. Chung. Learning-based robust motion planning with guaranteed stability: A contraction theory approach. IEEE Robotics and Automation Letters, 6(4):6164–6171, 2021. doi:10.1109/LRA.2021.3091019.
  82. H. Tsukamoto and S.-J. Chung. Neural contraction metrics for robust estimation and control: A convex optimization approach. IEEE Control Systems Letters, 5(1):211–216, 2021. doi:10.1109/lcsys.2020.3001646.
  83. k𝑘kitalic_k-contraction: Theory and applications. Automatica, 136:110048, 2022. doi:10.1016/j.automatica.2021.110048.
  84. Generalization of the multiplicative and additive compounds of square matrices and contraction in the Hausdorff dimension. IEEE Transactions on Automatic Control, 2022. doi:10.1109/TAC.2022.3162547.
  85. Robust classification using contractive Hamiltonian neural ODEs. IEEE Control Systems Letters, 7:145–150, 2023. doi:10.1109/LCSYS.2022.3186959.
  86. Tube-certified trajectory tracking for nonlinear systems with robust control contraction metrics. IEEE Robotics and Automation Letters, 7(2):5528–5535, 2022. doi:10.1109/lra.2022.3153712.
Citations (3)

Summary

We haven't generated a summary for this paper yet.