Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Generative Modeling of Regular and Irregular Time Series Data via Koopman VAEs (2310.02619v2)

Published 4 Oct 2023 in cs.LG

Abstract: Generating realistic time series data is important for many engineering and scientific applications. Existing work tackles this problem using generative adversarial networks (GANs). However, GANs are unstable during training, and they can suffer from mode collapse. While variational autoencoders (VAEs) are known to be more robust to the these issues, they are (surprisingly) less considered for time series generation. In this work, we introduce Koopman VAE (KoVAE), a new generative framework that is based on a novel design for the model prior, and that can be optimized for either regular and irregular training data. Inspired by Koopman theory, we represent the latent conditional prior dynamics using a linear map. Our approach enhances generative modeling with two desired features: (i) incorporating domain knowledge can be achieved by leveraging spectral tools that prescribe constraints on the eigenvalues of the linear map; and (ii) studying the qualitative behavior and stability of the system can be performed using tools from dynamical systems theory. Our results show that KoVAE outperforms state-of-the-art GAN and VAE methods across several challenging synthetic and real-world time series generation benchmarks. Whether trained on regular or irregular data, KoVAE generates time series that improve both discriminative and predictive metrics. We also present visual evidence suggesting that KoVAE learns probability density functions that better approximate the empirical ground truth distribution.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (74)
  1. Generative time-series modeling with Fourier flows. In International Conference on Learning Representations, 2020.
  2. TSGBench: time series generation benchmark. arXiv preprint arXiv:2309.03755, 2023.
  3. Ergodic theory, dynamic mode decomposition, and computation of spectral properties of the Koopman operator. SIAM Journal on Applied Dynamical Systems, 16(4):2096–2126, 2017.
  4. An operator approach to tangent vector field processing. In Computer Graphics Forum, volume 32, pp.  73–82. Wiley Online Library, 2013.
  5. Functional fluids on surfaces. In Computer Graphics Forum, volume 33, pp.  237–246. Wiley Online Library, 2014.
  6. Consistent dynamic mode decomposition. SIAM Journal on Applied Dynamical Systems, 18(3):1565–1585, 2019.
  7. Forecasting sequential data using consistent Koopman autoencoders. In International Conference on Machine Learning, pp.  475–485. PMLR, 2020.
  8. Multifactor sequential disentanglement via structured Koopman autoencoders. In The Eleventh International Conference on Learning Representations, ICLR, 2023.
  9. Language models are few-shot learners. Advances in neural information processing systems, 33:1877–1901, 2020.
  10. Modern Koopman theory for dynamical systems. SIAM Rev., 2021.
  11. Applied koopmanism. Chaos: An Interdisciplinary Journal of Nonlinear Science, 22(4), 2012.
  12. Luis Candanedo. Appliances energy prediction. UCI Machine Learning Repository, 2017.
  13. Recurrent neural networks for multivariate time series with missing values. Scientific reports, 8(1):6085, 2018.
  14. Neural ordinary differential equations. Advances in neural information processing systems, 31, 2018.
  15. Learning phrase representations using RNN encoder-decoder for statistical machine translation. In Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing, EMNLP, pp.  1724–1734. ACL, 2014.
  16. A recurrent latent variable model for sequential data. Advances in neural information processing systems, 28, 2015.
  17. On the constrained time-series generation problem. arXiv preprint arXiv:2307.01717, 2023.
  18. Delay-coordinate maps and the spectra of Koopman operators. Journal of Statistical Physics, 175(6):1107–1145, 2019.
  19. TimeVAE: a variational auto-encoder for multivariate time series generation. arXiv preprint arXiv:2111.08095, 2021.
  20. Carl Doersch. Tutorial on variational autoencoders. arXiv preprint arXiv:1606.05908, 2016.
  21. Optimizing neural networks via Koopman operator theory. Advances in Neural Information Processing Systems, 33:2087–2097, 2020.
  22. Adversarial audio synthesis. In International Conference on Learning Representations, 2019.
  23. Operator theoretic aspects of ergodic theory, volume 272. Springer, 2015.
  24. Physics-informed autoencoders for Lyapunov-stable fluid flow prediction. arXiv preprint arXiv:1905.10866, 2019.
  25. Real-valued (medical) time series generation with recurrent conditional GANs. arXiv preprint arXiv:1706.02633, 2017.
  26. Dynamical variational autoencoders: A comprehensive review. Found. Trends Mach. Learn., 15(1-2):1–175, 2021.
  27. Ian Goodfellow. Nips 2016 tutorial: Generative adversarial networks. arXiv preprint arXiv:1701.00160, 2016.
  28. Generative adversarial networks. Advances in neural information processing systems, 27, 2014.
  29. Deep learning. MIT press, 2016.
  30. Professor forcing: a new algorithm for training recurrent networks. In Advances in Neural Information Processing Systems 29, pp.  4601–4609, 2016.
  31. Alex Graves. Generating sequences with recurrent neural networks. arXiv preprint arXiv:1308.0850, 2013.
  32. DeSKO: stability-assured robust control with a deep stochastic Koopman operator. In The Tenth International Conference on Learning Representations, ICLR, 2022.
  33. The ERA5 global reanalysis. Quarterly Journal of the Royal Meteorological Society, 146(730):1999–2049, 2020.
  34. beta-VAE: learning basic visual concepts with a constrained variational framework. In International conference on learning representations, 2016.
  35. Ferenc Huszár. Variational inference using implicit distributions. arXiv preprint arXiv:1702.08235, 2017.
  36. Time-series generation by contrastive imitation. Advances in Neural Information Processing Systems, 34:28968–28982, 2021.
  37. GT-GAN: general purpose time series synthesis with generative adversarial networks. Advances in Neural Information Processing Systems, 35:36999–37010, 2022.
  38. Neural controlled differential equations for irregular time series. Advances in Neural Information Processing Systems, 33:6696–6707, 2020.
  39. Auto-encoding variational bayes. In 2nd International Conference on Learning Representations, ICLR, 2014.
  40. Bernard O Koopman. Hamiltonian systems and transformation in Hilbert space. Proceedings of the National Academy of Sciences, 17(5):315–318, 1931.
  41. Causal recurrent variational autoencoder for medical time series generation. In Thirty-Seventh AAAI Conference on Artificial Intelligence, AAAI, pp.  8562–8570, 2023.
  42. Scalable gradients for stochastic differential equations. In International Conference on Artificial Intelligence and Statistics, pp.  3870–3882. PMLR, 2020.
  43. Are GANs created equal? a large-scale study. Advances in neural information processing systems, 31, 2018.
  44. Deep learning for universal linear embeddings of nonlinear dynamics. Nature communications, 9(1):4950, 2018.
  45. A spectral operator-theoretic framework for global stability. In 52nd IEEE Conference on Decision and Control, pp.  5234–5239. IEEE, 2013.
  46. Global stability analysis using the eigenfunctions of the Koopman operator. IEEE Transactions on Automatic Control, 61(11):3356–3369, 2016.
  47. Igor Mezić. Analysis of fluid flows via spectral properties of the Koopman operator. Annual Review of Fluid Mechanics, 45:357–378, 2013.
  48. Igor Mezic. Koopman operator spectrum and data analysis. arXiv preprint arXiv:1702.07597, 2017.
  49. Olof Mogren. C-RNN-GAN: continuous recurrent neural networks with adversarial training. arXiv preprint arXiv:1611.09904, 2016.
  50. Deep variational Koopman models: inferring Koopman observations for uncertainty-aware dynamics modeling and control. In Proceedings of the Twenty-Eighth International Joint Conference on Artificial Intelligence, IJCAI, pp.  3173–3179, 2019.
  51. An operator theoretic approach for analyzing sequence neural networks. In Proceedings of the AAAI conference on artificial intelligence, volume 37, pp.  9268–9276, 2023.
  52. Resurrecting recurrent neural networks for long sequences. In International Conference on Machine Learning, ICML, volume 202, pp.  26670–26698, 2023.
  53. Physics-informed probabilistic learning of linear embeddings of nonlinear dynamics with guaranteed stability. SIAM Journal on Applied Dynamical Systems, 19(1):480–509, 2020.
  54. Hierarchical text-conditional image generation with clip latents. arXiv preprint arXiv:2204.06125, 1(2):3, 2022.
  55. An operator theoretic view on pruning deep neural networks. In The Tenth International Conference on Learning Representations, ICLR, 2022.
  56. High-resolution image synthesis with latent diffusion models. In Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, pp.  10684–10695, 2022.
  57. Spectral analysis of nonlinear flows. Journal of fluid mechanics, 641:115–127, 2009.
  58. Latent ordinary differential equations for irregularly-sampled time series. Advances in neural information processing systems, 32, 2019.
  59. Generative adversarial networks (GANs) challenges, solutions, and future directions. ACM Computing Surveys (CSUR), 54(3):1–42, 2021.
  60. Peter J Schmid. Dynamic mode decomposition of numerical and experimental data. Journal of fluid mechanics, 656:5–28, 2010.
  61. Deep unsupervised learning using nonequilibrium thermodynamics. In International conference on machine learning, pp.  2256–2265. PMLR, 2015.
  62. An MCMC method for uncertainty set generation via operator-theoretic metrics. In 59th IEEE Conference on Decision and Control, CDC, pp.  2714–2719, 2020.
  63. Steven H Strogatz. Nonlinear dynamics and chaos with student solutions manual: With applications to physics, biology, chemistry, and engineering. CRC press, 2018.
  64. Learning Koopman invariant subspaces for dynamic mode decomposition. Advances in neural information processing systems, 30, 2017.
  65. Koopman invertible autoencoder: Leveraging forward and backward dynamics for temporal modeling. arXiv preprint arXiv:2309.10291, 2023.
  66. MuJoCo: a physics engine for model-based control. In 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp.  5026–5033. IEEE, 2012. doi: 10.1109/IROS.2012.6386109.
  67. WaveNet: a generative model for raw audio. In The 9th ISCA Speech Synthesis Workshop, pp.  125, 2016.
  68. Laurens Van der Maaten and Geoffrey Hinton. Visualizing data using t-sne. Journal of machine learning research, 9(11), 2008.
  69. Koopman neural forecaster for time series with temporal distribution shifts. In The Eleventh International Conference on Learning Representations, ICLR, 2023.
  70. COT-GAN: generating sequential data via causal optimal transport. Advances in neural information processing systems, 33:8798–8809, 2020.
  71. Time-series generative adversarial networks. Advances in neural information processing systems, 32, 2019.
  72. Are transformers effective for time series forecasting? In Proceedings of the AAAI conference on artificial intelligence, pp.  11121–11128, 2023.
  73. Deep latent state space models for time-series generation. In International Conference on Machine Learning, pp.  42625–42643. PMLR, 2023.
  74. Markovian Gaussian process variational autoencoders. In International Conference on Machine Learning, pp.  42938–42961. PMLR, 2023.
User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Ilan Naiman (10 papers)
  2. N. Benjamin Erichson (45 papers)
  3. Pu Ren (19 papers)
  4. Michael W. Mahoney (233 papers)
  5. Omri Azencot (25 papers)
Citations (11)

Summary

We haven't generated a summary for this paper yet.