DualDynamics: Synergizing Implicit and Explicit Methods for Robust Irregular Time Series Analysis (2401.04979v3)
Abstract: Real-world time series analysis faces significant challenges when dealing with irregular and incomplete data. While Neural Differential Equation (NDE) based methods have shown promise, they struggle with limited expressiveness, scalability issues, and stability concerns. Conversely, Neural Flows offer stability but falter with irregular data. We introduce 'DualDynamics', a novel framework that synergistically combines NDE-based method and Neural Flow-based method. This approach enhances expressive power while balancing computational demands, addressing critical limitations of existing techniques. We demonstrate DualDynamics' effectiveness across diverse tasks: classification of robustness to dataset shift, irregularly-sampled series analysis, interpolation of missing data, and forecasting with partial observations. Our results show consistent outperformance over state-of-the-art methods, indicating DualDynamics' potential to advance irregular time series analysis significantly.
- The UEA multivariate time series classification archive, 2018. arXiv preprint arXiv:1811.00075.
- Invertible residual networks. In International Conference on Machine Learning, 573–582. PMLR.
- Neural flows: Efficient alternative to neural ODEs. Advances in Neural Information Processing Systems, 34: 21325–21337.
- Recurrent neural networks for multivariate time series with missing values. Scientific Reports, 8(1): 6085.
- Neural ordinary differential equations. Advances in neural information processing systems, 31.
- Doctor ai: Predicting clinical events via recurrent neural networks. In Machine learning for healthcare conference, 301–318. PMLR.
- Empirical evaluation of gated recurrent neural networks on sequence modeling. arXiv preprint arXiv:1412.3555.
- GRU-ODE-Bayes: Continuous modeling of sporadically-observed time series. Advances in neural information processing systems, 32.
- Nice: Non-linear independent components estimation. arXiv preprint arXiv:1410.8516.
- Density estimation using real nvp. arXiv preprint arXiv:1605.08803.
- Regularisation of neural networks by enforcing lipschitz continuity. Machine Learning, 110: 393–416.
- Deep residual learning for image recognition. In Proceedings of the IEEE conference on computer vision and pattern recognition, 770–778.
- Long short-term memory. Neural computation, 9(8): 1735–1780.
- Kidger, P. 2022. On neural differential equations. arXiv preprint arXiv:2202.02435.
- Neural controlled differential equations for irregular time series. Advances in Neural Information Processing Systems, 33: 6696–6707.
- Normalizing flows: An introduction and review of current methods. IEEE transactions on pattern analysis and machine intelligence, 43(11): 3964–3979.
- Learning long-term dependencies in irregularly-sampled time series. arXiv preprint arXiv:2006.04418.
- Detecting and adapting to irregular distribution shifts in bayesian online learning. Advances in neural information processing systems.
- Tune: A Research Platform for Distributed Model Selection and Training. arXiv preprint arXiv:1807.05118.
- Beyond finite layer neural networks: Bridging deep architectures and numerical differential equations. In International Conference on Machine Learning, 3276–3285. PMLR.
- Stable neural flows. arXiv preprint arXiv:2003.08063.
- Dissecting neural odes. Advances in Neural Information Processing Systems, 33: 3952–3963.
- Recurrent neural networks: design and applications. CRC press.
- Ray: A distributed framework for emerging {{\{{AI}}\}} applications. In 13th {normal-{\{{USENIX}normal-}\}} Symposium on Operating Systems Design and Implementation ({normal-{\{{OSDI}normal-}\}} 18), 561–577.
- Neural rough differential equations for long time series. In International Conference on Machine Learning, 7829–7838. PMLR.
- Neural ODE Processes. In International Conference on Learning Representations.
- Unifying Neural Controlled Differential Equations and Neural Flow for Irregular Time Series Classification. In The Symbiosis of Deep Learning and Differential Equations III.
- Normalizing flows for probabilistic modeling and inference. The Journal of Machine Learning Research, 22(1): 2617–2680.
- Latent ordinary differential equations for irregularly-sampled time series. Advances in neural information processing systems, 32.
- Learning representations by back-propagating errors. nature, 323(6088): 533–536.
- Multi-time attention networks for irregularly sampled time series. arXiv preprint arXiv:2101.10318.
- Predicting in-hospital mortality of icu patients: The physionet/computing in cardiology challenge 2012. In 2012 Computing in Cardiology, 245–248. IEEE.
- Transport analysis of infinitely deep neural network. The Journal of Machine Learning Research, 20(1): 31–82.
- Domain adaptation under missingness shift. International Conference on Artificial Intelligence and Statistics.
- YongKyung Oh (5 papers)
- Dongyoung Lim (2 papers)
- Sungil Kim (4 papers)