MADS: Modulated Auto-Decoding SIREN for time series imputation (2307.00868v1)
Abstract: Time series imputation remains a significant challenge across many fields due to the potentially significant variability in the type of data being modelled. Whilst traditional imputation methods often impose strong assumptions on the underlying data generation process, limiting their applicability, researchers have recently begun to investigate the potential of deep learning for this task, inspired by the strong performance shown by these models in both classification and regression problems across a range of applications. In this work we propose MADS, a novel auto-decoding framework for time series imputation, built upon implicit neural representations. Our method leverages the capabilities of SIRENs for high fidelity reconstruction of signals and irregular data, and combines it with a hypernetwork architecture which allows us to generalise by learning a prior over the space of time series. We evaluate our model on two real-world datasets, and show that it outperforms state-of-the-art methods for time series imputation. On the human activity dataset, it improves imputation performance by at least 40%, while on the air quality dataset it is shown to be competitive across all metrics. When evaluated on synthetic data, our model results in the best average rank across different dataset configurations over all baselines.
- Generative time-series modeling with fourier flows. In International Conference on Learning Representations, 2021.
- Gustavo E. A. P. A. Batista and Maria Carolina Monard. A study of k-nearest neighbour as an imputation method. In International Conference on Health Information Science, 2002.
- Nearest neighbor imputation algorithms: a critical evaluation. BMC Medical Informatics and Decision Making, 16, 2016.
- Brits: Bidirectional recurrent imputation for time series. In S. Bengio, H. Wallach, H. Larochelle, K. Grauman, N. Cesa-Bianchi, and R. Garnett, editors, Advances in Neural Information Processing Systems, volume 31. Curran Associates, Inc., 2018.
- Ting Chen. On the importance of noise scheduling for diffusion models, 2023.
- Time Series Analysis by State Space Methods: Second Edition. Oxford Statistical Science Series. OUP Oxford, 2012.
- Hypertime: Implicit neural representation for time series, 2022.
- Gp-vae: Deep probabilistic time series imputation. In International Conference on Artificial Intelligence and Statistics, 2019.
- David S. Fung. Methods for the estimation of missing values in time series. 2006.
- Supervised learning from incomplete data via an em approach. In NIPS, 1993.
- Hypernetworks, 2016.
- simple diffusion: End-to-end diffusion for high resolution images, 2023.
- Hyperimpute: Generalized iterative imputation with automatic model selection. 2022.
- Time-series anomaly detection with implicit neural representation. CoRR, abs/2201.11950, 2022.
- Inferring semantic information with 3d neural scene representations. ArXiv, abs/2003.12673, 2020.
- Directly modeling missing data in sequences with rnns: Improved classification of clinical time series. In Finale Doshi-Velez, Jim Fackler, David Kale, Byron Wallace, and Jenna Wiens, editors, Proceedings of the 1st Machine Learning for Healthcare Conference, volume 56 of Proceedings of Machine Learning Research, pages 253–270, Northeastern University, Boston, MA, USA, 18–19 Aug 2016. PMLR.
- Multivariate time series imputation with generative adversarial networks. In S. Bengio, H. Wallach, H. Larochelle, K. Grauman, N. Cesa-Bianchi, and R. Garnett, editors, Advances in Neural Information Processing Systems, volume 31. Curran Associates, Inc., 2018.
- Modulated periodic activations for generalizable local functional representations. In Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV), pages 14214–14223, October 2021.
- Occupancy networks: Learning 3d reconstruction in function space. In Proceedings IEEE Conf. on Computer Vision and Pattern Recognition (CVPR), 2019.
- Nerf: Representing scenes as neural radiance fields for view synthesis. In ECCV, 2020.
- Missing data: A comparison of neural network and expectation maximization techniques. Current Science, 93:1514–1521, 2007.
- Deepsdf: Learning continuous signed distance functions for shape representation. In The IEEE Conference on Computer Vision and Pattern Recognition (CVPR), June 2019.
- CKConv: Continuous kernel convolution for sequential data. In International Conference on Learning Representations, 2022.
- Latent ordinary differential equations for irregularly-sampled time series. In H. Wallach, H. Larochelle, A. Beygelzimer, F. d'Alché-Buc, E. Fox, and R. Garnett, editors, Advances in Neural Information Processing Systems, volume 32. Curran Associates, Inc., 2019.
- Multi-time attention networks for irregularly sampled time series. In International Conference on Learning Representations, 2021.
- Scene Representation Networks: Continuous 3D-Structure-Aware Neural Scene Representations. Curran Associates Inc., Red Hook, NY, USA, 2019.
- Metasdf: Meta-learning signed distance functions. In Proc. NeurIPS, 2020.
- Implicit neural representations with periodic activation functions. CoRR, abs/2006.09661, 2020.
- Neural brdf representation and importance sampling. Computer Graphics Forum, 40(6):332–346, 2021.
- Fourier features let networks learn high frequency functions in low dimensional domains. NeurIPS, 2020.
- Csdi: Conditional score-based diffusion models for probabilistic time series imputation. In M. Ranzato, A. Beygelzimer, Y. Dauphin, P.S. Liang, and J. Wortman Vaughan, editors, Advances in Neural Information Processing Systems, volume 34, pages 24804–24816. Curran Associates, Inc., 2021.
- Missing value estimation methods for dna microarrays. Bioinformatics, 17 6:520–5, 2001.
- Imputation of incomplete non- stationary seasonal time series data. Mathematical theory and modeling, 3:142–154, 2013.
- Deeptime: Deep time-index meta-learning for non-stationary time-series forecasting. 2022.
- Multiview neural surface reconstruction by disentangling geometry and appearance. Advances in Neural Information Processing Systems, 33, 2020.
- St-mvl: Filling missing values in geo-sensory time series data. In International Joint Conference on Artificial Intelligence, 2016.
- Estimating missing data in temporal data streams using multi-directional recurrent neural networks. IEEE Transactions on Biomedical Engineering, 66:1477–1490, 2017.