Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
116 tokens/sec
GPT-4o
10 tokens/sec
Gemini 2.5 Pro Pro
24 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
3 tokens/sec
DeepSeek R1 via Azure Pro
35 tokens/sec
2000 character limit reached

Retrieval-Augmented Diffusion Models for Time Series Forecasting (2410.18712v1)

Published 24 Oct 2024 in cs.LG

Abstract: While time series diffusion models have received considerable focus from many recent works, the performance of existing models remains highly unstable. Factors limiting time series diffusion models include insufficient time series datasets and the absence of guidance. To address these limitations, we propose a Retrieval- Augmented Time series Diffusion model (RATD). The framework of RATD consists of two parts: an embedding-based retrieval process and a reference-guided diffusion model. In the first part, RATD retrieves the time series that are most relevant to historical time series from the database as references. The references are utilized to guide the denoising process in the second part. Our approach allows leveraging meaningful samples within the database to aid in sampling, thus maximizing the utilization of datasets. Meanwhile, this reference-guided mechanism also compensates for the deficiencies of existing time series diffusion models in terms of guidance. Experiments and visualizations on multiple datasets demonstrate the effectiveness of our approach, particularly in complicated prediction tasks.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (47)
  1. Juan Miguel Lopez Alcaraz and Nils Strodthoff. Diffusion-based time series imputation and forecasting with structured state space models. arXiv preprint arXiv:2208.09399, 2022.
  2. Retrieval-augmented diffusion models. Advances in Neural Information Processing Systems, 35:15309–15324, 2022.
  3. Retrieval-augmented transformer-xl for close-domain dialog generation. arXiv preprint arXiv:2105.09235, 2021.
  4. Improving language models by retrieving from trillions of tokens. In International conference on machine learning, pp.  2206–2240. PMLR, 2022.
  5. Financial time series forecasting model based on ceemdan and lstm. Physica A: Statistical mechanics and its applications, 519:127–139, 2019.
  6. Forecasting energy consumption time series using machine learning techniques based on usage patterns of residential householders. Energy, 165:709–726, 2018.
  7. Financial time series forecasting-a deep learning approach. International Journal of Machine Learning and Computing, 7(5):118–122, 2017.
  8. Depts: deep expansion learning for periodic time series forecasting. arXiv preprint arXiv:2203.07681, 2022.
  9. Mg-tsd: Multi-granularity time series diffusion models with guided learning process. arXiv preprint arXiv:2403.05751, 2024.
  10. Retrieval augmented language model pre-training. In International conference on machine learning, pp.  3929–3938. PMLR, 2020.
  11. Temporal convolutional neural (tcn) network for an effective weather forecasting using time-series data from the local weather station. Soft Computing, 24:16453–16482, 2020.
  12. Denoising diffusion probabilistic models. Advances in neural information processing systems, 33:6840–6851, 2020.
  13. Retrieval based time series forecasting. arXiv preprint arXiv:2209.13525, 2022.
  14. Mimic-iv, a freely accessible electronic health record dataset. Scientific data, 10(1):1, 2023.
  15. Transductive lstm for time-series prediction: An application to weather forecasting. Neural Networks, 125:1–9, 2020.
  16. Generalization through memorization: Nearest neighbor language models. arXiv preprint arXiv:1911.00172, 2019.
  17. Diffwave: A versatile diffusion model for audio synthesis. In International Conference on Learning Representations, 2020.
  18. Modeling long-and short-term temporal patterns with deep neural networks. In The 41st international ACM SIGIR conference on research & development in information retrieval, pp.  95–104, 2018.
  19. S Sri Lakshmi and RK Tiwari. Model dissection from earthquake time series: A comparative analysis using modern non-linear forecasting and artificial neural network approaches. Computers & Geosciences, 35(2):191–204, 2009.
  20. Generative time series forecasting with diffusion, denoise, and disentanglement. Advances in Neural Information Processing Systems, 35:23009–23022, 2022.
  21. Scinet: Time series modeling and forecasting with sample convolution and interaction. Advances in Neural Information Processing Systems, 35:5816–5828, 2022.
  22. Pyraformer: Low-complexity pyramidal attention for long-range time series modeling and forecasting. In International conference on learning representations, 2021.
  23. itransformer: Inverted transformers are effective for time series forecasting. arXiv preprint arXiv:2310.06625, 2023.
  24. Scoring rules for continuous probability distributions. Management science, 22(10):1087–1096, 1976.
  25. A time series is worth 64 words: Long-term forecasting with transformers. arXiv preprint arXiv:2211.14730, 2022.
  26. N-beats: Neural basis expansion analysis for interpretable time series forecasting. arXiv preprint arXiv:1905.10437, 2019.
  27. Pytorch: An imperative style, high-performance deep learning library. Advances in neural information processing systems, 32, 2019.
  28. Autoregressive denoising diffusion models for multivariate probabilistic time series forecasting. In International Conference on Machine Learning, pp.  8857–8868. PMLR, 2021.
  29. High-resolution image synthesis with latent diffusion models. In Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, pp.  10684–10695, 2022.
  30. Non-autoregressive conditional diffusion models for time series prediction. arXiv preprint arXiv:2306.05043, 2023.
  31. Multi-resolution diffusion models for time series forecasting. In The Twelfth International Conference on Learning Representations, 2023.
  32. Seriesnet: a generative time series forecasting model. In 2018 international joint conference on neural networks (IJCNN), pp.  1–8. IEEE, 2018.
  33. Efficient attention: Attention with linear complexities. In Proceedings of the IEEE/CVF winter conference on applications of computer vision, pp.  3531–3539, 2021.
  34. Deep unsupervised learning using nonequilibrium thermodynamics. In International conference on machine learning, pp.  2256–2265. PMLR, 2015.
  35. Score-based generative modeling through stochastic differential equations. arXiv preprint arXiv:2011.13456, 2020.
  36. Csdi: Conditional score-based diffusion models for probabilistic time series imputation. Advances in Neural Information Processing Systems, 34:24804–24816, 2021.
  37. Attention is all you need. Advances in neural information processing systems, 30, 2017.
  38. Autoformer: Decomposition transformers with auto-correlation for long-term series forecasting. Advances in Neural Information Processing Systems, 34:22419–22430, 2021.
  39. Timesnet: Temporal 2d-variation modeling for general time series analysis. In The eleventh international conference on learning representations, 2022.
  40. Mqretnn: Multi-horizon time series forecasting with retrieval augmentation. arXiv preprint arXiv:2207.10517, 2022.
  41. Frequency-domain mlps are more effective learners in time series forecasting. In Thirty-seventh Conference on Neural Information Processing Systems, 2023.
  42. Time-series generative adversarial networks. Advances in neural information processing systems, 32, 2019.
  43. Are transformers effective for time series forecasting? In Proceedings of the AAAI conference on artificial intelligence, volume 37, pp.  11121–11128, 2023.
  44. Remodiffuse: Retrieval-augmented motion diffusion model. In Proceedings of the IEEE/CVF International Conference on Computer Vision, pp.  364–373, 2023.
  45. Informer: Beyond efficient transformer for long sequence time-series forecasting. In Proceedings of the AAAI conference on artificial intelligence, volume 35, pp.  11106–11115, 2021.
  46. Film: Frequency improved legendre memory model for long-term time series forecasting. Advances in Neural Information Processing Systems, 35:12677–12690, 2022a.
  47. Fedformer: Frequency enhanced decomposed transformer for long-term series forecasting. In International Conference on Machine Learning, pp.  27268–27286. PMLR, 2022b.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-up Questions

We haven't generated follow-up questions for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com