2000 character limit reached
A decoder-only foundation model for time-series forecasting (2310.10688v4)
Published 14 Oct 2023 in cs.CL, cs.AI, and cs.LG
Abstract: Motivated by recent advances in LLMs for NLP, we design a time-series foundation model for forecasting whose out-of-the-box zero-shot performance on a variety of public datasets comes close to the accuracy of state-of-the-art supervised forecasting models for each individual dataset. Our model is based on pretraining a patched-decoder style attention model on a large time-series corpus, and can work well across different forecasting history lengths, prediction lengths and temporal granularities.
- On the benefits of maximum likelihood estimation for regression and forecasting. arXiv preprint arXiv:2106.10370, 2021.
- Conditional time series forecasting with convolutional neural networks. arXiv preprint arXiv:1703.04691, 2017.
- Some recent advances in forecasting and control. Journal of the Royal Statistical Society. Series C (Applied Statistics), 17(2):91–109, 1968.
- Llm4ts: Two-stage fine-tuning for time-series forecasting with pre-trained llms. arXiv preprint arXiv:2308.08469, 2023.
- Long-term forecasting with TiDE: Time-series dense encoder. Transactions on Machine Learning Research, 2023.
- Timegpt-1. arXiv preprint arXiv:2310.03589, 2023.
- Training compute-optimal large language models. arXiv preprint arXiv:2203.15556, 2022.
- Traffic4cast at neurips 2020 - yet more on the unreasonable effectiveness of gridded geo-spatial processes. In Hugo Jair Escalante and Katja Hofmann, editors, Proceedings of the NeurIPS 2020 Competition and Demonstration Track, volume 133 of Proceedings of Machine Learning Research, pages 325–343. PMLR, 06–12 Dec 2021.
- Generating wikipedia by summarizing long sequences. arXiv preprint arXiv:1801.10198, 2018.
- ED McKenzie. General exponential smoothing and the equivalent arma process. Journal of Forecasting, 3(3):333–344, 1984.
- A survey on time-series pre-trained models. arXiv preprint arXiv:2305.10716, 2023.
- M5 accuracy competition: Results, findings, and conclusions. International Journal of Forecasting, 38(4):1346–1364, 2022.
- A time series is worth 64 words: Long-term forecasting with transformers. International conference on learning representations, 2022.
- N-beats: Neural basis expansion analysis for interpretable time series forecasting. In International Conference on Learning Representations, 2019.
- Meta-learning framework with applications to zero-shot time-series forecasting. In Proceedings of the AAAI Conference on Artificial Intelligence, volume 35, pages 9242–9250, 2021.
- Language models are unsupervised multitask learners. OpenAI blog, 1(8):9, 2019.
- Deepar: Probabilistic forecasting with autoregressive recurrent networks. International Journal of Forecasting, 36(3):1181–1191, 2020.
- Think globally, act locally: A deep neural network approach to high-dimensional time series forecasting. Advances in neural information processing systems, 32, 2019.
- Forecasting at scale. The American Statistician, 72(1):37–45, 2018.
- Attention is all you need. Advances in neural information processing systems, 30, 2017.
- Optimal forecast reconciliation for hierarchical and grouped time series through trace minimization. Journal of the American Statistical Association, 114(526):804–819, 2019.
- Towards efficient and comprehensive urban spatial-temporal prediction: A unified library and performance benchmark. arXiv preprint arXiv:2304.14343, 2023.
- A multi-horizon quantile recurrent forecaster. arXiv preprint arXiv:1711.11053, 2017.
- Are transformers effective for time series forecasting? Proceedings of the AAAI conference on artificial intelligence, 2023.
- Fedformer: Frequency enhanced decomposed transformer for long-term series forecasting. In International Conference on Machine Learning, pages 27268–27286. PMLR, 2022.
- One fits all: Power general time series analysis by pretrained lm. arXiv preprint arXiv:2302.11939, 2023.
- Vector autoregressive models for multivariate time series. Modeling financial time series with S-PLUS®, pages 385–429, 2006.
- Informer: Beyond efficient transformer for long sequence time-series forecasting. In Proceedings of the AAAI conference on artificial intelligence, 2021.
- Abhimanyu Das (21 papers)
- Weihao Kong (29 papers)
- Rajat Sen (29 papers)
- Yichen Zhou (21 papers)