Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
102 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Context Matters: Leveraging Contextual Features for Time Series Forecasting (2410.12672v4)

Published 16 Oct 2024 in cs.LG and cs.AI

Abstract: Time series forecasts are often influenced by exogenous contextual features in addition to their corresponding history. For example, in financial settings, it is hard to accurately predict a stock price without considering public sentiments and policy decisions in the form of news articles, tweets, etc. Though this is common knowledge, the current state-of-the-art (SOTA) forecasting models fail to incorporate such contextual information, owing to its heterogeneity and multimodal nature. To address this, we introduce ContextFormer, a novel plug-and-play method to surgically integrate multimodal contextual information into existing pre-trained forecasting models. ContextFormer effectively distills forecast-specific information from rich multimodal contexts, including categorical, continuous, time-varying, and even textual information, to significantly enhance the performance of existing base forecasters. ContextFormer outperforms SOTA forecasting models by up to 30% on a range of real-world datasets spanning energy, traffic, environmental, and financial domains.

Understanding ICLR 2025 Conference Submission Formatting

The paper “Formatting Instructions for ICLR 2025 Conference Submissions” serves as a comprehensive guide for preparing manuscripts for the International Conference on Learning Representations (ICLR) 2025. This document provides explicit formatting guidelines to ensure uniformity and streamlined submissions.

Key Components

  1. Submission Process: The paper emphasizes electronic submissions via OpenReview, aligning with ICLR’s existing practices. It ensures that the final version adheres to camera-ready requirements by incorporating the \iclrfinalcopy command in the LaTeX document.
  2. Formatting Style: The structure follows the NeurIPS formatting tradition, highlighting the necessity of using specific LaTeX style files (iclr2025_conference.sty and .bst). The meticulous specifications include page size, margins, font type, and spacing, contributing to a standardized appearance across submissions.
  3. General Formatting: Precise details are provided, such as leaving 1.5-inch left margins, 10-point Times New Roman font, and specific paragraph spacing. These conventions ensure clarity and readability.
  4. Headings and Sections: The document delineates a detailed hierarchy of headings, from first to third level, with specific instructions on capitalization, alignment, and spacing, which aids in maintaining a logical structure and flow across papers.
  5. Figures and Tables: The guidelines for figures and tables—centered, legible, properly numbered, and captioned—aid in effective data representation. Such attention to detail reduces ambiguities in visual presentation.

Practical Implications

  • Uniformity and Accessibility: By following these detailed formatting instructions, authors contribute to a cohesive compendium of conference papers, facilitating easier navigation and comprehension for reviewers and readers.
  • Time Efficiency for Reviewers: Standardized formatting reduces cognitive load for reviewers, enabling them to focus on content rather than deciphering inconsistent formats.

Theoretical Considerations

From a theoretical standpoint, the document serves as a protocol for scholarly communication, emphasizing the importance of clarity and consistency in disseminating scientific knowledge.

Future Directions

While the document serves its purpose within the confines of the ICLR 2025 conference, ongoing developments in LaTeX capabilities and digital publication standards will likely influence future iterations of formatting guidelines. Researchers must remain abreast of these changes to maintain compliance and enhance the presentation of their scientific contributions.

This paper primarily fulfills a procedural role, yet its meticulous attention to detail underscores an essential aspect of academic dissemination. Researchers preparing submissions can significantly benefit from adhering to these guidelines, ensuring a professional and polished presentation of their work to the AI research community.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (34)
  1. Chronos: Learning the language of time series, 2024. URL https://arxiv.org/abs/2403.07815.
  2. Conditional time series forecasting with convolutional neural networks, 2018. URL https://arxiv.org/abs/1703.04691.
  3. Qiyun Pan Chad Akkoyun. Leveraging causal inference to generate accurate forecasts. https://careers.doordash.com/blog/leveraging-causal-inference-to-generate-accurate-forecasts/, 2022.
  4. Song Chen. Beijing Multi-Site Air-Quality Data. UCI Machine Learning Repository, 2019. DOI: https://doi.org/10.24432/C5RK5G.
  5. Long-term forecasting with tiDE: Time-series dense encoder. Transactions on Machine Learning Research, 2023. ISSN 2835-8856. URL https://openreview.net/forum?id=pCbC3aQB5W.
  6. A decoder-only foundation model for time-series forecasting, 2024. URL https://arxiv.org/abs/2310.10688.
  7. An image is worth 16x16 words: Transformers for image recognition at scale, 2021. URL https://arxiv.org/abs/2010.11929.
  8. Bitcoin dataset without missing values, Jul. 2021. URL https://doi.org/10.5281/zenodo.5122101.
  9. Robin John Hyndman and George Athanasopoulos. Forecasting: Principles and Practice. OTexts, Australia, 2nd edition, 2018.
  10. Time-llm: Time series forecasting by reprogramming large language models, 2024. URL https://arxiv.org/abs/2310.01728.
  11. Retrieval based time series forecasting, 2022. URL https://arxiv.org/abs/2209.13525.
  12. Kaggle. Store sales - time series forecasting. https://www.kaggle.com/competitions/store-sales-time-series-forecasting/data, 2022. Accessed: 2024-08-29.
  13. Modeling long- and short-term temporal patterns with deep neural networks, 2018. URL https://arxiv.org/abs/1703.07015.
  14. Differentially private timeseries forecasts for networked control. In 2023 American Control Conference (ACC), pp.  3595–3601, 2023. doi: 10.23919/ACC55779.2023.10156097.
  15. A survey of transformers, 2021. URL https://arxiv.org/abs/2106.04554.
  16. itransformer: Inverted transformers are effective for time series forecasting, 2024. URL https://arxiv.org/abs/2310.06625.
  17. Time weaver: A conditional time series generation model, 2024. URL https://arxiv.org/abs/2403.02682.
  18. A time series is worth 64 words: Long-term forecasting with transformers, 2023. URL https://arxiv.org/abs/2211.14730.
  19. OpenAI. Openai embeddings, 2022. URL https://platform.openai.com/docs/guides/embeddings. Accessed: 2024-10-01.
  20. PyTorch: an imperative style, high-performance deep learning library. Curran Associates Inc., Red Hook, NY, USA, 2019.
  21. Learning transferable visual models from natural language supervision, 2021.
  22. Lag-llama: Towards foundation models for probabilistic time series forecasting, 2024. URL https://arxiv.org/abs/2310.08278.
  23. Deepar: Probabilistic forecasting with autoregressive recurrent networks, 2019. URL https://arxiv.org/abs/1704.04110.
  24. Alex Sherstinsky. Fundamentals of recurrent neural network (rnn) and long short-term memory (lstm) network. Physica D: Nonlinear Phenomena, 404:132306, March 2020. ISSN 0167-2789. doi: 10.1016/j.physd.2019.132306. URL http://dx.doi.org/10.1016/j.physd.2019.132306.
  25. Time Series Analysis and Its Applications: With R Examples. Springer Texts in Statistics. Springer International Publishing, 2017. ISBN 9783319524528. URL https://books.google.co.in/books?id=sfFdDwAAQBAJ.
  26. Forecasting at scale. The American Statistician, 72, 09 2017. doi: 10.1080/00031305.2017.1380080.
  27. Artur Trindade. Electricity Load Diagrams. UCI Machine Learning Repository, 2015. DOI: https://doi.org/10.24432/C58C86.
  28. Attention is all you need. Advances in neural information processing systems, 30, 2017.
  29. Autoformer: Decomposition transformers with auto-correlation for long-term series forecasting. In M. Ranzato, A. Beygelzimer, Y. Dauphin, P.S. Liang, and J. Wortman Vaughan (eds.), Advances in Neural Information Processing Systems, volume 34, pp.  22419–22430. Curran Associates, Inc., 2021. URL https://proceedings.neurips.cc/paper_files/paper/2021/file/bcc0d400288793e8bdcd7c19a8ac0c2b-Paper.pdf.
  30. Timesnet: Temporal 2d-variation modeling for general time series analysis. ArXiv, abs/2210.02186, 2022a. URL https://api.semanticscholar.org/CorpusID:252715491.
  31. Autoformer: Decomposition transformers with auto-correlation for long-term series forecasting, 2022b. URL https://arxiv.org/abs/2106.13008.
  32. Informer: Beyond efficient transformer for long sequence time-series forecasting. In Proceedings of the AAAI conference on artificial intelligence, volume 35, pp.  11106–11115, 2021a.
  33. Informer: Beyond efficient transformer for long sequence time-series forecasting, 2021b. URL https://arxiv.org/abs/2012.07436.
  34. Fedformer: Frequency enhanced decomposed transformer for long-term series forecasting, 2022. URL https://arxiv.org/abs/2201.12740.
User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
X Twitter Logo Streamline Icon: https://streamlinehq.com