CLeaRForecast: Contrastive Learning of High-Purity Representations for Time Series Forecasting (2312.05758v1)
Abstract: Time series forecasting (TSF) holds significant importance in modern society, spanning numerous domains. Previous representation learning-based TSF algorithms typically embrace a contrastive learning paradigm featuring segregated trend-periodicity representations. Yet, these methodologies disregard the inherent high-impact noise embedded within time series data, resulting in representation inaccuracies and seriously demoting the forecasting performance. To address this issue, we propose CLeaRForecast, a novel contrastive learning framework to learn high-purity time series representations with proposed sample, feature, and architecture purifying methods. More specifically, to avoid more noise adding caused by the transformations of original samples (series), transformations are respectively applied for trendy and periodic parts to provide better positive samples with obviously less noise. Moreover, we introduce a channel independent training manner to mitigate noise originating from unrelated variables in the multivariate series. By employing a streamlined deep-learning backbone and a comprehensive global contrastive loss function, we prevent noise introduction due to redundant or uneven learning of periodicity and trend. Experimental results show the superior performance of CLeaRForecast in various downstream TSF tasks.
- MSTL: A Seasonal-Trend Decomposition Algorithm for Time Series with Multiple Seasonal Patterns. arXiv:2107.13462.
- Time series analysis: forecasting and control. John Wiley & Sons.
- Time-Series Representation Learning via Temporal and Contextual Contrasting. In Proceedings of the Thirtieth International Joint Conference on Artificial Intelligence, IJCAI-21, 2352–2359.
- Attention meets long short-term memory: A deep learning network for traffic flow forecasting. Physica A: Statistical Mechanics and its Applications, 587: 126485.
- An adaptive deep-learning load forecasting framework by integrating Transformer and domain knowledge. Advances in Applied Energy, 100142.
- Client: Cross-variable Linear Integrated Enhanced Transformer for Multivariate Long-Term Time Series Forecasting. arXiv:2305.18838.
- The Capacity and Robustness Trade-off: Revisiting the Channel Independent Strategy for Multivariate Time Series Forecasting. arXiv:2304.05206.
- Momentum Contrast for Unsupervised Visual Representation Learning. arXiv:1911.05722.
- Ridge regression: applications to nonorthogonal problems. Technometrics, 12(1): 69–82.
- Enhancing the Locality and Breaking the Memory Bottleneck of Transformer on Time Series Forecasting. In NeurIPS.
- Scinet: Time series modeling and forecasting with sample convolution and interaction. Advances in Neural Information Processing Systems, 35: 5816–5828.
- Can ChatGPT Forecast Stock Price Movements? Return Predictability and Large Language Models. arXiv preprint arXiv:2304.07619.
- A Time Series is Worth 64 Words: Long-term Forecasting with Transformers. arXiv preprint arXiv:2211.14730.
- Segal, M. R. 2004. Machine learning benchmarks and random forest regression.
- Think Globally, Act Locally: A Deep Neural Network Approach to High-Dimensional Time Series Forecasting. In NeurIPS.
- Sneddon, I. N. 1995. Fourier transforms. Courier Corporation.
- Linear regression. Wiley Interdisciplinary Reviews: Computational Statistics, 4(3): 275–294.
- Unsupervised Representation Learning for Time Series with Temporal Neighborhood Coding. In International Conference on Learning Representations.
- UCI. 2015. Electricity. https://archive.ics.uci.edu/ml/datasets/ElectricityLoadDiagrams20112014.
- WaveNet: A Generative Model for Raw Audio. In SSW.
- Visualizing data using t-SNE. Journal of machine learning research, 9(11).
- Wetterstation. 2020. Weather. https://www.ncei.noaa.gov/data/local-climatological-data/.
- CoST: Contrastive Learning of Disentangled Seasonal-Trend Representations for Time Series Forecasting. In International Conference on Learning Representations.
- TS2Vec: Towards Universal Representation of Time Series. arXiv:2106.10466.
- Are Transformers Effective for Time Series Forecasting?
- Self-Supervised Learning for Time Series Analysis: Taxonomy, Progress, and Prospects. arXiv:2306.10125.
- Informer: Beyond Efficient Transformer for Long Sequence Time-Series Forecasting. In AAAI.
- Jiaxin Gao (22 papers)
- Yuxiao Hu (11 papers)
- Qinglong Cao (14 papers)
- Siqi Dai (3 papers)
- Yuntian Chen (115 papers)