Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Simple Contrastive Representation Learning for Time Series Forecasting (2303.18205v2)

Published 31 Mar 2023 in cs.LG

Abstract: Contrastive learning methods have shown an impressive ability to learn meaningful representations for image or time series classification. However, these methods are less effective for time series forecasting, as optimization of instance discrimination is not directly applicable to predicting the future state from the historical context. To address these limitations, we propose SimTS, a simple representation learning approach for improving time series forecasting by learning to predict the future from the past in the latent space. SimTS exclusively uses positive pairs and does not depend on negative pairs or specific characteristics of a given time series. In addition, we show the shortcomings of the current contrastive learning framework used for time series forecasting through a detailed ablation study. Overall, our work suggests that SimTS is a promising alternative to other contrastive learning approaches for time series forecasting.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Xiaochen Zheng (29 papers)
  2. Xingyu Chen (98 papers)
  3. Manuel Schürch (11 papers)
  4. Amina Mollaysa (11 papers)
  5. Ahmed Allam (18 papers)
  6. Michael Krauthammer (32 papers)
Citations (15)

Summary

We haven't generated a summary for this paper yet.