Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
173 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Hierarchical Multimodal LLMs with Semantic Space Alignment for Enhanced Time Series Classification (2410.18686v1)

Published 24 Oct 2024 in cs.LG

Abstract: Leveraging LLMs has garnered increasing attention and introduced novel perspectives in time series classification. However, existing approaches often overlook the crucial dynamic temporal information inherent in time series data and face challenges in aligning this data with textual semantics. To address these limitations, we propose HiTime, a hierarchical multi-modal model that seamlessly integrates temporal information into LLMs for multivariate time series classification (MTSC). Our model employs a hierarchical feature encoder to capture diverse aspects of time series data through both data-specific and task-specific embeddings. To facilitate semantic space alignment between time series and text, we introduce a dual-view contrastive alignment module that bridges the gap between modalities. Additionally, we adopt a hybrid prompting strategy to fine-tune the pre-trained LLM in a parameter-efficient manner. By effectively incorporating dynamic temporal features and ensuring semantic alignment, HiTime enables LLMs to process continuous time series data and achieves state-of-the-art classification performance through text generation. Extensive experiments on benchmark datasets demonstrate that HiTime significantly enhances time series classification accuracy compared to most competitive baseline methods. Our findings highlight the potential of integrating temporal features into LLMs, paving the way for advanced time series analysis. The code is publicly available for further research and validation. Our codes are publicly available1.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (48)
  1. Deep learning for time series classification and extrinsic regression: A current survey. Comput. Surveys 56, 9 (2024), 1–45.
  2. A review on distance based time series classification. Data Mining and Knowledge Discovery 33, 2 (2019), 378–412.
  3. Unsupervised scalable representation learning for multivariate time series. Advances in neural information processing systems 32 (2019).
  4. A reinforcement learning-informed pattern mining framework for multivariate time series classification. In 31st International Joint Conference on Artificial Intelligence (IJCAI).
  5. Time-series classification methods: Review and applications to power systems data. Big data application in power systems (2018), 179–220.
  6. FormerTime: Hierarchical Multi-Scale Representations for Multivariate Time Series Classification. arXiv preprint arXiv:2302.09818 (2023).
  7. Tapnet: Multivariate time series classification with attentional prototypical network. In Proceedings of the AAAI conference on artificial intelligence, Vol. 34. 6845–6852.
  8. Minirocket: A very fast (almost) deterministic transform for time series classification. In Proceedings of the 27th ACM SIGKDD conference on knowledge discovery & data mining. 248–257.
  9. Querying and mining of time series data: experimental comparison of representations and distance measures. Proceedings of the VLDB Endowment 1, 2 (2008), 1542–1552.
  10. Ben D Fulcher and Nick S Jones. 2014. Highly comparative feature-based time-series classification. IEEE Transactions on Knowledge and Data Engineering 26, 12 (2014), 3026–3037.
  11. A time series forest for classification and feature extraction. Information Sciences 239 (2013), 142–153.
  12. A Bostrom and A Bagnall. [n. d.]. A shapelet transform for multivariate time series classification. arXiv 2017. arXiv preprint arXiv:1712.06428 ([n. d.]).
  13. Lexiang Ye and Eamonn Keogh. 2009. Time series shapelets: a new primitive for data mining. In Proceedings of the 15th ACM SIGKDD international conference on Knowledge discovery and data mining. 947–956.
  14. Shapewordnet: An interpretable shapelet neural network for physiological signal classification. In International Conference on Database Systems for Advanced Applications. Springer, 353–369.
  15. Experiencing SAX: a novel symbolic representation of time series. Data Mining and knowledge discovery 15 (2007), 107–144.
  16. ConvTimeNet: A Deep Hierarchical Fully Convolutional Model for Multivariate Time Series Analysis. arXiv preprint arXiv:2403.01493 (2024).
  17. One fits all: Power general time series analysis by pretrained lm. Advances in neural information processing systems 36 (2023), 43322–43355.
  18. Time-llm: Time series forecasting by reprogramming large language models. arXiv preprint arXiv:2310.01728 (2023).
  19. Wavenet: A generative model for raw audio. arXiv preprint arXiv:1609.03499 12 (2016).
  20. Chronos: Learning the language of time series. arXiv preprint arXiv:2403.07815 (2024).
  21. Deep learning for time series classification: a review. Data mining and knowledge discovery 33, 4 (2019), 917–963.
  22. Generative pretrained hierarchical transformer for time series forecasting. In Proceedings of the 30th ACM SIGKDD Conference on Knowledge Discovery and Data Mining. 2003–2013.
  23. Dynamic time warping constraint learning for large margin nearest neighbor classification. Information Sciences 181, 13 (2011), 2787–2796.
  24. Learning time-series shapelets. In Proceedings of the 20th ACM SIGKDD international conference on Knowledge discovery and data mining. 392–401.
  25. Multi-scale convolutional neural networks for time series classification. arXiv preprint arXiv:1603.06995 (2016).
  26. Inceptiontime: Finding alexnet for time series classification. Data Mining and Knowledge Discovery 34, 6 (2020), 1936–1962.
  27. Yunhao Zhang and Junchi Yan. 2023. Crossformer: Transformer utilizing cross-dimension dependency for multivariate time series forecasting. In The eleventh international conference on learning representations.
  28. TEST: Text prototype aligned embedding to activate LLM’s ability for time series. arXiv preprint arXiv:2308.08241 (2023).
  29. Advancing Time Series Classification with Multimodal Language Modeling. arXiv preprint arXiv:2403.12371 (2024).
  30. S2superscript𝑆2S^{2}italic_S start_POSTSUPERSCRIPT 2 end_POSTSUPERSCRIPT IP-LLM: Semantic Space Informed Prompt Learning with LLM for Time Series Forecasting. In Forty-first International Conference on Machine Learning.
  31. Unitime: A language-empowered unified model for cross-domain time series forecasting. In Proceedings of the ACM on Web Conference 2024. 4095–4106.
  32. Azul Garza and Max Mergenthaler-Canseco. 2023. TimeGPT-1. arXiv preprint arXiv:2310.03589 (2023).
  33. Tempo: Prompt-based generative pre-trained transformer for time series forecasting. arXiv preprint arXiv:2310.04948 (2023).
  34. Dualtime: A dual-adapter multimodal language model for time series representation. arXiv e-prints arXiv–2406. (2024).
  35. TimeCMA: Towards LLM-Empowered Time Series Forecasting via Cross-Modality Alignment. arXiv preprint arXiv:2406.01638 (2024).
  36. Blip-2: Bootstrapping language-image pre-training with frozen image encoders and large language models. In International conference on machine learning. PMLR, 19730–19742.
  37. Visual instruction tuning. Advances in neural information processing systems 36 (2024).
  38. Lora: Low-rank adaptation of large language models. arXiv preprint arXiv:2106.09685 (2021).
  39. The UEA multivariate time series classification archive, 2018. arXiv preprint arXiv:1811.00075 (2018).
  40. A shapelet transform for time series classification. In Proceedings of the 18th ACM SIGKDD international conference on Knowledge discovery and data mining. 289–297.
  41. Time series classification using multi-channels deep convolutional neural networks. In International conference on web-age information management. Springer, 298–310.
  42. Timesnet: Temporal 2d-variation modeling for general time series analysis. arXiv preprint arXiv:2210.02186 (2022).
  43. A transformer-based framework for multivariate time series representation learning. In Proceedings of the 27th ACM SIGKDD conference on knowledge discovery & data mining. 2114–2124.
  44. Reformer: The efficient transformer. arXiv preprint arXiv:2001.04451 (2020).
  45. Are transformers effective for time series forecasting?. In Proceedings of the AAAI conference on artificial intelligence, Vol. 37. 11121–11128.
  46. Learning Transferable Time Series Classifier with Cross-Domain Pre-training from Language Model. arXiv preprint arXiv:2403.12372 (2024).
  47. Time-FFM: Towards LM-Empowered Federated Foundation Model for Time Series Forecasting. arXiv preprint arXiv:2405.14252 (2024).
  48. Janez Demšar. 2006. Statistical comparisons of classifiers over multiple data sets. The Journal of Machine learning research 7 (2006), 1–30.

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets