A Novel Hyperdimensional Computing Framework for Online Time Series Forecasting on the Edge (2402.01999v1)
Abstract: In recent years, both online and offline deep learning models have been developed for time series forecasting. However, offline deep forecasting models fail to adapt effectively to changes in time-series data, while online deep forecasting models are often expensive and have complex training procedures. In this paper, we reframe the online nonlinear time-series forecasting problem as one of linear hyperdimensional time-series forecasting. Nonlinear low-dimensional time-series data is mapped to high-dimensional (hyperdimensional) spaces for linear hyperdimensional prediction, allowing fast, efficient and lightweight online time-series forecasting. Our framework, TSF-HD, adapts to time-series distribution shifts using a novel co-training framework for its hyperdimensional mapping and its linear hyperdimensional predictor. TSF-HD is shown to outperform the state of the art, while having reduced inference latency, for both short-term and long-term time series forecasting. Our code is publicly available at http://github.com/tsfhd2024/tsf-hd.git
- Task-free continual learning. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 11254–11263, 2019.
- Learning fast, learning slow: A general continual learning method based on complementary learning system. arXiv preprint arXiv:2201.12604, 2022.
- Merlion: A machine learning library for time series. CoRR, abs/2109.09265, 2021. URL https://arxiv.org/abs/2109.09265.
- Some recent advances in forecasting and control. Journal of the Royal Statistical Society. Series C (Applied Statistics), 17(2):91–109, 1968.
- Dark experience for general continual learning: a strong, simple baseline. Advances in neural information processing systems, 33:15920–15930, 2020.
- On tiny episodic memories in continual learning. arXiv preprint arXiv:1902.10486, 2019.
- Full stack parallel online hyperdimensional regression on fpga. In 2022 IEEE 40th International Conference on Computer Design (ICCD), pp. 517–524. IEEE, 2022.
- Hdnn-pim: Efficient in memory design of hyperdimensional computing with feature extraction. In Proceedings of the Great Lakes Symposium on VLSI 2022, pp. 281–286, 2022.
- Friedman, J. H. Greedy function approximation: a gradient boosting machine. Annals of statistics, pp. 1189–1232, 2001.
- How does a brain build a cognitive code? Studies of mind and brain: Neural principles of learning, perception, development, cognition, and motor control, pp. 1–52, 1982.
- Reghd: Robust and efficient regression in hyper-dimensional learning system. In 2021 58th ACM/IEEE Design Automation Conference (DAC), pp. 7–12. IEEE, 2021.
- Holt, C. C. Forecasting seasonals and trends by exponentially weighted moving averages. International journal of forecasting, 20(1):5–10, 2004.
- Forecasting: Principles and Practice. OTexts, Australia, 2nd edition, 2018.
- Hierarchical hyperdimensional computing for energy efficient classification. In Proceedings of the 55th Annual Design Automation Conference, pp. 1–6, 2018.
- Kanerva, P. Hyperdimensional computing: An introduction to computing in distributed representation with high-dimensional random vectors. Cognitive computation, 1:139–159, 2009.
- What learning systems do intelligent agents need? complementary learning systems theory updated. Trends in cognitive sciences, 20(7):512–534, 2016.
- Modeling long-and short-term temporal patterns with deep neural networks. In The 41st international ACM SIGIR conference on research & development in information retrieval, pp. 95–104, 2018.
- Lin, L.-J. Self-improving reactive agents based on reinforcement learning, planning and teaching. Machine learning, 8:293–321, 1992.
- Scinet: Time series modeling and forecasting with sample convolution and interaction. Advances in Neural Information Processing Systems, 35:5816–5828, 2022.
- Gradient episodic memory for continual learning. In Guyon, I., Luxburg, U. V., Bengio, S., Wallach, H., Fergus, R., Vishwanathan, S., and Garnett, R. (eds.), Advances in Neural Information Processing Systems, volume 30. Curran Associates, Inc., 2017. URL https://proceedings.neurips.cc/paper_files/paper/2017/file/f87522788a2be2d171666752f97ddebb-Paper.pdf.
- Why there are complementary learning systems in the hippocampus and neocortex: insights from the successes and failures of connectionist models of learning and memory. Psychological review, 102(3):419, 1995.
- Contextual transformation networks for online continual learning. In International Conference on Learning Representations, 2020.
- Learning fast and slow for online time series forecasting. arXiv preprint arXiv:2202.11672, 2022.
- Deep state space models for time series forecasting. Advances in neural information processing systems, 31, 2018.
- Räsänen, O. J. Generating hyperdimensional distributed representations from continuous-valued multivariate sensory input. Cognitive Science, 2015. URL https://api.semanticscholar.org/CorpusID:15600360.
- Learning to learn without forgetting by maximizing transfer and minimizing interference. arXiv preprint arXiv:1810.11910, 2018.
- Experience replay for continual learning. Advances in Neural Information Processing Systems, 32, 2019.
- Ar-net: A simple auto-regressive neural network for time-series. arXiv preprint arXiv:1911.12436, 2019.
- Cost: Contrastive learning of disentangled seasonal-trend representations for time series forecasting. arXiv preprint arXiv:2202.01575, 2022.
- Are transformers effective for time series forecasting? In Proceedings of the AAAI conference on artificial intelligence, volume 37, pp. 11121–11128, 2023.
- Onenet: Enhancing time series forecasting models under concept drift by online ensembling. arXiv preprint arXiv:2309.12659, 2023.
- Informer: Beyond efficient transformer for long sequence time-series forecasting. In Proceedings of the AAAI conference on artificial intelligence, volume 35, pp. 11106–11115, 2021.
- Mohamed Mejri (18 papers)
- Chandramouli Amarnath (5 papers)
- Abhijit Chatterjee (19 papers)