Large Language Models for Financial Aid in Financial Time-series Forecasting (2410.19025v1)
Abstract: Considering the difficulty of financial time series forecasting in financial aid, much of the current research focuses on leveraging big data analytics in financial services. One modern approach is to utilize "predictive analysis", analogous to forecasting financial trends. However, many of these time series data in Financial Aid (FA) pose unique challenges due to limited historical datasets and high dimensional financial information, which hinder the development of effective predictive models that balance accuracy with efficient runtime and memory usage. Pre-trained foundation models are employed to address these challenging tasks. We use state-of-the-art time series models including pre-trained LLMs (GPT-2 as the backbone), transformers, and linear models to demonstrate their ability to outperform traditional approaches, even with minimal ("few-shot") or no fine-tuning ("zero-shot"). Our benchmark study, which includes financial aid with seven other time series tasks, shows the potential of using LLMs for scarce financial datasets.
- L. Cao, “Ai in finance: challenges, techniques, and opportunities,” ACM Computing Surveys (CSUR), vol. 55, no. 3, pp. 1–38, 2022.
- Y. Zu, J. Mi, L. Song, S. Lu, and J. He, “Finformer: A static-dynamic spatiotemporal framework for stock trend prediction,” in 2023 IEEE International Conference on Big Data (BigData). IEEE, 2023, pp. 1460–1469.
- T. Leung and T. Zhao, “Financial time series analysis and forecasting with hilbert–huang transform feature generation and machine learning,” Applied Stochastic Models in Business and Industry, vol. 37, no. 6, pp. 993–1016, 2021.
- X. Li, X. Shen, Y. Zeng, X. Xing, and J. Xu, “Finreport: Explainable stock earnings forecasting via news factor analyzing model,” in Companion Proceedings of the ACM on Web Conference 2024, 2024, pp. 319–327.
- K. Sako, B. N. Mpinda, and P. C. Rodrigues, “Neural networks for financial time series forecasting,” Entropy, vol. 24, no. 5, p. 657, 2022.
- H. Wu, T. Hu, Y. Liu, H. Zhou, J. Wang, and M. Long, “Timesnet: Temporal 2d-variation modeling for general time series analysis,” in International Conference on Learning Representations, 2023.
- X. Xu and Y. Zhang, “Commodity price forecasting via neural networks for coffee, corn, cotton, oats, soybeans, soybean oil, sugar, and wheat,” Intelligent Systems in Accounting, Finance and Management, vol. 29, no. 3, pp. 169–181, 2022.
- S. Zhang, J. Luo, S. Wang, and F. Liu, “Oil price forecasting: A hybrid gru neural network based on decomposition–reconstruction methods,” Expert Systems with Applications, vol. 218, p. 119617, 2023.
- A. Radford, J. Wu, R. Child, D. Luan, D. Amodei, I. Sutskever et al., “Language models are unsupervised multitask learners,” OpenAI blog, vol. 1, no. 8, p. 9, 2019.
- A. Dosovitskiy, “An image is worth 16x16 words: Transformers for image recognition at scale,” arXiv preprint arXiv:2010.11929, 2020.
- Z. Chen, L. N. Zheng, C. Lu, J. Yuan, and D. Zhu, “Chatgpt informed graph neural network for stock movement prediction,” arXiv preprint arXiv:2306.03763, 2023.
- X. Yu, Z. Chen, Y. Ling, S. Dong, Z. Liu, and Y. Lu, “Temporal data meets llm–explainable financial time series forecasting,” arXiv preprint arXiv:2306.11025, 2023.
- R. K., B. D., J. Ortagus, K. R., E. A., L. M., and C. J., “State financial aid dataset,” 2023. [Online]. Available: https://informedstates.org/state-financial-aid-dataset-download
- N. Gruver, M. Finzi, S. Qiu, and A. G. Wilson, “Large language models are zero-shot time series forecasters,” Advances in Neural Information Processing Systems, vol. 36, 2024.
- M. Tan, M. A. Merrill, V. Gupta, T. Althoff, and T. Hartvigsen, “Are language models actually useful for time series forecasting?” arXiv preprint arXiv:2406.16964, 2024.
- M. Jin, S. Wang, L. Ma, Z. Chu, J. Y. Zhang, X. Shi, P.-Y. Chen, Y. Liang, Y.-F. Li, S. Pan et al., “Time-llm: Time series forecasting by reprogramming large language models,” arXiv preprint arXiv:2310.01728, 2023.
- P. Liu, H. Guo, T. Dai, N. Li, J. Bao, X. Ren, Y. Jiang, and S.-T. Xia, “Taming pre-trained llms for generalised time series forecasting via cross-modal knowledge distillation,” arXiv preprint arXiv:2403.07300, 2024.
- T. Zhou, P. Niu, L. Sun, R. Jin et al., “One fits all: Power general time series analysis by pretrained lm,” Advances in neural information processing systems, vol. 36, pp. 43 322–43 355, 2023.
- A. Zeng, M. Chen, L. Zhang, and Q. Xu, “Are transformers effective for time series forecasting?” in Proceedings of the AAAI Conference on Artificial Intelligence, 2023.
- Y. Liu, T. Hu, H. Zhang, H. Wu, S. Wang, L. Ma, and M. Long, “itransformer: Inverted transformers are effective for time series forecasting,” in The Twelfth International Conference on Learning Representations, 2024. [Online]. Available: https://openreview.net/forum?id=JePfAI8fah
- Y. Nie, N. H. Nguyen, P. Sinthong, and J. Kalagnanam, “A time series is worth 64 words: Long-term forecasting with transformers,” arXiv preprint arXiv:2211.14730, 2022.
- S. Wang, H. Wu, X. Shi, T. Hu, H. Luo, L. Ma, J. Y. Zhang, and J. Zhou, “Timemixer: Decomposable multiscale mixing for time series forecasting,” arXiv preprint arXiv:2405.14616, 2024.
- M. Jin, H. Tang, C. Zhang, Q. Yu, C. Liu, S. Zhu, Y. Zhang, and M. Du, “Time series forecasting with llms: Understanding and enhancing model capabilities,” arXiv preprint arXiv:2402.10835, 2024.
- Y. Liang, H. Wen, Y. Nie, Y. Jiang, M. Jin, D. Song, S. Pan, and Q. Wen, “Foundation models for time series analysis: A tutorial and survey,” in Proceedings of the 30th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, 2024, pp. 6555–6565.
- A. Das, W. Kong, R. Sen, and Y. Zhou, “A decoder-only foundation model for time-series forecasting,” arXiv preprint arXiv:2310.10688, 2023.
- A. F. Ansari, L. Stella, C. Turkmen, X. Zhang, P. Mercado, H. Shen, O. Shchur, S. S. Rangapuram, S. P. Arango, S. Kapoor et al., “Chronos: Learning the language of time series,” arXiv preprint arXiv:2403.07815, 2024.
Collections
Sign up for free to add this paper to one or more collections.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.