Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

MASTER: Market-Guided Stock Transformer for Stock Price Forecasting (2312.15235v1)

Published 23 Dec 2023 in cs.CE

Abstract: Stock price forecasting has remained an extremely challenging problem for many decades due to the high volatility of the stock market. Recent efforts have been devoted to modeling complex stock correlations toward joint stock price forecasting. Existing works share a common neural architecture that learns temporal patterns from individual stock series and then mixes up temporal representations to establish stock correlations. However, they only consider time-aligned stock correlations stemming from all the input stock features, which suffer from two limitations. First, stock correlations often occur momentarily and in a cross-time manner. Second, the feature effectiveness is dynamic with market variation, which affects both the stock sequential patterns and their correlations. To address the limitations, this paper introduces MASTER, a MArkert-Guided Stock TransformER, which models the momentary and cross-time stock correlation and leverages market information for automatic feature selection. MASTER elegantly tackles the complex stock correlation by alternatively engaging in intra-stock and inter-stock information aggregation. Experiments show the superiority of MASTER compared with previous works and visualize the captured realistic stock correlation to provide valuable insights.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (33)
  1. An empirical evaluation of generic convolutional and recurrent networks for sequence modeling. arXiv preprint arXiv:1803.01271.
  2. Lead–lag detection and network clustering for multivariate time series with an application to the US equity market. Machine Learning, 111(12): 4497–4538.
  3. Space-time mixing attention for video transformer. Advances in neural information processing systems, 34: 19594–19607.
  4. Xgboost: A scalable tree boosting system. In Proceedings of the 22nd acm sigkdd international conference on knowledge discovery and data mining, 785–794.
  5. Learning phrase representations using RNN encoder-decoder for statistical machine translation. arXiv preprint arXiv:1406.1078.
  6. Spatial-temporal transformer for dynamic scene graph generation. In Proceedings of the IEEE/CVF international conference on computer vision, 16372–16382.
  7. Hierarchical Multi-Scale Gaussian Transformer for Stock Movement Prediction. In IJCAI, 4640–4646.
  8. Enhancing stock movement prediction with adversarial training. arXiv preprint arXiv:1810.09936.
  9. Temporal relational ranking for stock prediction. ACM Transactions on Information Systems (TOIS), 37(2): 1–30.
  10. Long short-term memory. Supervised sequence labelling with recurrent neural networks, 37–45.
  11. Efficient integration of multi-order dynamics and internal dynamics in stock movement prediction. In Proceedings of the Sixteenth ACM International Conference on Web Search and Data Mining, 850–858.
  12. Kamble, R. A. 2017. Short and long term stock trend prediction using decision tree. In 2017 International Conference on Intelligent Computing and Control Systems (ICICCS), 1371–1375. IEEE.
  13. Reformer: The efficient transformer. arXiv preprint arXiv:2001.04451.
  14. Memory-Enhanced Transformer for Representation Learning on Temporal Heterogeneous Graphs. Data Science and Engineering, 8(2): 98–111.
  15. Tensor-based learning for predicting stock movements. In Proceedings of the AAAI Conference on Artificial Intelligence, volume 29.
  16. Transformer-based capsule network for stock movement prediction. In Proceedings of the first workshop on financial technology and natural language processing, 66–73.
  17. A Time Series is Worth 64 Words: Long-term Forecasting with Transformers. In The Eleventh International Conference on Learning Representations.
  18. Decision support system for stock trading using multiple indicators decision tree. In 2014 The 1st International Conference on Information Technology, Computer, and Electrical Engineering, 291–296. IEEE.
  19. Piccolo, D. 1990. A distance measure for classifying ARIMA models. Journal of time series analysis, 11(2): 153–164.
  20. Stock selection via spatiotemporal hypergraph attention network: A learning to rank approach. In Proceedings of the AAAI Conference on Artificial Intelligence, volume 35, 497–504.
  21. Spatiotemporal hypergraph convolution network for stock movement forecasting. In 2020 IEEE International Conference on Data Mining (ICDM), 482–491. IEEE.
  22. Attention is all you need. Advances in neural information processing systems, 30.
  23. Graph attention networks. arXiv preprint arXiv:1710.10903.
  24. Hierarchical Adaptive Temporal-Relational Modeling for Stock Trend Prediction. In IJCAI, 3691–3698.
  25. Adaptive long-short pattern transformer for stock investment selection. In Proceedings of the Thirty-First International Joint Conference on Artificial Intelligence, 3970–3977.
  26. Review of graph construction and graph learning in stock price prediction. Procedia Computer Science, 214: 771–778.
  27. Temporal and Heterogeneous Graph Neural Network for Financial Time Series Prediction. In Proceedings of the 31st ACM International Conference on Information & Knowledge Management, 3584–3593.
  28. Semantic frames to predict stock price movement. In Proceedings of the 51st annual meeting of the association for computational linguistics, 873–883.
  29. Spatial-temporal transformer networks for traffic flow forecasting. arXiv preprint arXiv:2001.02908.
  30. Hist: A graph-based framework for stock trend forecasting via mining concept-oriented shared information. arXiv preprint arXiv:2110.13716.
  31. Qlib: An ai-oriented quantitative investment platform. arXiv preprint arXiv:2009.11189.
  32. Accurate multivariate stock movement prediction via data-axis transformer with multi-level contexts. In Proceedings of the 27th ACM SIGKDD Conference on Knowledge Discovery & Data Mining, 2037–2045.
  33. Crossformer: Transformer utilizing cross-dimension dependency for multivariate time series forecasting. In The Eleventh International Conference on Learning Representations.
User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Tong Li (197 papers)
  2. Zhaoyang Liu (42 papers)
  3. Yanyan Shen (54 papers)
  4. Xue Wang (69 papers)
  5. Haokun Chen (26 papers)
  6. Sen Huang (9 papers)
Citations (9)

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com