Dice Question Streamline Icon: https://streamlinehq.com

Scaling laws for financial deep learning models

Investigate the existence and form of scaling laws governing the performance of financial deep learning models trained on Limit Order Book data for stock price trend prediction, including determining how predictive accuracy changes with model size, dataset size, and compute for architectures such as TLOB and MLPLOB.

Information Square Streamline Icon: https://streamlinehq.com

Background

The paper introduces two architectures—MLPLOB and TLOB—for stock price trend prediction using Limit Order Book data and reports strong results across benchmarks. Given that Transformers often benefit from scale and that large volumes of financial data are available, understanding how performance scales with model size, data, and compute is crucial.

Despite prior evidence of scaling behavior in other domains (e.g., LLMs), the authors note that it remains unknown whether comparable scaling relations hold for financial deep learning models. They explicitly flag this as an open question in their concluding discussion of future research directions.

References

The investigation of scaling laws for financial deep learning models remains an open question, as does the development of more robust approaches to handling increased market efficiency and complexity.

TLOB: A Novel Transformer Model with Dual Attention for Price Trend Prediction with Limit Order Book Data (2502.15757 - Berti et al., 12 Feb 2025) in Conclusion (Future works)