Dice Question Streamline Icon: https://streamlinehq.com

Integrating LLM-based Transformers into portfolio optimization and asset allocation

Develop portfolio optimization and asset allocation frameworks that leverage large language model-based Transformers applied to asset pricing and factor investing, and evaluate their incremental benefits relative to forecasting-only applications, following techniques akin to those in transformer-based allocation approaches such as Ma (2023).

Information Square Streamline Icon: https://streamlinehq.com

Background

Beyond return prediction, the practical value of Transformer models in portfolio construction and allocation remains to be established. The authors recommend exploring LLM-based portfolio optimization or asset allocation techniques, referencing recent transformer-based allocation work.

This points to extending Transformer usage from forecasting to decision-making, assessing risk-adjusted performance improvements and robustness in allocation.

References

However, some open questions remain for future researchers. Finally, and no less importantly, researchers can further explore the LLM-based portfolio optimization or asset allocation technique as employed in Ma2023AttentionApproach.

Asset Pricing in Pre-trained Transformer (2505.01575 - Lai, 2 May 2025) in Section 6, Conclusion