Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
194 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Deep Learning for Forecasting Stock Returns in the Cross-Section (1801.01777v4)

Published 3 Jan 2018 in q-fin.ST and cs.LG

Abstract: Many studies have been undertaken by using machine learning techniques, including neural networks, to predict stock returns. Recently, a method known as deep learning, which achieves high performance mainly in image recognition and speech recognition, has attracted attention in the machine learning field. This paper implements deep learning to predict one-month-ahead stock returns in the cross-section in the Japanese stock market and investigates the performance of the method. Our results show that deep neural networks generally outperform shallow neural networks, and the best networks also outperform representative machine learning models. These results indicate that deep learning shows promise as a skillful machine learning method to predict stock returns in the cross-section.

Citations (87)

Summary

  • The paper finds that deep neural networks with eight layers outperform shallow networks in predicting stock returns as measured by rank correlation and directional accuracy.
  • The paper compares DNNs to SVR and random forests, showing strong performance over SVR and competitive outcomes with RF, emphasizing deep learning's potential.
  • The paper highlights that ensemble models and portfolio strategies using DNN predictions yield superior risk-adjusted returns, demonstrating practical investment applications.

Analyzing the Efficacy of Deep Learning for Forecasting Stock Returns: A Study in the Japanese Market

This paper presents a methodical investigation into the application of deep learning techniques for the prediction of one-month-ahead stock returns in the cross-section of the Japanese stock market. The authors, Masaya Abe and Hideki Nakayama, outline a comparative paper using deep neural networks (DNNs) against shallow neural networks and traditional machine learning models like support vector regression (SVR) and random forests (RF).

Methodology and Experimental Setup

The paper utilizes a dataset comprising the MSCI Japan Index constituents, covering a wide array of 25 factors frequently employed in practice, encompassing both financial and market data. A regression problem is defined where DNNs aim to predict future stock returns based on historical data aggregated over different time intervals. The training encompasses 120 unique sets over ten years, reinforcing the predictive model's rigorous learning phase.

Performance evaluation relies on calculating the rank correlation coefficient (CORR) and directional accuracy (Direction) of forecasts, providing insights into the agreement between predicted scores and actual stock returns. Additionally, a practical long–short portfolio strategy assesses the applicability of these predictions in realistic investment scenarios.

Findings

The experimental results delineate several key findings:

  1. Superiority of Deep Neural Networks: DNNs with advanced architectures (notably with eight layers) consistently outperform three-layered shallow networks in both CORR and Direction across multiple configurations. The paper asserts that deeper networks enhance representational power due to their capability to conduct multiple nonlinear transformations, thus improving prediction accuracy.
  2. Comparison with Traditional Methods: While DNNs generally surpass SVR in predictive accuracy, they display competitive but limited superiority to RFs. Among the machine learning models tested, certain DNN architectures highlighted notably higher CORR values and Directional accuracies, although the advantage over RFs was not robust across all measures.
  3. Ensemble Effectiveness: When predictions from DNNs, SVR, and RFs are combined, the ensemble model achieves slightly superior CORR without significant gains in Directional accuracy, underlining the modest benefit of ensembling diverse model types in this context.
  4. Portfolio Strategy Performance: In practical investment evaluations, DNN patterns yielded the highest risk-adjusted returns (R/R) in particular portfolio strategies, demonstrating the potential applicability of these models in constructing viable trading strategies.

Implications and Future Directions

The research underscores the potential of deep learning to enhance stock return predictions within financial markets, suggesting that deeper architectures effectively leverage complex feature interactions. The examination is particularly pertinent to the Japanese stock market but offers implications that could extend to other regions.

Future research could explore diverse architectural innovations beyond fully-connected DNNs. The integration of recurrent neural networks (RNNs), known for handling temporal dependencies, could further leverage historical data patterns, potentially refining prediction accuracies further. Moreover, incorporating additional deep learning paradigms or hybrid models may yield insights that further bridge the gap between statistical predictions and market realities.

In conclusion, while the current paper positions deep learning as a formidable approach in stock return forecasting, it simultaneously calls for continued exploration and optimization to fully realize its potential in the domain of financial prediction.