Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 156 tok/s
Gemini 2.5 Pro 49 tok/s Pro
GPT-5 Medium 30 tok/s Pro
GPT-5 High 27 tok/s Pro
GPT-4o 110 tok/s Pro
Kimi K2 212 tok/s Pro
GPT OSS 120B 436 tok/s Pro
Claude Sonnet 4.5 39 tok/s Pro
2000 character limit reached

Optimizing Time Series Forecasting: A Comparative Study of Adam and Nesterov Accelerated Gradient on LSTM and GRU networks Using Stock Market data (2410.01843v1)

Published 28 Sep 2024 in q-fin.ST

Abstract: Several studies have discussed the impact different optimization techniques in the context of time series forecasting across different Neural network architectures. This paper examines the effectiveness of Adam and Nesterov's Accelerated Gradient (NAG) optimization techniques on LSTM and GRU neural networks for time series prediction, specifically stock market time-series. Our study was done by training LSTM and GRU models with two different optimization techniques - Adam and Nesterov Accelerated Gradient (NAG), comparing and evaluating their performance on Apple Inc's closing price data over the last decade. The GRU model optimized with Adam produced the lowest RMSE, outperforming the other model-optimizer combinations in both accuracy and convergence speed. The GRU models with both optimizers outperformed the LSTM models, whilst the Adam optimizer outperformed the NAG optimizer for both model architectures. The results suggest that GRU models optimized with Adam are well-suited for practitioners in time-series prediction, more specifically stock price time series prediction producing accurate and computationally efficient models. The code for the experiments in this project can be found at https://github.com/AhmadMak/Time-Series-Optimization-Research Keywords: Time-series Forecasting, Neural Network, LSTM, GRU, Adam Optimizer, Nesterov Accelerated Gradient (NAG) Optimizer

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (1)

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.