1st International Conference on Smart Mobility and Vehicle Electrification

Financial Market Forecasting using RNN, LSTM, BiLSTM, GRU and Transformer-Based Deep Learning Algorithms

TEMITOPE KEHINDE, Waqar Ahmed Khan & S.H. Chung
Publisher: IEOM Society International
0 Paper Citations
1 Views
1 Downloads
Track: Artificial Intelligence
Abstract

In recent years, there has been a notable surge of interest in deep learning techniques due to their potential application in predicting financial market movements. Their proficiency in effectively handling the complex, unpredictable, and dynamic nature of financial markets establishes them as valuable resources for both investors and scholars. The aim of this study is to conduct a comprehensive assessment of the predictive precision of five deep learning models, namely RNN, LSTM, BiLSTM, GRU, and Transformer, in forecasting the performance of prominent global stock indices such as the FTSE 100, S&P 500, and HSI. The study demonstrated that the Transformer model exhibited higher accuracy and more efficient convergence compared to other models across several datasets, as assessed by commonly used evaluation metrics like Mean Absolute Error (MAE), Mean Squared Error (MSE), Root Mean Squared Error (RMSE), Huber loss, and Log-Cosh. On the other hand, the Recurrent Neural Network (RNN), despite its relatively straightforward architecture, frequently reached convergence within a comparable range of epochs as many sophisticated models. However, it significantly fell behind in terms of predicting performance. The Long Short-Term Memory (LSTM), Bidirectional LSTM (BiLSTM), and Gated Recurrent Unit (GRU) models demonstrated comparable performances, but with some dependency on the particular dataset. The Transformer model exhibited greater forecasting accuracy in comparison to its peers across all datasets and performance criteria. The results of our study underscore the effectiveness of the Transformer model in predicting future returns in financial markets. This suggests that incorporating this model into investment strategies can yield significant advantages, such as higher returns.In recent years, there has been a notable surge of interest in deep learning techniques due to their potential application in predicting financial market movements. Their proficiency in effectively handling the complex, unpredictable, and dynamic nature of financial markets establishes them as valuable resources for both investors and scholars. The aim of this study is to conduct a comprehensive assessment of the predictive precision of five deep learning models, namely RNN, LSTM, BiLSTM, GRU, and Transformer, in forecasting the performance of prominent global stock indices such as the FTSE 100, S&P 500, and HSI. The study demonstrated that the Transformer model exhibited higher accuracy and more efficient convergence compared to other models across several datasets, as assessed by commonly used evaluation metrics like Mean Absolute Error (MAE), Mean Squared Error (MSE), Root Mean Squared Error (RMSE), Huber loss, and Log-Cosh. On the other hand, the Recurrent Neural Network (RNN), despite its relatively straightforward architecture, frequently reached convergence within a comparable range of epochs as many sophisticated models. However, it significantly fell behind in terms of predicting performance. The Long Short-Term Memory (LSTM), Bidirectional LSTM (BiLSTM), and Gated Recurrent Unit (GRU) models demonstrated comparable performances, but with some dependency on the particular dataset. The Transformer model exhibited greater forecasting accuracy in comparison to its peers across all datasets and performance criteria. The results of our study underscore the effectiveness of the Transformer model in predicting future returns in financial markets. This suggests that incorporating this model into investment strategies can yield significant advantages, such as higher returns.

Published in: 1st International Conference on Smart Mobility and Vehicle Electrification, Southfield, USA

Publisher: IEOM Society International
Date of Conference: October 10-12, 2023

ISBN: 979-8-3507-0550-8
ISSN/E-ISSN: 2169-8767