Neural Networks: 6 Part(s)
Recurrent Neural Network (RNN) is a type of neural network that is designed to deal with sequential data. However, traditional RNNs have a limited short-term memory and struggle to retain information over longer sequences. Not only that, but they also suffer from the vanishing gradient problem. In other words, the gradients would become too small and unable to update weights during backpropagation effectively.
In this post, we are going to discuss Long Short-Term Memory (LSTM). It is a special type of RNN that can address the aforementioned problems. They were introduced by Sepp Hochreiter and Jürgen Schmidhuber in 1997 1. Since LSTM can retain information for a certain time, they are widely used in applications like speech recognition, language modeling, and stock prediction.
In this post, we are going to predict stock prices using LSTM networks using Keras 2.
Download the NVIDIA stock price dataset from Yahoo! Finance. After that, load the csv file into your Jupyter Notebook and convert the date column to datetime.
Once the CSV file is loaded, scale the data using the MinMaxScaler. Scaling the data brings all features to the same range, which helps the model to learn better. Also prevent dominance of one feature over the other.
After scaling the data, split the dataset into sequences for the LSTM model.
What the function does above is that splitting the dataset into sequences of 10 like a sliding window. Thus, LSTM has the ability to retain the previous information.
Finally, split the dataset into training and testing sets.
We are going to build an LSTM model using the Keras library. The model will consists of 3 layers of LSTM, and each of them will have 50 units.
We could add the number of LSTM layers. However, our model would catch all of the noises in the dataset. In otherwords, our model is overfitting and it would be good at predicting future values. Besides that, our dataset is small. Therefore, having more than 3 LSTM layers would not be beneficial.
We are going to train the model using the scaled data for 100 times.
Nvidia stock price truth vs prediction values
From the graph above, we can see that the model is able to somehow catch the trend of the stock price, eventhough it is not perfect. Given that the dataset is small and the information is limited, the model is able to predict the stock price to some extent.
Sepp Hochreiter and Jürgen Schmidhuber. Long Short-Term Memory ↩