Let’s now jump to the modelling part by splitting the data to train, validation and test sets.
Nest, let’s build the RNN model. We’re going to use word embeddings to represent the words. Now, while training the model, you can also train the word embeddings along with network weights. These are often called the embedding weights. While training, the embedding weights will be treated as normal weights of the network which are updated in each iteration.
In the next few sections, we will try the following three RNN models.
. RNN with arbitrarily initialised, untrainable embeddings: In this model, we will initialise the embedding weights arbitrarily. Further, we’ll freeze the embedding, that is, we won’t allow the network to train them.
. RNN with arbitrairily initialised, trainable embeddings: In this model, we’ll allow the network to train the embeddings.
. RNN with trainable word2vec embeddings: In this experiment, we’ll use word2vec word experiment, we’ll use word2vec word embeddings and also allow the network to train them further.
Let’s start with the first experiment: an RNN with arbitrarily initialised, untrainable embeddings.
Next, try the second model – RNN with arbitrarily initialised, trainable embeddings. Here, we’ll allow the embeddings to get trained with the network.
In the next segment, we’ll try the word2vec embeddings and see if that improves our model or not.
Report an error