In this session, you studied some commonly used variants of RNNs.
First, you studied bidirectional RNNs. You learnt that in case of offline sequences, you can reverse the order of a sequence. You can use bidirectional RNNs to feed sequences in regular as well as in the reverse order. This gives you better result in most cases.
Then you learnt about long short-term memory networks or LSTMs. You learnt the three features of an LSTM cell:
- Presence of an explicit memory
- Gating mechanisms
- Constant error carousel
You also learnt the structure of an LSTM cell and its feed forward equations. The number of parameters in an LSTM layer are 4x the number of parameters in a standard RNN.
Finally, you briefly looked at some other variants of LSTM such as GRUs and LSTMs with peephole connections. The number of parameters in a GRU layer are 3x the number of parameters in a standard RNN layer.
In the next section, you’ll attempt the graded questions of this session.
Report an error