IKH

Summary

In this session, you learnt how to build a POS tagger and a text generator using RNNs. The code generation was at the character level, that is, you were predicting the next character given previous ‘n’ characters. In a similar fashion, you can also try to build a word level RNN. Given previous ‘n’ words, you can predict the next word. Everything else will be the same. you can check out the following notebook where we have built a word-level text generator on Valmiki’s Ramayana text.

That brings us to the end of the module on recurrent neutral networks. We hope that you learn how to apply RNNs to real-world scenarios. The important thing when it comes to applying RNNs is identifying the data and, of course, getting your hands dirty and experimenting with real-world problems.

You can download the lecture notes for this module from the link below:

report an error