IKH

Introduction to the Module

In this module, we are now going to understand the advanced topics in NLP, such as:

  • Building Neural Machine Translation using Attention Mechanisms.
  • Fine-tuning advanced language models such as Transformers.

Welcome to the session on Machine Translation.

So far, in the previous, you have come across some of the NLP problems like predicting the sentiment of a particular text corpus or predicting the POS tag for a wood. However, there are many tasks where the output is not just a single class or a word but a word but a sequential output that may very in length too.

One such task is machine translation where you have an input sentence present in one language and have to translate it into a sentence in a different language. This kind of problem is one of the necessary applications of deep learning in the domain of natural language understanding.

In this module, you will look at the basics of machine translation and the deep learning-based framework for handling this type of problem. In the next video, Mohit will introduce you to machine translation and the list of elements that we will be covering in this module.

Language translation has become an important tool to different parts of the world. With the progress of digitalisation, the internet has become more accessible across geographies and language translation play a key role in breaking the language barriers.

Tech giants like Google, Microsoft and Facebook are integrating their applications with machine translation to make the user experience better and help cater to a wider audience. 

In Facebook, to translate a post or comment written in another language, you just have to tap the ‘Translate’ button next to it. Let’s say on your visit to Paris you want to ask “Where is the nearby hotel?” in French. You can use your voice in Google Translate to hear the real-time translation and have a better conversation. With such seamless features, we can easily engage with others in our own preferred language.

This topic consists of two sessions. The first session deals with the understanding of machine translation and sequence to sequence models. The second session introduces the concept of attention and how it helps in building a better translation model. By the end of this topic, you will be able to build a neural machine translation model translating from one language to another.

The entire topic can be segregated into the following sub topics:

  • Evolution of machine translation.
  • Traditional sequence to sequence models.
  • Attention-based sequence to sequence models.

Guidelines for in-module questions

The in-video and in-content questions for this module are not graded.

People you will hear from in this session

Adjunct Faculty

Mohit Bhatnagar

Founder, Tucareers.com

Mohit is the founder of Tucarrers.com which is a global career assessment and guidance platform. He has done his PhD in Decision Sciences from IIM Lucknow where his thesis proposal was an analytics-driven model for internationalisation in the field of career decision.

He has 20+ years of experience in the IT industry and is very passionate about teaching Business Research, Python, Machine Learning, Deep Learning and NLP to industry professionals. He is also an Adjunct Faculty at IIM Lucknow teaching NLP and DL as part of the Executive Program for Business Analytics.

To understand machine translation further, let’s go through the next segment and see how it has evolved over the years.

Report an error