Introduction
In the previous session, you had studied the idea of distributional semantics briefly -words that occur in similar contexts have similar meanings. In this session, you will study distributional semantics in detail (also sometimes called vactor semantics).
The idea of distributional semantics (implemented through ‘word vectors’) has been used heavily in semantic processing for a wide variety of applications. in this session, you will learn about word vectors, word embeddings and the use of word vectors for practical NLP applications.
In this session
This session will introduce you to the following topics:
- Word vectors (Occurrence context and co -occurrence matrices)
- Word embedding(Frequency and prediction- based embeddings)
- Frequency- based embeddings: Latent Semantic Analysis(LSA)
- Prediction- based embeddings: Word2 Vec
- Using word embedding in python for practical application
Guidelines for in-module question
The in- video and in-content questions for this module are not graded. Note that graded question are given on a separate page labelled ‘Graded Questions’ at the end of this session. The graded questions in this session will adhere to the following guidelines:
First Attempt Marks | Second Attempt Marks | |
Question with 2 Attempts | 10 | 5 |
Question with 1 Attempt | 10 | 0 |