IKH

Skipgram Model

Let’s now study prediction-based approaches for creating word embeddings.

Please note that you will study neural networks in detail later, for now you only need to understand the input and the output fed to the network; you can ignore how the network is trained , the exact architecture etc.

Please note that , while considering the training data for the skipgram model , we should also consider ([be, be ],will).

Let’s understand the implementation of the skip-gram model through an example.

In the skip-gram approach to generating word vectors, the input is your target word and the task of the neural network is to predict the context words (the output) for that target word. The input word is represented in the form of a 1-hot-encoded vector. Once trained, the weight matrix between the input layer and the hidden layer gives the word embeddings for any target word (in the vocabulary).

Report an error