IKH

Recall for Forward Pass

In the previous segment, you learnt that neural networks are used by the Word2Vec model to produce word embeddings. Before you learn in detail how the Word2Vec model works, let’s revise the concept of neural networks.

A general representation of the Input Data Matrix is given below wherein n represents the number of fetures and m represents the numbefr of sasmples. In Iris data set you saw that the number of fetures n is 4 sepal length, sepal with petal length and petal width.

We designed the neural network that had 4 input neurons, as we had 4 features in the Iris data set and 3 neurons in the output layer, as we had 3 classes, which are setosa, virginica and versicolor. The number of neurons in the hidden layer is arbitrarily decided. 

Now,you gained an understaning of the different components in a fully connected neural network. These are as follows:

  • Input layer
  • Wieght Matrix
  • Hidden layer1
  • Activation function of layerv 1(f1)
  • Wieght Matrix(W2)
  • Hidden layer2
  • Activation function of layer2(f2)
  • Weight Matrix (Wo)
  • Output Layer
  • Activation function of output layer (f0 )

The dimensions of the weight matrix are deduced from the numbers of neurons in the previous layer and in the next layer.

Number of rows of the weight matrix = Number of neurons in the previous layer

Number of columns of the weight matrix = Number of neurons in the next layer

You learnt how one input sample goes through the different layers of neural network, and this is given below:

Python
$$//W_1//$$

$$//W_1//$$

$$W_1$$