IKH

Loss Function

Now that we know how to calculate the predicted output from a neural network when given an input, we want to check if the neural network predicted it correctly. We will revisit the calculations we had done in the previous segment on the housing price prediction problem.

Std. Number of RoomsStd. House Size (sq. ft)Predicted PriceActual Price
-0.32-0.660.63-0.54

As you can see in the table above, the predicted price is not the same or even close to the actual price. So, we want to know how wrong the prediction of the neural network is and want to quantify this error in the prediction. A loss function or cost function will help us quantify such errors.

loss function or cost function is a function that maps an event or values of one or more variables onto a real number intuitively, representing some ‘cost’ associated with the ‘event’, as shown below: 

L(y,^y)=f:(y,^y)→R

Neural networks minimise the error in the prediction by optimising the loss function with respect to the parameters in the network. In other words, this optimisation is done by adjusting the weights and biases. We will see how this adjustment is done in subsequent sessions. For now, we will concentrate on how to compute the loss. 

In the case of regression, the most commonly used loss function is MSE/RSS.

In the case of classification, the most commonly used loss function is Cross Entropy/Log Loss.

Let’s consider the regression problem where we predict the house price, given the number of rooms and the size of the house. Here, we will use the RSS method to calculate the loss.

Std. Number of RoomsStd. House Size (sq. ft.)Predicted PriceActual Price
-0.32-0.660.63-0.54

In this example, we get a prediction 0.63, but the expected output is -0.54. Let’s calculate the loss using RSS: 
Loss(L)=12(actual−predicted)2=12(−0.54−0.63)2=0.68445

As given above, the MSE is the mean square error of all the samples in the given data. This gives us a quantified method of measuring how well the neural network is predicting the output. 

Now, let’s take a look at the loss function for the classification problem. In the next video, you will learn how to quantify the loss for a classification problem.

Now that we have learnt about the forward pass and the loss function for regression and classification problems, we know that given any input and its actual output, we can assess the behaviour of the neural network. 

Let’s now attempt a few questions based on this topic and then proceed to the next segment to understand how neural networks are trained in order to minimise the loss. 

Report an error