IKH

Summary 4

In this session, you learnt how forward propagation and backpropagation occurs in neural networks and how the parameters are updated using the gradient descent algorithm.


You understood that the task is to minimise the loss function with respect to a large number of parameters and that it can be done efficiently using gradient descent. You then learnt how to derive the expressions for the gradient of loss with respect to the variables Z, W, b and H of the various layers for a single data point. You then learnt how to repeatedly update the weights and biases of the network using these gradients.

Then, you explored the basics of an extensive library called TensorFlow to help build and train neural networks with ease and used it to implement the housing price prediction example. We hope you experiment with the code notebooks provided and explore and build more interesting neural networks!

In the next session, we will explore the implementation of neural networks using Keras and some commonly used best practices for training neural networks, dropouts and batch normalisation. 

Report an error