IKH

Summary

In this session, we learnt the intuition behind boosting and studied in detail the AdaBoost algorithm.

  • Adaboost starts with a uniform distribution of weights over training examples.
  • These weights tell the importance of the data point being considered.
  • We first start with a weak learner h1(x) to create the initial prediction.
  • Patterns which are not captured by previous models become the goal for the next model by giving more weightage.
  • The next model(weak learner) trains on this resampled data to create the next prediction.
  • In the end, we make a weighted sum of all the linear classifiers to make a strong classifier.

Additional Content – Soft Skills:

Navigate here to access the soft skills content related to Job Interviews.

Report an error