Now that we have understood the intuition behind the algorithm, let us see the pseudo-code behind it.

In the next video, Anjali will explain how the pseudo-code in conjunction with the numerical example we saw earlier.

Here is the summary of the AdaBoost algorithm we have studied until now:

- Initialize the probabilities of the distribution as 1n where n is the number of data points
- For t = 0 to T, repeat the following (T is the total number of trees):
- Fit a tree ht on the training data using the respective probabilities

2.Compute

\mbox{\large$\in$}\nolimits_t={\textstyle\sum_i^n}\mbox{\large$D$}\nolimits_i\begin{bmatrix}\mbox{\large$h$}\nolimits_t\left(\mbox{\large$x$}\nolimits_i\right)\neq&\mbox{\large$y$}\nolimits_i\end{bmatrix}

3.Compute

4.Update

where,

Final Model:

We see here that with each new weak learner, the distribution of the data changes i.e. the weight given to each observation changes.

Observe the factor: **e−αtyiht(xi)**

If there is a misclassification done by the model, then the product of yi∗ht(xi) = -1

So the power of the exponential will be positive (growing exponential weight).

This indicates that the weight will increase for all misclassified points.

Otherwise, if it is correctly classified then a product of yi∗ht(xi) = 1.

So it will have a decaying weight because of the negative term in the power of the exponential term. This indicates that the weight will decrease for all correctly classified points.

The model continues adding weak learners till a preset number of weak learners have been added.

We then make the final prediction by adding up the weighted prediction for every classifier.

Note: Summarizing the notations in the lecture – At an iteration t, we have a distribution Dt of the training data T on which we fit a model ht and then use the results to create a new distribution Dt+1.

The final model H(x) we built is an ensemble of all the individual models hi with weights αi.

In the next segment, you will learn how we perform these two steps. Before that, try thinking about some ideas we just learnt through the following questions.

Report an error