IKH

Introduction

Until now, you have learned the original boosting algorithm, AdaBoost.

In this session, we will learn another popular boosting algorithm – Gradient Boosting and a modification of Gradient Boosting called XGBoost which is widely used in the industry. We will also briefly discuss some of the more advanced algorithms recently introduced.

Towards the end, we will go through the implementation of the different algorithms on a Kaggle Dataset in Python.

In this session

Let’s look at the broad flow of this session.

Prerequisites

Understanding how gradient descent works is a prerequisite for this session. You would have gone through the topic in the Linear Regression module.

Guidelines for in-module questions

The in-video and in-content questions for this module are not graded.

People you will hear from in this sessionĀ 

Subject Matter Expert:

Prof G Srinivasaraghavan

Professor, IIIT- B

The International Institute of Information Technology, Bangalore, commonly known as IIIT Bangalore, is a premier national graduate school in India. Founded in 1999, it offers Integrated M.Tech., M.Tech., M.S. (Research) and PhD programs in the field of Information Technology.

Anjali Rajvanshi

Sr. Subject Matter Expert, upGrad

Anjali has around 11 years of experience and has worked as a software engineer, data scientist, project lead for companies like Infosys, Evalueserve etc across geographies. She is currently working as a Senior Subject Matter Expert with upGrad.

Snehansu Sekhar Sahu

Sr. Research Engineer in ML & AI at American Express

Snehanshu has more than 6 years of experience and has worked with companies like Qualcomm Inc, Infoedge solutions & American Express. He is currently part of the ML & AI Research Group @ AmEx.

Report an error