Simple linear regression is a useful approach for predicting a response on the basis of a single predictor variable. However, in practice, we often have more than one predictor. For more predictors, the multiple linear regression model takes the form:

where k is the number of predictors in the model and i = 1 to n where n is the total number of observations. In the upcoming video, you will learn how the multiple linear regression model can be represented in matrix form.

So, in the video, Anjali explained how equations of ‘n’ observations can be converted into its matrix equivalent.

This matrix can be written in a very concise notation as:

Here, each row in theX matrix belongs to each of the ‘n’ observations and each column corresponds to each predictor along with the first column of 1’s. Since there are k predictors, we have k+1 elements in the matrix. Finally, we add the product of the design matrix X and the coefficient vector to the error vector to get the Yvector.

In the next segment, you will learn about the assumptions of linear regression.