In the previous few segments of this session you have learned two basic requirements to build an Auto Regressive model, Stationarity and Autocorrelation. In this segment and the next session, few Auto Regressive models that you will be studying are as follows:

- Auto Regressive (AR)
- Moving Average (MA)
- Auto Regressive Moving Average (ARMA)
- Auto Regressive Integrated Moving Average (ARIMA)
- Seasonal Auto Regressive Integrated Moving Average(SARIMA).
- Seasonal Auto Regressive Integrated Moving Average with Exogenous variable (SARIMAX).

In this session we will cover the Simple Auto Regressive model and the Moving Average model.

Let us start with the first model, i.e., **Simple Auto Regressive model(AR).**

The Simple Auto Regressive model predicts the future observation as linear regression of one or more past observations. In simpler terms, the simple Auto Regressive model forecasts the dependent variable (future observation) when one or more independent variables are known (past observations). This model has a parameter **‘p’** called **lag order**. Lag order is the maximum number of lags used to build ‘p’ number of past data points to predict future data points.

**Example**

Consider an example of forecasting monthly sales of ice cream for the year 2021 on the basis of the previous 3 years’ monthly sales data of the ice cream. This can be one of the simple Auto Regressive model.

#### To **determine the value of parameter ‘p’.**

- Plot partial autocorrelation function
- Select p as the highest lag where partial autocorrelation is significantly high

Here, the lag value of 1, 2, 4 and 12 has a significant level of confidence. i.e., a significant level of influence on future observation (refer to the red line). Hence, the value of ‘p’ will be set to 12 since that is the highest lag where partial autocorrelation is significantly high.

- Build the Auto Regression model equation:

The past values which have a significant value are 1, 2, 4 and 12. Therefore, in the regression model the independent variables yt−1, yt−2, yt−4 and yt−12 which are the observations from the past has been taken to predict the dependent variable ^yt.

Now let us get back again to the airline passenger dataset and build an AR model on it. In order to build the AR model, the stationary series is divided into train and test data.

You have built the model now, but you do not have any idea what the forecast looks like. Let us look at that and the accuracy in the video below. Also, recall that you had performed boxcox transformation and differencing in order to covert the airline passenger data into a stationary time-series. Now in order to recover the original forecast, you will have to reverse these transformations. Let’s learn how to do that as well from Chiranjoy.