5 Modeling an autoregressive process

 

This chapter covers

  • Illustrating an autoregressive process
  • Defining the partial autocorrelation function (PACF)
  • Using the PACF plot to determine the order of an autoregressive process
  • Forecasting a time series using the autoregressive model

In the previous chapter, we covered the moving average process, also denoted as MA(q)), where q is the order. You learned that in a moving average process, the present value is linearly dependent on current and past error terms. Therefore, if you predict more than q steps ahead, the prediction will fall flat and will return only the mean of the series, because the error terms are not observed in the data and must be recursively estimated. Finally, you saw that you can determine the order of a stationary MA(q) process by studying the ACF plot; the autocorrelation coefficients will be significant up until lag q. In the case where the autocorrelation coefficients slowly decay or exhibit a sinusoidal pattern, then you are possibly in the presence of an autoregressive process.

In this chapter, we will first define the autoregressive process. Then, we will define the partial autocorrelation function and use it to find the order of the underlying autoregressive process of a dataset. Finally, we will use the AR(p) model to produce forecasts.

5.1 Predicting the average weekly foot traffic in a retail store

5.2 Defining the autoregressive process

5.3 Finding the order of a stationary autoregressive process

5.3.1 The partial autocorrelation function (PACF)

5.4 Forecasting an autoregressive process

5.5 Next steps

5.6 Exercises

5.6.1 Simulate an AR(2) process and make forecasts

5.6.2 Simulate an AR(p) process and make forecasts

Summary