3 Forecasting with TimeGPT

 

This chapter covers

  • Defining generative models
  • Exploring the architecture and inner workings of TimeGPT
  • Forecasting with TimeGPT
  • Anomaly detection with TimeGPT

In chapter 2, we built our own tiny foundation forecasting model and lived through the challenges of building such models. For example, since we only trained on monthly data, the model struggled to make accurate predictions on daily data. Thus, unless we are willing to spend many months collecting varied data and pretraining large models, we should then use existing large time models.

Starting from this chapter, all the way to chapter 9, we use a dataset tracking the weekly sales of many Walmart stores from 2010-02-05 to 2012-10-26 as the forecasting scenario to apply the different foundation models we explore in this book.

This dataset is released under the Creative Commons 0 license. While the original dataset tracked thousands of stores, we use a subset with only four stores. The dataset also includes exogenous variables, such as a holiday indicator, the average temperature, the average fuel price, and the average consumer price index (CPI) and the average unemployment rate, for each store location.

3.1 Defining generative pretrained Transformers

3.2 Exploring TimeGPT

3.2.1 Training TimeGPT

3.2.2 Uncertainty quantification in TimeGPT

3.3 Forecasting with TimeGPT

3.3.1 Fine-tuning with TimeGPT

3.3.2 Forecasting with exogenous variables

3.3.3 Cross-validation with TimeGPT

3.3.4 Forecasting on a long horizon with TimeGPT

3.4 Anomaly detection with TimeGPT

3.5 Next steps

3.6 Summary

3.7 References