13 Timeseries forecasting

 

This chapter covers

  • An overview of machine learning for timeseries
  • Understanding recurrent neural networks (RNNs)
  • Applying RNNs to a temperature forecasting example

This chapter tackles timeseries, where temporal order is everything. We’ll focus on the most common and valuable timeseries task: forecasting. Using the recent past to predict the near future is a powerful capability, whether you’re trying to anticipate energy demand, manage inventory, or simply forecast the weather.

13.1 Different kinds of timeseries tasks

A timeseries can be any data obtained via measurements at regular intervals, like the daily price of a stock, the hourly electricity consumption of a city, or the weekly sales of a store. Timeseries are everywhere, whether we’re looking at natural phenomena (like seismic activity, the evolution of fish populations in a river, or the weather at a location) or human activity patterns (like visitors to a website, a country’s GDP, or credit card transactions). Unlike the types of data you’ve encountered so far, working with timeseries involves understanding the dynamics of a system—its periodic cycles, how it trends over time, its regular regime, and its sudden spikes.

13.2 A temperature forecasting example

13.2.1 Preparing the data

13.2.2 A commonsense, non-machine-learning baseline

13.2.3 Let’s try a basic machine learning model

13.2.4 Let’s try a 1D convolutional model

13.3 Recurrent neural networks

13.3.1 Understanding recurrent neural networks

13.3.2 A recurrent layer in Keras

13.3.3 Getting the most out of recurrent neural networks

13.3.4 Using recurrent dropout to fight overfitting

13.3.5 Stacking recurrent layers

13.3.6 Using bidirectional RNNs

13.4 Going even further

Summary