In this chapter, we will explore how to customize a sequential search method to iteratively explore the hyperparameter search space and discover better hyperparameters. You will learn how to implement different sequential search methods for selecting pipelines from the search space in each trial. These search methods fall into the following two categories:
- History-independent sequential search methods cannot be updated during the search process. For example, grid search, which we looked at in chapter 2, traverses all the possible combinations of values in the candidate hyperparameter sets, and in chapter 6, we used the random search method to select hyperparameter combinations from the search space randomly. These are the two most representative history-independent methods. Some other advanced random search methods make use of history, such as the quasi-random search method using Sobol sequences (http://mng.bz/6Z7A), but here we’ll consider only the vanilla uniform random search method.
- History-dependent sequential search methods, such as Bayesian optimization, are able to improve the effectiveness of the search by leveraging the previous results.