6 Bayesian optimization: Automating experimental optimization

 

This chapter covers

  • Combining ideas from RSM and MAB into one optimization method
  • Automating response surface modeling with Gaussian process regression
  • Automating experiment design by optimizing over an acquisition function

Before we begin, let’s review:

  • In chapter 2 (A/B testing), we talked about how to take a measurement of a business metric.
  • In chapters 3 (multi-armed bandits) and 5 (contextual bandits), we saw that if you adapt your experiments based on uncertainty estimates, you can improve your business metric while your experiment is running. We said we were “balancing exploration with exploitation.”
  • In chapter 4 (response surface methodology), we showed how to use estimates of a business metric—a surrogate function—to reduce the number of measurements required to optimize parameters.

Bayesian optimization (BO) integrates all these ideas—taking measurements, building a surrogate, and balancing exploration with exploitation—into one optimization method that automatically designs a sequence of experiments that optimizes system parameters.

6.1 Optimizing a single compiler parameter, a visual explanation

6.1.1 Simulate the compiler

6.1.2 Run the initial experiment

6.1.3 Analyze: Model the response surface

6.1.4 Design: Select the parameter value to measure next

6.1.5 Design: Balance exploration with exploitation

6.2 Model the response surface with Gaussian process regression

6.2.1 Estimate the expected CPU time

6.2.2 Estimate uncertainty with GPR

6.3 Optimize over an acquisition function

6.3.1 Minimize the acquisition function

6.4 Optimize all seven compiler parameters

6.4.1 Random search