This chapter covers
- Prefetching data for hiding read latency
 - Optimistic updates for hiding write latency
 - Speculative execution for hiding execution latency
 - Predictive resource allocation for hiding provisioning latency
 
When building for low latency, you’ll sometimes encounter operations that take a long time to complete, but you cannot make them run faster. You, therefore, need to look at ways to hide the latency. In the previous chapter, we discussed asynchronous processing, which allows you to hide latency by performing work in the background. However, even that is not always enough, and you’ll need to be more proactive and perform long-running operations ahead of time.
Predictive techniques are all about figuring out good times to perform long-running operations so you can make their results available when they’re needed, effectively hiding the latency. Techniques such as prefetching, optimistic updates, speculative execution, and predictive resource allocation allow you to hide the latency of operations that take a long time to compute by predicting the future. Of course, as with all latency optimization techniques, each has some underlying tradeoff and cost, which you’ll need to factor in when applying them to your applications.