11 Predictive techniques
This chapter covers
- Prefetching data for hiding read latency
- Optimistic updates for hiding write latency
- Speculative execution for hiding execution latency
- Predictive resource allocation for hiding provisioning latency
Welcome to this final chapter of the book!
When building for low latency, you sometimes encounter operations that take a long time to complete but cannot make them run faster. You, therefore, need to look at ways to hide the latency. In the previous chapter, we discussed asynchronous processing, which allows you to hide latency by performing work in the background. However, sometimes, even that is not enough, and you need to be more proactive in latency hiding to perform long-running operations ahead of time.
Predictive techniques are all about figuring out a good time to perform long-running operations to make their results available when needed, effectively hiding the latency. Techniques such as prefetching, optimistic updates, speculative execution, and predictive resource allocation allow you to hide the latency of operations that take a long time to compute by predicting the future. Of course, as with all latency optimization techniques, each of them have some underlying trade-off and cost, which is something you need to factor in when applying them for your application.
With that in mind, let’s explore predictive techniques to hiding latency, to complete our journey to optimizing for low latency!