Chapter 8. Cache me if you can

 

This chapter covers

  • Managing client-side caching
  • Introducing Catbox
  • Server-side caching with hapi

There are two things that we always want to do in web applications:

  • Give users the most up-to-date, relevant data that we can get our hands on
  • Make the experience as quick as possible

We can’t always have both at the same time, so we find a compromise. Caching is the name of that compromise. Caching lets us serve data that is fresh enough, with the benefit that we can serve it a lot faster. It’s common sense, and sometimes this isn’t much of a compromise to make as figure 8.1 illustrates—in particular, when your data changes infrequently.

Figure 8.1. The server-side caching trade-off

It’s faster to serve from a cache for a few reasons. The obvious one is that it’s usually physically a lot faster to pull data from a cache than it is to get it from its source. This is either because the cache is located closer to you—for example, your browser cache is on the same machine as your browser—or it’s stored in a physical form which is faster to read from. For instance, RAM is ~80x faster to read from than an optical disk and probably several thousand times faster than requesting the same amount of data via a network request.

8.1. Client-side caching

8.2. Introducing Catbox: a multi-strategy object-caching library

8.3. Server-side caching in hapi applications

8.4. Summary

sitemap