8 Caching techniques
This chapter covers
- Caching overview
- HTTP Response Caching (client-side, intermediate, and server-side)
- In-Memory Caching
- Distributed Caching (using SQL Server and Redis)
In Information Technology, the term "cache" describes a hardware component or a software mechanism that can be used to store data, so that future requests that require such data can be served faster and - most importantly - without having to retrieve it from scratch; good caching practices often results in performance benefits, lower latency, less CPU overhead, reduced bandwidth usage, and decreased costs.
From the above definition, we can easily understand how adopting and implementing a caching strategy can bring many invaluable optimization advantages. This is especially true for web applications and services (including Web APIs), since they often have to deal with recurring requests targeting the same resources - such as the same HTML page(s) or JSON result(s) accessed by multiple users, just to make some common examples. However, introducing a caching mechanism also adds some complexity to our code, and might easily cause unwanted side effects when not implemented properly.