10 Microservice APIs in Kubernetes

 

This chapter covers

  • Deploying an API to Kubernetes
  • Hardening Docker container images
  • Setting up a service mesh for mutual TLS
  • Locking down the network using network policies
  • Supporting external clients with an ingress controller

In the chapters so far, you have learned how to secure user-facing APIs from a variety of threats using security controls such as authentication, authorization, and rate-limiting. It’s increasingly common for applications to themselves be structured as a set of microservices, communicating with each other using internal APIs intended to be used by other microservices rather than directly by users. The example in figure 10.1 shows a set of microservices implementing a fictional web store. A single user-facing API provides an interface for a web application, and in turn, calls several backend microservices to handle stock checks, process payment card details, and arrange for products to be shipped once an order is placed.

Definition

A microservice is an independently deployed service that is a component of a larger application. Microservices are often contrasted with monoliths, where all the components of an application are bundled into a single deployed unit. Microservices communicate with each other using APIs over a protocol such as HTTP.

10.1 Microservice APIs on Kubernetes

10.2 Deploying Natter on Kubernetes

10.2.1 Building H2 database as a Docker container

10.2.2 Deploying the database to Kubernetes

10.2.3 Building the Natter API as a Docker container

10.2.4 The link-preview microservice

10.2.5 Deploying the new microservice

10.2.6 Calling the link-preview microservice

10.2.7 Preventing SSRF attacks

10.2.8 DNS rebinding attacks

10.3 Securing microservice communications

10.3.1 Securing communications with TLS

10.3.2 Using a service mesh for TLS

10.3.3 Locking down network connections

10.4 Securing incoming requests

Answers to pop quiz questions

Summary

sitemap