21 Asynchronous communication with a message queue

 

This is the final full chapter of the book, and it introduces a new way for the components of a system to communicate: sending and receiving messages using a queue. Message queues have been around for a very long time--they’re a way of decoupling components so instead of making a direct connection to communicate with each other, they send messages to the queue. The queue can deliver messages to one or many recipients, and that adds a lot of flexibility to your architecture.

In this chapter we’ll focus on two scenarios that are enabled when you add a message queue to your application: improving system performance and scalability, and adding new features with zero downtime. We’ll use two modern message queues that run very nicely in Docker: Redis and NATS.

21.1 What is asynchronous messaging?

Software components usually communicate synchronously--the client makes a connection to the server, sends a request, waits for the server to send a response, and then closes the connection. That’s true for REST APIs, SOAP web services, and gRPC, which all use HTTP connections.

21.2 Using a cloud-native message queue

 
 
 

21.3 Consuming and handling messages

 
 
 

21.4 Adding new features with message handlers

 

21.5 Understanding async messaging patterns

 

21.6 Lab

 
sitemap

Unable to load book!

The book could not be loaded.

(try again in a couple of minutes)

manning.com homepage