Part 2

 

In part 1, you were introduced to the Apache Kafka event streaming platform. You learned, at a high level, the various components that make up the Kafka platform. You went on from there to learn about how the Kafka broker operates and the various functions it performs in acting as a central nervous system for data. In this part, you’re going to dive in and learn in detail about getting data into Kafka.

First up is Schema Registry. Schema Registry helps enforce the implied contract between Kafka producers and consumers (If you don’t know what I mean by an implied contract, don’t worry; I’ll explain it). But you might say, “I don’t use schemas.” Well, here’s the rub: you are always using one; it just depends on whether it’s an explicit or implicit schema. By the end of chapter 3, you’ll fully understand what I mean by that statement and how Schema Registry solves the problem.

Next, you’ll move on to the workhorses of the Kafka platform, producer and consumer clients. You’ll see how producers get data into Kafka and how consumers get it out. Learning about the Kafka clients is essential because there are important tools in Kafka that are abstractions over producer and consumer client, so understanding how they work is essential.