Chapter 2. The unified log

 

This chapter covers

  • Understanding the key attributes of a unified log
  • Modeling events using JSON
  • Setting up Apache Kafka, a unified log
  • Sending events to Kafka and reading them from Kafka

The previous chapter introduced the idea of events and continuous streams of events and showed that many familiar software platforms and tools have event-oriented underpinnings. We recapped the history of business intelligence and data analytics, before introducing an event-centric data processing architecture, built around something called a unified log. We started to show the why of the unified log with some use cases but stopped short of explaining what a unified log is.

In this chapter, we will start to get hands-on with unified log technology. We will take a simple Java application and show how to update it to send events to a unified log. Understanding the theory and design of unified logs is important too, so we’ll introduce the core attributes of the unified log first.

We have a few unified log implementations to choose from. We’ll pick Apache Kafka, an open source, self-hosted unified log to get us started. With the scene set, we will code up our simple Java application, start configuring Kafka, and then code the integration between our app and Kafka. This process has a few discrete steps:

  1. Defining a simple format for our events
  2. Setting up and configuring our unified log
  3. Writing events into our unified log
  4. Reading events from our unified log

2.1. Understanding the anatomy of a unified log

 
 

2.2. Introducing our application

 
 
 

2.3. Setting up our unified log

 
 
 
 

Summary

 
 
 
sitemap

Unable to load book!

The book could not be loaded.

(try again in a couple of minutes)

manning.com homepage