4 Creating Consumer Applications

 

This chapter covers

  • Receiving messages from Kafka
  • Principles of parallel message reception
  • Common challenges in Kafka consumer handling
  • Accessing Kafka via HTTP
  • Utilizing data compression in Kafka

Gathered once more in their meeting room, Max Sellington, Rob Routine, and Eva Catalyst focus on the next stage of their project: developing the consumer application. With topics set for customer profile data and transactions, they now delve into the intricacies of data aggregation, determining how best to handle and merge the incoming streams.

MAX: Alright, let's dive into the consumer application for our project. We've got those two input topics—one for customer profile data and the other for transactions. Remember how we settled on using customer ID as the key for profile data and transaction ID for transactions? Now, we're looking at aggregating data by customer ID. So, what's the plan for the consumer service? How's it going to handle and mash up all that data?

EVA: Easy, it's just going to stash the data away in the relational database.

ROB: The consumer side isn't straightforward at all. How many instances of our aggregated service should we have? It will pull data, but how much data should it receive in one request to avoid memory issues? And if it fails, how quickly does the cluster detect it?

EVA: Just like with the producers, there’s a lot of configuration required to make the application work properly. We need to examine all these properties.

4.1 Organizing consumer applications

4.2 Receiving a message

4.2.1 Reading data in parallel

4.2.2 Setting initial consumer configuration for Customer 360 project

4.2.3 Group Leader and Group Coordinator

4.2.4 Committing the Offsets

4.2.5 Specifying the strategy for committing offsets for Customer360

4.2.6 Creating batches

4.2.7 Timeouts and partition rebalance

4.2.8 Static Group Membership

4.2.9 Partition assignment strategies

4.2.10 The Next-Gen Consumer Rebalance Protocol

4.2.11 Subscriptions and assignment

4.2.12 Reading data from compacted topics

4.2.13 Consumer considerations for the Customer360 project

4.3 Common consumer issues

4.4 Data compression

4.5 Accessing Kafka through REST Proxy

4.6 Online resources

4.7 Summary