4 Creating Consumer Applications
This chapter covers
- Receiving messages from Kafka
- Principles of parallel message reception
- Common challenges in Kafka consumer handling
- Accessing Kafka via HTTP
- Utilizing data compression in Kafka
Here we shift from producers to the other half of the pipeline: consumers. We need to understand how they read at scale, coordinate, and stay correct. This chapter explains how messages are received and processed in parallel, how applications subscribe to topics or explicitly position themselves in a stream, and how batching and timeouts shape throughput and latency. We make consumer groups concrete, explore rebalances, and highlight the most common issues (lag, duplicates, ordering). We also touch on Kafka REST Proxy as a lightweight integration option. By the end, you’ll know how to design consumer logic that is reliable, efficient, and easy to operate.
4.1 Field notes: Consumer patterns and trade-offs
Gathered once more in their meeting room, Max Sellington, Rob Routine, and Eva Catalyst focus on the next stage of their project: developing the consumer application. With topics set for customer profile data and transactions, they now delve into the intricacies of data aggregation, determining how best to handle and merge the incoming streams.