10 Organizing a Kafka Project

 

This chapter covers

  • Defining project requirements across environment setup, non-functional requirements, infrastructure sizing, and resource quotas.
  • Maintaining Kafka cluster using tools, GitOps, and the Kafka Admin API.
  • Testing Kafka applications.

When a team considers adopting a new technology for a project, they often focus intensely on technical considerations—prototyping, assessing feasibility, estimating costs, and reviewing infrastructure demands. Yet one critical factor frequently underestimated is how the technology aligns with the organization’s project workflow. Key questions arise: What specific requirements must be met? Who owns gathering them? How will deliverables be tracked? Can the technology seamlessly integrate into the CI/CD pipeline? Are QA teams equipped to test for integration risks?

In this chapter, we’ll address these challenges and present best practices for structuring a Kafka project to ensure smooth adoption, governance, and long-term success.

10.1 Defining Kafka Project Requirements

Max sighed as he sat down, shaking his head.

Max: Alright, team. Our project is starting to attract a lot of attention. Every time I go for lunch, someone stops me and asks if they can also use Kafka for their use case. It’s like we’ve created a new buzzword in the company.

Rob: Well, before we let everyone jump on board, they need to understand whether Kafka is actually suitable for their use case.

10.1.1 Identifying event-driven workflows

10.1.2 Turning business workflows into events

10.1.3 Gathering functional requirements for Kafka topics

10.1.4 Identifying non-functional requirements

10.2 Maintaining Cluster Structure

10.2.1 Using tools

10.2.2 Using GitOps for Kafka configurations

10.2.3 Using the Kafka Admin API

10.2.4 Setting up environments

10.2.5 Choosing a solution for Customer360 project

10.3 Testing Kafka applications

10.3.1 Unit testing

10.3.2 Integration testing

10.3.3 Performance tests

10.4 Online Resources

10.5 Summary