11 Schema registry

 

This chapters covers

  • Developing a proposed Kafka maturity model
  • The value schemas can provide for your data as it changes
  • Reviewing Avro and data serialization
  • Compatibility rules for schema changes over time

As you have discovered the various ways to use Apache Kafka, it might be an interesting experiment to think about how you view Kafka the more you utilize it. As enterprises (or even tools) grow, they can sometimes be modeled with maturity levels. Martin Fowler provides a great explanation for this at https://martinfowler.com/bliki/MaturityModel.html [1]. Fowler also has a good example that explains the Richardson Maturity Model, which looks at REST [2]. For even further reference, the original talk, “Justice Will Take Us Millions Of Intricate Moves: Act Three: The Maturity Heuristic” by Leonard Richardson can be found at https://www.crummy .com/writing/speaking/2008-QCon/act3.html.1

11.1 A proposed Kafka maturity model

In the following sections, we focus our discussion on maturity levels specific to Kafka. For a comparison, check out the Confluent white paper titled, “Five Stages to Streaming Platform Adoption,” which presents a different perspective that encompasses five stages of their streaming maturity model with distinct criteria for each stage [3]. Let’s look at our first level (of course, as programmers we start with level 0).

11.1.1 Level 0

11.1.2 Level 1

11.1.3 Level 2

11.1.4 Level 3

11.2 The Schema Registry

11.2.1 Installing the Confluent Schema Registry

11.2.2 Registry configuration

11.3 Schema features

11.3.1 REST API

11.3.2 Client library

11.4 Compatibility rules

11.4.1 Validating schema modifications

11.5 Alternative to a schema registry

Summary

References

sitemap