11 Schema registry

 

This chapter covers:

  • Developing an ideal Kafka Maturity Model
  • Focusing on the value schemas can provide for your data as it changes
  • Reviewing Avro and its use in data serialization
  • Determining compatibility rules for schema changes over time

As we have discovered the various ways to use Apache Kafka, it might be an interesting experiment to think through how you view Kafka the more you utilize it.

As enterprises (or even tools) grow, they can sometimes be modeled with maturity levels. Martin Fowler provides a great explanation at this site: martinfowler.com/bliki/MaturityModel.html [175]. Fowler also has a clear example of explaining the Richardson Maturity Model which looks at REST in such a model [176]. For even furher reference, the original talk writeup, "Justice Will Take Us Millions Of Intricate Moves: Act Three: The Maturity Heuristic" by Leonard Richardson can by found at www.crummy.com/writing/speaking/2008-QCon/act3.html [177]. The following model is our opinions for maturity levels specific to Kafka. For a comparison to a different experienced perspective, check out the Confluent whitepaper titled, "Five Stages to Streaming Platform Adoption", which has five stages of their streaming maturity model with different criteria for each stage [178].

Let’s look at our first level: of course, as programmers we’re starting with Level 0.

11.1 Schema Registry

 
 

11.1.1 Installing Schema Registry

 
 
 

11.1.2 Registry configuration

 
 
 

11.2 Defining a schema

 
 

11.2.1 A new schema

 
 
 

11.3 Schema features

 

11.3.1 REST API

 
 

11.3.2 Client library

 
 

11.4 Compatibility rules

 

11.4.1 Validate schema modifications

 
 
 
 

11.5 Alternative to a Schema Registry

 
 

11.5.1 Create new topics

 
 

11.5.2 Confluent Cloud Schema Registry

 
 
 

11.6 Summary

 
sitemap

Unable to load book!

The book could not be loaded.

(try again in a couple of minutes)

manning.com homepage
test yourself with a liveTest