5 Message handling with Event Hubs

 

This chapter covers

  • Creating an Event Hub
  • Configuring partitions and throughput units
  • Saving messages to disk
  • Accessing Event Hubs

In the previous chapters, you learned about services that can store the potentially limitless volumes of data generated by modern applications. These services support the speed and batch layers of the Lambda architecture. Data storage forms both sources and outputs for data, and queries to answer user questions.

In this chapter, you’ll learn about another Azure data source. Event Hubs exposes a high-throughput endpoint for ingesting and serving event messages. Events messages record the activities of modern applications as a time-based series of event data. Producers generate event messages, and consumers process event messages. Event Hubs forms the bridge between the two. In this way, Event Hubs decouples the producers from the consumers. By decoupling ingestion from consumption, Event Hubs allows multiple producers to communicate with multiple consumers.

5.1 How does an Event Hub work?

An Event Hub ingests messages from applications. It records the details of each message in a journal and saves the message data for retrieval. The message data can be simple or complex. The Event Hub serves messages on request. Each message consumer records the last message read in the journal. Multiple consumers can read from the same journal of messages.

5.2 Collecting data in Azure

5.3 Create an Event Hubs namespace

5.3.1 Using Azure PowerShell

5.3.2 Throughput units

5.3.3 Event Hub geo-disaster recovery

5.3.4 Failover with geo-disaster recovery

5.4 Creating an Event Hub

5.4.1 Using Azure portal

5.4.2 Using Azure PowerShell

5.4.3 Shared access policy

5.5 Event Hub partitions

5.5.1 Multiple consumers

5.5.2 Why specify a partition?

5.5.3 Why not specify a partition?

5.5.4 Event Hubs message journal

sitemap