This chapter covers:
- Creating an Event Hub
- Configuring partitions and throughput units
- Saving messages to disk
- Accessing Event Hubs
In the previous chapters, you learned how to create storage services that can store the potentially limitless volumes of data generated by modern applications. These services support the speed and batch layers of the Lambda architecture. Data storage forms both sources and outputs for data and queries to answer user questions.
In this chapter, you’ll learn about another data source in the Azure-based analytics system. Event Hubs exposes a high-throughput endpoint for ingesting and serving event messages. Events messages record the activities of modern applications as a time-based series of event data. Producers generate event messages, and consumers process event messages. Event Hubs forms the bridge between the two. In this way, Event Hubs decouples the producers from the consumers. By decoupling the event message ingestion from consumption Event Hubs allows multiple producers to communicate with multiple consumers.
An Event Hub ingests messages from applications. It records the details of each message in a journal, and saves the message data for retrieval. The message data can be simple or complex. The Event Hub serves messages on request. Each message consumer records the last message read in the journal. Multiple consumer can read from the same journal of messages.