In the next video, Ajay will summarise your learnings from this module, Your learnings from this module can be summarised as follows:
- In the first session, you learnt about batch and real-time processing and the differences between them. You then learnt about traditional messaging systems and some of its challenges. Next, you learnt about Kafka and the pub-sub model. You also understood how Kafka brings the best of both, message queues and pub-sub models. You looked at some of the use cases of Kafka and understood the architecture of Kafka.
- In the second session, you understood the different concepts of Kafka, including topics, partitions, producers, consumers and consumer groups. All the messages are stored in topics. A topic can have multiple partitions. Producers are responsible for writing messages to a topic, whereas consumers are responsible for reading the messages from a topic. Consumers can be grouped together to form consumer groups. Whenever a new consumer joins or a consumer leaves a consumer group, partitions are rebalanced across the available consumers. A topic can be replicated across different brokers to ensure fault tolerance.
- In the third session, you saw practical demonstrations of creating topics, pushing messages to the topic using producers and reading messages using a consumer. We covered the demonstration using both the command line and the Python code.
- In the fourth session, you learnt about Kafka Connect and Kafka Streams. We fetched data from Twitter and stored the data in a Kafka topic. You also saw a demonstration of a simple word count using Kafka Streams.
Following are the lecture notes for this module.
Report an error