Now that you learnt how to create topics, push messages and consume them from the topic using the command-line, you will now learn how to create Producers and Consumers using Python. These techniques are generally used to continuously push or costume data to/from Kafka.
The Jupyter Notebook used in this segment is attached below.
Before starting with the next video, make sure that you already have a topic named ‘test’ created in your Kafka server.
If you deleted it in the previous segment, run the following command to create a topic named test:bin/kafka-topics.sh –create –bootstrap-server localhost:9092 –replication-factor 1 –partitions 1 –topic test
Note: Ensure that you run this command from inside the Kafka folder that is present in the downloads directory.
Once you create this topic, in the next video, let’s take a look at the Python code to create a simple Producer.
You can use the below command to install Kafka client library in your EC2 machine. pip install kafka-python
In the previous video, you learnt how to create a simple producer and then push some messages to the topic named ‘test’.
We first pushed the message without a key, and in the second case, we pushed one that had the key and the value. Each time we sent the message, we would receive an acknowledgement that we stored in the variable named ack.
For each message sent, we also printed the topic and the partition to which this message was sent.
In the next segment, you will learn how to write a Python script to consume data from a Kafka topic.
Report an error