Welcome to the session on ‘Spark Integration – Apache Kafka’.
In the previous three sessions, you learnt about the various features of the Spark Streaming API, the general flow of code and the architecture of a Spark Streaming application. You also looked at a few transformations followed by window operations.
Now, you will take a look at a tool integration with Apache Kafka to understand the overall working of an application. In this session, you will how Spark can be integrated to read and write messages to Apache Kafka.
Let’s get started!