The pre-requisites for this blog are basic understanding of Kafka concepts.If you don’t have,please follow this first post Wiring up Apache Kafka
.In the previous Blog, we discussed about the installation steps of Apache Kafka.Now,we’ll look for the real live example of publishing and consuming live data streams. Live Data Streams means as soon as data is published,you can consume at the very same moment of time,isn’t it interesting ?
- Let’s start zookeeper server first
- Open the terminal with Ctrl+Alt+T
- Go to kafka directory with cd kafka_0.10 and use this command to start zookeeper :
2. With Ctrl+Shift+T ,open a new tab in terminal and start kafka server with the following command.
3. Let’s create a topic naming test with partition 1 and replica 1 which will store feeds of messages/categories. The command required to create topic is :
Additionally,you can see the list of topics by this command : bin/kafka-topics.sh –list –zookeeper localhost:2181
It will be beneficial if you open a new terminal and then simultaneously observe publishing and consuming data.
4. It’s time to publish data in Publisher with the command mentioned below and we will any sort of data be it any numeric or text. Make sure to publish data on the above created topic only,else you’ll not be able to consume the data.It’s mandatory to have the same topic while publishing as well as consuming.
- bin/kafka-console-producer.sh –broker-list localhost:9092 –topic test
5. Let’s consume data ,isn’t it interesting as soon as you publish data,you can consume as well.The command required to consume data is
This is all about the example of publishing and consuming live data streams. If you have any doubts,you can comment down.