Apache Kafka Stream with Spring Boot

Caner Kaya
3 min readMar 16, 2021

In this article i will implement a service which is responsible from processing the streaming data. At the end of the article, we will have a ecosystem like this

Architectural Design

As you see we need to start a Apache Kafka broker to store and process the data. We need docker run Kafka. Apache Kafka has dependency into Apache Zookeeper. Here the docker-compose file to run kafka broker.

docker-compose.yml

Go to the destination of file and run the command.

docker-compose up -d

So there is a kafka broker that runs on port 9092. Now we can start to implement the Product Service which is responsible to produce messages about the sold products. Let’s create project from start.spring.io.

First we need to make some configurations for connecting to Kafka Brokers.

KafkaConfiguration.java

Next job is to create data transfer object to send into kafka broker.

SoldProduct.java

In the KafkaConfiguration class we created a template as a bean that responsible from communication between kafka broker. Now we will create a component that uses that template and sends SoldProduct object into kafka broker.

Now we need a scheduler to send a randomly generated SoldProduct data every 2 seconds. So go the destinaion of service folder and run the service.

mvn spring-boot:run

We are done with producing part. Now we need to implement processor. This service won’t be a complicated service. It will take messages and increase the total price for each product. Also it will have a controller to check current total price. You can create project from start.spring.io.

Let’s start with kafka configuration.

Stream API is a combination of producer and consumer API’s. So we need to make some configuration for producing and consuming parts.

Then we need a controller to get current value of total price

AnalysisController.java

Now we will create dto object that produced from product-service.

Finally we will implement the processor. Processor will store total price for each product in a map.

So we are also done with analysis-service. But make sure that you chaned port with 8081 in application.properties. Because product-service has already run on 8080.

Let’s go analysis-service’s folder and run.

mvn spring-boot:run

Now we have product-service, analysis-service and kafka broker that working properly. Let’s test it.

curl localhost:8081

Thank you for reading this article :). I know the article does not contain much detail about Kafka and Stream API. This a kind of memo for me. But still i want to write it, maybe it will help you about some topics. You can access the source code from here:

https://github.com/canerky96/apache-kafka-stream

--

--