1. Introduction
Are you looking for a practical example of how to use Spring Boot and Apache Kafka broker together? Get ready because this blog post will dive into the nitty-gritty of setting up producer and consumer in the Spring Boot Kafka application. When you finish reading this blog post, you should have a basic understanding of Apache Kafka Spring dependency and how to use its capabilities when combined with Spring Boot. Whether your goal is to become proficient in using both technologies or understand them better, let’s get started!
2. Prerequisites and Setup
You may skip this section if you do not follow this tutorial thoroughly and only want to look at code examples.
If you try following this tutorial using your IDE, I assume you already have a configured Kafka cluster inside the docker image. You may want to check running Kafka locally if you don’t.
2.2. Generate Project Template
Let’s generate a Spring Boot application project using Spring Initializr using the template with Lombok and Spring Boot Kafka dependency. Click generate and import into your IDE.

3. Create Kafka Topic Programmatically
Though it is usually best to avoid creating topics programmatically in production, it can be helpful when no other options are available. Let’s see how we can do it with Java and Spring.
Spring support for Kafka broker is great; first, we must create a KafkaConfig class with Kafka properties. The sample topic bean combined with configuration annotation will create the topic on application startup.
@Configuration public class KafkaConfig { private KafkaProperties kafkaProperties; public KafkaConfig(KafkaProperties kafkaProperties) { this.kafkaProperties = kafkaProperties; } @Bean public NewTopic sampleTopic() { return TopicBuilder .name("sampleTopic") .partitions(1) .replicas(1) .compact() .build(); } }
If we want to pass extra topic configuration, we can give a key-value map to the topic builder like in the example below.
@Bean public NewTopic sampleTopicWithMapConfig() { var map = new HashMap<String, String>(); map.put( TopicConfig.CLEANUP_POLICY_CONFIG, TopicConfig.CLEANUP_POLICY_COMPACT ); return TopicBuilder .name("sampleTopicTwo") .partitions(1) .replicas(1) .configs(map) .build(); }
For more advanced topic options, we can use AdminClient class which, on top of the topic builder, does accept advanced topic options if provided choices are not adequate for your specific use case.
Please note that CreateSampleTopicWithOptions Is a method rather than Spring bean; therefore, it won’t be created automatically unless invoked somewhere in the application.
public void createSampleTopicWithOptions() { var sampleTopic = TopicBuilder .name("sampleTopicThree") .partitions(1) .build(); var createTopicOptions = new CreateTopicsOptions() .validateOnly(false) .retryOnQuotaViolation(true) .timeoutMs(1000); var adminProperties = kafkaProperties.buildAdminProperties(); try (AdminClient client = AdminClient.create(adminProperties)) { client.createTopics( singletonList(sampleTopic), createTopicOptions ); } }
Starting the application should create the declared topics automatically on the Kafka server. Verify that by entering the Kafka container and listing all the Kafka broker topics:
(ADD winpty TO THE START IF ON WINDOWS) [1] docker exec -it broker bash [2] kafka-topics -bootstrap-server broker:9092 -list
In the following steps, we will create producer and consumer applications.
4. Spring Boot Kafka Producer
We create a SampleProducer class with a @Service annotation. It represents a singleton by default in Spring Boot. This class will contain logic to send messages to the Apache Kafka topic using the KafkaTemplate.
We will send a simple String java object instead of custom messages in the body. A more complex producer configuration example is available in this Spring Boot project.
@Service public class SampleProducer { private final KafkaTemplate<String, String> template; public SampleProducer(KafkaTemplate<String, String> template) { this.template = template; } public void sendString(String userMessage) { template.send("sampleTopic", userMessage); } }
Because we lack application logic, let’s autowire the Kafka producer class in the SpringBootKafkaApplication generated automatically. In the postConstruct method, we send two Strings to the topic sampleTopic.
@SpringBootApplication public class SpringBootKafkaApplication { @Autowired private SampleProducer producer; public static void main(String[] args) { SpringApplication.run(SpringBootKafkaApplication.class, args); } @PostConstruct public void postConstruct() { producer.sendString("1,Jack,jack123@gmail.com"); producer.sendString("2,Daniel,daniel1234@gmail.com"); } }
Start the application; since we haven’t yet created a Kafka consumer, we can verify the inserted data by getting onto the broker container and executing the consumer command to view all the messages on the topic:
(ADD winpty TO THE START IF ON WINDOWS) [1] docker exec -it broker bash [2] kafka-console-consumer -bootstrap-server broker:9092 -topic sampleOne -from-beginning
5. Spring Boot Kafka Consumer
First and foremost, we must place the application.yml file into the resources folder with its proper consumer configuration. Since one consumer per group can only service a single Apache Kafka partition, every Kafka consumer has to be assigned an individual group id. Additionally, the initial offset is set to the earliest, so the consuming messages will be consumed from the start of the topic’s partition.
spring: kafka: consumer: group-id: default-spring-consumer auto-offset-reset: earliest
To create our consumer class, utilise the @KafkaListener annotation with a parameter specifying the topic name. Add this file and start the project; you can expect log entries in the console for each string message sent by our producer previously.
@Log4j2 @Service public class SampleConsumer { @KafkaListener(topics = "sampleTopic") public void listenAll(String message) { log.info("Processing message : {}", message); } }
6. Summary
This short guide has looked at how to create Spring Boot Kafka application. With the help of Java and Spring, creating Apache Kafka topics is a breeze, whether you want to set them up on startup using bean definition or tweak some advanced configurations using AdminClient class. KafkaTemplate makes sending messages to a topic effortless, and consuming those same messages is made even simpler with the help of Spring listener annotations to define consumer classes.
Daniel Barczak
Daniel Barczak is a software developer with a solid 9-year track record in the industry. Outside the office, Daniel is passionate about home automation. He dedicates his free time to tinkering with the latest smart home technologies and engaging in DIY projects that enhance and automate the functionality of living spaces, reflecting his enthusiasm and passion for smart home solutions.
Leave a Reply