1. Introduction
Are you struggling to test out new microservice features and Kafka messages? Well, your troubles are over! Running the Apache Kafka cluster onboard a Docker container is surprisingly painless. Let us show you how it’s done – follow this post’s steps for all your local testing needs. This post will focus on how to run Kafka in Docker.
2. Install Docker
Docker is a container engine written in the go language. It is a platform that allows running applications under a shared configuration that uses Linux kernel features to create containers on top of the operating system.
Getting started with Docker is simple; navigate to this page and download Docker for desktop for the chosen operating system. Here I will be using Docker for Windows.
Once downloaded and installed, the Docker client has different settings; right-click on the mini icon and select preferences –> resources.
We can allocate a portion of the system resources that a Docker client can use. We will use the settings below for this tutorial, but depending on how many images you intend to run, those may need to increase significantly.
Once desired settings are selected, click apply, restart, and wait for Docker to reboot.
3. Apache Kafka Docker Image
After Docker is configured, create a docker-compose.yml file using the configuration below.
In a commercial environment, that file usually lives in the project folder and contains all the external services required for the specific project.
version: '3' networks: kafka-net: name: kafka-net driver: bridge services: zookeeper: image: confluentinc/cp-zookeeper:7.2.2 container_name: zookeeper environment: ZOOKEEPER_CLIENT_PORT: 2181 ZOOKEEPER_TICK_TIME: 2000 networks: - kafka-net broker: image: confluentinc/cp-kafka:7.2.2 container_name: broker networks: - kafka-net ports: - "9092:9092" depends_on: - zookeeper environment: KAFKA_BROKER_ID: 1 KAFKA_ZOOKEEPER_CONNECT: "zookeeper:2181" KAFKA_LISTENER_SECURITY_PROTOCOL_MAP: PLAINTEXT:PLAINTEXT,PLAINTEXT_INTERNAL:PLAINTEXT KAFKA_ADVERTISED_LISTENERS: PLAINTEXT://localhost:9092,PLAINTEXT_INTERNAL://broker:29092 KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR: 1 KAFKA_TRANSACTION_STATE_LOG_MIN_ISR: 1 KAFKA_TRANSACTION_STATE_LOG_REPLICATION_FACTOR: 1 KAFKA_CFG_AUTO_CREATE_TOPICS_ENABLE: "true" ALLOW_PLAINTEXT_LISTENER: "yes"
In this docker-compose, we have defined two services, the ZooKeeper server and the Kafka cluster. ZooKeeper manages the Apache Kafka server, maintaining Kafka topics and messages.
Upon successful file creation, we can initialize both services via the docker-compose command in the exact location where the file is created.
docker-compose up -d
This will install Kafka inside the container. The initial startup may take a few minutes because relevant images must be downloaded from the Docker Hub.
After some time, the Kafka service should be operating in the docker container. You can verify that by executing the below command in the terminal.
docker ps command
4. Connect and Create Kafka Topic
Once everything is running, it’s time to test the Kafka cluster. You can run the bellow command to access the docker container via bash, where a broker is a name we have given in the docker-compose file for the Apache Kafka container.
docker exec -it broker bash
We can run different commands inside the container to interact with the Kafka brokers. Let’s run a simpDockermand to create a Kafka topic with default settings.
kafka-topics --bootstrap-server broker:9092 --create --topic test
This should create an Apache Kafka topic with a name test, and we can verify that using docker by executing the command below.
kafka-topics --bootstrap-server broker:9092 --describe --topic test
4.1. Insert Data Into Kafka Using Docker
Aside from creating topics, we can insert messages directly into Kafka using docker.
kafka-console-producer -broker-list broker:9092 -topic test {"id":1,"order":1,"name":"phone","status":"PURCHASED","price":"80.00"}
As a result, we can use Kafka consumers to check the inserted message. This is only scratching the surface of what’s possible; the Kafka command-line interface has large vast o commands at your disposal.
kafka-console-consumer -bootstrap-server broker:9092 -topic test -from-beginning
5. Summary
This post has looked at how we can run Apache Kafka in Docker. It’s incredible how easy it is to run the Apache Kafka container nowadays. With just a few steps, we could spin up our own little data-streaming dream world – thanks to the mighty docker-compose.yml file! To top it off, by creating a topic inside an intrepid container, we checked that everything was excellent and ready for use.