Deploying a Kafka Node App with Docker: Step-by-Step Guide
Deploying applications using Docker has become a popular approach due to its simplicity and efficiency. In this guide, we will explore how to deploy a Kafka Node app with Docker, step by step. This guide is aimed at helping you understand how to set up Kafka with a Node.js application in a Docker environment.
What is Kafka? ๐
Apache Kafka is an open-source distributed event streaming platform capable of handling trillions of events a day. It is designed to be fast, scalable, and durable. Kafka is commonly used for building real-time data pipelines and streaming applications. It allows you to publish and subscribe to streams of records, similar to how a message queue operates.
Why Use Docker? ๐ณ
Docker is an open-source platform that automates the deployment, scaling, and management of applications in containers. Containers are lightweight, portable, and provide an isolated environment for your application, ensuring that it runs consistently regardless of where it is deployed.
Benefits of Using Docker for Kafka Node Apps
- Isolation: Each container runs independently, making it easier to manage dependencies.
- Portability: Docker containers can be run on any system that supports Docker, making it easy to move applications between environments.
- Scalability: Docker makes it easy to scale applications horizontally by adding more container instances.
Prerequisites ๐
Before we begin the deployment process, ensure that you have the following installed:
- Docker
- Node.js and npm
- Basic knowledge of Docker and Kafka
- A text editor or IDE of your choice
Step 1: Setting Up Kafka with Docker
1.1 Create a Docker Network
First, we need to create a Docker network to allow our Kafka and Zookeeper containers to communicate with each other:
docker network create kafka-network
1.2 Start Zookeeper
Kafka requires Zookeeper, so we need to start it first. Use the following command to run a Zookeeper container:
docker run -d --name zookeeper \
--network kafka-network \
-e ZOOKEEPER_CLIENT_PORT=2181 \
wurstmeister/zookeeper
1.3 Start Kafka
Now, let's run the Kafka container. We will link it to the Zookeeper container we just created:
docker run -d --name kafka \
--network kafka-network \
-e KAFKA_ZOOKEEPER_CONNECT=zookeeper:2181 \
-e KAFKA_ADVERTISED_LISTENERS=INSIDE://kafka:9092,OUTSIDE://localhost:9094 \
-e KAFKA_LISTENERS=INSIDE://0.0.0.0:9092,OUTSIDE://0.0.0.0:9094 \
-e KAFKA_LISTENER_SECURITY_PROTOCOL_MAP=INSIDE:PLAINTEXT,OUTSIDE:PLAINTEXT \
-e KAFKA_PORT=9092 \
wurstmeister/kafka
Important Note
Make sure to replace
localhost
in theKAFKA_ADVERTISED_LISTENERS
environment variable if you're deploying it on a server or in a different environment.
Step 2: Create a Node.js Application
Now, let's create a simple Node.js application that will produce and consume messages from Kafka.
2.1 Initialize the Node.js App
Create a new directory for your application and initialize a Node.js project:
mkdir kafka-node-app
cd kafka-node-app
npm init -y
2.2 Install Kafka Client Library
Next, install the kafkajs
library, which is a modern Kafka client for Node.js:
npm install kafkajs
2.3 Create the Producer
Create a file named producer.js
and add the following code:
const { Kafka } = require('kafkajs');
const kafka = new Kafka({
clientId: 'my-producer',
brokers: ['kafka:9092']
});
const producer = kafka.producer();
const run = async () => {
await producer.connect();
console.log('Connected to Kafka');
// Produce a message
await producer.send({
topic: 'test-topic',
messages: [
{ value: 'Hello Kafka!' }
],
});
console.log('Message sent');
await producer.disconnect();
};
run().catch(console.error);
2.4 Create the Consumer
Now create a file named consumer.js
and add the following code:
const { Kafka } = require('kafkajs');
const kafka = new Kafka({
clientId: 'my-consumer',
brokers: ['kafka:9092']
});
const consumer = kafka.consumer({ groupId: 'test-group' });
const run = async () => {
await consumer.connect();
console.log('Connected to Kafka');
await consumer.subscribe({ topic: 'test-topic', fromBeginning: true });
await consumer.run({
eachMessage: async ({ topic, partition, message }) => {
console.log(`Received message: ${message.value.toString()}`);
},
});
};
run().catch(console.error);
Step 3: Dockerize the Node.js Application
3.1 Create Dockerfile
Next, we need to create a Dockerfile
in the root of your Node.js application:
# Use the official Node.js image as a base
FROM node:14
# Set the working directory
WORKDIR /usr/src/app
# Copy package.json and install dependencies
COPY package*.json ./
RUN npm install
# Copy the rest of the application files
COPY . .
# Command to run the producer
CMD ["node", "producer.js"]
3.2 Build the Docker Image
Now, build the Docker image for your Node.js application:
docker build -t kafka-node-app .
3.3 Run the Node.js App Container
Finally, run the Node.js app container and connect it to the Kafka network:
docker run --network kafka-network kafka-node-app
Step 4: Testing the Application
Once your Node.js producer is running, it will send a message to Kafka. You can run the consumer in another terminal to see the message being consumed:
Run the Consumer
Open another terminal window and execute the following command to start the consumer:
docker run --network kafka-network kafka-node-app node consumer.js
You should see the message "Received message: Hello Kafka!" printed in the terminal.
Troubleshooting Common Issues โ ๏ธ
- Network Issues: Make sure that both the Node.js app and Kafka are running in the same Docker network.
- Permission Errors: Check Docker permissions if you encounter issues running containers.
- Environment Variables: Ensure that you have set the environment variables correctly, especially the Kafka brokers in your Node.js application.
Conclusion ๐
In this guide, we've walked through the process of deploying a Kafka Node app using Docker. We covered the setup of Kafka and Zookeeper containers, created a simple Node.js application, and successfully connected the two. Docker's capabilities help streamline the deployment process, making it easier to manage and scale your applications.
By following this guide, you should now have a functioning Kafka setup and a basic understanding of how to integrate it with a Node.js application using Docker. Feel free to extend this setup by implementing more advanced features or scaling it further based on your needs. Happy coding! ๐