XenonStack Recommends

Continuous Security

Kafka Security with Kerberos on Kubernetes

Parveen Bhandari | 14 March 2023

Kafka Security

Introduction to Apache Kafka

Apache Kafka is an open source stream processing platform for the software, written in JAVA and SCALA which is initially developed by LinkedIn and then was donated to the Apache Software Foundation. Kafka is a public subscribe scalable messaging system and fault tolerant that helps us to establish distributed applications. Due to fast, scalable and fault-tolerant feature of the Kafka, it is used in a messaging system where even JMS, RabbitMQ, and AMQP are not even considered. It is also helpful in tracking service calls or in tracking of the IoT sensor devices due to its higher throughput and more reliability.

This article will give an overview of Kafka Security with ACL and Apache Kerberos. This data can be often performed with following functions -

  • Data analysis
  • Reporting
  • Data science Crunching
  • Compliance Auditing
  • Backups
A fast in-memory data processing engine with expressive development APIs to allow data workers to execute streaming conveniently. Click to explore about, Real Time Streaming Application with Apache Spark

How to implement Kafka Security with Kerberos?

Supposing that we have installed kerberos setup and we have to use that setup for the authentication First we have to create the principles by using the following commands -
sudo /usr/sbin/kadmin.local -q 'addprinc -randkey kafka/{hostname}@{REALM}'
sudo /usr/sbin/kadmin.local -q "ktadd -k /etc/security/keytabs/{keytabname}.keytab
Now we have to configure kafka kerberos, we will be adding JAAS file and will be naming it as -
KafkaServer {
com.sun.security.auth.module.Krb5LoginModule required
// Zookeeper client authentication
Client {
com.sun.security.auth.module.Krb5LoginModule required
Now we will pass the Krb5 location to each of kafka brokers as -
We have to make sure that the keytabs that are configured in the JAAS file are readable by the user who is starting Kafka broker. Now configure the SASL ports as -
The principal name of the Kafka brokers must match the service name in-service properties. By checking the principal name, we can write it as -
Now configure Kafka clients. The configuration of the client using the keytab will be as follow -
sasl.jaas.config=com.sun.security.auth.module.Krb5LoginModule required \
useKeyTab=true \
storeKey=true \
keyTab="/etc/security/keytabs/kafka_client.keytab" \
We have to make sure that the keytabs that are configured in the JAAS file are readable by the user who is starting Kafka broker. Now pass the Krb5 file location to each client as -
At last configure the following properties in producer or consumer.properties as
security.protocol=SASL_PLAINTEXT (or SASL_SSL)

How to secure Apache Kafka with Access Control List (ACL)?

The general format of “Principal P is [Allowed/Denied] Operation O From Host, H On Resource R” is defined in Kafka Acl’s. In a secure cluster, both client requests and inter-broker operations require authorization. Now in Server. Properties enable the default authorizer by -
Now we will be setting up the broker principle as superuser to give them required access to perform operations.
Tls user name by default will be -
It can be customized in server.properties as by following -
Now In order to add User:Bob as a producer of Test-topic we can execute the following -
kafka-acls –authorizer-properties zookeeper.connect=localhost:2181 \
–add –allow-principal User:Bob \
–producer –topic Test-topic
In order to give access to then newly created topic we can authorize it as -
export KAFKA_OPTS=”-Djava.security.auth.login.config=/etc/kafka/kafka_server_jaas.conf”
kafka-acls –authorizer-properties zookeeper.connect=kafka.example.com:2181 \
–add –allow-principal User:kafkaclient \
–producer –topic securing-kafka
kafka-acls –authorizer-properties zookeeper.connect=kafka.example.com:2181 \
–add –allow-principal User:kafkaclient \
–consumer –topic securing-kafka –group securing-kafka-group

Holistic Approach to Kafka Security

The streaming of data from one system to another in real time is done by Kafka. It acts as a kind of middle layer to decouple between real-time data pipelines. It can also be used to feed the fast lane systems like Spark, Apache Storm and other CEP systems. To know more about Stream Processing platform we would recommend.