Authentication using SASL/Kerberos

  1. Prerequisites
    1. Kerberos
      If your organization is already using a Kerberos server (for example, by using Active Directory), there is no need to install a new server just for Kafka. Otherwise you will need to install one, your Linux vendor likely has packages for Kerberos and a short guide on how to install and configure it (UbuntuRedhat). Note that if you are using Oracle Java, you will need to download JCE policy files for your Java version and copy them to $JAVA_HOME/jre/lib/security.
    2. Create Kerberos Principals
      If you are using the organization's Kerberos or Active Directory server, ask your Kerberos administrator for a principal for each Kafka broker in your cluster and for every operating system user that will access Kafka with Kerberos authentication (via clients and tools).
      If you have installed your own Kerberos, you will need to create these principals yourself using the following commands:
      1
      2
      sudo /usr/sbin/kadmin.local -q 'addprinc -randkey kafka/{hostname}@{REALM}'
      sudo /usr/sbin/kadmin.local -q "ktadd -k /etc/security/keytabs/{keytabname}.keytab kafka/{hostname}@{REALM}"
    3. Make sure all hosts can be reachable using hostnames - it is a Kerberos requirement that all your hosts can be resolved with their FQDNs.
  2. Configuring Kafka Brokers
      1. Add a suitably modified JAAS file similar to the one below to each Kafka broker's config directory, let's call it kafka_server_jaas.conf for this example (note that each broker should have its own keytab):

        1
        2
        3
        4
        5
        6
        7
        8
        9
        10
        11
        12
        13
        14
        15
        16
        KafkaServer {
            com.sun.security.auth.module.Krb5LoginModule required
            useKeyTab=true
            storeKey=true
            keyTab="/etc/security/keytabs/kafka_server.keytab"
            principal="kafka/kafka1.hostname.com@EXAMPLE.COM";
        };
         
        // Zookeeper client authentication
        Client {
        com.sun.security.auth.module.Krb5LoginModule required
        useKeyTab=true
        storeKey=true
        keyTab="/etc/security/keytabs/kafka_server.keytab"
        principal="kafka/kafka1.hostname.com@EXAMPLE.COM";
        };

    KafkaServer

        section in the JAAS file tells the broker which principal to use and the location of the keytab where this principal is stored. It allows the broker to login using the keytab specified in this section. See

    notes

        for more details on Zookeeper SASL configuration.

      1. Pass the JAAS and optionally the krb5 file locations as JVM parameters to each Kafka broker (see here for more details):
            -Djava.security.krb5.conf=/etc/kafka/krb5.conf
        -Djava.security.auth.login.config=/etc/kafka/kafka_server_jaas.conf
      2. Make sure the keytabs configured in the JAAS file are readable by the operating system user who is starting kafka broker.
      3. Configure SASL port and SASL mechanisms in server.properties as described here. For example:
            listeners=SASL_PLAINTEXT://host.name:port
        security.inter.broker.protocol=SASL_PLAINTEXT
        sasl.mechanism.inter.broker.protocol=GSSAPI
        sasl.enabled.mechanisms=GSSAPI
      4. We must also configure the service name in server.properties, which should match the principal name of the kafka brokers. In the above example, principal is "kafka/kafka1.hostname.com@EXAMPLE.com", so:

        sasl.kerberos.service.name=kafka
  3. Configuring Kafka Clients

    To configure SASL authentication on the clients:

    1. Clients (producers, consumers, connect workers, etc) will authenticate to the cluster with their own principal (usually with the same name as the user running the client), so obtain or create these principals as needed. Then configure the JAAS configuration property for each client. Different clients within a JVM may run as different users by specifying different principals. The property sasl.jaas.config in producer.properties or consumer.properties describes how clients like producer and consumer can connect to the Kafka Broker. The following is an example configuration for a client using a keytab (recommended for long-running processes):

          sasl.jaas.config=com.sun.security.auth.module.Krb5LoginModule required \
      useKeyTab=true \
      storeKey=true \
      keyTab="/etc/security/keytabs/kafka_client.keytab" \
      principal="kafka-client-1@EXAMPLE.COM";

      For command-line utilities like kafka-console-consumer or kafka-console-producer, kinit can be used along with "useTicketCache=true" as in:

          sasl.jaas.config=com.sun.security.auth.module.Krb5LoginModule required \
      useTicketCache=true;

      JAAS configuration for clients may alternatively be specified as a JVM parameter similar to brokers as described here. Clients use the login section named KafkaClient. This option allows only one user for all client connections from a JVM.

    2. Make sure the keytabs configured in the JAAS configuration are readable by the operating system user who is starting kafka client.
    3. Optionally pass the krb5 file locations as JVM parameters to each client JVM (see here for more details):
          -Djava.security.krb5.conf=/etc/kafka/krb5.conf
    4. Configure the following properties in producer.properties or consumer.properties:
          security.protocol=SASL_PLAINTEXT (or SASL_SSL)
      sasl.mechanism=GSSAPI
      sasl.kerberos.service.name=kafka

kafka Authentication using SASL/Kerberos的更多相关文章

  1. Authentication using SASL/Kerberos

    Prerequisites KerberosIf your organization is already using a Kerberos server (for example, by using ...

  2. Kafka监控系统Kafka Eagle:支持kerberos认证

    在线文档:https://ke.smartloli.org/ 作者博客:https://www.cnblogs.com/smartloli/p/9371904.html 源码地址:https://gi ...

  3. kafka Enabling Kerberos Authentication

    CDK 2.0 and higher Powered By Apache Kafka supports Kerberos authentication, but it is supported onl ...

  4. kafka集群安全化之启用kerberos与acl

    一.背景 在我们部署完kafka之后,虽然我们已经可以“肆意”的用kafka了,但是在一个大公司的实际生产环境中,kafka集群往往十分庞大,每个使用者都应该只关心自己所负责的Topic,并且对其他人 ...

  5. kafka SASL认证介绍及自定义SASL PLAIN认证功能

    目录 kafka 2.x用户认证方式小结 SASL/PLAIN实例(配置及客户端) broker配置 客户端配置 自定义SASL/PLAIN认证(二次开发) kafka2新的callback接口介绍 ...

  6. flume集成kafka(kafka开启kerberos)配置

    根据flume官网:当kafka涉及kerberos认证: 涉及两点配置,如下: 配置一:见下实例中红色部分 配置conf实例: [root@gz237-107 conf]# cat flume_sl ...

  7. Kafka Kerberos 安全认证

    本主要介绍在 Kafka 中如何配置 Kerberos 认证,文中所使用到的软件版本:Java 1.8.0_261.Kafka_2.12-2.6.0.Kerberos 1.15.1. 1. Kerbe ...

  8. Kafka集成Kerberos之后如何使用生产者消费者命令

    1.生产者1.1.准备jaas.conf并添加到环境变量(使用以下方式的其中一种)1.1.1.使用Kinit方式前提是手动kinit 配置内容为: KafkaClient { com.sun.secu ...

  9. Java Api Consumer 连接启用Kerberos认证的Kafka

    java程序连接到一个需要Kerberos认证的kafka集群上,消费生产者生产的信息,kafka版本是2.10-0.10.0.1: Java程序以maven构建,(怎么构建maven工程,可去问下度 ...

随机推荐

  1. The field file exceeds its maximum permitted size of 1048576 bytes.

    问题原因:Spring Boot内置tomcat限制了请求文件的大小 下面是修改方法:根据自己的Spring Boot版本 2.0之后版本的修改方式 在主配置文件 application.proper ...

  2. python3自动部署mariadb主从

    master import configparser import os def config_mariadb_yum(): exists = os.path.exists('/etc/yum.rep ...

  3. centos7 下 yum 安装Nginx

    centos7 下 yum 安装和配置 Nginx 添加yum源 Nginx不在默认的yum源中,可以使用epel或者官网的yum源,这里使用官网的yum源 rpm -ivh http://nginx ...

  4. ubuntu,安装、配置和美化(1)

    ubuntu linux 1.前言 1.1关于Ubuntu Linux Ubuntu是一个以桌面应用为主的Linux操作系统,其名称来自非洲南部祖鲁语或豪萨语的“ubuntu"一词,意思是“ ...

  5. Python在Linux环境中安装Thrift

    1.文件下载:thrift-0.11.0.tar.gz 个人网盘下载:链接:https://pan.baidu.com/s/1MXgx8LuN4wk7ssVUD9Wzaw  提取码:xw85  2. ...

  6. nc用法

    NC 在客户端 和 服务器执行------------------------------------------------------------------------------------- ...

  7. windows命令行模式打开目录

    cmd命令行里面,打开当前目录方式如下: explorer .

  8. GOOD BYE OI

    大米饼正式退役了,OI给我带来很多东西 我会的数学知识基本都在下面了 博客园的评论区问题如果我看到了应该是会尽力回答的... 这也是我作为一个OIer最后一次讲课的讲稿 20190731 多项式乘法 ...

  9. oracle用dba创建用户并授权

    参考: https://blog.csdn.net/zhang195617/article/details/5857769 sqlplus中切换用户,如切换到adm用户,命令为:conn adm/12 ...

  10. html 指定页面字符集的两种方式

      1.html指定页面字符集的两种方式 方式一: <meta charset="utf-8"> 方式二: <meta http-equiv="Cont ...