本文章基于Kafka配置 SASL/PLAINTEXT后编写,如未配置请参考:https://datamining.blog.csdn.net/article/details/90264636 进行修改
配置kafka server.properties文件
super.users 指定超级用户,不受权限控制
listeners=SASL_PLAINTEXT://ip:pot
security.inter.broker.protocol=SASL_PLAINTEXT
sasl.mechanism.inter.broker.protocol=PLAIN
sasl.enabled.mechanisms=PLAIN
authorizer.class.name = kafka.security.auth.SimpleAclAuthorizer
super.users=User:admin
用户配置为kafka设置的kafka_server_jass.conf,如何配置参考:https://datamining.blog.csdn.net/article/details/90264636
配置为:
# user_zsh,user_dataflow,user_crawler,user_taskcentr 皆为用户
KafkaServer {org.apache.kafka.common.security.plain.PlainLoginModule requiredusername="zsh"password="niubi"user_zsh="niubi"user_dataflow="dataflow"user_crawler="crawler"user_taskcenter="taskcenter";
};
Client{org.apache.kafka.common.security.plain.PlainLoginModule requiredusername="kafka"password="kafkapwd";
};
- 指定Topic拒绝所有用户读写权限(超级用户除外)
[root@ecs-0002 logs]# kafka-acls.sh --authorizer-properties zookeeper.connect=localhost:2181 -add --deny-principal User:* --operation Read --operation Write --topic test_topic_3
Adding ACLs for resource `Topic:test_topic_3`: User:* has Deny permission for operations: Read from hosts: *User:* has Deny permission for operations: Write from hosts: * Current ACLs for resource `Topic:test_topic_3`: User:* has Deny permission for operations: Read from hosts: *User:* has Deny permission for operations: Write from hosts: *
- crawler用户添加指定Topic读写权限,不指定的用户均无法访问
[root@ecs-0004 config]# kafka-acls.sh --authorizer-properties zookeeper.connect=localhost:2181 -add --allow-principal User:crawler --operation Read --operation Write --topic test_topic_3
Adding ACLs for resource `Topic:test_topic_3`: User:crawler has Allow permission for operations: Write from hosts: *User:crawler has Allow permission for operations: Read from hosts: * Current ACLs for resource `Topic:test_topic_3`: User:crawler has Allow permission for operations: Write from hosts: *User:crawler has Allow permission for operations: Read from hosts: *
- crawler用户添加读权限
kafka-acls.sh --authorizer-properties zookeeper.connect=localhost:2181 -add --allow-principal User:crawler --operation Read --topic test_topic_3
- crawler用户添加写权限
kafka-acls.sh --authorizer-properties zookeeper.connect=localhost:2181 -add --allow-principal User:dataflow --operation Write --topic test_topic_3
- 移除某个Topic ACL权限,任何权限不设置即所有用户都可以操作
[root@ecs-0004 config]# kafka-acls.sh --authorizer-properties zookeeper.connect=localhost:2181 --remove --topic test_topic_3 Are you sure you want to delete all ACLs for resource `Topic:test_topic_3`? (y/n)
y
Current ACLs for resource `Topic:test_topic_3`:
- 删除Topic 指定ACL权限
kafka-acls.sh --authorizer-properties zookeeper.connect=localhost:2181 --remove --allow-principal User:crawler --allow-principal User:Alice --allow-host 198.168.2.0 --allow-host 198.168.2.1 --operation Read --operation Write --topic test_topic_3
- 指定host某台服务器可以操作
注:--allow-host
和--deny-host 指定方式,必须与
--deny-principal 或--allow-principa共同使用
使用--allow-host必须与
--allow-principa 一起使用,指定某个用户在某台服务器可以操作
同理,使用--deny-host必须与
--deny-principal 一起使用
[root@ecs-0004 config]# kafka-acls.sh --authorizer-properties zookeeper.connect=localhost:2181 -add --allow-principal User:crawler --allow-host 192.168.2.221 --operation Read --topic test_topic_3
Adding ACLs for resource `Topic:test_topic_3`: User:crawler has Allow permission for operations: Read from hosts: 192.168.2.221 Current ACLs for resource `Topic:test_topic_3`: User:crawler has Allow permission for operations: Read from hosts: 192.168.2.222User:crawler has Allow permission for operations: Read from hosts: 192.168.2.221
- 查看ACL列表
[root@ecs-0004 config]# kafka-acls.sh --authorizer-properties zookeeper.connect=localhost:2181 --list --topic test_topic_3
Current ACLs for resource `Topic:test_topic_3`: User:crawler has Allow permission for operations: Read from hosts: *
- 增加producer、consumer ACL
kafka-acls.sh --authorizer-properties zookeeper.connect=localhost:2181 --add --allow-principal User:crawler --producer --topic test_topic_3
kafka-acls.sh --authorizer-properties zookeeper.connect=localhost:2181 --add --allow-principal User:crawler --consumer --topic test_topic_3 --group group-1