文档首页> MapReduce服务 MRS> 故障排除> 使用Kafka> 账号锁定导致启动Kafka组件失败
更新时间:2023-12-22 GMT+08:00

账号锁定导致启动Kafka组件失败

问题背景与现象

新安装集群,启动Kafka失败。显示认证失败,导致启动失败。

/home/omm/kerberos/bin/kinit -k -t ${BIGDATA_HOME}/etc/2_15_ Broker /kafka.keytab kafka/hadoop.hadoop.com -c ${BIGDATA_HOME}/etc/2_15_ Broker /11846 failed.
export key tab file for kafka/hadoop.hadoop.com failed.export and check keytab file failed, errMsg=]}] for Broker #192.168.1.92@192-168-1-92.
[2015-07-11 02:34:33] RoleInstance started failure for ROLE[name: Broker].
[2015-07-11 02:34:34] Failed to complete the instances start operation. Current operation entities: [Broker #192.168.1.92@192-168-1-92], Failure entites : [Broker #192.168.1.92@192-168-1-92].Operation Failed.Failed to complete the instances start operation. Current operation entities: [Broker#192.168.1.92@192-168-1-92], Failure entites: [Broker #192.168.1.92@192-168-1-92].

原因分析

查看Kerberos日志“/var/log/Bigdata/kerberos/krb5kdc.log”,发现有集群外的IP使用Kafka用户连接,导致多次认证失败,最终导致Kafka账号被锁定。
Jul 11 02:49:16 192-168-1-91 krb5kdc[1863](info): AS_REQ (2 etypes {18 17}) 192.168.1.93: NEEDED_PREAUTH: kafka/hadoop.hadoop.com@HADOOP.COM for krbtgt/HADOOP.COM@HADOOP.COM, Additional pre-authentication required
Jul 11 02:49:16 192-168-1-91 krb5kdc[1863](info): preauth (encrypted_timestamp) verify failure: Decrypt integrity check failed
Jul 11 02:49:16 192-168-1-91 krb5kdc[1863](info): AS_REQ (2 etypes {18 17}) 192.168.1.93: PREAUTH_FAILED: kafka/hadoop.hadoop.com@HADOOP.COM for krbtgt/HADOOP.COM@HADOOP.COM, Decrypt integrity check failed

解决办法

进入集群外的节点(如原因分析示例中的192.168.1.93),断开其对Kafka的认证。等待5分钟,此账号就会被解锁。