【kerberos】org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN,

前言

在用SUSE 操作系统安装 CM 大数据平台,在集群开启 kerberos 后,使用 HDFS 命令报错如下:

hdfs dfs -ls /
19/05/29 18:06:15 WARN ipc.Client: Exception encountered while connecting to the server : org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
ls: Failed on local exception: java.io.IOException: org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]; Host Details : local host is: "hadoop001/172.17.239.230"; destination host is: "hadoop001":8020;

环境信息

SUSE Linux Enterprise Server 12 Service Pack 1 (SLES 12 SP5)

问题复现

  1. 先进行认证
kinit -kt hdfs.keytab hdfs

## 查看票据
klist 

在这里插入图片描述

export HADOOP_ROOT_LOGGER=DEBUG,console
export HADOOP_OPTS="-Dsun.security.krb5.debug=true -Djavax.net.debug=ssl"
hdfs dfs -ls /

在这里插入图片描述

问题原因

仔细看,在使用 klist 命令时,有个Ticket Cache : Dir 他指向的路径是: /run/user/0/krb5cc/tkt

而在执行 HDFS 命令时,有个 KinitOptions cache name is 他指向的路径是 tmp/krb5cc_0

HDFS 默认是去 /tmp 目录下找 Kerberos 缓存。然后 SUSE 操作系统下 kerberos 并不是放在 /tmp 目录下,导致 HDFS 客户端认为你没有进行 Kerberos 认证。所以报错。

解决方案

/etc/krb5.conf中,我们增加了下面的参数以后,就可以正常kinit,也可以执行hdfs的命令了。

default_ccache_name = FILE:/tmp/krb5cc_%{uid}

在这里插入图片描述

  1. 销毁凭据
kdestroy
  1. 重新认证
kinit -kt hdfs.keytab hdfs
  1. 查看HDS
hdfs dfs -ls /

在重新执行,问题解决!

此外网上还有别的解决方案,但都不是我这种情况。这里也顺便贴下:

方法一:

krb5.conf文件中的default_ccache_name注释掉,然后执行kdestroy,重新kinit,问题解决

方法二:

在 /etc/krb5.conf 里补全了加密方法后
https://www.cnblogs.com/tommyjiang/p/15008787.html

方法三:

代码问题
https://blog.csdn.net/ifenggege/article/details/111243297

  • 0
    点赞
  • 3
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
As more corporations turn to Hadoop to store and process their most valuable data, the risk of a potential breach of those systems increases exponentially. This practical book not only shows Hadoop administrators and security architects how to protect Hadoop data from unauthorized access, it also shows how to limit the ability of an attacker to corrupt or modify data in the event of a security breach. Authors Ben Spivey and Joey Echeverria provide in-depth information about the security features available in Hadoop, and organize them according to common computer security concepts. You’ll also get real-world examples that demonstrate how you can apply these concepts to your use cases. Understand the challenges of securing distributed systems, particularly Hadoop Use best practices for preparing Hadoop cluster hardware as securely as possible Get an overview of the Kerberos network authentication protocol Delve into authorization and accounting principles as they apply to Hadoop Learn how to use mechanisms to protect data in a Hadoop cluster, both in transit and at rest Integrate Hadoop data ingest into enterprise-wide security architecture Ensure that security architecture reaches all the way to end-user access Table of Contents Chapter 1. Introduction Part I. Security Architecture Chapter 2. Securing Distributed Systems Chapter 3. System Architecture Chapter 4. Kerberos Part II. Authentication, Authorization, and Accounting Chapter 5. Identity and Authentication Chapter 6. Authorization Chapter 7. Apache Sentry Chapter 8. Accounting Part III. Data Security Chapter 9. Data Protection Chapter 10. Securing Data Ingest Chapter 11. Data Extraction and Client Access Security Chapter 12. Cloudera Hue Part IV. Putting It All Together Chapter 13. Case Studies

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值