hadoop kerberos java_java - 无法使用kerberos凭单使用Java代码创建hadoop文件 - 堆栈内存溢出...

我们的hadoop集群使用kerberos,因此我们首先需要使用kinit,然后使用“ hadoop fs -ls /”之类的命令。 现在我使用jaas和gssapi登录并在集群中创建文件,但是失败了。这是我的代码:

import java.security.PrivilegedAction;

import javax.security.auth.Subject;

import javax.security.auth.login.LoginContext;

import javax.security.auth.login.LoginException;

import org.apache.hadoop.conf.Configuration;

import org.apache.hadoop.fs.FSDataOutputStream;

import org.apache.hadoop.fs.FileSystem;

import org.apache.hadoop.fs.Path;

import org.ietf.jgss.*;

public static void main(String[] args) throws LoginException

{

System.setProperty("sun.security.krb5.debug", "false");

System.setProperty("java.security.krb5.realm", "H236");

System.setProperty("java.security.krb5.kdc", "172.16.0.236");

System.setProperty( "javax.security.auth.useSubjectCredsOnly", "false");

System.setProperty("java.security.auth.login.config",

"/etc/hadoop/conf/jaas.conf");

LoginContext lc = null;

lc = new LoginContext("Client");

lc.login();

System.out.println("Authentication succeeded!");

Subject subject = lc.getSubject();

Subject.doAs( subject, new PrivilegedAction()

{

public byte[] run()

{

Configuration conf = new Configuration();

try

{

Oid krb5Mechanism = new Oid("1.2.840.113554.1.2.2");

GSSManager manager = GSSManager.getInstance();

GSSName clientName = manager.createName("hdfs/172.16.0.239@H236",

GSSName.NT_USER_NAME);

GSSCredential clientCreds = manager.createCredential(clientName,

GSSCredential.DEFAULT_LIFETIME,

krb5Mechanism,

GSSCredential.INITIATE_ONLY);

GSSName serverName = manager.createName("hdfs@172.16.0.239",

GSSName.NT_HOSTBASED_SERVICE);

GSSContext context = manager.createContext(serverName,

krb5Mechanism,

clientCreds,

GSSContext.DEFAULT_LIFETIME);

context.requestMutualAuth(true);

context.requestConf(false);

context.requestInteg(true);

System.out.println(clientCreds.getName().toString());

System.out.println(clientCreds.getRemainingLifetime());

byte[] outToken = context.initSecContext(new byte[0], 0, 0);

//create file on hadoop cluster

FileSystem fs = FileSystem.get(conf);

Path f = new Path("hdfs:///hdfs/123");

FSDataOutputStream s = fs.create(f, true);

System.out.println("done\n");

int i = 0;

for (i = 0; i < 100; ++i)

s.writeChars("test");

s.close();

}catch (Exception e)

{

e.printStackTrace();

}

return null;

}//endof run

});

}//endof main

jaas.conf如下所示:

Client {

com.sun.security.auth.module.Krb5LoginModule required

debug=true

storeKey=true

doNotPrompt=true

useKeyTab=true

keyTab="/etc/hadoop/conf/hdfs.keytab"

principal="hdfs/172.16.0.239@H236";

};

我的登录用户名为root,在使用“ hadoop jar ./client.jar”之前运行此代码,我运行kdestory删除kerberos缓存,然后出现以下错误:

Authentication succeeded!

ERROR security.UserGroupInformation: PriviledgedActionException as:root (auth:KERBEROS) cause:javax.sec

urity.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided

(Mechanism level: Attempt to obtain new INITIATE credentials failed! (null))]

ipc.Client: Exception encountered while connecting to the server : javax.security.sasl.SaslExcepti

on: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Attempt to

obtain new INITIATE credentials failed! (null))]

ERROR security.UserGroupInformation: PriviledgedActionException as:root (auth:KERBEROS) cause:java.io.I

OException: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid

credentials provided (Mechanism level: Attempt to obtain new INITIATE credentials failed! (null))]

WARN retry.RetryInvocationHandler: Exception while invoking class org.apache.hadoop.hdfs.protocolPB.Cli

entNamenodeProtocolTranslatorPB.create. Not retrying because the invoked method is not idempotent,

and unable to determine whether it was invoked

java.io.IOException: Failed on local exception: java.io.IOException: javax.security.sasl.SaslException:

GSSinitiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Attempt to

obtain new INITIATE credentials failed! (null))]; Host Details : local host is: "XP236/172.16.0.236"; destination

host is: "172.16.0.236":8020;at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:760)

我不知道如何使它起作用,任何人都可以帮助我,非常感谢。

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值