Kerberos 与hadoop组件之间的认证问题 持续更新

  1. ICMP Port Unreachable
Caused by: javax.security.auth.login.LoginException: ICMP Port Unreachable
	at com.sun.security.auth.module.Krb5LoginModule.attemptAuthentication(Krb5LoginModule.java:808)
	at com.sun.security.auth.module.Krb5LoginModule.login(Krb5LoginModule.java:617)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at javax.security.auth.login.LoginContext.invoke(LoginContext.java:755)
	at javax.security.auth.login.LoginContext.access$000(LoginContext.java:195)
	at javax.security.auth.login.LoginContext$4.run(LoginContext.java:682)
	at javax.security.auth.login.LoginContext$4.run(LoginContext.java:680)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.login.LoginContext.invokePriv(LoginContext.java:680)
	at javax.security.auth.login.LoginContext.login(LoginContext.java:587)
	at org.apache.kafka.common.security.authenticator.AbstractLogin.login(AbstractLogin.java:58)
	at org.apache.kafka.common.security.kerberos.KerberosLogin.login(KerberosLogin.java:109)
	at org.apache.kafka.common.security.authenticator.LoginManager.<init>(LoginManager.java:55)
	at org.apache.kafka.common.security.authenticator.LoginManager.acquireLoginManager(LoginManager.java:89)
	at org.apache.kafka.common.network.SaslChannelBuilder.configure(SaslChannelBuilder.java:86)
	... 8 more
Caused by: java.net.PortUnreachableException: ICMP Port Unreachable
	at java.net.DualStackPlainDatagramSocketImpl.socketReceiveOrPeekData(Native Method)
	at java.net.DualStackPlainDatagramSocketImpl.receive0(DualStackPlainDatagramSocketImpl.java:124)
	at java.net.AbstractPlainDatagramSocketImpl.receive(AbstractPlainDatagramSocketImpl.java:143)
	at java.net.DatagramSocket.receive(DatagramSocket.java:812)
	at sun.security.krb5.internal.UDPClient.receive(NetClient.java:206)
	at sun.security.krb5.KdcComm$KdcCommunication.run(KdcComm.java:411)
	at sun.security.krb5.KdcComm$KdcCommunication.run(KdcComm.java:364)
	at java.security.AccessController.doPrivileged(Native Method)
	at sun.security.krb5.KdcComm.send(KdcComm.java:348)
	at sun.security.krb5.KdcComm.sendIfPossible(KdcComm.java:253)
	at sun.security.krb5.KdcComm.send(KdcComm.java:229)
	at sun.security.krb5.KdcComm.send(KdcComm.java:200)
	at sun.security.krb5.KrbAsReqBuilder.send(KrbAsReqBuilder.java:316)
	at sun.security.krb5.KrbAsReqBuilder.action(KrbAsReqBuilder.java:361)
	at com.sun.security.auth.module.Krb5LoginModule.attemptAuthentication(Krb5LoginModule.java:776)
	... 25 more
]

解决: 在连接Kafka,构造kafka consumer 时抛出异常,后来发现是由于开发环境下 hosts 文件内映射的域名与IP不一致,修改后即可

  1. Server asks us to fall back to SIMPLE auth, but this client is configured to only allow secure connections.
Exception in thread "main" org.apache.flink.runtime.client.JobExecutionException: Could not retrieve JobResult.
	at org.apache.flink.runtime.minicluster.MiniCluster.executeJobBlocking(MiniCluster.java:623)
	at org.apache.flink.streaming.api.environment.LocalStreamEnvironment.execute(LocalStreamEnvironment.java:117)
	at org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.execute(StreamExecutionEnvironment.java:1507)
	at org.apache.flink.streaming.api.scala.StreamExecutionEnvironment.execute(StreamExecutionEnvironment.scala:654)
	at com.clb.flink.consumer.OggKafkaToHdfs$.main(OggKafkaToHdfs.scala:409)
	at com.clb.flink.consumer.OggKafkaToHdfs.main(OggKafkaToHdfs.scala)
Caused by: org.apache.flink.runtime.client.JobSubmissionException: Failed to submit job.
	at org.apache.flink.runtime.dispatcher.Dispatcher.lambda$internalSubmitJob$2(Dispatcher.java:333)
	at java.util.concurrent.CompletableFuture.uniHandle(CompletableFuture.java:822)
	at java.util.concurrent.CompletableFuture$UniHandle.tryFire(CompletableFuture.java:797)
	at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:442)
	at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:40)
	at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:44)
	at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
	at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
	at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
	at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
20:28:25.979 [flink-akka.actor.default-dispatcher-2] INFO  org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopped Akka RPC service.
Caused by: java.lang.RuntimeException: org.apache.flink.runtime.client.JobExecutionException: Could not set up JobManager
	at org.apache.flink.util.function.CheckedSupplier.lambda$unchecked$0(CheckedSupplier.java:36)
	at java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1590)
	... 6 more
Caused by: org.apache.flink.runtime.client.JobExecutionException: Could not set up JobManager
	at org.apache.flink.runtime.jobmaster.JobManagerRunner.<init>(JobManagerRunner.java:152)
	at org.apache.flink.runtime.dispatcher.DefaultJobManagerRunnerFactory.createJobManagerRunner(DefaultJobManagerRunnerFactory.java:83)
	at org.apache.flink.runtime.dispatcher.Dispatcher.lambda$createJobManagerRunner$5(Dispatcher.java:375)
	at org.apache.flink.util.function.CheckedSupplier.lambda$unchecked$0(CheckedSupplier.java:34)
	... 7 more
Caused by: org.apache.flink.util.FlinkRuntimeException: Failed to create checkpoint storage at checkpoint coordinator side.
	at org.apache.flink.runtime.checkpoint.CheckpointCoordinator.<init>(CheckpointCoordinator.java:255)
	at org.apache.flink.runtime.executiongraph.ExecutionGraph.enableCheckpointing(ExecutionGraph.java:594)
	at org.apache.flink.runtime.executiongraph.ExecutionGraphBuilder.buildGraph(ExecutionGraphBuilder.java:340)
	at org.apache.flink.runtime.executiongraph.ExecutionGraphBuilder.buildGraph(ExecutionGraphBuilder.java:106)
	at org.apache.flink.runtime.scheduler.LegacyScheduler.createExecutionGraph(LegacyScheduler.java:207)
	at org.apache.flink.runtime.scheduler.LegacyScheduler.createAndRestoreExecutionGraph(LegacyScheduler.java:184)
	at org.apache.flink.runtime.scheduler.LegacyScheduler.<init>(LegacyScheduler.java:176)
	at org.apache.flink.runtime.scheduler.LegacySchedulerFactory.createInstance(LegacySchedulerFactory.java:70)
	at org.apache.flink.runtime.jobmaster.JobMaster.createScheduler(JobMaster.java:275)
	at org.apache.flink.runtime.jobmaster.JobMaster.<init>(JobMaster.java:265)
	at org.apache.flink.runtime.jobmaster.factories.DefaultJobMasterServiceFactory.createJobMasterService(DefaultJobMasterServiceFactory.java:98)
	at org.apache.flink.runtime.jobmaster.factories.DefaultJobMasterServiceFactory.createJobMasterService(DefaultJobMasterServiceFactory.java:40)
	at org.apache.flink.runtime.jobmaster.JobManagerRunner.<init>(JobManagerRunner.java:146)
	... 10 more
Caused by: java.io.IOException: Failed on local exception: java.io.IOException: Server asks us to fall back to SIMPLE auth, but this client is configured to only allow secure connections.; Host Details : local host is: "laptop-XH/192.168.30.1"; destination host is: "master":8020; 
	at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:772)
	at org.apache.hadoop.ipc.Client.call(Client.java:1472)
	at org.apache.hadoop.ipc.Client.call(Client.java:1399)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232)
	at com.sun.proxy.$Proxy24.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:539)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
	at com.sun.proxy.$Proxy25.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2753)
	at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2724)
	at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:870)
	at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:866)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:866)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:859)
	at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1817)
	at org.apache.flink.runtime.fs.hdfs.HadoopFileSystem.mkdirs(HadoopFileSystem.java:170)
	at org.apache.flink.runtime.state.filesystem.FsCheckpointStorage.<init>(FsCheckpointStorage.java:90)
	at org.apache.flink.runtime.state.filesystem.FsCheckpointStorage.<init>(FsCheckpointStorage.java:61)
	at org.apache.flink.runtime.state.filesystem.FsStateBackend.createCheckpointStorage(FsStateBackend.java:490)
	at org.apache.flink.runtime.checkpoint.CheckpointCoordinator.<init>(CheckpointCoordinator.java:253)
	... 22 more
Caused by: java.io.IOException: Server asks us to fall back to SIMPLE auth, but this client is configured to only allow secure connections.
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:747)
	at org.apache.hadoop.ipc.Client$Connection.access$2800(Client.java:368)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1521)
	at org.apache.hadoop.ipc.Client.call(Client.java:1438)
	... 46 more

解决: 该异常一般是在认证 hdfs 时认证失败,导致后续操作无法成功导致的异常,kerberos 在认证 hdfs 时,需要new Configuration();对象,默认情况下,jvm是直接从classpath下去找core-site.xml,hdfs-site.xml 等文件,此时如果classpath下没有该文件,或者是文件出现异常,则会导致kerberos认证hdfs异常,抛出此类错误. 替换正确的配置文件即可解决

  1. Could not login: the client is being asked for a password, but the Kafka client code does not currently support obtaining a password from the user. not available to garner authentication information from the user
org.apache.kafka.common.KafkaException: Failed to construct kafka consumer
	at org.apache.kafka.clients.consumer.KafkaConsumer.<init>(KafkaConsumer.java:717) ~[kafka-clients-0.10.2.1.jar:na]
	at org.apache.kafka.clients.consumer.KafkaConsumer.<init>(KafkaConsumer.java:597) ~[kafka-clients-0.10.2.1.jar:na]
	at org.apache.kafka.clients.consumer.KafkaConsumer.<init>(KafkaConsumer.java:579) ~[kafka-clients-0.10.2.1.jar:na]
	at org.apache.flink.streaming.connectors.kafka.internal.Kafka09PartitionDiscoverer.initializeConnections(Kafka09PartitionDiscoverer.java:58) ~[flink-connector-kafka-0.9_2.11-1.9.0.jar:1.9.0]
	at org.apache.flink.streaming.connectors.kafka.internals.AbstractPartitionDiscoverer.open(AbstractPartitionDiscoverer.java:94) ~[flink-connector-kafka-base_2.11-1.9.0.jar:1.9.0]
	at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumerBase.open(FlinkKafkaConsumerBase.java:505) ~[flink-connector-kafka-base_2.11-1.9.0.jar:1.9.0]
	at org.apache.flink.api.common.functions.util.FunctionUtils.openFunction(FunctionUtils.java:36) ~[flink-core-1.9.0.jar:1.9.0]
	at org.apache.flink.streaming.api.operators.AbstractUdfStreamOperator.open(AbstractUdfStreamOperator.java:102) ~[flink-streaming-java_2.11-1.9.0.jar:1.9.0]
	at org.apache.flink.streaming.runtime.tasks.StreamTask.openAllOperators(StreamTask.java:529) ~[flink-streaming-java_2.11-1.9.0.jar:1.9.0]
	at org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:393) ~[flink-streaming-java_2.11-1.9.0.jar:1.9.0]
	at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:705) ~[flink-runtime_2.11-1.9.0.jar:1.9.0]
	at org.apache.flink.runtime.taskmanager.Task.run(Task.java:530) ~[flink-runtime_2.11-1.9.0.jar:1.9.0]
	at java.lang.Thread.run(Thread.java:748) ~[na:1.8.0_191]
Caused by: org.apache.kafka.common.KafkaException: javax.security.auth.login.LoginException: Could not login: the client is being asked for a password, but the Kafka client code does not currently support obtaining a password from the user. not available to garner  authentication information from the user
	at org.apache.kafka.common.network.SaslChannelBuilder.configure(SaslChannelBuilder.java:94) ~[kafka-clients-0.10.2.1.jar:na]
	at org.apache.kafka.common.network.ChannelBuilders.create(ChannelBuilders.java:93) ~[kafka-clients-0.10.2.1.jar:na]
	at org.apache.kafka.common.network.ChannelBuilders.clientChannelBuilder(ChannelBuilders.java:51) ~[kafka-clients-0.10.2.1.jar:na]
	at org.apache.kafka.clients.ClientUtils.createChannelBuilder(ClientUtils.java:84) ~[kafka-clients-0.10.2.1.jar:na]
	at org.apache.kafka.clients.consumer.KafkaConsumer.<init>(KafkaConsumer.java:657) ~[kafka-clients-0.10.2.1.jar:na]
	... 12 common frames omitted
Caused by: javax.security.auth.login.LoginException: Could not login: the client is being asked for a password, but the Kafka client code does not currently support obtaining a password from the user. not available to garner  authentication information from the user
	at com.sun.security.auth.module.Krb5LoginModule.promptForPass(Krb5LoginModule.java:940) ~[na:1.8.0_191]
	at com.sun.security.auth.module.Krb5LoginModule.attemptAuthentication(Krb5LoginModule.java:760) ~[na:1.8.0_191]
	at com.sun.security.auth.module.Krb5LoginModule.login(Krb5LoginModule.java:617) ~[na:1.8.0_191]
	at sun.reflect.GeneratedMethodAccessor40.invoke(Unknown Source) ~[na:na]
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.8.0_191]
	at java.lang.reflect.Method.invoke(Method.java:498) ~[na:1.8.0_191]
	at javax.security.auth.login.LoginContext.invoke(LoginContext.java:755) ~[na:1.8.0_191]
	at javax.security.auth.login.LoginContext.access$000(LoginContext.java:195) ~[na:1.8.0_191]
	at javax.security.auth.login.LoginContext$4.run(LoginContext.java:682) ~[na:1.8.0_191]
	at javax.security.auth.login.LoginContext$4.run(LoginContext.java:680) ~[na:1.8.0_191]
	at java.security.AccessController.doPrivileged(Native Method) ~[na:1.8.0_191]
	at javax.security.auth.login.LoginContext.invokePriv(LoginContext.java:680) ~[na:1.8.0_191]
	at javax.security.auth.login.LoginContext.login(LoginContext.java:587) ~[na:1.8.0_191]
	at org.apache.kafka.common.security.authenticator.AbstractLogin.login(AbstractLogin.java:58) ~[kafka-clients-0.10.2.1.jar:na]
	at org.apache.kafka.common.security.kerberos.KerberosLogin.login(KerberosLogin.java:109) ~[kafka-clients-0.10.2.1.jar:na]
	at org.apache.kafka.common.security.authenticator.LoginManager.<init>(LoginManager.java:55) ~[kafka-clients-0.10.2.1.jar:na]
	at org.apache.kafka.common.security.authenticator.LoginManager.acquireLoginManager(LoginManager.java:89) ~[kafka-clients-0.10.2.1.jar:na]
	at org.apache.kafka.common.network.SaslChannelBuilder.configure(SaslChannelBuilder.java:86) ~[kafka-clients-0.10.2.1.jar:na]
	... 16 common frames omitted

解决: kerberos 认证kafka 还是相对hdfs 较复杂的,首先得添加环境变量
System.setProperty("java.security.auth.login.config","jaas.conf")
构造kafkaProp的时候也需要添加部分环境:security.protocol,sasl.mechanism,sasl.kerberos.service.name,此处的问题是因为 jaas.conf 文件存在异常,不同的环境jaas.conf的配置不一样,下面我给出一个我没有问题的jaas.conf配置,仅供参考

KafkaClient {
com.sun.security.auth.module.Krb5LoginModule required
useKeyTab=true
keyTab="/etc/kerberos/keytab/kafka.keytab"
principal="kafka/master@HADOOP.COM";
};
  • 0
    点赞
  • 1
    收藏
    觉得还不错? 一键收藏
  • 2
    评论
Kerberos Hadoop认证是一种基于Kerberos协议的Hadoop集群安全验证机制。下面是一个简单的教程来介绍如何配置和使用Kerberos Hadoop认证。 步骤1:安装Kerberos 首先,需要在Hadoop集群上安装和配置Kerberos。安装Kerberos包,并设置Kerberos配置文件。确保所有节点上的Kerberos配置文件都是相同的。 步骤2:配置HadoopHadoop的配置文件中,进行以下更改: - 将"security.authentication"设置为"kerberos",以启用Kerberos认证。 - 设置"security.authorization"为"true",启用Hadoop中的授权功能。 - 配置Kerberos相关的参数,如"dfs.namenode.kerberos.principal"和"dfs.namenode.keytab.file"等。 步骤3:生成和分发Kerberos认证凭证 使用Kerberos的"Kadmin"工具,创建和管理Kerberos主体和密钥表。为Hadoop服务和用户创建主体,并生成相应的密钥表文件。然后,将密钥表文件分发给Hadoop集群中的所有节点。 步骤4:配置Hadoop服务 在每个Hadoop服务的配置文件中,指定相应的Kerberos主体和密钥表文件。这将使得Hadoop服务能够通过Kerberos协议进行认证和授权。 步骤5:启动Hadoop集群 在所有节点上启动Hadoop集群。Hadoop服务会自动使用Kerberos认证配置进行验证。 步骤6:测试认证 使用Hadoop命令行工具或Web界面进行测试。输入有效的Kerberos主体和密码,确保能够访问和执行Hadoop操作。 需要注意的是,Kerberos Hadoop认证需要一些许可证和安全设置。此外,Kerberos的配置步骤可能因不同的操作系统和Hadoop版本而有所不同。因此,在实际部署和使用中,可能需要参考具体的操作指南和文档。 总结来说,Kerberos Hadoop认证是一种通过Kerberos协议确保Hadoop集群安全性的机制。通过正确配置和使用Kerberos,可以保护集群免受未授权的访问和数据泄露的威胁。

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论 2
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值