hadoop-auth NoSuchMethodError: org.apache.hadoop.security.HadoopKerberosName.setRuleMechanism

hadoop 3.2.1
hbase 2.3.7
spark 3.2.0
pom hadoop-auth是由hbase默认导入的2.10.0版本,更新导入版本3.2.1

Exception in thread "main" java.lang.NoSuchMethodError: org.apache.hadoop.security.HadoopKerberosName.setRuleMechanism(Ljava/lang/String;)V
	at org.apache.hadoop.security.HadoopKerberosName.setConfiguration(HadoopKerberosName.java:84)
	at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:315)
	at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:300)
	at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:575)
	at org.apache.hadoop.hbase.security.User$SecureHadoopUser.<init>(User.java:300)
	at org.apache.hadoop.hbase.security.User.getCurrent(User.java:173)
	at org.apache.hadoop.hbase.security.UserProvider.getCurrent(UserProvider.java:187)
	at org.apache.hadoop.hbase.AuthUtil.loginClient(AuthUtil.java:107)
	at org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:128)
	at org.apache.hadoop.hbase.client.HBaseAdmin.available(HBaseAdmin.java:2408)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.locationtech.geomesa.hbase.utils.HBaseVersions$.$anonfun$_available$3(HBaseVersions.scala:167)
	at org.locationtech.geomesa.hbase.utils.HBaseVersions$.$anonfun$_available$3$adapted(HBaseVersions.scala:166)
	at org.locationtech.geomesa.hbase.utils.HBaseVersions$.checkAvailable(HBaseVersions.scala:89)
	at org.locationtech.geomesa.hbase.data.HBaseConnectionPool$.org$locationtech$geomesa$hbase$data$HBaseConnectionPool$$doCreateConnection(HBaseConnectionPool.scala:135)
	at org.locationtech.geomesa.hbase.data.HBaseConnectionPool$.createConnection(HBaseConnectionPool.scala:128)
	at org.locationtech.geomesa.hbase.data.HBaseConnectionPool$$anon$2.load(HBaseConnectionPool.scala:66)
	at org.locationtech.geomesa.hbase.data.HBaseConnectionPool$$anon$2.load(HBaseConnectionPool.scala:64)
	at com.github.benmanes.caffeine.cache.UnboundedLocalCache$UnboundedLocalLoadingCache.lambda$new$0(UnboundedLocalCache.java:922)
	at com.github.benmanes.caffeine.cache.UnboundedLocalCache.lambda$computeIfAbsent$2(UnboundedLocalCache.java:235)
	at java.util.concurrent.ConcurrentHashMap.computeIfAbsent(ConcurrentHashMap.java:1660)
	at com.github.benmanes.caffeine.cache.UnboundedLocalCache.computeIfAbsent(UnboundedLocalCache.java:231)
	at com.github.benmanes.caffeine.cache.LocalCache.computeIfAbsent(LocalCache.java:113)
	at com.github.benmanes.caffeine.cache.LocalLoadingCache.get(LocalLoadingCache.java:65)
	at org.locationtech.geomesa.hbase.data.HBaseConnectionPool$.getConnection(HBaseConnectionPool.scala:106)
	at org.locationtech.geomesa.hbase.data.HBaseDataStoreFactory.createDataStore(HBaseDataStoreFactory.scala:42)
	at org.locationtech.geomesa.hbase.data.HBaseDataStoreFactory.createDataStore(HBaseDataStoreFactory.scala:33)
	at org.geotools.data.DataAccessFinder.getDataStore(DataAccessFinder.java:119)
	at org.geotools.data.DataStoreFinder.getDataStore(DataStoreFinder.java:69)
	at orbita.ai.geomesa.SparkHbaseTest$.main(SparkHbaseTest.scala:30)
	at orbita.ai.geomesa.SparkHbaseTest.main(SparkHbaseTest.scala)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
	at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:955)
	at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
	at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
	at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
	at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1043)
	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1052)
	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
这个错误信息是由于缺少某个方法导致的。根据引用\[1\]和引用\[2\]中的信息,这个错误可能是由于缺少org.apache.hadoop.hive.ql.metadata.Hive.databaseExistsByView方法引起的。这个方法在Hive的元数据客户端中被调用,用于检查是否存在指定的数据库视图。然而,由于缺少这个方法,导致在执行相关任务时出现了异常。引用\[3\]中的信息也提到了HiveException和RuntimeException,这些都是由于缺少方法引起的异常。要解决这个问题,可以尝试更新Hive的版本或者检查是否有缺失的依赖。 #### 引用[.reference_title] - *1* [java.lang.NoSuchMethodError: org.apache.hadoop.hive.ql.session.SessionState](https://blog.csdn.net/qq_43306439/article/details/118767731)[target="_blank" data-report-click={"spm":"1018.2226.3001.9630","extra":{"utm_source":"vip_chatgpt_common_search_pc_result","utm_medium":"distribute.pc_search_result.none-task-cask-2~all~insert_cask~default-1-null.142^v91^control_2,239^v3^insert_chatgpt"}} ] [.reference_item] - *2* [我无法访问Hive - 线程“main”中的异常java.lang.NoSuchMethodErrororg.apache.hadoop.security....](https://blog.csdn.net/weixin_39672194/article/details/116217992)[target="_blank" data-report-click={"spm":"1018.2226.3001.9630","extra":{"utm_source":"vip_chatgpt_common_search_pc_result","utm_medium":"distribute.pc_search_result.none-task-cask-2~all~insert_cask~default-1-null.142^v91^control_2,239^v3^insert_chatgpt"}} ] [.reference_item] - *3* [HiveException java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.S](https://blog.csdn.net/weixin_42382758/article/details/124220443)[target="_blank" data-report-click={"spm":"1018.2226.3001.9630","extra":{"utm_source":"vip_chatgpt_common_search_pc_result","utm_medium":"distribute.pc_search_result.none-task-cask-2~all~insert_cask~default-1-null.142^v91^control_2,239^v3^insert_chatgpt"}} ] [.reference_item] [ .reference_list ]
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值