flink hive 异常以及解决办法

[ERROR] Could not execute SQL statement. Reason:
java.lang.NoClassDefFoundError: Could not initialize class org.apache.flink.table.catalog.hive.factories.HiveCatalogFactoryOptions

原因:缺失依赖
解决办法: 新增hive-exec-3.1.0.jar 依赖

[ERROR] Could not execute SQL statement. Reason:
java.lang.ClassNotFoundException: org.apache.hadoop.fs.FSDataInputStream

原因:缺少hadoop 依赖或者hadoop 的环境变量
解决办法:export HADOOP_CLASSPATH=hadoop classpath

type'='hive'
        at org.apache.flink.table.factories.FactoryUtil.createCatalog(FactoryUtil.java:264) ~[flink-table_2.11-1.13.0.jar:1.13.0]
        at org.apache.flink.table.api.internal.TableEnvironmentImpl.createCatalog(TableEnvironmentImpl.java:1241) ~[flink-table_2.11-1.13.0.jar:1.13.0]
        at org.apache.flink.table.api.internal.TableEnvironmentImpl.executeInternal(TableEnvironmentImpl.java:1104) ~[flink-table_2.11-1.13.0.jar:1.13.0]
        at org.apache.flink.table.client.gateway.local.LocalExecutor.lambda$executeOperation$3(LocalExecutor.java:213) ~[flink-sql-client_2.11-1.13.0.jar:1.13.0]
        at org.apache.flink.table.client.gateway.context.ExecutionContext.wrapClassLoader(ExecutionContext.java:90) ~[flink-sql-client_2.11-1.13.0.jar:1.13.0]
        at org.apache.flink.table.client.gateway.local.LocalExecutor.executeOperation(LocalExecutor.java:213) ~[flink-sql-client_2.11-1.13.0.jar:1.13.0]
        ... 11 more
Caused by: java.lang.NoSuchMethodError: com.google.common.base.Preconditions.checkArgument(ZLjava/lang/String;Ljava/lang/Object;)V
        at org.apache.hadoop.conf.Configuration.set(Configuration.java:1358) ~[hadoop-common-3.1.1.3.1.4.0-315.jar:?]
        at org.apache.hadoop.conf.Configuration.set(Configuration.java:1339) ~[hadoop-common-3.1.1.3.1.4.0-315.jar:?]
        at org.apache.hadoop.mapred.JobConf.setJar(JobConf.java:518) ~[hadoop-mapreduce-client-core-3.1.1.3.1.4.0-315.jar:?]
        at org.apache.hadoop.mapred.JobConf.setJarByClass(JobConf.java:536) ~[hadoop-mapreduce-client-core-3.1.1.3.1.4.0-315.jar:?]
        at org.apache.hadoop.mapred.JobConf.<init>(JobConf.java:430) ~[hadoop-mapreduce-client-core-3.1.1.3.1.4.0-315.jar:?]
        at org.apache.hadoop.hive.conf.HiveConf.initialize(HiveConf.java:5135) ~[hive-exec-3.1.0.jar:3.1.0]
        at org.apache.hadoop.hive.conf.HiveConf.<init>(HiveConf.java:5103) ~[hive-exec-3.1.0.jar:3.1.0]
        at org.apache.flink.table.catalog.hive.HiveCatalog.createHiveConf(HiveCatalog.java:255) ~[flink-connector-hive_2.11-1.13.0.jar:1.13.0]
        at org.apache.flink.table.catalog.hive.HiveCatalog.<init>(HiveCatalog.java:180) ~[flink-connector-hive_2.11-1.13.0.jar:1.13.0]
        at org.apache.flink.table.catalog.hive.factories.HiveCatalogFactory.createCatalog(HiveCatalogFactory.java:76) ~[flink-connector-hive_2.11-1.13.0.jar:1.13.0]
        at org.apache.flink.table.factories.FactoryUtil.createCatalog(FactoryUtil.java:261) ~[flink-table_2.11-1.13.0.jar:1.13.0]
        at org.apache.flink.table.api.internal.TableEnvironmentImpl.createCatalog(TableEnvironmentImpl.java:1241) ~[flink-table_2.11-1.13.0.jar:1.13.0]
        at org.apache.flink.table.api.internal.TableEnvironmentImpl.executeInternal(TableEnvironmentImpl.java:1104) ~[flink-table_2.11-1.13.0.jar:1.13.0]
        at org.apache.flink.table.client.gateway.local.LocalExecutor.lambda$executeOperation$3(LocalExecutor.java:213) ~[flink-sql-client_2.11-1.13.0.jar:1.13.0]
        at org.apache.flink.table.client.gateway.context.ExecutionContext.wrapClassLoader(ExecutionContext.java:90) ~[flink-sql-client_2.11-1.13.0.jar:1.13.0]
        at org.apache.flink.table.client.gateway.local.LocalExecutor.executeOperation(LocalExecutor.java:213) ~[flink-sql-client_2.11-1.13.0.jar:1.13.0]

错误原因:
hive和hadoop的lib下面的guava.jar版本不一致造成的。
查看发现:
hive的lib下面的为guava-19.0.jar
hadoop的目录share/hadoop/common/lib下面的为guava-27.0.jar
解决:
保持都为高版本的。
删除hive的lib下面的guava-19.0.jar,然后拷贝hadoop下面的guava-27.0.jar即可。
还可以参考:https://blog.csdn.net/zhyajshhz/article/details/113249073

  • 0
    点赞
  • 1
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值