[ERROR] Could not execute SQL statement. Reason:
java.lang.NoClassDefFoundError: Could not initialize class org.apache.flink.table.catalog.hive.factories.HiveCatalogFactoryOptions
原因:缺失依赖
解决办法: 新增hive-exec-3.1.0.jar 依赖
[ERROR] Could not execute SQL statement. Reason:
java.lang.ClassNotFoundException: org.apache.hadoop.fs.FSDataInputStream
原因:缺少hadoop 依赖或者hadoop 的环境变量
解决办法:export HADOOP_CLASSPATH=hadoop classpath
type'='hive'
at org.apache.flink.table.factories.FactoryUtil.createCatalog(FactoryUtil.java:264) ~[flink-table_2.11-1.13.0.jar:1.13.0]
at org.apache.flink.table.api.internal.TableEnvironmentImpl.createCatalog(TableEnvironmentImpl.java:1241) ~[flink-table_2.11-1.13.0.jar:1.13.0]
at org.apache.flink.table.api.internal.TableEnvironmentImpl.executeInternal(TableEnvironmentImpl.java:1104) ~[flink-table_2.11-1.13.0.jar:1.13.0]
at org.apache.flink.table.client.gateway.local.LocalExecutor.lambda$executeOperation$3(LocalExecutor.java:213) ~[flink-sql-client_2.11-1.13.0.jar:1.13.0]
at org.apache.flink.table.client.gateway.context.ExecutionContext.wrapClassLoader(ExecutionContext.java:90) ~[flink-sql-client_2.11-1.13.0.jar:1.13.0]
at org.apache.flink.table.client.gateway.local.LocalExecutor.executeOperation(LocalExecutor.java:213) ~[flink-sql-client_2.11-1.13.0.jar:1.13.0]
... 11 more
Caused by: java.lang.NoSuchMethodError: com.google.common.base.Preconditions.checkArgument(ZLjava/lang/String;Ljava/lang/Object;)V
at org.apache.hadoop.conf.Configuration.set(Configuration.java:1358) ~[hadoop-common-3.1.1.3.1.4.0-315.jar:?]
at org.apache.hadoop.conf.Configuration.set(Configuration.java:1339) ~[hadoop-common-3.1.1.3.1.4.0-315.jar:?]
at org.apache.hadoop.mapred.JobConf.setJar(JobConf.java:518) ~[hadoop-mapreduce-client-core-3.1.1.3.1.4.0-315.jar:?]
at org.apache.hadoop.mapred.JobConf.setJarByClass(JobConf.java:536) ~[hadoop-mapreduce-client-core-3.1.1.3.1.4.0-315.jar:?]
at org.apache.hadoop.mapred.JobConf.<init>(JobConf.java:430) ~[hadoop-mapreduce-client-core-3.1.1.3.1.4.0-315.jar:?]
at org.apache.hadoop.hive.conf.HiveConf.initialize(HiveConf.java:5135) ~[hive-exec-3.1.0.jar:3.1.0]
at org.apache.hadoop.hive.conf.HiveConf.<init>(HiveConf.java:5103) ~[hive-exec-3.1.0.jar:3.1.0]
at org.apache.flink.table.catalog.hive.HiveCatalog.createHiveConf(HiveCatalog.java:255) ~[flink-connector-hive_2.11-1.13.0.jar:1.13.0]
at org.apache.flink.table.catalog.hive.HiveCatalog.<init>(HiveCatalog.java:180) ~[flink-connector-hive_2.11-1.13.0.jar:1.13.0]
at org.apache.flink.table.catalog.hive.factories.HiveCatalogFactory.createCatalog(HiveCatalogFactory.java:76) ~[flink-connector-hive_2.11-1.13.0.jar:1.13.0]
at org.apache.flink.table.factories.FactoryUtil.createCatalog(FactoryUtil.java:261) ~[flink-table_2.11-1.13.0.jar:1.13.0]
at org.apache.flink.table.api.internal.TableEnvironmentImpl.createCatalog(TableEnvironmentImpl.java:1241) ~[flink-table_2.11-1.13.0.jar:1.13.0]
at org.apache.flink.table.api.internal.TableEnvironmentImpl.executeInternal(TableEnvironmentImpl.java:1104) ~[flink-table_2.11-1.13.0.jar:1.13.0]
at org.apache.flink.table.client.gateway.local.LocalExecutor.lambda$executeOperation$3(LocalExecutor.java:213) ~[flink-sql-client_2.11-1.13.0.jar:1.13.0]
at org.apache.flink.table.client.gateway.context.ExecutionContext.wrapClassLoader(ExecutionContext.java:90) ~[flink-sql-client_2.11-1.13.0.jar:1.13.0]
at org.apache.flink.table.client.gateway.local.LocalExecutor.executeOperation(LocalExecutor.java:213) ~[flink-sql-client_2.11-1.13.0.jar:1.13.0]
错误原因:
hive和hadoop的lib下面的guava.jar版本不一致造成的。
查看发现:
hive的lib下面的为guava-19.0.jar
hadoop的目录share/hadoop/common/lib下面的为guava-27.0.jar
解决:
保持都为高版本的。
删除hive的lib下面的guava-19.0.jar,然后拷贝hadoop下面的guava-27.0.jar即可。
还可以参考:https://blog.csdn.net/zhyajshhz/article/details/113249073