flink集成hive 启动:flink-sql 报错:
./sql-client.sh embedded
报错如下:
Exception in thread "main" org.apache.flink.table.client.SqlClientException: Unexpected exception. This is a bug. Please consider filing an issue.
at org.apache.flink.table.client.SqlClient.main(SqlClient.java:215)
Caused by: org.apache.flink.table.client.gateway.SqlExecutionException: Could not create execution context.
at org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:972)
at org.apache.flink.table.client.gateway.local.LocalExecutor.openSession(LocalExecutor.java:225)
at org.apache.flink.table.client.SqlClient.start(SqlClient.java:108)
at org.apache.flink.table.client.SqlClient.main(SqlClient.java:201)
Caused by: java.lang.NoSuchMethodError: com.google.common.base.Preconditions.checkArgument(ZLjava/lang/String;Ljava/lang/Object;)V
at org.apache.hadoop.conf.Configuration.set(Configuration.java:1357)
at org.apache.hadoop.conf.Configuration.set(Configuration.java:1338)
at org.apache.hadoop.mapred.JobConf.setJar(JobConf.java:536)
at org.apache.hadoop.mapred.JobConf.setJarByClass(JobConf.java:554)
at org.apache.hadoop.mapred.JobConf.<init>(JobConf.java:448)
at org.apache.hadoop.hive.conf.HiveConf.initialize(HiveConf.java:5141)
at org.apache.hadoop.hive.conf.HiveConf.<init>(HiveConf.java:5109)
at org.apache.flink.table.catalog.hive.HiveCatalog.createHiveConf(HiveCatalog.java:211)
at org.apache.flink.table.catalog.hive.HiveCatalog.<init>(HiveCatalog.java:164)
at org.apache.flink.table.catalog.hive.factories.HiveCatalogFactory.createCatalog(HiveCatalogFactory.java:89)
at org.apache.flink.table.client.gateway.local.ExecutionContext.createCatalog(ExecutionContext.java:396)
at org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$null$5(ExecutionContext.java:684)
at java.util.HashMap.forEach(HashMap.java:1288)
at org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$initializeCatalogs$6(ExecutionContext.java:681)
at org.apache.flink.table.client.gateway.local.ExecutionContext.wrapClassLoader(ExecutionContext.java:265)
at org.apache.flink.table.client.gateway.local.ExecutionContext.initializeCatalogs(ExecutionContext.java:677)
at org.apache.flink.table.client.gateway.local.ExecutionContext.initializeTableEnvironment(ExecutionContext.java:565)
at org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:187)
at org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:138)
at org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:961)
... 3 more
问题
flink SQL连接hive以及hudi 报错java.lang.NoSuchMethodError: com.google.common.base.Preconditions.checkArgument(ZLjava/lang/String;Ljava/lang/Object;)V,查询资料后发现就是guava版本冲突造成的。
hive 3.1.2版本内置的guava是19.0版本的,而hadoop中的guava是27.0-jre版本的,flink内置的guava也有多个版本。彼此之间版本就冲突了。
解决办法
查了好久才发现是hive-exec包中的guava冲突造成的。
hive-exec将19.0版本的guava shade进去了
而 flink-hive-connector 将hive-exec shade进去了。
hudi-flink-bundle_2.12-0.9.0.jar也将hive-exec shade进去了。
所以要解决办法就是将以上jar包中的guava的shade去掉。
方法一:
对hive 源码重新编译,去掉guava版本,然后将生成的hive-exec-3.1.2.jar
放在本地的maven仓库里,然后对flink源码重新编译,
将flink-sql-connector-hive-3.1.2_2.12-1.12.2.jar放在flink的lib目录下,重启flink cluster,然后再次执行flinkSQL,问题解决。需要注意的是,不同版本guava 依赖在flink 的lib目录下不能在其他的jar中存在。
方法二:
将 flink-hive-connector 以及hudi-flink-bundle_2.12-0.9.0.jar
中的hive-exec 的include标签注释掉。
方法三:(推荐)
- Remove all
guava-*.jar
from$HADOOP_HOME/share/hadoop/commom/lib
and$HIVE_HOME/lib
. - Put the
guava-27.0-jre.jar
to both$HADOOP_HOME/share/hadoop/commom/lib
and$HIVE_HOME/lib
. - Put the
guava-27.0-jre.jar
to$FLINK_HOME/lib
, and rename it toa_guava-27.0.jre.jar
.
Note that Step 3 rename the jar is of great importance. And I am not sure that Step 2 is neccesary (May be you can try to skip it).
翻译:
1.从
$HADOOP_HOME/share/hadoop/commom/lib 和 $HIVE_HOME/lib 移除 guava-*.jar
2.把guava-27.0-jre.jar 包放入 $HADOOP_HOME/share/hadoop/commom/lib
,$HIVE_HOME/lib
3.把guava-27.0-jre.jar
包放入
$FLINK_HOME/lib
, 并重命名a_guava-27.0.jre.jar (很重要)
我用的第三种方法解决!