Flink1.12.1使用HiveCatalog(HIve3.1.2)出现java.lang.NoSuchMethodError: org.apache.hadoop.io.retry.RetryUtils.getDefaultRetryPolicy(Lorg/apache/hadoop/conf/Configuration;Ljava/lang/String
java.lang.NoSuchMethodError: org.apache.hadoop.io.retry.RetryUtils.getDefaultRetryPolicy(Lorg/apache/hadoop/conf/Configuration;Ljava/lang/String;ZLjava/lang/String;Ljava/lang/String;Ljava/lang/Class;)Lorg/apache/hadoop/io/retry/RetryPolicy;
at org.apache.hadoop.hdfs.NameNodeProxies.createNNProxyWithClientProtocol(NameNodeProxies.java:318) ~[hadoop-hdfs-2.2.0.jar:?]
由于Hive依赖Hadoop需要在Flink程序导入Hadoop的依赖
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-client</artifactId>
<version>3.1.3</version>
<scope>provided</scope>
</dependency>
导入依赖还是不成功,就在Windows或Linux里面配置环境变量
Linux
配置环境变量
export HADOOP_CLASSPATH=`hadoop classpath`
Windows配置环境变量
使用hadoop classpath
查看Windows的HADOOP_CLASSPATH,并复制
配置环境变量HADOOP_CLASSPATH,变量值:粘贴之前指令输出的值