源码编译
1、下载源码、解压源码,先不改动。执行次编译(直接编译成功)
修改 maven setting 添加 几个镜像
<mirror>
<id>alimaven</id>
<name>aliyun maven</name>
<url>http://maven.aliyun.com/nexus/content/groups/public/</url>
<mirrorOf>central</mirrorOf>
</mirror>
<mirror>
<id>aliyunmaven</id>
<mirrorOf>*</mirrorOf>
<name>spring-plugin</name>
<url>https://maven.aliyun.com/repository/spring-plugin</url>
</mirror>
<mirror>
<id>aliyunmaven</id>
<mirrorOf>*</mirrorOf>
<name>阿里云公共仓库</name>
<url>https://maven.aliyun.com/repository/public</url>
</mirror>
<mirror>
<id>nexus-aliyun</id>
<mirrorOf>*,!cloudera</mirrorOf>
<name>Nexus aliyun</name>
<url>http://maven.aliyun.com/nexus/content/groups/public</url>
</mirror>
执行编译 : mvn clean package -Pdist -DskipTests -Dmaven.javadoc.skip=true
二、修改pom包中 依赖
<spark.version>3.0.0</spark.version>
<scala.binary.version>2.12</scala.binary.version>
<scala.version>2.12.10</scala.version>
<guava.version>27.0-jre</guava.version>
三、修改官网提供的源码中,更改了guava 和 spark后需要更改的文件
参考: https://github.com/gitlbo/hive/commits/3.1.2
四、最后在 /opt/apache-hive-3.1.3-src/packaging 会生成target目录。为编译后的包
遇到问题集合
1、Failed to execute goal on project hive-upgrade-acid: Could not resolve dependencies for project org.apache.hive:hive-upgrade-acid:jar:3.1.3: Could not find artifact org.pentaho:pentaho-aggdesigner-algorithm:jar:5.1.5-jhyde in aliyun-maven (http://maven.aliyun.com/nexus/content/groups/public) -> [Help 1]
<mirror>
<id>aliyunmaven</id>
<mirrorOf>*</mirrorOf>
<name>spring-plugin</name>
<url>https://maven.aliyun.com/repository/spring-plugin</url>
</mirror>
<mirror>
<id>aliyunmaven</id>
<mirrorOf>*</mirrorOf>
<name>阿里云公共仓库</name>
<url>https://maven.aliyun.com/repository/public</url>
</mirror>
<mirror>
<id>nexus-aliyun</id>
<mirrorOf>*,!cloudera</mirrorOf>
<name>Nexus aliyun</name>
<url>http://maven.aliyun.com/nexus/content/groups/public</url>
</mirror>
2、[ERROR] Failed to execute goal on project hive-shims-common: Could not resolve dependencies for project org.apache.hive.shims:hive-shims-common:jar:3.1.3: The following artifacts could not be resolved: org.apache.logging.log4j:log4j-slf4j-impl:jar:2.17.1, org.apache.hadoop:hadoop-client:jar:3.1.0, commons-codec:commons-codec:jar:1.15: Could not find artifact org.apache.logging.log4j:log4j-slf4j-impl:jar:2.17.1 in aliyunmaven (https://maven.aliyun.com/repository/spring-plugin) -> [Help 1]
https://mvnrepository.com/artifact/org.apache.logging.log4j/log4j-slf4j-impl/2.17.1
手动去下载然后放进去
3、[ERROR] Failed to execute goal on project hive-shims-common: Could not resolve dependencies for project org.apache.hive.shims:hive-shims-common:jar:3.1.3: The following artifacts could not be resolved: org.apache.hadoop:hadoop-client:jar:3.1.0, commons-codec:commons-codec:jar:1.15: Failure to find org.apache.hadoop:hadoop-client:jar:3.1.0 in https://maven.aliyun.com/repository/spring-plugin was cached in the local repository, resolution will not be reattempted until the update interval of aliyunmaven has elapsed or updates are forced -> [Help 1]
https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-client/3.1.0
4、[ERROR] Failed to execute goal on project hive-shims-common: Could not resolve dependencies for project org.apache.hive.shims:hive-shims-common:jar:3.1.3: Failure to find commons-codec:commons-codec:jar:1.15 in https://maven.aliyun.com/repository/spring-plugin was cached in the local repository, resolution will not be reattempted until the update interval of aliyunmaven has elapsed or updates are forced -> [Help 1]
https://mvnrepository.com/artifact/commons-codec/commons-codec/1.15
5、
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.6.1:compile (default-compile) on project hive-shims-common: Compilation failure: Compilation failure:
[ERROR] error reading /root/.m2/repository/org/apache/httpcomponents/httpcore/4.4.13/httpcore-4.4.13.jar; error in opening zip file
[ERROR] error reading /root/.m2/repository/org/apache/httpcomponents/httpcore/4.4.13/httpcore-4.4.13.jar; error in opening zip file
[ERROR] /opt/apache-hive-3.1.3-src/shims/common/src/main/java/org/apache/hadoop/hive/io/HiveIOExceptionHandler.java:[23,32] package org.apache.hadoop.mapred does not exist
[ERROR] /opt/apache-hive-3.1.3-src/shims/common/src/main/java/org/apache/hadoop/hive/io/HiveIOExceptionHandler.java:[24,32] package org.apache.hadoop.mapred does not exist
[ERROR] /opt/apache-hive-3.1.3-src/shims/common/src/main/java/org/apache/hadoop/hive/io/HiveIOExceptionHandler.java:[39,10] cannot find symbol
[ERROR] symbol: class RecordReader
[ERROR] location: interface org.apache.hadoop.hive.io.HiveIOExceptionHandler
[ERROR] /opt/apache-hive-3.1.3-src/shims/common/src/main/java/org/apache/hadoop/hive/thrift/client/TUGIAssumingTransport.java:[25,34] cannot find symbol
[ERROR] symbol: class UserGroupInformation
[ERROR] location: package org.apache.hadoop.security
[ERROR] /opt/apache-hive-3.1.3-src/shims/common/src/main/java/org/apache/hadoop/hive/thrift/client/TUGIAssumingTransport.java:[39,14] cannot find symbol
[ERROR] -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR] mvn -rf :hive-shims-common
在不修改源码情况下编译一次,这个错就不出现了
6、[ERROR] error reading /root/.m2/repository/org/apache/httpcomponents/httpcore/4.4.13/httpcore-4.4.13.jar; error in opening zip file
[ERROR] error reading /root/.m2/repository/org/apache/httpcomponents/httpcore/4.4.13/httpcore-4.4.13.jar; error in opening zip file
https://mvnrepository.com/artifact/org.apache.httpcomponents/httpcore/4.4.13
重新下载放进去
7、[ERROR] Failed to execute goal on project hive-spark-client: Could not resolve dependencies for project org.apache.hive:hive-spark-client:jar:3.1.3: Could not transfer artifact org.apache.spark:spark-core_2.12:jar:3.0.3 from/to alimaven (http://maven.aliyun.com/nexus/content/groups/public/): GET request of: org/apache/spark/spark-core_2.12/3.0.3/spark-core_2.12-3.0.3.jar from alimaven failed: Tag mismatch! -> [Help 1]
https://mvnrepository.com/artifact/org.apache.spark/spark-core_2.12/3.0.3
8、[ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.6.1:compile (default-compile) on project hive-druid-handler: Compilation failure
[ERROR] /opt/apache-hive-3.1.3-src/druid-handler/src/java/org/apache/hadoop/hive/druid/security/KerberosHttpClient.java:[63,56] cannot access com.google.common.util.concurrent.internal.InternalFutureFailureAccess
[ERROR] class file for com.google.common.util.concurrent.internal.InternalFutureFailureAccess not found
[ERROR]
[ERROR] -> [Help 1]
<properties>
<hive.path.to.root>..</hive.path.to.root>
<druid.metamx.util.version>1.3.2</druid.metamx.util.version>
<druid.guava.version>16.0.1</druid.guava.version>
</properties>
修改 druid-handler 模块下的pom文件的 guava版本为 最外层pom的版本