hive3.1.3 on Sprak3.0.3 源码编译

源码编译

在这里插入图片描述

1、下载源码、解压源码,先不改动。执行次编译(直接编译成功)

修改 maven setting 添加 几个镜像

    <mirror>
      <id>alimaven</id>
      <name>aliyun maven</name>
      <url>http://maven.aliyun.com/nexus/content/groups/public/</url>
      <mirrorOf>central</mirrorOf>
    </mirror>


    <mirror>
        <id>aliyunmaven</id>
        <mirrorOf>*</mirrorOf>
        <name>spring-plugin</name>
        <url>https://maven.aliyun.com/repository/spring-plugin</url>
    </mirror>

    <mirror>
        <id>aliyunmaven</id>
        <mirrorOf>*</mirrorOf>
        <name>阿里云公共仓库</name>
        <url>https://maven.aliyun.com/repository/public</url>
    </mirror>

    <mirror>
        <id>nexus-aliyun</id>
        <mirrorOf>*,!cloudera</mirrorOf>
        <name>Nexus aliyun</name>
        <url>http://maven.aliyun.com/nexus/content/groups/public</url>
    </mirror>

执行编译 : mvn clean package -Pdist -DskipTests -Dmaven.javadoc.skip=true

二、修改pom包中 依赖

<spark.version>3.0.0</spark.version>
<scala.binary.version>2.12</scala.binary.version>
<scala.version>2.12.10</scala.version>
<guava.version>27.0-jre</guava.version>

三、修改官网提供的源码中,更改了guava 和 spark后需要更改的文件

参考: https://github.com/gitlbo/hive/commits/3.1.2

四、最后在 /opt/apache-hive-3.1.3-src/packaging 会生成target目录。为编译后的包

遇到问题集合

1、Failed to execute goal on project hive-upgrade-acid: Could not resolve dependencies for project org.apache.hive:hive-upgrade-acid:jar:3.1.3: Could not find artifact org.pentaho:pentaho-aggdesigner-algorithm:jar:5.1.5-jhyde in aliyun-maven (http://maven.aliyun.com/nexus/content/groups/public) -> [Help 1]

<mirror>
        <id>aliyunmaven</id>
        <mirrorOf>*</mirrorOf>
        <name>spring-plugin</name>
        <url>https://maven.aliyun.com/repository/spring-plugin</url>
    </mirror>
    
    <mirror>
    <id>aliyunmaven</id>
    <mirrorOf>*</mirrorOf>
    <name>阿里云公共仓库</name>
    <url>https://maven.aliyun.com/repository/public</url>
    </mirror>
 
    <mirror>
    <id>nexus-aliyun</id>
    <mirrorOf>*,!cloudera</mirrorOf>
    <name>Nexus aliyun</name> 
    <url>http://maven.aliyun.com/nexus/content/groups/public</url>
    </mirror>

2、[ERROR] Failed to execute goal on project hive-shims-common: Could not resolve dependencies for project org.apache.hive.shims:hive-shims-common:jar:3.1.3: The following artifacts could not be resolved: org.apache.logging.log4j:log4j-slf4j-impl:jar:2.17.1, org.apache.hadoop:hadoop-client:jar:3.1.0, commons-codec:commons-codec:jar:1.15: Could not find artifact org.apache.logging.log4j:log4j-slf4j-impl:jar:2.17.1 in aliyunmaven (https://maven.aliyun.com/repository/spring-plugin) -> [Help 1]

https://mvnrepository.com/artifact/org.apache.logging.log4j/log4j-slf4j-impl/2.17.1
手动去下载然后放进去

3、[ERROR] Failed to execute goal on project hive-shims-common: Could not resolve dependencies for project org.apache.hive.shims:hive-shims-common:jar:3.1.3: The following artifacts could not be resolved: org.apache.hadoop:hadoop-client:jar:3.1.0, commons-codec:commons-codec:jar:1.15: Failure to find org.apache.hadoop:hadoop-client:jar:3.1.0 in https://maven.aliyun.com/repository/spring-plugin was cached in the local repository, resolution will not be reattempted until the update interval of aliyunmaven has elapsed or updates are forced -> [Help 1]

https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-client/3.1.0

4、[ERROR] Failed to execute goal on project hive-shims-common: Could not resolve dependencies for project org.apache.hive.shims:hive-shims-common:jar:3.1.3: Failure to find commons-codec:commons-codec:jar:1.15 in https://maven.aliyun.com/repository/spring-plugin was cached in the local repository, resolution will not be reattempted until the update interval of aliyunmaven has elapsed or updates are forced -> [Help 1]

https://mvnrepository.com/artifact/commons-codec/commons-codec/1.15

5、
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.6.1:compile (default-compile) on project hive-shims-common: Compilation failure: Compilation failure:
[ERROR] error reading /root/.m2/repository/org/apache/httpcomponents/httpcore/4.4.13/httpcore-4.4.13.jar; error in opening zip file
[ERROR] error reading /root/.m2/repository/org/apache/httpcomponents/httpcore/4.4.13/httpcore-4.4.13.jar; error in opening zip file
[ERROR] /opt/apache-hive-3.1.3-src/shims/common/src/main/java/org/apache/hadoop/hive/io/HiveIOExceptionHandler.java:[23,32] package org.apache.hadoop.mapred does not exist
[ERROR] /opt/apache-hive-3.1.3-src/shims/common/src/main/java/org/apache/hadoop/hive/io/HiveIOExceptionHandler.java:[24,32] package org.apache.hadoop.mapred does not exist
[ERROR] /opt/apache-hive-3.1.3-src/shims/common/src/main/java/org/apache/hadoop/hive/io/HiveIOExceptionHandler.java:[39,10] cannot find symbol
[ERROR] symbol: class RecordReader
[ERROR] location: interface org.apache.hadoop.hive.io.HiveIOExceptionHandler
[ERROR] /opt/apache-hive-3.1.3-src/shims/common/src/main/java/org/apache/hadoop/hive/thrift/client/TUGIAssumingTransport.java:[25,34] cannot find symbol
[ERROR] symbol: class UserGroupInformation
[ERROR] location: package org.apache.hadoop.security
[ERROR] /opt/apache-hive-3.1.3-src/shims/common/src/main/java/org/apache/hadoop/hive/thrift/client/TUGIAssumingTransport.java:[39,14] cannot find symbol
[ERROR] -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR] mvn -rf :hive-shims-common

在不修改源码情况下编译一次,这个错就不出现了

6、[ERROR] error reading /root/.m2/repository/org/apache/httpcomponents/httpcore/4.4.13/httpcore-4.4.13.jar; error in opening zip file
[ERROR] error reading /root/.m2/repository/org/apache/httpcomponents/httpcore/4.4.13/httpcore-4.4.13.jar; error in opening zip file
https://mvnrepository.com/artifact/org.apache.httpcomponents/httpcore/4.4.13
重新下载放进去
7、[ERROR] Failed to execute goal on project hive-spark-client: Could not resolve dependencies for project org.apache.hive:hive-spark-client:jar:3.1.3: Could not transfer artifact org.apache.spark:spark-core_2.12:jar:3.0.3 from/to alimaven (http://maven.aliyun.com/nexus/content/groups/public/): GET request of: org/apache/spark/spark-core_2.12/3.0.3/spark-core_2.12-3.0.3.jar from alimaven failed: Tag mismatch! -> [Help 1]

https://mvnrepository.com/artifact/org.apache.spark/spark-core_2.12/3.0.3

8、[ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.6.1:compile (default-compile) on project hive-druid-handler: Compilation failure
[ERROR] /opt/apache-hive-3.1.3-src/druid-handler/src/java/org/apache/hadoop/hive/druid/security/KerberosHttpClient.java:[63,56] cannot access com.google.common.util.concurrent.internal.InternalFutureFailureAccess
[ERROR] class file for com.google.common.util.concurrent.internal.InternalFutureFailureAccess not found
[ERROR]
[ERROR] -> [Help 1]

<properties>
  <hive.path.to.root>..</hive.path.to.root>
  <druid.metamx.util.version>1.3.2</druid.metamx.util.version>
  <druid.guava.version>16.0.1</druid.guava.version>
</properties>

修改 druid-handler 模块下的pom文件的 guava版本为 最外层pom的版本
  • 1
    点赞
  • 1
    收藏
    觉得还不错? 一键收藏
  • 打赏
    打赏
  • 0
    评论
当搭建Hive 3.1.3环境时,以下是一些基本的步骤: 1. 安装Java:确保你的机器上已经安装了Java。Hive 3.1.3需要Java 8或更高版本。 2. 下载Hive:从Apache Hive的官方网站(https://hive.apache.org/downloads.html)下载Hive 3.1.3的二进制文件。 3. 解压文件:将下载的文件解压到你选择的目录中。 4. 配置环境变量:设置Hive的环境变量。在你的终端或命令提示符中,找到并编辑`~/.bashrc`或`~/.bash_profile`文件,并将以下内容添加到文件末尾: ```shell export HIVE_HOME=/path/to/hive export PATH=$PATH:$HIVE_HOME/bin ``` 然后运行`source ~/.bashrc`或`source ~/.bash_profile`使变量生效。 5. 配置HadoopHive需要连接到一个Hadoop集群。在Hive目录中,进入`conf`文件夹,复制`hive-default.xml.template`并将其重命名为`hive-site.xml`。然后编辑`hive-site.xml`文件,设置以下属性: ```xml <property> <name>hive.execution.engine</name> <value>mr</value> </property> <property> <name>hive.metastore.warehouse.dir</name> <value>/user/hive/warehouse</value> </property> <property> <name>javax.jdo.option.ConnectionURL</name> <value>jdbc:derby:;databaseName=/path/to/metastore_db;create=true</value> </property> ``` 注意替换上述代码中的路径为你的Hadoop配置和元数据存储路径。 6. 启动Hive Metastore:在Hive目录中,执行以下命令启动Hive Metastore服务: ```shell schematool -dbType derby -initSchema ``` 这将初始化Hive Metastore的数据库。 7. 启动Hive CLI或Beeline:根据你的需求,可以使用Hive CLI(命令行界面)或Beeline(JDBC客户端)连接到Hive。执行以下命令启动Hive CLI: ```shell hive ``` 或者执行以下命令启动Beeline: ```shell beeline -u jdbc:hive2://localhost:10000 ``` 这将连接到本地的Hive服务。 这些步骤应该帮助你搭建Hive 3.1.3环境。根据你的需求,你可能还需要进行其他配置和调整。请参考Apache Hive的官方文档(https://cwiki.apache.org/confluence/display/Hive/Home)以获取更多详细信息和进一步的指导。

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包

打赏作者

简维旅者

你的鼓励将是我创作的最大动力

¥1 ¥2 ¥4 ¥6 ¥10 ¥20
扫码支付:¥1
获取中
扫码支付

您的余额不足,请更换扫码支付或充值

打赏作者

实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值