搭建Hive On Spark 编译Hive源码错误解决方法(spark2.1.0,hadoop2.7.2)

15 篇文章 2 订阅

Hive On Spark 的搭建需要自行编译Spark 源码  without Hive 然后部署。

编译成功之后部署Spark集群,此处不描述如何部署集群了。

然后本人自行从Hive官网下载Hive2.1.1安装包进行安装,之后启动运行Hive发现出现Class不兼容异常,经过Google得知版本冲突,导致自行编译Hive源码。


本人第一次是在archive.apache.org官方下载hive源码进行编译得到如下错误:(由于Spark升级到2.x版本,移除了JavaSparkListener类)

[ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.1:compile (default-compile) on project spark-client: Compilation failure: Compilation failure:

[ERROR] /home/daxin/installData/source/apache-hive-2.1.1-src/spark-client/src/main/java/org/apache/hive/spark/client/RemoteDriver.java:[46,24] cannot find symbol
[ERROR] symbol:   class JavaSparkListener
[ERROR] location: package org.apache.spark
[ERROR] /home/daxin/installData/source/apache-hive-2.1.1-src/spark-client/src/main/java/org/apache/hive/spark/client/RemoteDriver.java:[444,40] cannot find symbol
[ERROR] symbol:   class JavaSparkListener
[ERROR] location: class org.apache.hive.spark.client.RemoteDriver
[ERROR] /home/daxin/installData/source/apache-hive-2.1.1-src/spark-client/src/main/java/org/apache/hive/spark/client/RemoteDriver.java:[46,24] cannot find symbol
[ERROR] symbol:   class JavaSparkListener
[ERROR] location: package org.apache.spark
[ERROR] /home/daxin/installData/source/apache-hive-2.1.1-src/spark-client/src/main/java/org/apache/hive/spark/client/RemoteDriver.java:[444,40] cannot find symbol
[ERROR] symbol:   class JavaSparkListener
[ERROR] location: class org.apache.hive.spark.client.RemoteDriver
[ERROR] /home/daxin/installData/source/apache-hive-2.1.1-src/spark-client/src/main/java/org/apache/hive/spark/client/metrics/ShuffleReadMetrics.java:[63,38] cannot find symbol
[ERROR] symbol:   method get()
[ERROR] location: class org.apache.spark.executor.ShuffleReadMetrics
[ERROR] /home/daxin/installData/source/apache-hive-2.1.1-src/spark-client/src/main/java/org/apache/hive/spark/client/metrics/ShuffleReadMetrics.java:[64,35] cannot find symbol
[ERROR] symbol:   method get()
[ERROR] location: class org.apache.spark.executor.ShuffleReadMetrics
[ERROR] /home/daxin/installData/source/apache-hive-2.1.1-src/spark-client/src/main/java/org/apache/hive/spark/client/metrics/ShuffleReadMetrics.java:[65,35] cannot find symbol
[ERROR] symbol:   method get()
[ERROR] location: class org.apache.spark.executor.ShuffleReadMetrics
[ERROR] /home/daxin/installData/source/apache-hive-2.1.1-src/spark-client/src/main/java/org/apache/hive/spark/client/metrics/ShuffleReadMetrics.java:[66,35] cannot find symbol
[ERROR] symbol:   method get()
[ERROR] location: class org.apache.spark.executor.ShuffleReadMetrics
[ERROR] /home/daxin/installData/source/apache-hive-2.1.1-src/spark-client/src/main/java/org/apache/hive/spark/client/metrics/ShuffleWriteMetrics.java:[50,39] cannot find symbol
[ERROR] symbol:   method get()
[ERROR] location: class org.apache.spark.executor.ShuffleWriteMetrics
[ERROR] /home/daxin/installData/source/apache-hive-2.1.1-src/spark-client/src/main/java/org/apache/hive/spark/client/metrics/ShuffleWriteMetrics.java:[51,36] cannot find symbol
[ERROR] symbol:   method get()
[ERROR] location: class org.apache.spark.executor.ShuffleWriteMetrics
[ERROR] /home/daxin/installData/source/apache-hive-2.1.1-src/spark-client/src/main/java/org/apache/hive/spark/client/metrics/Metrics.java:[102,34] cannot find symbol
[ERROR] symbol:   method isDefined()
[ERROR] location: class org.apache.spark.executor.InputMetrics
[ERROR] /home/daxin/installData/source/apache-hive-2.1.1-src/spark-client/src/main/java/org/apache/hive/spark/client/metrics/Metrics.java:[106,40] cannot find symbol
[ERROR] symbol:   method isDefined()
[ERROR] location: class org.apache.spark.executor.ShuffleReadMetrics
[ERROR] /home/daxin/installData/source/apache-hive-2.1.1-src/spark-client/src/main/java/org/apache/hive/spark/client/metrics/Metrics.java:[110,41] cannot find symbol
[ERROR] symbol:   method isDefined()
[ERROR] location: class org.apache.spark.executor.ShuffleWriteMetrics
[ERROR] /home/daxin/installData/source/apache-hive-2.1.1-src/spark-client/src/main/java/org/apache/hive/spark/client/metrics/InputMetrics.java:[48,55] cannot find symbol
[ERROR] symbol:   method get()
[ERROR] location: class org.apache.spark.executor.InputMetrics
[ERROR] /home/daxin/installData/source/apache-hive-2.1.1-src/spark-client/src/main/java/org/apache/hive/spark/client/metrics/InputMetrics.java:[49,29] cannot find symbol
[ERROR] symbol:   method get()
[ERROR] location: class org.apache.spark.executor.InputMetrics
[ERROR] /home/daxin/installData/source/apache-hive-2.1.1-src/spark-client/src/main/java/org/apache/hive/spark/client/RemoteDriver.java:[158,14] method addSparkListener in class org.apache.spark.SparkContext cannot be applied to given types;
[ERROR] required: org.apache.spark.scheduler.SparkListenerInterface
[ERROR] found: org.apache.hive.spark.client.RemoteDriver.ClientListener
[ERROR] reason: actual argument org.apache.hive.spark.client.RemoteDriver.ClientListener cannot be converted to org.apache.spark.scheduler.SparkListenerInterface by method invocation conversion
[ERROR] /home/daxin/installData/source/apache-hive-2.1.1-src/spark-client/src/main/java/org/apache/hive/spark/client/RemoteDriver.java:[448,5] method does not override or implement a method from a supertype
[ERROR] /home/daxin/installData/source/apache-hive-2.1.1-src/spark-client/src/main/java/org/apache/hive/spark/client/RemoteDriver.java:[457,5] method does not override or implement a method from a supertype
[ERROR] /home/daxin/installData/source/apache-hive-2.1.1-src/spark-client/src/main/java/org/apache/hive/spark/client/RemoteDriver.java:[475,5] method does not override or implement a method from a supertype
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command

[ERROR]   mvn <goals> -rf :spark-client



由于Spark升级导致接口和类的不兼容,后来去Hive的github上去查看Hive源码发现,Github上已经更新此错误,最后在Github上clone一份代码进行本地编译。

具体编译过程查看Hive官方文档即可,官方文档如下:

-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

  $ git clone https://git-wip-us.apache.org/repos/asf/hive.git
  $ cd hive
  $ mvn clean package -Pdist
  $ cd packaging/target/apache-hive-{version}-SNAPSHOT-bin/apache-hive-{version}-SNAPSHOT-bin
  $ ls
  LICENSE
  NOTICE
  README.txt
  RELEASE_NOTES.txt
  bin/ (all the shell scripts)
  lib/ (required jar files)
  conf/ (configuration files)
  examples/ (sample input and query files)
  hcatalog / (hcatalog installation)
  scripts / (upgrade scripts for hive-metastore)
-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------


最后输出目录:

packaging/target/apache-hive-{version}-SNAPSHOT-bin/apache-hive-{version}-SNAPSHOT-bin

例如我的输出目录:
/home/daxin/installData/source/hive-2.1.1/packaging/target/apache-hive-2.2.0-SNAPSHOT-bin/apache-hive-2.2.0-SNAPSHOT-bin

备注:编译过程中测试遇到问题,可以直接跳过maven测试:

mvn clean package -Pdist -Dmaven.test.skip=true



如果想指定Spark、Hadoop版本的话,在父目录的pom.xml修改依赖版本即可。,修改JDK,Scala版本同理。



经验:以后下载源码还是去github上下载为好,那里代码更新快,少走弯路。



-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值