Hive on spark 执行加载不了spark的jars

在尝试执行Hive中的Spark任务时遇到错误,表现为无法创建Spark客户端。问题源于Hive无法加载Spark的相关JAR包。解决方案是在hive-env.sh中设置SPARK_HOME并正确指定SPARK_JARS路径,确保Hive能够找到并使用Spark的依赖。
摘要由CSDN通过智能技术生成
问题展示
Launching Job 1 out of 1
In order to change the average load for a reducer (in bytes):
  set hive.exec.reducers.bytes.per.reducer=<number>
In order to limit the maximum number of reducers:
  set hive.exec.reducers.max=<number>
In order to set a constant number of reducers:
  set mapreduce.job.reduces=<number>
Failed to execute spark task, with exception 'org.apache.hadoop.hive.ql.metadata.HiveException(Failed to create Spark client for Spark session 8ffa76f2-907e-4092-987d-6c89279f3c5b)'
FAILED: Execution Error, return code 30041 from org.apache.hadoop.hive.ql.exec.spark.SparkTask. Failed to create Spark client for Spark session 8ffa76f2-907e-4092-987d-6c89279f3c5b

在这里插入图片描述
最后用一下命令查找日志:

hive --hiveconf hive.root.logger=DEBUG,console

找到报错原因:

2020-07-03T05:42:51,624 ERROR [1c9bb3c6-d486-4968-8b94-c35812f50e75 main] spark.SparkTask: Failed to execute spark task, with exception 'org.apache.hadoop.hive.ql.metadata.HiveException(Failed to create Spark client for Spark session 1afe33ae-3b62-468c-8d2f-f04269421ed7)'
org.apache.hadoop.hive.ql.metadata.HiveException: Failed to create Spark client for Spark session 1afe33ae-3b62-468c-8d2f-f04269421ed7
 at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.getHiveException(SparkSessionImpl.java:221)
 at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:92)
 at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionManagerImpl.getSession(SparkSessionManagerImpl.java:115)
 at org.apache.hadoop.hive.ql.exec.spark.SparkUtilities.getSparkSession(SparkUtilities.java:136)
 at org.apache.hadoop.hive.ql.exec.spark.SparkTask.execute(SparkTask.java:115)
 at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:205)
 at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:97)
 at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:2664)
 at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:2335)
 at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:2011)
 at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1709)
 at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1703)
 at org.apache.hadoop.hive.ql.reexec.ReExecDriver.run(ReExecDriver.java:157)
 at org.apache.hadoop.hive.ql.reexec.ReExecDriver.run(ReExecDriver.java:218)
 at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:239)
 at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:188)
 at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:402)
 at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:821)
 at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:759)
 at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:683)
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 at sun.reflect.NativeMethodAcce

截图如下
在这里插入图片描述

最后分析是因为hive加载不了spark的jars

hive-env.sh 添加

export SPARK_HOME=/opt/module/spark-2.4.5-bin-without-hive
export SPARK_JARS=""
for jar in `ls $SPARK_HOME/jars`; do
    export SPARK_JARS=$SPARK_JARS:$SPARK_HOME/jars/$jar
done
export HIVE_AUX_JARS_PATH=/opt/module/hadoop-3.1.3/share/hadoop/common/hadoop-lzo-0.4.21-SNAPSHOT.jar$SPARK_JARS
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值