场景:
搭建hadoop数仓时部署hive on spark
报错:
Failed to execute spark task, with exception 'org.apache.hadoop.hive.gl.metadata.HiveException(FailedTcreate Spark client for Spark session 272d523b-d472-4d93-8610-7ef9ac7e07a2)AILED: Execution Error, return code 30041 from org.apache.hadoop.hive.ql.exec.spark.SparkTask. Failedto create Spark client for Spark session 272d523b-d472-4d93-8610-7ef9ac7e07a2
解决思路:
先去查一下hive的日志
vim $HIVE_HOME/conf/hive-log4j2.properties
报错显示103的8032(MR)端口拒接链接 使用nc测试一下
nc -zv hadoop103 8032
测试结果不通的 好像是启动集群时103的mr没起来
sbin/start-yarn.sh
jps检查一下(这里用的是xcall的一个小脚本)
xcall脚本如下
#! /bin/bash
for i in hadoop102 hadoop103 hadoop100
do
echo --------- $i ----------
ssh $i "$*"
done
重试:
重试一下发现还是报错检查一下日志
Caused by: java.util.concurrent,TimeoutException: Timed out waiting for client connection
问一下chatgpt
结合这报错可能是连接超时了 改下连接超时参数试一下
修改hive-site.xml
<property>
<name>hive.spark.client.connect.timeout</name>
<value>10000</value>
</property>
<property>
<name>hive.spark.client.server.connect.timeout</name>
<value>90000</value>
</property>
保存重试成功运行问题解决