一,
问题:
Failed to execute spark task, with exception 'org.apache.hadoop.hive.ql.metadata.HiveException(Failed to create Spark client for Spark session 7b1581d9-5e43-46d0-9eb1-ae7b96dc9f77)'
FAILED: Execution Error, return code 30041 from org.apache.hadoop.hive.ql.exec.spark.SparkTask. Failed to create Spark client for Spark session 7b1581d9-5e43-46d0-9eb1-ae7b96dc9f77
原因:没有连接上spark
解决方法:
在hive-site.xml增加下面的配置,防止因为连接时间过长导致超时报错
<!--Hive和spark连接超时时间-->
<property>
<name>hive.spark.client.connect.timeout</name>
<value>100000ms</value>
</property>
二,
解决上面的报错后,又出现下面的问题:
Failed to monitor Job[-1] with exception 'java.lang.IllegalStateException(Connection to remote Spark driver was lost)' Last known state = SENT
Failed to execute spark task, with exception 'java.lang.IllegalStateException(RPC channel is closed.)'
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.spark.SparkTask. RPC channel is closed.
原因:没有spark-history文件
解决方法:
在hdfs上创建此文件
hadoop fs -mkdir /spark-history
成功解决:

785

被折叠的 条评论
为什么被折叠?



