HQL报错:Error: Java heap space Error, return code 2 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask

当执行HQL时遇到'Error: Java heap space'错误,可以通过调整HADOOP_HEAPSIZE在'hive-env.sh'中解决。若无效,可在'Hadoop的mapred-site.xml'配置文件中增加相关配置,并重启服务。
摘要由CSDN通过智能技术生成

一 错误

Starting Job = job_1547323088343_0010, Tracking URL = http://hadoop01:8088/proxy/application_1547323088343_0010/
Kill Command = /home/hadoop/install/hadoop-2.5.0-cdh5.3.6//bin/hadoop job  -kill job_1547323088343_0010
Hadoop job information for Stage-1: number of mappers: 1; number of reducers: 1
2019-01-15 16:46:21,433 Stage-1 map = 0%,  reduce = 0%
2019-01-15 16:46:44,308 Stage-1 map = 100%,  reduce = 100%
Ended Job = job_1547323088343_0010 with errors
Error during job, obtaining debugging information...
Examining task ID: task_1547323088343_0010_m_000000 (and more) from job job_1547323088343_0010

Task with the most failures(4): 
-----
Task ID:
  task_1547323088343_0010_m_000000

URL:
  http://hadoop01:8088/taskdetails.jsp?jobid=job_1547323088343_0010&tipid=task_1547323088343_0010_m_000000
-----
Diagnostic Messages for this Task:
Error: Java heap space

FAILED: Execution Error, return code 2 from or
引用\[1\]: 这个错误信息表明在执行HiveHQL语句时出现了错误,具体错误信息是"FAILED: Execution Error, return code 30041 from org.apache.hadoop.hive.ql.exec.spark.SparkTask. Failed to create Spark client for Spark session"。引用\[2\]中提到了类似的问题,报错的语句都是设置了"set hive.execution.engine=spark;"参数。原因是在yarn中有很多新增的任务,导致资源满了。引用\[3\]给出了解决方案,可以在Hive参数中添加一些配置来解决这个问题。具体的解决方案是在"hive-site.xml"的HiveServer2高级配置代码段中添加以下三个参数: hive.spark.client.connect.timeout=30000, hive.spark.client.server.connect.timeout=300000, hive.spark.client.future.timeout=1200。这些参数可以调整Spark客户端的连接和超时时间。 至于你提到的另一个错误"FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask (state=08S01,code=2)",这个错误信息表明在执行Hive的MapReduce任务时出现了错误。根据提供的引用内容,无法确定具体的原因和解决方案。如果你能提供更多的上下文信息或者错误日志,我可以帮助你更好地解决这个问题。 #### 引用[.reference_title] - *1* [hive-spark: Error while processing statement: FAILED: Execution Error, return code 30041 from org.ap](https://blog.csdn.net/TzBugs/article/details/108322762)[target="_blank" data-report-click={"spm":"1018.2226.3001.9630","extra":{"utm_source":"vip_chatgpt_common_search_pc_result","utm_medium":"distribute.pc_search_result.none-task-cask-2~all~insert_cask~default-1-null.142^v91^koosearch_v1,239^v3^insert_chatgpt"}} ] [.reference_item] - *2* *3* [Execution Error, return code 30041 from org.apache.hadoop.hive.ql.exec.spark.SparkTask.](https://blog.csdn.net/hcq_lxq/article/details/124136404)[target="_blank" data-report-click={"spm":"1018.2226.3001.9630","extra":{"utm_source":"vip_chatgpt_common_search_pc_result","utm_medium":"distribute.pc_search_result.none-task-cask-2~all~insert_cask~default-1-null.142^v91^koosearch_v1,239^v3^insert_chatgpt"}} ] [.reference_item] [ .reference_list ]
评论 1
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值