hive操作insert时,accepted状态, return code 2 from org.apache.hadoop.hive.ql.exec.mr.Map...问题处理

hive (mydb)> insert into custs(id,name) values(1,'LiLei');
Query ID = grid_20190423164217_fc3f0961-0a15-430c-9fce-78999c909f02
Total jobs = 3
Launching Job 1 out of 3
Number of reduce tasks determined at compile time: 1
In order to change the average load for a reducer (in bytes):
  set hive.exec.reducers.bytes.per.reducer=<number>
In order to limit the maximum number of reducers:
  set hive.exec.reducers.max=<number>
In order to set a constant number of reducers:
  set mapreduce.job.reduces=<number>
Starting Job = job_1556008617802_0001, Tracking URL = http://master:8088/proxy/application_1556008617802_0001/
Kill Command = /home/grid/hadoop-3.2.0/bin/mapred job  -kill job_1556008617802_0001

查看hadoop网页信息,显示状态为accepted;

原因:内存不足

解决方法:修改hadoop的配置文件yarn-site.xml

修改前

<property>
    <name>yarn.nodemanager.resource.memory-mb</name>
    <value>2000</value>
</property>

修改后:

<property>
    <name>yarn.nodemanager.resource.memory-mb</name>
    <value>3000</value>
</property>

重启hadoop,再执行hive的insert语句,ACCEPTED状态跑过,但却又发生如下错误

hive (mydb)> insert into custs(id,name) values(1,'II');
Query ID = grid_20190427162936_4d0cdc7e-43bb-49b6-b356-f6cc02282d38
Total jobs = 3
Launching Job 1 out of 3
Number of reduce tasks determined at compile time: 1
In order to change the average load for a reducer (in bytes):
  set hive.exec.reducers.bytes.per.reducer=<number>
In order to limit the maximum number of reducers:
  set hive.exec.reducers.max=<number>
In order to set a constant number of reducers:
  set mapreduce.job.reduces=<number>
Starting Job = job_1556353582599_0001, Tracking URL = http://master:8088/proxy/application_1556353582599_0001/
Kill Command = /home/grid/hadoop-3.2.0/bin/mapred job  -kill job_1556353582599_0001
Hadoop job information for Stage-1: number of mappers: 0; number of reducers: 0
2019-04-27 16:29:55,250 Stage-1 map = 0%,  reduce = 0%
Ended Job = job_1556353582599_0001 with errors
Error during job, obtaining debugging information...
FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask

MapReduce Jobs Launched: 
Stage-Stage-1:  HDFS Read: 0 HDFS Write: 0 FAIL
Total MapReduce CPU Time Spent: 0 msec

表面上是这个错误,在hadoop网页上查看日志发现更详细的日志如下:

Application application_1556367262246_0001 failed 2 times due to AM Container for appattempt_1556367262246_0001_000002 exited with exitCode: 1
Failing this attempt.Diagnostics: [2019-04-27 20:32:23.974]Exception from container-launch.
Container id: container_1556367262246_0001_02_000001
Exit code: 1
[2019-04-27 20:32:23.979]Container exited with a non-zero exit code 1. Error file: prelaunch.err.
Last 4096 bytes of prelaunch.err :
Last 4096 bytes of stderr :
错误: 找不到或无法加载主类 org.apache.hadoop.mapreduce.v2.app.MRAppMaster
[2019-04-27 20:32:23.981]Container exited with a non-zero exit code 1. Error file: prelaunch.err.
Last 4096 bytes of prelaunch.err :
Last 4096 bytes of stderr :
错误: 找不到或无法加载主类 org.apache.hadoop.mapreduce.v2.app.MRAppMaster

For more detailed output, check the application tracking page: http://master:8088/cluster/app/application_1556367262246_0001 Then click on links to logs of each attempt.
. Failing the application. 

 

原因:

在新安装了 Hadoop 后运行 HDFS 的任务没问题,但一运行 MapReduce 任务时就出错,提示“错误: 找不到或无法加载主类 org.apache.hadoop.mapreduce.v2.app.MRAppMaster”。但是在 Hadoop 的 classpath 中明明有该主类所在的包“hadoop-mapreduce-client-app-x.x.x.jar”。查阅资料发现还要在 mapred-site.xml 文件中添加 mapreduce 程序所用到的 classpath。

解决方法:

mapred-site.xml文件中添加mapreduce所需要用到的classpath。

<property>
   <name>mapreduce.application.classpath</name>
   <value>$HADOOP_MAPRED_HOME/share/hadoop/mapreduce/*,$HADOOP_MAPRED_HOME/share/hadoop/mapreduce/lib/*</value>
</property>

$HADOOP_MAPRED_HOME要写安装hadoop的绝对路径,我的是/home/grid/hadoop-3.2.0,所以最后就是

<property>
   <name>mapreduce.application.classpath</name>
   <value>/home/grid/hadoop-3.2.0/share/hadoop/mapreduce/*, /home/grid/hadoop-3.2.0/share/hadoop/mapreduce/lib/*</value>
</property>

修改完成之后,把mapred-site.xml文件复制到集群中其他的节点相应位置上,重启hadoop,

再在hive上执行insert,成功;

  • 3
    点赞
  • 2
    收藏
    觉得还不错? 一键收藏
  • 1
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论 1
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值