hdp集成apache spark提交任务时报错:bad substitution

集群环境:

hdp 3.1.5.0-152

apache spark 2.4.7

错误截图:

错误:

2022-02-18 06:31:00,428 INFO  [main] yarn.Client (Logging.scala:logInfo(54)) - Submitting application application_1644979649961_0061 to ResourceManager
2022-02-18 06:31:00,464 INFO  [main] impl.YarnClientImpl (YarnClientImpl.java:submitApplication(306)) - Submitted application application_1644979649961_0061
2022-02-18 06:31:01,466 INFO  [main] yarn.Client (Logging.scala:logInfo(54)) - Application report for application_1644979649961_0061 (state: ACCEPTED)
2022-02-18 06:31:01,469 INFO  [main] yarn.Client (Logging.scala:logInfo(54)) - 
	 client token: N/A
	 diagnostics: AM container is launched, waiting for AM container to Register with RM
	 ApplicationMaster host: N/A
	 ApplicationMaster RPC port: -1
	 queue: default
	 start time: 1645165860441
	 final status: UNDEFINED
	 tracking URL: http://awnx1-cdata-tnode01:8088/proxy/application_1644979649961_0061/
	 user: root
2022-02-18 06:31:02,470 INFO  [main] yarn.Client (Logging.scala:logInfo(54)) - Application report for application_1644979649961_0061 (state: ACCEPTED)
2022-02-18 06:31:03,471 INFO  [main] yarn.Client (Logging.scala:logInfo(54)) - Application report for application_1644979649961_0061 (state: FAILED)
2022-02-18 06:31:03,472 INFO  [main] yarn.Client (Logging.scala:logInfo(54)) - 
	 client token: N/A
	 diagnostics: Application application_1644979649961_0061 failed 1 times (global limit =2; local limit is =1) due to AM Container for appattempt_1644979649961_0061_000001 exited with  exitCode: 1
Failing this attempt.Diagnostics: [2022-02-18 06:31:02.612]Exception from container-launch.
Container id: container_e02_1644979649961_0061_01_000001
Exit code: 1

[2022-02-18 06:31:02.613]Container exited with a non-zero exit code 1. Error file: prelaunch.err.
Last 4096 bytes of prelaunch.err :
/data/hadoop/yarn/local/usercache/root/appcache/application_1644979649961_0061/container_e02_1644979649961_0061_01_000001/launch_container.sh: line 38: $PWD:$PWD/__spark_conf__:$PWD/__spark_libs__/*:$HADOOP_CONF_DIR:/usr/hdp/3.1.5.0-152/hadoop/*:/usr/hdp/3.1.5.0-152/ha
doop/lib/*:/usr/hdp/current/hadoop-hdfs-client/*:/usr/hdp/current/hadoop-hdfs-client/lib/*:/usr/hdp/current/hadoop-yarn-client/*:/usr/hdp/current/hadoop-yarn-client/lib/*:$PWD/mr-framework/hadoop/share/hadoop/mapreduce/*:$PWD/mr-framework/hadoop/share/hadoop/mapreduce/lib/*:$PWD/mr-framework/hadoop/share/hadoop/common/*:$PWD/mr-framework/hadoop/share/hadoop/common/lib/*:$PWD/mr-framework/hadoop/share/hadoop/yarn/*:$PWD/mr-framework/hadoop/share/hadoop/yarn/lib/*:$PWD/mr-framework/hadoop/share/hadoop/hdfs/*:$PWD/mr-framework/hadoop/share/hadoop/hdfs/lib/*:$PWD/mr-framework/hadoop/share/hadoop/tools/lib/*:/usr/hdp/${hdp.version}/hadoop/lib/hadoop-lzo-0.6.0.${hdp.version}.jar:/etc/hadoop/conf/secure:$PWD/__spark_conf__/__hadoop_conf__: bad substitution
[2022-02-18 06:31:02.613]Container exited with a non-zero exit code 1. Error file: prelaunch.err.

该错误是由于apache版本的spark提交任务时找不到hdp.version变量,我们需要在HDP管理页面的MapReduce2的配置中添加该变量 

 文章参考:spark 异常 __spark_conf__/__hadoop_conf__: bad substitution_编程学问网

  • 1
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 打赏
    打赏
  • 2
    评论
评论 2
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包

打赏作者

头顶榴莲树

你的鼓励是我最大的动力~

¥1 ¥2 ¥4 ¥6 ¥10 ¥20
扫码支付:¥1
获取中
扫码支付

您的余额不足,请更换扫码支付或充值

打赏作者

实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值