spark-shell启动报错:Yarn application has already ended! It might have been killed or unable to launch...

0人阅读 评论(0) 收藏 举报
分类:

前半部分转自:https://www.cnblogs.com/tibit/p/7337045.html (后半原创)

spark-shell不支持yarn cluster,以yarn client方式启动
spark-shell --master=yarn --deploy-mode=client

启动日志,错误信息如下

 

其中“Neither spark.yarn.jars nor spark.yarn.archive is set, falling back to uploading libraries under SPARK_HOME”,只是一个警告,官方的解释如下:

To make Spark runtime jars accessible from YARN side, you can specify spark.yarn.archive or spark.yarn.jars. For details please refer to Spark Properties. If neither spark.yarn.archive nor spark.yarn.jars is specified, Spark will create a zip file with all jars under $SPARK_HOME/jars and upload it to the distributed cache.

大概是说:如果 spark.yarn.jars 和 spark.yarn.archive都没配置,会把$SPAR_HOME/jars下面所有jar打包成zip文件,上传到每个工作分区,所以打包分发是自动完成的,没配置这俩参数没关系。

 

"Yarn application has already ended! It might have been killed or unable to launch application master",这个可是一个异常,打开mr管理页面,我的是 http://192.168.128.130/8088 ,

重点在红框处,2.2g的虚拟内存实际值,超过了2.1g的上限。也就是说虚拟内存超限,所以contrainer被干掉了,活都是在容器干的,容器被干掉了,还玩个屁。

解决方案

yarn-site.xml 增加配置:

2个配置2选一即可

复制代码
 1 <!--以下为解决spark-shell 以yarn client模式运行报错问题而增加的配置,估计spark-summit也会有这个问题。2个配置只用配置一个即可解决问题,当然都配置也没问题-->
 2 <!--虚拟内存设置是否生效,若实际虚拟内存大于设置值 ,spark 以client模式运行可能会报错,"Yarn application has already ended! It might have been killed or unable to l"-->
 3 <property>
 4     <name>yarn.nodemanager.vmem-check-enabled</name>
 5     <value>false</value>
 6     <description>Whether virtual memory limits will be enforced for containers</description>
 7 </property>
 8 <!--配置虚拟内存/物理内存的值,默认为2.1,物理内存默认应该是1g,所以虚拟内存是2.1g-->
 9 <property>
10     <name>yarn.nodemanager.vmem-pmem-ratio</name>
11     <value>4</value>
12     <description>Ratio between virtual memory to physical memory when setting memory limits for containers</description>
13 </property>
复制代码

 

修改后,启动hadoop,spark-shell.

---------------------------------------------------下面原创------------------------------------------------------------

我在spark1.6的老集群上面的yarn master安装了spark2.3,local模式启动正常,但是spark2.3 on yarn启动(spark)报错信息同上文;区别在于yarn的报错信息:

Application application_1522048616169_0024 failed 2 times due to AM Container for appattempt_1522048616169_0024_000002 exited with exitCode: 1
For more detailed output, check application tracking page:http://slave1:8088/proxy/application_1522048616169_0024/Then, click on links to logs of each attempt.
Diagnostics: Exception from container-launch.
Container id: container_1522048616169_0024_02_000001
Exit code: 1
Stack trace: ExitCodeException exitCode=1:
at org.apache.hadoop.util.Shell.runCommand(Shell.java:538)
at org.apache.hadoop.util.Shell.run(Shell.java:455)
at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:715)
at org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(DefaultContainerExecutor.java:212)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:302)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:82)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
Container exited with a non-zero exit code 1
Failing this attempt. Failing the application.

显然没有那么直接明了的错误提示,进一步查看以下log:HADOOP_HOME/logs/userlogs/application_1522048616169_0028/container_1522048616169_0028_01_000001/stderr

Exception in thread "main" java.lang.UnsupportedClassVersionError: org/apache/spark/network/util/ByteUnit : Unsupported major.minor version 52.0
        at java.lang.ClassLoader.defineClass1(Native Method)
        at java.lang.ClassLoader.defineClass(ClassLoader.java:800)
        at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
        at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
        at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
        at org.apache.spark.deploy.history.config$.<init>(config.scala:44)
        at org.apache.spark.deploy.history.config$.<clinit>(config.scala)
        at org.apache.spark.SparkConf$.<init>(SparkConf.scala:635)
        at org.apache.spark.SparkConf$.<clinit>(SparkConf.scala)
        at org.apache.spark.SparkConf.set(SparkConf.scala:94)
        at org.apache.spark.SparkConf$$anonfun$loadFromSystemProperties$3.apply(SparkConf.scala:76)
        at org.apache.spark.SparkConf$$anonfun$loadFromSystemProperties$3.apply(SparkConf.scala:75)
        at scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:733)
        at scala.collection.immutable.HashMap$HashMap1.foreach(HashMap.scala:221)
        at scala.collection.immutable.HashMap$HashTrieMap.foreach(HashMap.scala:428)
        at scala.collection.immutable.HashMap$HashTrieMap.foreach(HashMap.scala:428)
        at scala.collection.immutable.HashMap$HashTrieMap.foreach(HashMap.scala:428)
        at scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:732)
        at org.apache.spark.SparkConf.loadFromSystemProperties(SparkConf.scala:75)
        at org.apache.spark.SparkConf.<init>(SparkConf.scala:70)
        at org.apache.spark.SparkConf.<init>(SparkConf.scala:57)
        at org.apache.spark.deploy.yarn.ApplicationMaster.<init>(ApplicationMaster.scala:62)
        at org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:823)
        at org.apache.spark.deploy.yarn.ExecutorLauncher$.main(ApplicationMaster.scala:854)

        at org.apache.spark.deploy.yarn.ExecutorLauncher.main(ApplicationMaster.scala)

由此可见,是配置的jdk不支持,由于旧的配置引用jdk7,然而spark2.3需要jdk8;因此修改yarn-env.sh

#export JAVA_HOME=/usr/java/jdk1.7.0_55

export JAVA_HOME=/r2/jwb/java/jdk1.8.0_161

yarn没重启,,,继续还是报一样的错。。。未完待续。。。

查看评论

spark-shell启动报错:Yarn application has already ended! It might have been killed or unable to launch ap

我的Hadoop版本:2.7.3, Spark版本:2.1.2,4台虚拟机,master,data1,data2,data3的伪分布式。 参考的方法 http://www.cnblogs.com/t...
  • ellen881103
  • ellen881103
  • 2017-12-03 16:30:46
  • 425

Yarn application has already ended! It might have been killed or unable to launch application maste

yarn-client运行spark任务是出现:  Yarn application has already ended! It might have been killed or unable t...
  • JCY19890312
  • JCY19890312
  • 2016-10-19 11:27:15
  • 606

spark-shell on yarn 出错(arn application already ended,might be killed or not able to launch applic)解决

今天想要将spark-shell 在yarn-client的状态下 结果出错: [hadoop@localhost spark-1.0.1-bin-hadoop2]$ bin/spark-shell...
  • sunflower_cao
  • sunflower_cao
  • 2014-07-22 17:49:15
  • 18914

Spark Streaming On Yarn/ On StandAlone模式下的checkpointing容错

Spark On Yarn: 在Spark On Yarn模式下部署Spark Streaming 时候,我们需要使用StreamingContext.getOrCreate方法创建Streaming...
  • Dax1n
  • Dax1n
  • 2017-09-08 15:04:28
  • 288

提交spark的代码的时候出现Exception in thread "main" org.apache.SparkException:Yarn application has already end

出现的具体错误是:Exception in thread &quot;main&quot; org.apache.SparkException:Yarn application has already...
  • wangxiyanw
  • wangxiyanw
  • 2018-03-06 17:08:32
  • 17

spark on yarn启动异常

一个困扰了我好久的异常:17/05/31 23:53:23 ERROR spark.SparkContext: Error initializing SparkContext. org.apache....
  • GG584741
  • GG584741
  • 2017-06-01 00:11:07
  • 3875

Spark On Yarn 提交任务报错ERROR SparkContext: Error initializing SparkContext.

spark-submit --class org.apache.spark.examples.SparkPi --master yarn --deploy-mode client $SPARK_HOM...
  • m0_37109150
  • m0_37109150
  • 2017-11-14 17:02:41
  • 55

安装spark on yarn error

error : bad substitution /hadoop/yarn/local/usercache/root/appcache/application_1440502498372_0012/...
  • kenandetonghua
  • kenandetonghua
  • 2015-09-27 19:40:30
  • 991

spark-shell --master yarn-client(异常已经解决)

[root@node1 ~]# spark-shell --master yarn-client Warning: Master yarn-client is deprecated since 2.0...
  • chengyuqiang
  • chengyuqiang
  • 2017-04-09 21:13:36
  • 8031

提交spark程序到yarn出现ERROR SparkContext: Error initializing SparkContext.

命令行输出的java异常栈ERROR SparkContext: Error initializing SparkContext.org.apache.spark.SparkException: Ya...
  • piduzi
  • piduzi
  • 2018-02-26 14:15:26
  • 310
    个人资料
    等级:
    访问量: 81
    积分: 72
    排名: 157万+
    文章分类
    文章存档