hduser@xxx:/data1/hadoop/spark/bin$ ./spark-shell --master yarn --deploy-mode client2020-04-13 10:04:25 WARN NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes whereapplicable
Settingdefault log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).2020-04-13 10:10:10 ERROR YarnClientSchedulerBackend:70 - The YARN application has already ended! It might have been killed or the Application Master may have failed to start. Check the YARN application logs formore details.2020-04-13 10:10:10 ERROR SparkContext:91 -Error initializing SparkContext.
org.apache.spark.SparkException: Application application_1586511331938_0025 failed2 times due to AM Container for appattempt_1586511331938_0025_000002 exited with exitCode: 1For more detailed output, check application tracking page:http://xxx:8088/proxy/application_1586511331938_0025/Then, click on links to logs of each attempt.
Diagnostics: Exception from container-launch.
Container id: container_1586511331938_0025_02_000159
Exit code:1Stack trace: ExitCodeException exitCode=1:
at org.apache.hadoop.util.Shell.runCommand(Shell.java:538)
at org.apache.hadoop.util.Shell.run(Shell.java:455)
at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:715)
at org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(DefaultContainerExecutor.java:211)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:302)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:82)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
Container exited with a non-zero exit code 1Failingthisattempt. Failing the application.
at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.waitForApplication(YarnClientSchedulerBackend.scala:94)
at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:63)
at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:178)
at org.apache.spark.SparkContext.(SparkContext.scala:501)
at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2520)
at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:935)
at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:926)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:926)
at org.apache.spark.repl.Main$.createSparkSession(Main.scala:106)
at $line3.$read$$iw$$iw.(:15)
at $line3.$read$$iw.(:43)
at $line3.$read.(:45)
at $line3.$read$.(:49)
<