初次start启动时出错:
starting org.apache.spark.deploy.master.Master, logging to /soft/spark/log/spark-superahua-org.apache.spark.deploy.master.Master-1-b1.out
failed to launch org.apache.spark.deploy.master.Master:
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 7 more
full log in /soft/spark/log/spark-superahua-org.apache.spark.deploy.master.Master-1-b1.out
localhost: starting org.apache.spark.deploy.worker.Worker, logging to /soft/spark/log/spark-superahua-org.apache.spark.deploy.worker.Worker-1-b1.out
localhost: failed to launch org.apache.spark.deploy.worker.Worker:
localhost: at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
localhost: ... 7 more
localhost: full log in /soft/spark/log/spark-superahua-org.apache.spark.deploy.worker.Worker-1-b1.out
查看master日志
Error: A JNI error has occurred, please check your installation and try again
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration
at java.lang.Class.getDeclaredMethods0(Native Method)
at java.lang.Class.privateGetDeclaredMethods(Class.java:2701)
at java.lang.Class.privateGetMethodRecursive(Class.java:3048)
at java.lang.Class.getMethod0(Class.java:3018)
at java.lang.Class.getMethod(Class.java:1784)
at sun.launcher.LauncherHelper.validateMainClass(LauncherHelper.java:544)
at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:526)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.conf.Configuration
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
查看worker日志
Error: A JNI error has occurred, please check your installation and try again
Exception in thread "main" java.lang.NoClassDefFoundError: org/slf4j/Logger
at java.lang.Class.getDeclaredMethods0(Native Method)
at java.lang.Class.privateGetDeclaredMethods(Class.java:2701)
at java.lang.Class.privateGetMethodRecursive(Class.java:3048)
at java.lang.Class.getMethod0(Class.java:3018)
at java.lang.Class.getMethod(Class.java:1784)
at sun.launcher.LauncherHelper.validateMainClass(LauncherHelper.java:544)
at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:526)
Caused by: java.lang.ClassNotFoundException: org.slf4j.Logger
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
解决:
开始时看这个博客https://blog.csdn.net/qq_41212491/article/details/87716710
在spark-env.sh中添加了:
export export SPARK_DIST_CLASSPATH=$(${HADOOP_HOME}/bin/hadoop classpath)
还是不对,出现:
starting org.apache.spark.deploy.master.Master, logging to /soft/spark/log/spark-superahua-org.apache.spark.deploy.master.Master-1-b1.out
localhost: /soft/spark/conf/spark-env.sh: line 73: /bin/hadoop: No such file or directory
localhost: starting org.apache.spark.deploy.worker.Worker, logging to /soft/spark/log/spark-superahua-org.apache.spark.deploy.worker.Worker-1-b1.out
localhost: failed to launch org.apache.spark.deploy.worker.Worker:
localhost: at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
localhost: ... 7 more
localhost: full log in /soft/spark/log/spark-superahua-org.apache.spark.deploy.worker.Worker-1-b1.out
还是不写环境变量了,换成了完整路径
export SPARK_DIST_CLASSPATH=$(/soft/hadoop/bin/hadoop classpath)
启动成功
starting org.apache.spark.deploy.master.Master, logging to /soft/spark/log/spark-superahua-org.apache.spark.deploy.master.Master-1-b1.out
localhost: starting org.apache.spark.deploy.worker.Worker, logging to /soft/spark/log/spark-superahua-org.apache.spark.deploy.worker.Worker-1-b1.out