在执行./start-all.sh
出现
妈耶,花了我几个小时
starting org.apache.spark.deploy.master.Master, logging to /usr/local/spark-2.4.7-bin-without-hadoop/logs/spark-centos7-gao-org.apache.spark.deploy.master.Master-1-master.out
failed to launch: nice -n 0 /usr/local/spark-2.4.7-bin-without-hadoop/bin/spark-class org.apache.spark.deploy.master.Master --host master --port 7077 --webui-port 8080
at java.lang.Class.getMethod0(Class.java:3018)
at java.lang.Class.getMethod(Class.java:1784)
at sun.launcher.LauncherHelper.validateMainClass(LauncherHelper.java:650)
at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:632)
Caused by: java.lang.ClassNotFoundException: org.slf4j.Logger
at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:355)
at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
... 7 more
full log in /usr/local/spark-2.4.7-bin-without-hadoop/logs/spark-centos7-gao-org.apache.spark.deploy.master.Master-1-master.out
解决办法
在配置spark-env.sh时,
一定要添加这个
export SPARK_DIST_CLASSPATH=$(/data/tools/hadoop-2.8.5/bin/hadoop classpath)
我之前是,一直报这个错误
export JAVA_HOME=/home/centos7-gao/Software/java/jdk1.8.0_281
export HADOOP_HOME=/usr/local/hadoop-2.7.7
export HADOOP_CONF_DIR=$HADOOP_HOME/etc/hadoop
export SPARK_MASTER_HOST=master
export SPARK_MASTER_PORT=7077
添加后
export JAVA_HOME=/home/centos7-gao/Software/java/jdk1.8.0_281
export HADOOP_HOME=/usr/local/hadoop-2.7.7
export HADOOP_CONF_DIR=$HADOOP_HOME/etc/hadoop
export SPARK_MASTER_HOST=master
export SPARK_MASTER_PORT=7077
export SPARK_DIST_CLASSPATH=$(/data/tools/hadoop-2.8.5/bin/hadoop classpath)
Google了很多解决方案,都没有用
比如:
https://blog.csdn.net/one111a/article/details/98597381