目录
1.节点访问logs目录时 有 Permission denied 的警告
2.failed to launch: nice -n 0 /soft/spark/bin/spark-class org.apache.spark.deploy.worker.Worker...
3. ERROR Worker: Failed to create work directory /soft/spark/work
初次standalone启动时出现三个问题
1.节点访问logs目录时 有 Permission denied 的警告
将spark文件下的logs权限改到了最大777,也不知道为什么会出现这个问题,安装kafka的时候也是,也是讲logs改成了777,开始不知道为什么,后来发现是文件权限属于root用户,把文件都改成自己的用户也可以解决问题
2.failed to launch: nice -n 0 /soft/spark/bin/spark-class org.apache.spark.deploy.worker.Worker...
出现问题如下:
[superahua@b1 /soft/spark/sbin]$./start-all.sh
starting org.apache.spark.deploy.master.Master, logging to /soft/spark/logs/spark-superahua-org.apache.spark.deploy.master.Master-1-b1.out
b2: starting org.apache.spark.deploy.worker.Worker, logging to /soft/spark/logs/spark-superahua-org.apache.spark.deploy.worker.Worker-1-b2.out
b5: starting org.apache.spark.deploy.worker.Worker, logging to /soft/spark/logs/spark-superahua-org.apache.spark.deploy.worker.Worker-1-b5.out
b4: starting org.apache.spark.deploy.worker.Worker, logging to /soft/spark/logs/spark-superahua-org.apache.spark.deploy.worker.Worker-1-b4.out
b3: starting org.apache.spark.deploy.worker.Worker, logging to /soft/spark/logs/spark-superahua-org.apache.spark.deploy.worker.Worker-1-b3.out
b5: failed to launch: nice -n 0 /soft/spark/bin/spark-class org.apache.spark.deploy.worker.Worker --webui-port 8081 spark://b1:7077
b5: JAVA_HOME is not set
b5: full log in /soft/spark/logs/spark-superahua-org.apache.spark.deploy.worker.Worker-1-b5.out
b4: failed to launch: nice -n 0 /soft/spark/bin/spark-class org.apache.spark.deploy.worker.Worker --webui-port 8081 spark://b1:7077
b4: JAVA_HOME is not set
b4: full log in /soft/spark/logs/spark-superahua-org.apache.spark.deploy.worker.Worker-1-b4.out
b2: failed to launch: nice -n 0 /soft/spark/bin/spark-class org.apache.spark.deploy.worker.Worker --webui-port 8081 spark://b1:7077
b2: JAVA_HOME is not set
b2: full log in /soft/spark/logs/spark-superahua-org.apache.spark.deploy.worker.Worker-1-b2.out
b3: failed to launch: nice -n 0 /soft/spark/bin/spark-class org.apache.spark.deploy.worker.Worker --webui-port 8081 spark://b1:7077
b3: JAVA_HOME is not set
b3: full log in /soft/spark/logs/spark-superahua-org.apache.spark.deploy.worker.Worker-1-b3.out
想到了hadoop中若hadoop-env.sh 未设置JAVA_HOME也会出现类似的问题,就在hadoop-env.sh添加了这个:
# set JAVA_HOME in this file, so that it is correctly defined on
# remote nodes.
export JAVA_HOME=/soft/jdk
# The java implementation to use.
export JAVA_HOME=${JAVA_HOME}
于是进spark/conf/:
cp spark-env.sh.template spark-env.sh
vi spark-env.sh
在文件末添加了export JAVA_HOME=/soft/jdk
如下:
再次启动集群出现下面的错误3
3. ERROR Worker: Failed to create work directory /soft/spark/work
starting org.apache.spark.deploy.master.Master, logging to /soft/spark/logs/spark-superahua-org.apache.spark.deploy.master.Master-1-b1.out
b2: starting org.apache.spark.deploy.worker.Worker, logging to /soft/spark/logs/spark-superahua-org.apache.spark.deploy.worker.Worker-1-b2.out
b5: starting org.apache.spark.deploy.worker.Worker, logging to /soft/spark/logs/spark-superahua-org.apache.spark.deploy.worker.Worker-1-b5.out
b4: starting org.apache.spark.deploy.worker.Worker, logging to /soft/spark/logs/spark-superahua-org.apache.spark.deploy.worker.Worker-1-b4.out
b3: starting org.apache.spark.deploy.worker.Worker, logging to /soft/spark/logs/spark-superahua-org.apache.spark.deploy.worker.Worker-1-b3.out
b2: failed to launch: nice -n 0 /soft/spark/bin/spark-class org.apache.spark.deploy.worker.Worker --webui-port 8081 spark://b1:7077
b5: failed to launch: nice -n 0 /soft/spark/bin/spark-class org.apache.spark.deploy.worker.Worker --webui-port 8081 spark://b1:7077
b5: 19/11/15 11:15:53 INFO Worker: Spark home: /soft/spark
b5: 19/11/15 11:15:53 ERROR Worker: Failed to create work directory /soft/spark/work
b5: full log in /soft/spark/logs/spark-superahua-org.apache.spark.deploy.worker.Worker-1-b5.out
b2: 19/11/15 11:15:53 INFO Worker: Spark home: /soft/spark
b2: 19/11/15 11:15:53 ERROR Worker: Failed to create work directory /soft/spark/work
b2: full log in /soft/spark/logs/spark-superahua-org.apache.spark.deploy.worker.Worker-1-b2.out
在worker节点的/soft/spark/ 下手动创建了work文件夹
三个问题解决,重启集群后成功