Spark的的配置文件spark-env.sh配置如下:
export JAVA_HOME=/usr/local/jdk1.7.0_79
export SCALA_HOME=/home/hadoop/scala-2.10.1
exportHADOOP_HOME=/home/hadoop/hadoop-2.5.0
export SPARK_MASTER_IP=Master1
export SPARK_WORKER_MEMORY=1g
export HADOOP_CONF_DIR=/home/hadoop/hadoop-2.5.0/etc/hadoop
SPARK_MASTER_PORT=7077
SPARK_MASTER_WEBUI_PORT=18080
SPARK_WORKER_WEBUI_PORT=18081
exportSPARK_DAEMON_JAVA_OPTS="-Dspark.deploy.recoveryMode=ZOOKEEPER-Dspark.deploy.zookeeper.url=Master1:2181,Master2:2181,Slave1:2181,Slave2:2181,Slave3-Dspark.deploy.zookeeper.dir=/spark"
启动之后发现Master2上的Spark没有启动,查看日志信息如下:
[SparkR@Master2 logs]$ more spark-SparkR-org.apache.spark.deploy.master.Master-1-Master2.out
Spark Command: /home/SparkR/jdk1.7.0_79/bin/java -cp /home/SparkR/spark-1.6.1-bin-hadoop2.6/conf/:/