1. Spark Pre-build for hadoop 2.6 and later下载
wget http://www.apache.org/dyn/closer.lua/spark/spark-1.6.0/spark-1.6.0-bin-hadoop2.6.tgz
tar zxvf spark-1.6.0-bin-hadoop2.6.tgz
2. 修改环境变量
cd conf/
cp spark-env.sh.template spark-env.sh
vim spark-env.sh
在文件末尾添加Master的IP地址:
export SPARK_MASTER_IP=192.168.1.6
export SPARK_LOCAL_IP=192.168.1.6
3. 启动master和slave
sbin/start-master.sh
sbin/start-slave.sh spark://192.168.1.6:7077
4. 确认是否正确启动
# jps
5637 Jps
4413 Master
4797 Worker
打开Spark控制台:http://192.168.1.6:8080/