spark的版本为:2.1.2
下载地址:https://archive.apache.org/dist/spark/spark-2.1.2/
分配如下:
hadoop101 | hadoop102 | hadoop103 | hadop104 |
master | |||
worker | worker | worker | worker |
1.首先下载的文件解压:
tar -zxvf spark-2.1.2-bin-hadoop2.7.tgz
2.进入解压后的文件,修改配置
cd spark/conf
3.修改文件名spark-env.sh.template为spark-env.sh
cp spark-env.sh.template spark-env.sh
4.修改spark-env.sh
vim spark-env.sh
添加如下配置:
#Jdk
export JAVA_HOME=/home/hadoop/software/jdk1.8.0_201
#Master
export SPARK_MASTER_IP=hadoop101
#端口号
export SPARK_MASTER_PORT=7077
5.在conf下面创建slaves文件,配置worker节点
touch slaves
vim slaves
添加:
hadoop101
hadoop102
hadoop103
hadoop104
6.此时配置完毕,然后将配置好的文件分发到其他机器
scp -r spark hadoop@hadoop102:/home/hadoop/software/
scp -r spark hadoop@hadoop103:/home/hadoop/software/
scp -r spark hadoop@hadoop104:/home/hadoop/software/
7.启动
在节点hadoop101上启动:
sbin/start-all.sh