1.准备三台机器
192.168.193.129 hasoop01
192.168.193.130 hadoop02
192.168.193.131 hadoop03
2.下载压缩包:下载地址:https://archive.apache.org/dist/spark/spark-2.4.4/
3.解压到/opt/mosule目录下面
4.进入到:cd spark-2.4.4-bin-hadoop2.7/conf
复制文件: cp spark-env.sh.template spark-env.sh
cp slaves.template slaves
vi spark-env.sh
添加内容如下:
export JAVA_HOME=/opt/module/jdk1.8.0_221
export HADOOP_HOME=/opt/module/hadoop-2.7.4
export SPARK_MASTER_MASTER_IP=192.168.193.129
export SPARK_WORKED_MEMORY=1g
export HADOOP_CONF_DIR=/opt/module/hadoop-2.7.4/etc/hadoop
export SPARK_DIST_CLASSPATH=$(/opt/module/hadoop-2.7.4/bin/hado
op classpath)
vi slaves