一、环境准备
Centos7.0
Jdk1.8
安装JDK1.8
二、Spark安装
1、Spark程序下载与解压缩
下载地址
tar xvf spark-3.1.1-bin-hadoop3.2.tgz
2、配置文件
conf/spark-env.sh文件:
tmp目录可无需提前创建
export SPARK_LOCAL_HOSTNAME=datanode1 #此处填写机器名称
export SPARK_TMPDIR=/home/tmp
export SPARK_LOCAL_DIRS=/home/tmp
export SPARK_PID_DIR=/home/tmp
/etc/profile文件
export PATH=$PATH:/software/spark-3.1.1-bin-hadoop3.2/bin
export SPARK_HOME=/software/spark-3.1.1-bin-hadoop3.2/
# 刷新文件
source /etc/profile
conf/spark-defaults.conf文件:
spark.master spark://namenode:7077
spark.eventLog.enabled true
spark.eventLog.dir hdfs://namenode:8021/spark
spark.serializer org.apache.spark.serializer.KryoSerializer
spark.driver.memory 1g
spark.executor.extraJavaOptions -XX:+PrintGCDetails -Dkey=value -
Dnumbers="one two three“
conf/workers文件(主节点添加):
namenode
datanode1
3、启动服务
sbin/start-all.sh
4、测试(ip为主节点ip)
windows访问地址http://172.20.10.7:8080/