1.安装Scala
1.1 下载解压安装包
cd /usr/local
wget https://downloads.lightbend.com/scala/2.12.2/scala-2.12.2.tgz
tar -zxvf scala-2.12.2.tgz
mv scala-2.12.2 scala
1.2 配置环境变量
vi /etc/profile
#添加下面配置
export SCALA_HOME=/usr/local/scala
export PATH=$PATH:$SCALA_HOME/bin
#使其生效
source /etc/profile
#验证(能出现版本则成功)
scala -version
2.下载解压Spark安装包
cd /usr/local
wget http://mirror.bit.edu.cn/apache/spark/spark-2.4.4/spark-2.4.4-bin-hadoop2.7.tgz
tar -zxvf spark-2.4.4-bin-hadoop2.7.tgz
mv spark-2.4.4-bin-hadoop2.7 spark
3.配置环境变量
sudo vi /etc/profile
#编辑环境变量
export SPARK_HOME=/usr/local/spark
export PATH=$PATH:$SPARK_HOME/bin
#使其生效
source /etc/profile
4.配置Spark系统文件
cd spark/conf
#重命名
mv spark-env.sh.template spark-env.sh
#修改spark-env.sh
vi spark-env.sh
#添加如下配置
export JAVA_HOME=/usr/local/jdk1.8.0_231
export HADOOP_HOME=/usr/local/hadoop-2.7.1
export HADOOP_CONF_DIR=/usr/local/hadoop-2.7.1/etc/hadoop
export SPARK_MASTER_IP=dn1
export SPARK_WORKER_MEMORY=1g
export SPARK_WORKER_COSER=1
export SPARK_WORKER_INSTANCES=1
#重命名
mv slaves.template slaves
#修改slaves(添加workers节点的ip或者域名)
xt2
xt3
#将环境变量配置文件和spark和Scala发送到其他两个节点
scp /etc/profile hadoop@xt2:/etc
scp /etc/profile hadoop@xt3:/etc
scp -r scala hadoop@xt2:/usr/local
scp -r scala hadoop@xt3:/usr/local
scp -r spark hadoop@xt2:/usr/local
scp -r spark hadoop@xt3:/usr/local
5.启动集群
cd /usr/local/spark/sbin
./start-all.sh
#测试(在主节点能看到master,从节点能看到worker)就是成功
jps
#访问WebUI界面(ip为你的master节点ip地址)
http://192.168.123.100:8080/
6.报错
- 如果是权限不足
Permission denied
只是因为没有写入的权限,因此只需要修改目标文件夹的权限即可,使其拥有写入权限。
sudo chmod 777 local
- 如果启动的时候报
chown: changing ownership of ‘/usr/local/spark-2.4.4/logs’: Operation not permitted
starting org.apache.spark.deploy.master.Master, logging to /usr/local/spark-2.4.4/logs/spark-hadoop-org.apache.spark.deploy.master.Master-1-xt1.out
/usr/local/spark-2.4.4/sbin/spark-daemon.sh: line 128: /usr/local/spark-2.4.4/logs/spark-hadoop-org.apache.spark.deploy.master.Master-1-xt1.out: Permission denied
failed to launch: nice -n 0 /usr/local/spark-2.4.4/bin/spark-class org.apache.spark.deploy.master.Master --host xt1 --port 7077 --webui-port 8080
tail: cannot open ‘/usr/local/spark-2.4.4/logs/spark-hadoop-org.apache.spark.deploy.master.Master-1-xt1.out’ for reading: No such file or directory
full log in /usr/local/spark-2.4.4/logs/spark-hadoop-org.apache.spark.deploy.master.Master-1-xt1.out
xt3: starting org.apache.spark.deploy.worker.Worker, logging to /usr/local/spark-2.4.4/logs/spark-hadoop-org.apache.spark.deploy.worker.Worker-1-xt3.out
xt2: starting org.apache.spark.deploy.worker.Worker, logging to /usr/local/spark-2.4.4/logs/spark-hadoop-org.apache.spark.deploy.worker.Worker-1-xt2.out
我执行这一句就可以解决
sudo chown -R $(whoami) /usr/local/*
有什么问题欢迎留言!