Spark安装
http://spark.apache.org/downloads.html
Note that, Spark 2.x is pre-built with Scala 2.11 except version 2.4.2, which is pre-built with Scala 2.12. Spark 3.0+ is pre-built with Scala 2.12.
上传spark安装包
[root@node2 app]# tar zxvf spark-3.0.1-bin-hadoop2.7.tgz
[root@node2 app]# cd spark-3.0.1-bin-hadoop2.7/
[root@node2 spark-3.0.1-bin-hadoop2.7]# cd conf/
[root@node2 conf]# mv slaves slaves.template slaves
mv spark-env.sh spark-env.sh.template
[root@node2 conf]# vi spark-env.sh
spark-env.sh
export JAVA_HOME=/root/app/jdk1.8.0_11
export SCALA_HOME=/root/app/scala-2.11.8
export SPARK_MASTER_IP=node1
export SPARK_WORKER_MEMORY=1g
export HADOOP_CONF_DIR=/root/app/hadoop-2.7.5/etc/hadoop
Slave
异常解决
1.查看scala版本是否符合要求
2.查看日志解决
[root@node1 sbin]# ./start-all.sh
starting org.apache.spark.deploy.master.Master, logging to /root/app/spark-3.0.1-bin-hadoop2.7/logs/spark-root-org.apache.spark.deploy.master.Master-1-node1.out
failed to launch: nice -n 0 /root/app/spark-3.0.1-bin-hadoop2.7/bin/spark-class org.apache.spark.deploy.master.Master --host node1 --port 7077 --webui-port 8080
Spark Command: /root/app/jdk1.8.0_11/bin/java -cp /root/app/spark-3.0.1-bin-hadoop2.7/conf/:/root/app/spark-3.0.1-bin-hadoop2.7/jars/*:/root/app/hadoop-2.7.5/etc/hadoop/ -Xmx1g org.apache.spark.deploy.master.Master --host node1 --port 7077 --webui-port 8080
========================================
full log in /root/app/spark-3.0.1-bin-hadoop2.7/logs/spark-root-org.apache.spark.deploy.master.Master-1-node1.out
node2: starting org.apache.spark.deploy.worker.Worker, logging to /root/app/spark-3.0.1-bin-hadoop2.7/logs/spark-root-org.apache.spark.deploy.worker.Worker-1-node2.out
node3: starting org.apache.spark.deploy.worker.Worker, logging to /root/app/spark-3.0.1-bin-hadoop2.7/logs/spark-root-org.apache.spark.deploy.worker.Worker-1-node3.out