下载安装scala
wget https://www.scala-lang.org/download/
tar -zxvf scala-2.12.8.tgz -C /root
mv scala-2.12.8 scala
配置etc/profile
source 使其生效
测试:scala -version
启动:scala
下载spark
wget https://www.apache.org/dyn/closer.lua/spark/spark-2.4.2/spark-2.4.2-bin-hadoop2.7.tgz
tar -zxvf spark-2.4.2-bin-hadoop2.7.tgz -C /root
mv spark-2.4.2-bin spark
配置etc/profile加入spark路径并使其生效
参考配置此网址
https://www.cnblogs.com/qingyunzong/p/8903714.html#_label4_0
进入spark/sbin
start-all.sh
如果无法启动两个结点,参考集群spark内容
cd /root/spark/conf
cp spark-env.sh.template spark-env.sh
加入配置
export SCALA_HOME=/root/scala
export JAVA_HOME=/root/java/jdk1.8.0_121
export HADOOP_HOME=/root/hadoop
export HADOOP_CONF_DIR=$HADOOP_HOME/etc/hadoop
export SPARK_HOME=/root/spark
export SPARK_MASTER_IP=192.168.78.129
cp slaves.template slaves
vi slaves
输入192.168.78.129
单独启动两个节点
jps查看启动节点
spark-shell启动成功
使用一些基础命令