一、安装spark依赖的scala
1、下载和解压缩Scala
下载链接:http://www.scala-lang.org/
[hadoop@h3 software]$ tar -zxvf scala-2.12.6.tgz -C /opt/modules/
2、配置环境变量
[hadoop@h3 scala-2.12.6]$ sudo vim /etc/profile
添加:
export SCALA_HOME=/opt/modules/scala-2.12.6
export PATH=$PATH:$SCALA_HOME/bin
[hadoop@h3 scala-2.12.6]$ source /etc/profile
3、验证Scala
scala -version
二、安装spark
1、下载解压缩
下载链接地址:http://spark.apache.org/downloads.html
[hadoop@h3 software]$ tar -xzf spark-2.1.0-bin-hadoop2.7.tgz -C /opt/modules/
2、spark相关环境配置
配置环境变量:
[hadoop@h3 modules]$ sudo vim /etc/profile
添加:
export SPARK_HOME=/opt/modules/spark-2.1.0-bin-hadoop2.7
export PATH=$PATH:$SPARK_HOME/bin
[hadoop@h3 modules]$ source /etc/profile
配置spark-env.sh文件
[hadoop@h3 spark-2.3.1-bin-hadoop2.7]$ cd con