1. 下载 spark
解压:
mkdir /opt/spark
tar zxvf spark-2.2.0-bin-hadoop2.7.tgz
mv spark-2.2.0-bin-hadoop2.7 /opt/spark/
修改/etc/profile文件,在文件末尾添加:
export SPARK_HOME=/opt/spark/spark-2.2.0-bin-hadoop2.7
export PATH=$PATH:$SPARK_HOME/bin
2. 下载 scala
mkdir /opt/scala
tar zxvf scala-2.10.6.tgz -C /opt/scala
修改/etc/profile文件,在文件末尾添加:
export SCALA_HOME=/opt/scala/scala-2.10.6
export PATH=$PATH:$SCALA_HOME/bin
检查scala是否安装成功:
scala -version
Scala code runner version 2.10.6 -- Copyright 2002-2013, LAMP/EPFL
3. 如果已经安装jdk,请跳过
mkdir /usr/java
tar zxvf jdk-8u111-linux-x64.tar.gz
mv jdk1.8.0_111 /usr/java
编辑/etc/profile, 添加内容:
export JAVA_HOME=/usr/java/jdk1.8.0_111
export PATH=$PATH:$JAVA_HOME/bin
检查java是否安装成功
java -version
java version "1.8.0_111"
Java(TM) SE Runtime Environment (build 1.8.0_111-b14)
Java HotSpot(TM) 64-Bit Server VM (build 25.111-b14, mixed mode)
4.运行spark-shell
# spark-shell
解压:
mkdir /opt/spark
tar zxvf spark-2.2.0-bin-hadoop2.7.tgz
mv spark-2.2.0-bin-hadoop2.7 /opt/spark/
修改/etc/profile文件,在文件末尾添加:
export SPARK_HOME=/opt/spark/spark-2.2.0-bin-hadoop2.7
export PATH=$PATH:$SPARK_HOME/bin
2. 下载 scala
mkdir /opt/scala
tar zxvf scala-2.10.6.tgz -C /opt/scala
修改/etc/profile文件,在文件末尾添加:
export SCALA_HOME=/opt/scala/scala-2.10.6
export PATH=$PATH:$SCALA_HOME/bin
检查scala是否安装成功:
scala -version
Scala code runner version 2.10.6 -- Copyright 2002-2013, LAMP/EPFL
3. 如果已经安装jdk,请跳过
mkdir /usr/java
tar zxvf jdk-8u111-linux-x64.tar.gz
mv jdk1.8.0_111 /usr/java
编辑/etc/profile, 添加内容:
export JAVA_HOME=/usr/java/jdk1.8.0_111
export PATH=$PATH:$JAVA_HOME/bin
检查java是否安装成功
java -version
java version "1.8.0_111"
Java(TM) SE Runtime Environment (build 1.8.0_111-b14)
Java HotSpot(TM) 64-Bit Server VM (build 25.111-b14, mixed mode)
4.运行spark-shell
# spark-shell