Spark 1.6.0单机安装配置
一、约束条件如下
Spark runs on Java
7+, Python 2.6+ and R 3.1+. For the Scala API, Spark 1.6.0 uses Scala 2.10. You
will need to use a compatible Scala version (2.10.x).
Spark 1.6+ Scala 2.10
二、依赖安装
1、JDK 1.8+(前置安装)
2、HADOOP 2.6.0+(前置安装)
3、SCALA 2.10.+
4、Spark -1.6.0 –bin-Hadoop
http://d3kbcqa49mib13.cloudfront.net/spark-1.6.3-bin-hadoop2.6.tgz
三、依赖安装
n配置ssh localhost
确保已经安装openssh-server
yum–y install openssh-server
n无密码登陆配置
ssh-keygen -t rsa
cat
~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys
如果已经生成过密钥,只需执行后两行命令。测试ssh localhost
四、安装scala
n解压scala安装包到任意目录:
cd /opt/scala
tar -xzvf
scala-2.10.6.tgz
n编辑环境变量
vim /etc/profile
export
SCALA_HOME