Ubuntu Linux64 安装配置Spark1.6.1
ubuntuu机器:
-> uname -a
Linux xxx 4.1.5-x86_64 #7 SMPMon Aug 24 13:46:31 EDT 2015 x86_64 x86_64 x86_64 GNU/Linux
-> getconf LONG_BIT
64
1、安装jvm
http://www.oracle.com/technetwork/java/javase/downloads/jdk8-downloads-2133151.html
jdk-8u77-linux-x64.tar.gz
-> tar -zxf jdk-8u77-linux-x64.gz
cd jdk1.8.0_77
配置:
-> vi ~/.bashrc
export JAVA_HOME=xxx/jdk1.8.0_77
exportJRE_HOME=${JAVA_HOME}/jre
export CLASSPATH=.:${JAVA_HOME}/lib:${JRE_HOME}/lib
export PATH=${JAVA_HOME}/bin:$PATH
-> source ~/.bashrc
检查jvm安装是否正常
-> java -version
2、安装scala
http://www.scala-lang.org/
-> tar -zxf scala-2.12.0-M3.tgz
-> vi ~/.bashrc
export SCALA_HOME=xxx/scala-2.12.0-M3/bin
export PATH=$JAVA_HOME/bin:$SCALA_HOME:$HADOOP_HOME/bin:$SPARK_HOME/bin:$PATH
-> source ~/.bashrc
测试是否安装成功:
-> scala
Welcome to Scala 2.12.0-M3(Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_77).
Type in expressions forevaluation. Or try :help.
scala>
3、安装hadoop
hadoop-2.6.0.tar.gz:
http://mirror.bit.edu.cn/apache/hadoop/common/hadoop-2.6.0/
-> tar -zxf hadoop-2.6.0.tar.gz
-> vi ~/.bashrc
export SCALA_HOME=xxx/scala-2.12.0-M3/bin
export PATH=$JAVA_HOME/bin:$SCALA_HOME:$HADOOP_HOME/bin:$SPARK_HOME/bin:$PATH
-> source ~/.bashrc
export HADOOP_HOME=xx/hadoop-2.6.0
exportLD_LIBRARY_PATH=$HADOOP_HOME/lib
exportPATH=$JAVA_HOME/bin:$SCALA_HOME:$HADOOP_HOME/bin:$SPARK_HOME/bin:$PATH
-> source ~/.bashrc
4、安装spark
http://spark.apache.org/downloads.html
-> tar -zxf spark-1.6.1-bin-hadoop2.6.tgz
-> source ~/.bashrc
export SPARK_HOME=xx/spark/spark-1.6.1-bin-hadoop2.6
PATH=$JAVA_HOME/bin:$SCALA_HOME:$HADOOP_HOME/bin:$SPARK_HOME/bin:$PATH
-> source ~/.bashrc
-> cd spark-1.6.1-bin-hadoop2.6
-> tree -L 2 //查看2级目录树
-> ./bin/ pyspark
★出现错误: "unable to load native-hadoop library for your platform"
解决:更新hadoop-native库:
http://dl.bintray.com/sequenceiq/sequenceiq-bin/hadoop-native-64-2.6.0.tar
-> tar -xhadoop-native-64-2.4.0.tar -C xx/hadoop/lib
★出现错误:"java.net.UnknownHostException: xx: xx: unknown erro"
解决:hosts配置错误
-> vi /etc/hosts
增加如:
127.0.0.1 xx
5、使用scala测试spark
-> bin/spark-shell