Ubuntu下spark开发(Local模式)
1、下载scala、spark、jdk并解压至/opt/路径下
scala下载地址: http://www.scala-lang.org/ 解压路径:/opt/scala
spark下载地址:http://spark.apache.org/downloads.html 解压路径:/opt/spark-hadoop
jdk下载地址:http://www.oracle.com/technetwork/java/javase/downloads/index.html 解压路径:/opt/jdk
2、配置环境变量/etc/profile
gedit /etc/profile
加入以下配置:
#Seeting JDK JDK环境变量
export JAVA_HOME=/opt/jdk
export JRE_HOME=${JAVA_HOME}/jre
export CLASSPATH=.:${JAVA_HOME}/lib:${JRE_HOME}/lib
export PATH=${JAVA_HOME}/bin:${JRE_HOME}/bin:$PATH
#Seeting Scala Scala环境变量
export SCALA_HOME=/opt/scala-hadoop
export PATH=${SCALA_HOME}/bin:$PATH
#setting Spark Spark环境变量
export SPARK_HOME=/opt/spark-hadoop/
#PythonPath 将Spark中的pySpark模块增加的Python环境中
export PYTHONPATH=/opt/spa