ubuntu 14.04
Download Softwares
[ if web connection refused, go to
http://blog.csdn.net/houxiaoqin/article/details/54096175
]:
Anaconda2-4.2.0-Linux-x86_64.sh 【
http://continuum.io/downloads#all】
jdk-8u111-linux-x64.tar.gz 【
http://continuum.io/downloads#all】
spark-1.5.2-bin-hadoop2.6.tgz 【
http://spark.apache.org/downloads】
1. Installing Java 8
cd Downloads
ls
mkdir -p /usr/lib/jvm
sudo mv jdk-8u111-linux-x64.tar.gz /usr/lib/jvm
cd /usr/lib/jvm
sudo tar xzvf jdk-8u11-linux-i586.tar.gz
sudo ln -s jdk1.8.0_111 java-8
ls
mkdir -p /usr/lib/jvm
sudo mv jdk-8u111-linux-x64.tar.gz /usr/lib/jvm
cd /usr/lib/jvm
sudo tar xzvf jdk-8u11-linux-i586.tar.gz
sudo ln -s jdk1.8.0_111 java-8
设置环境变量
vi ~/.bashrc
在文件末尾追加:
export JAVA_HOME=/usr/lib/jvm/java-8
export JAVA_HOME=${JAVA_HOME}/jre
export CLASSPATH=.:${JAVA_HOME}/lib:${JRE_HOME}/lib
export PATH=${JAVA_HOME}/bin:$PATH
:wq保存退出
vi ~/.bashrc
在文件末尾追加:
export JAVA_HOME=/usr/lib/jvm/java-8
export JAVA_HOME=${JAVA_HOME}/jre
export CLASSPATH=.:${JAVA_HOME}/lib:${JRE_HOME}/lib
export PATH=${JAVA_HOME}/bin:$PATH
:wq保存退出
source ~/.bashrc 使上述修改生效
java-version
【注意::$PATH不写、写错或者写成小写,则原始路径直接被覆盖,导致系统找不到原始路径,ls等基础命令失效】
【如果不小心覆盖了原始路径,命令行输入export PATH=/usr/bin:/bin 然后输入 vi ~/.bashrc 找错改正】
更新默认jdk
# update-alternatives --install /usr/bin/java java /usr/lib/jvm/jdk1.8.0_05/bin/java 300
# update-alternatives --install /usr/bin/javac javac /usr/lib/jvm/jdk1.8.0_05/bin/javac 300
# update-alternatives --config java
# update-alternatives --install /usr/bin/javac javac /usr/lib/jvm/jdk1.8.0_05/bin/javac 300
# update-alternatives --config java
2. Installing Anaconda with Python 2.7
命令:bash Anaconda2-4.2.0-Linux-x86_64.sh
3. Installing Spark
cd Downloads
ls
mkdir -p /home/用户名/spark
tar -xf spark-1.5.2-bin-hadoop2.6.tgz
sudo mv spark-1.5.2-bin-hadoop2.6 /home/用户名/spark
sudo mv spark-1.5.2-bin-hadoop2.6 /home/用户名/spark
cd /home/用户名/spark/spark-1.5.2-bin-hadoop2.6
./bin/pyspark
4. Enabling Jupyter Notebook
命令:jupyter notebook
connect Jupyter to Spark:
命令:PYSPARK_DRIVER_PYTHON=jupyter PYSPARK_DRIVER_PYTHON_OPTS="notebook" $SPARK_HOME/bin/pyspark --master local[*]
Where $SPARK_HOME: environment variable set to the Spark home directory
5. Virtualizing the environment with Docker
$ sudo apt-get
update
$ sudo apt- get install curl
$ sudo apt- get install curl
$ curl -fsSL
https:/
/get.docker.com/ | sh
如果报错dpkg: error processing package oracle-java8-installer (--configure), 很可能/var/cache/oracle-jdk8-installer/jdk-8u111-linux-x64.tar.gz残缺【ls -lht查看】。
sudo mv /usr/lib/jvm/jdk-8u111-linux-x64.tar.gz /var/cache/oracle-jdk8-installer/
curl -fsSL
https:/
/get.docker.com/ | sh
sudo docker version