Scala安装
scala解压,放到/usr/local/scala,配置好环境变量即可。
Spark安装
cd /usr/local/spark/conf
cp spark-env.sh.template spark-env.sh
vi spark-env.sh
export JAVA_HOME=/usr/local/java
export SCALA_HOME=/usr/local/scala
export HADOOP_HOME=/usr/local/hadoop
export HADOOP_CONF_DIR=/usr/local/hadoop/etc/hadoop
同步配置
scp -r /usr/local/spark/conf root@sparkproject2:/usr/local/spark
scp -r /usr/local/spark/conf root@sparkproject3:/usr/local/spark
测试安装
/usr/local/spark/bin/spark-submit \
--class org.apache.spark.examples.JavaSparkPi \
--master yarn-client \
--num-executors 1 \
--driver-memory 512m \
--executor-memory 512m \
--executor-cores 1 \
/usr/local/spark/lib/spark-examples-1.5.1-hadoop2.4.0.jar \