Step 1:
可以尝试先搭建HA模式,然后Yarn模式只是在这个基础上做了一些修改而已
http://blog.csdn.net/ymf827311945/article/details/73822832
Step 2:
在node11节点上执行命令:
vi ~/.bash_profile
source ~/.bash_profile
添加如下属性:
export HADOOP_INSTALL=/opt/apps/hadoop/hadoop-2.6.0
Step 3:
在node11节点上执行命令
vi /opt/apps/spark/spark-1.6.0-bin-hadoop2.6/conf/spark-env.sh
添加如下属性:
export HADOOP_CONF_DIR=$HADOOP_INSTALL/etc/hadoop
export YARN_CONF_DIR=$HADOOP_INSTALL/etc/hadoop
export SPARK_HOME=/opt/apps/spark/spark-1.6.0-bin-hadoop2.6
export SPARK_JAR=/opt/apps/spark/spark-1.6.0-bin-hadoop2.6/lib/spark-assembly-1.6.0-hadoop2.6.0.jar
export PATH=$SPARK_HOME/bin:$PATH
Step 4:
在node11节点上执行命令,进行文件的分发
scp