1.上传spark的安装包解压并,重命名
[hadoop@host151 bigdata]$ tar xvf spark-2.1.1-bin-hadoop2.6.tar.gz
[hadoop@host151 bigdata]$ mv spark-2.1.1-bin-hadoop2.6 spark
2.修改spark-defaults.conf
[hadoop@host151 conf]$ cp spark-defaults.conf.template spark-defaults.conf
spark.yarn.principal hadoop/host151@NBDP.COM
spark.yarn.keytab /home/keydir/hadoop/hadoop.keytab
spark.executorEnv.JAVA_HOME /opt/jdk1.8.0_131
spark.yarn.appMasterEnv.JAVA_HOME /opt/jdk1.8.0_131
3.复制hive-site.xml,从hive的/conf下面拷贝过来即可
[hadoop@host151 conf]$ cp hive-site.xml /home/hadoop/bigdata/spark/conf
4.修改spark-env.sh
export JAVA_HOME=/opt/jdk1.8.0_131
export HADOOP_HOME=/home/hadoop/bigdata/hadoop
export HADOOP_CONF_DIR=/home/hadoop/bigdata/hadoop/etc/hadoop
export SPARK_LOCAL_DIRS=/home/hadoop/bigdata/datas/spark/tmp
export SPARK_PID_DIR=/home/hadoop/bigdata/datas/spark/pid
5.测试
[hadoop@host151 bin]$ ./spark-sql —master yarn-client principal hadoop/host151@nbdp.com keytab /home/keydir/hadoop/hadoop.keytab
或者直接启动spark-sql也可以,spark-defaults.conf配置了默认的keytab
[hadoop@host151 bin]$ ./spark-sql