1。编辑spark-env.sh
export HADOOP_HOME=/usr/hdp/current/hadoop-client
export HADOOP_CONF_DIR=/usr/hdp/current/hadoop-client/conf
这两个其实就是为了获得hadoop中的hdfs和yarn的配置文件。
2。编辑spark-defaults.conf
spark.history.fs.logDirectory hdfs:///spark2-history/
spark.history.provider org.apache.spark.deploy.history.FsHistoryProvider
spark.history.ui.port 18081
spark.eventLog.enabled true
spark.eventLog.dir hdfs:///spark2-history/
3。在hdfs上创建文件
hadoop fs -mkdir /spark2-history
4。启动服务
//启动job历史端口
./start-history-server.sh