快速安装Spark2和Livy0.3
1、安装spark2.4.3
1)把spark-2.4.3-bin-hadoop2.7.tgz上传到/opt/software目录,并解压到/opt/module
[user01@node1 software]$ tar -zxvf spark-2.4.3-bin-hadoop2.7.tgz -C /opt/module/
2)修改名称/opt/module/spark-2.4.3-bin-hadoop2.7名称为spark
[user01@node1 module]$ mv spark-2.4.3-bin-hadoop2.7/ spark
3)修改/opt/module/spark/conf/spark-defaults.conf.template名称为spark-defaults.conf
[user01@node1 conf]$ mv spark-defaults.conf.template spark-defaults.conf
4)在hadoop集群上提前创建spark_directory日志路径
[user01@node1 spark]$ hadoop fs -mkdir /spark_directory
5)在spark-default.conf文件中配置Spark日志路径
[user01@node1 conf]$ vim spark-defaults.conf
#添加如下配置
spark.eventLog.enabled true
spark.eventLog.dir hdfs://node1:9000/spark_directory
6)修改/opt/module/spark/conf/spark-env.sh.template名称为spark-env.sh
[user01@node1 conf]$ mv spark-env.sh.template spark-env.sh
7)在/opt/module/spark/conf/spark-env.sh文件中配置YARN配置文件路径、配置历史服务器相关参数
[user01@node1 conf]$ vim spark-env.sh
#添加如下参数
YARN_CONF_DIR=/opt/module/hadoop-2.7.2/etc/hadoop
export SPARK_HISTORY_OPTS="-Dspark.history.ui.port=18080
-Dspark.history.retainedApplications=30
-Dspark.history.fs.logDirectory=hdfs://node1:9000/spark_directory"
8)把Hive中/opt/module/hive/lib/datanucleus-*.jar包拷贝到Spark的/opt/module/spark/jars路径
[user01@node1 lib]$ cp /opt/module/hive/lib/datanucleus-*.jar /opt/module/spark/jars/
9)把Hive中/opt/module/hive/conf/hive-site.xml包拷贝到Spark的/opt/module/spark/conf路径
[user01@node1 conf]$ cp /opt/module/hive/conf/hive-site.xml /opt/module/spark/conf/
10)测试环境
[user01@node1 spark]$ bin/spark-shell
scala>spark.sql("show databases").show
2、安装Livy0.3
1)上传livy-server-0.3.0.zip到node1的/opt/software目录下,并解压到/opt/module
[user01@node1 software]$ unzip livy-server-0.3.0.zip -d /opt/module/
2)修改/opt/module/livy-server-0.3.0文件名称为livy
[user01@node1 module]$ mv livy-server-0.3.0/ livy
3)修改/opt/module/livy/conf/livy.conf文件,配置livy与spark相关参数
livy.server.host = Node1
livy.spark.master =yarn
livy.spark.deployMode = client
livy.repl.enableHiveContext = true
livy.server.port = 8999
4)配置需要的环境变量
[user01@node1 conf]$ sudo vim /etc/profile
#SPARK_HOME
export SPARK_HOME=/opt/module/spark
export PATH=$PATH:$SPARK_HOME/bin
[user01@node1 conf]$ source /etc/profile
5)在/opt/module/livy/路径上,启动livy服务
[user01@node1 livy]$ bin/livy-server start
6)访问web端页面http://192.168.xx.xx:8999/ui,成功!