前期准备
1、MySQL 环境准备
Kafka-Eagle 的安装依赖于 MySQL,MySQL 主要用来存储可视化展示的数据。如果集群中没有安装过MySQL需要先安装MySQL。
2、Kafka 环境准备
1、关闭Kafka集群(先关闭kafka,再关闭zookeeper)
2、修改 /opt/module/kafka/bin/kafka-server-start.sh
[user@hadoop100 kafka_2.12-3.0.0]$ vim bin/kafka-server-start.sh
...
#注释以下内容
# if [ "x$KAFKA_HEAP_OPTS" = "x" ]; then
# export KAFKA_HEAP_OPTS="-Xmx1G -Xms1G"
# fi
#添加以下内容
if [ "x$KAFKA_HEAP_OPTS" = "x" ]; then
export KAFKA_HEAP_OPTS="-server -Xms2G -Xmx2G -XX:PermSize=128m -XX:+UseG1GC -XX:MaxGCPauseMillis=200 -XX:ParallelGCThreads=8 -XX:ConcGCThreads=5 -XX:InitiatingHeapOccupancyPercent=70"
export JMX_PORT="9999"
#export KAFKA_HEAP_OPTS="-Xmx1G -Xms1G"
fi
...
注意:修改之后在启动Kafka之前要同步分发至其他节点(这里使用了我自己写的shell分发脚本)
[user@hadoop100 kafka_2.12-3.0.0]$ xsync bin/kafka-server-start.sh
Kafka-Eagle 安装
官网:https://www.kafka-eagle.org/下载安装包
1、上传刚下载的压缩包kafka-eagle-bin-2.0.8.tar.gz到集群/opt/software目录
2、解压kafka-eagle-bin-2.0.8.tar.gz并进入该目录
[user@hadoop100 software]$ tar -zxvf kafka-eagle-bin-2.0.8.tar.gz
[user@hadoop100 software]$ cd kafka-eagle-bin-2.0.8
3、解压efak-web-2.0.8-bin.tar.gz至/opt/module目录下
[user@hadoop100 kafka-eagle-bin-2.0.8]$ tar -zxvf efak-web-2.0.8-bin.tar.gz -C /opt/module/
4、修改配置文件/opt/module/efak-web-2.0.8/conf/system-config.properties
[user@hadoop100 kafka-eagle-bin-2.0.8]$ vim /opt/module/efak-web-2.0.8/conf/system-config.properties
######################################
# multi zookeeper & kafka cluster list
# Settings prefixed with 'kafka.eagle.' will be deprecated, use 'efak.'
instead
######################################
efak.zk.cluster.alias=cluster1
cluster1.zk.list=hadoop102:2181,hadoop103:2181,hadoop104:2181/kafka
...
...
...
######################################
# kafka offset storage
######################################
# offset 保存在 kafka
cluster1.efak.offset.storage=kafka
...
...
...
######################################
# kafka sqlite jdbc driver address
######################################
# 配置 mysql 连接
efak.driver=com.mysql.jdbc.Driver
efak.url=jdbc:mysql://hadoop100:3306/ke?useUnicode=true&characterEncoding=UT
F-8&zeroDateTimeBehavior=convertToNull
efak.username=root
efak.password=123456
...
...
...
5、添加环境变量
[user@hadoop100 kafka-eagle-bin-2.0.8]$ sudo vim /etc/profile.d/user_env.sh
# kafkaEFAK
export KE_HOME=/opt/module/efak-web-2.0.8
export PATH=$PATH:$KE_HOME/bin
注意:source /etc/profile
[user@hadoop100 kafka-eagle-bin-2.0.8]$ source /etc/profile
6、启动
(1)注意:启动之前需要先启动ZK以及KAFKA。
(2)启动efak
[user@hadoop100 efak-web-2.0.8]$ bin/ke.sh start
登录页面查看监控数据
http://192.168.88.100:8048
(3)退出efak命令
[user@hadoop100 efak-web-2.0.8]$ bin/ke.sh stop