由于历史的原因,公司所有的日志都通过阿里云日志服务的logtail进行收集并存储到日志服务
首先查看阿里云日志服务SLS的官方文档
https://help.aliyun.com/document_detail/123446.html#title-4jc-gkl-7nq
上面的文档有Flume关于sls配置文件的配置
阿里云日志服务sls的Flume插件安装
https://github.com/aliyun/aliyun-log-flume
打开上面的链接,按照文档的步骤操作
[hadoop@hadoop004 software]$ wget http://apache.tt.co.kr/flume/1.9.0/apache-flume-1.9.0-bin.tar.gz
[hadoop@hadoop004 software]$ tar -zxf apache-flume-1.9.0-bin.tar.gz -C ../app/
[hadoop@hadoop004 software]$ git clone https://github.com/aliyun/aliyun-log-flume.git
[hadoop@hadoop004 software]$ cd aliyun-log-flume/
[hadoop@hadoop004 aliyun-log-flume]$ mvn clean compile assembly:single -DskipTests
漫长的等待。。。
上面虽然我下载了apache-flume-1.9.0,但我还是任性的使用apache-flume-1.6.0-cdh5.7.0-bin,开干!
[hadoop@hadoop004 aliyun-log-flume]$ cd target/
[hadoop@hadoop004 target]$ ls
aliyun-log-flume-1.1.jar archive-tmp classes generated-sources maven-status
[hadoop@hadoop004 target]$ cp aliyun-log-flume-1.1.jar /data/aaron/app/apache-flume-1.6.0-cdh5.7.0-bin/lib/
[hadoop@hadoop004 bin]$ cd -
/data/aaron/app/apache-flume-1.6.0-cdh5.7.0-bin/conf/conffile
[hadoop@hadoop004 conffile]$ vim sls-flume.conf
sls-flume-kafka.sources = sls-source
sls-flume-kafka.channels = sls-memory-channel
sls-flume-kafka.sinks = kafka-sink
sls-flume-kafka.sources.sls-source.type = com.aliyun.loghub.flume.source.LoghubSource
sls-flume-kafka.sources.sls-source.endpoint = cn-shenzhen.log.aliyuncs.com
sls-flume-kafka.sources.sls-source.project = <Your Loghub project>
sls-flume-kafka.sources.sls-source.logstore = <Your Loghub logstore>
sls-flume-kafka.sources.sls-source.accessKeyId = <Your Accesss Key Id>
sls-flume-kafka.sources.sls-source.accessKey = <Your Access Key>
sls-flume-kafka.sources.sls-source.deserializer = JSON
sls-flume-kafka.channels.sls-memory-channel.type = memory
sls-flume-kafka.sinks.kafka-sink.type = logger
sls-flume-kafka.sources.sls-source.channels = sls-memory-channel
sls-flume-kafka.sinks.kafka-sink.channel = sls-memory-channel
启动Flume
[hadoop@hadoop004 conffile]$ cd -
/data/aaron/app/apache-flume-1.6.0-cdh5.7.0-bin/bin
[hadoop@hadoop004 bin]$ ./flume-ng agent --name sls-flume-kafka --conf /data/aaron/app/apache-flume-1.6.0-cdh5.7.0-bin/conf/conffile --conf-file /data/aaron/app/apache-flume-1.6.0-cdh5.7.0-bin/conf/conffile/sls-flume.conf -Dflume.root.logger=INFO,console
成功了!
运行了一段时间,
再等了一会,又正常了哈哈
应该是已启动Flume,就把阿里云日志服务的日志一下子给干过来了
下一步就是把日志导进Kafka,敬请期待!!!