技术选型需求:
netcat-memory-loggerAndHdfs:
配置文件:netcat-memory-loggerAndHdfs.conf:
a1.sources = r1
a1.channels = c1
a1.sinks=k1 k2
a1.sources.r1.type = netcat
a1.sources.r1.bind = 192.168.137.252
a1.sources.r1.port = 6666
a1.channels.c1.type = memory
a1.sinks.k1.type = hdfs
a1.sinks.k1.hdfs.path = hdfs://ruozehadoop000:9000/data/flume/page_views/%Y%m%d%H%M
a1.sinks.k1.hdfs.filePrefix = page-views
a1.sinks.k1.hdfs.fileType=DataStream
a1.sinks.k1.hdfs.writeFormat=Text
a1.sinks.k1.hdfs.batchSize=10
a1.sinks.k1.hdfs.round = true
a1.sinks.k1.hdfs.roundValue = 1
a1.sinks.k1.hdfs.roundUnit = minute
a1.sinks.k1.hdfs.useLocalTimeStamp=true
a1.sinks.k2.type = logger
//可设置sink选择器,提高数据安全性
#a1.sinkgroups = g1
#a1.sinkgroups.g1.sinks = k1 k2
#a1.sinkgroups.g1.processor.type = failover
#a1.sinkgroups.g1.processor.priority.k1 = 5
#a1.sinkgroups.g1.processor.priority.k2 = 10
#a1.sinkgroups.g1.processor.maxpenalty = 10000
a1.sinks.k1.channel = c1
a1.sinks.k2.channel = c1
a1.sources.r1.channels = c1
启动命令:
flume-ng agent \
--name a1 \
--conf $FLUME_HOME/conf \
--conf-file /home/hadoop/script/flume/netcat-memory-loggerAndHdfs.conf \
-Dflume.root.logger=INFO,console \
-Dflume.monitoring.type=http \
-Dflume.monitoring.port=34343
部分操作结果:
ncAndExec-file-logger:
配置文件:ncAndExec-file-logger.conf
a1.sources = r1 r2
a1.channels = c1
a1.sinks = k1
a1.sources.r1.type = netcat
a1.sources.r1.bind = 192.168.137.252
a1.sources.r1.port = 6666
a1.sources.r2.type = exec
a1.sources.r2.command = tail -F /home/hadoop/data/data1.log
a1.channels.c1.type = file
a1.channels.c1.checkpointDir = /home/hadoop/data/flume/checkpoint
a1.channels.c1.dataDirs = /home/hadoop/data/flume/data
a1.sinks.k1.type = logger
a1.sinks.k1.channel = c1
a1.sources.r2.channels = c1
a1.sources.r1.channels = c1
启动命令:
flume-ng agent \
--name a1 \
--conf $FLUME_HOME/conf \
--conf-file /home/hadoop/script/flume/ncAndExec-file-logger.conf \
-Dflume.root.logger=INFO,console \
-Dflume.monitoring.type=http \
-Dflume.monitoring.port=34343
部分操作结果:
问题补充:为什么要对Flume日志收集系统进行分层设计
答案:http://f.dataguru.cn/spark-896270-1-1.html