**
Flume把收集到的数据存储在 home/flumedata目录
进入配置:vim filechannel.conf
**
# 给Agent起名
# 给Source起名
a1.sources = s1
# 给Channel起名
a1.channels = c1
# 给Sink起名
a1.sinks = k1
# 配置Source的类型
a1.sources.s1.type = netcat
# 配置IP
a1.sources.s1.bind = 0.0.0.0
# 配置监听端口
a1.sources.s1.port = 8090
# 配置Channel的类型
a1.channels.c1.type = file
a1.channels.c1.dataDirs = /home/flumedata
# 配置Sink的类型
a1.sinks.k1.type = logger
# 将Source和Channel进行绑定
a1.sources.s1.channels = c1
# 将Sink和Channel进行绑定
a1.sinks.k1.channel = c1
命令运行:
[root@hadoop01 data]# …/bin/flume-ng agent -n a1 -c …/conf -f filechannel.conf -Dflume.root.logger=INFO,console
发送数据:
[root@hadoop01 home]# nc hadoop01 8090
hello
OK
ok
OK
OK
OK
HI666
OK
查看数据:
Info: Including Hadoop libraries found via (/home/software/hadoop-2.7.1/bin/hadoop) for HDFS access
Info: Including Hive libraries found via () for Hive access
+ exec /home/presoftware/jdk1.8.0_65/bin/java -Xmx20m -Dflume.root.logger=INFO,console -cp '/home/software/apache-flume-1.7.0-bin/conf:/home/software/apache-flume-1.7.0-bin/lib/*:/home/software/hadoop-2.7.1/etc/hadoop:/home/software/hadoop-2.7.1/share/hadoop/common/lib/*:/home/software/hadoop-2.7.1/share/hadoop/common/*:/home/software/hadoop-2.7.1/share/hadoop/hdfs:/home/software/hadoop-2.7.1/share/hadoop/hdfs/lib/*:/home/software/hadoop-2.7.1/share/hadoop/hdfs/*:/home/software/hadoop-2.7.1/share/hadoop/yarn/lib/*:/home/software/hadoop-2.7.1/share/hadoop/yarn/*:/home/software/hadoop-2.7.1/share/hadoop/mapreduce/lib/*:/home/software/hadoop-2.7.1/share/hadoop/mapreduce/*:/home/software/hadoop-2.7.1/contrib/capacity-scheduler/*.jar:/lib/*' -Djava.library.path=:/home/software/hadoop-2.7.1/lib/native org.apache.flume.node.Application -n a1 -f filechannel.conf
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/software/apache-flume-1.7.0-bin/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/software/apache-flume-1.7.0-bin/lib/Flume.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/software/hadoop-2.7.1/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
2020-12-02 00:28:01,350 (lifecycleSupervisor-1-0) [INFO - org.apache.flume.node.PollingPropertiesFileConfigurationProvider.start(PollingPropertiesFileConfigurationProvider.java:62)] Configuration provider starting
2020-12-02 00:28:01,361 (conf-file-poller-0) [INFO - org.apache.flume.node.PollingPropertiesFileConfigurationProvider$FileWatcherRunnable.run(PollingPropertiesFileConfigurationProvider.java:134)] Reloading configuration file:filechannel.conf
2020-12-02 00:28:01,375 (conf-file-poller-0) [INFO - org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty(FlumeConfiguration.java:930)] Added sinks: k1 Agent: a1
2020-12-02 00:28:01,376 (conf-file-poller-0) [INFO - org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty(FlumeConfiguration.java:1016)] Processing:k1
2020-12-02 00:28:01,376 (conf-file-poller-0) [INFO - org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty(FlumeConfiguration.java:1016)] Processing:k1
2020-12-02 00:28:01,398 (conf-file-poller-0) [INFO - org.apache.flume.conf.FlumeConfiguration.validateConfiguration(FlumeConfiguration.java:140)] Post-validation flume configuration contains configuration for agents: [a1]
2020-12-02 00:28:01,399 (conf-file-poller-0) [INFO - org.apache.flume.node.AbstractConfigurationProvider.loadChannels(AbstractConfigurationProvider.java:147)] Creating channels
2020-12-02 00:28:01,413 (conf-file-poller-0) [INFO - org.apache.flume.channel.DefaultChannelFactory.create(DefaultChannelFactory.java:42)] Creating instance of channel c1 type file
2020-12-02 00:28:01,442 (</