java写入flume,Flume没有将日志写入Hdfs

所以我配置水槽把我的apache2访问日志写入hdfs ...并且我想通过水槽的日志是所有的配置是正确的但我不知道为什么它仍然没有写入hdfs . 所以这是我的flume配置文件

#agent and component of agent

search.sources = so

search.sinks = si

search.channels = sc

# Configure a channel that buffers events in memory:

search.channels.sc.type = memory

search.channels.sc.capacity = 20000

search.channels.sc.transactionCapacity = 100

# Configure the source:

search.sources.so.channels = sc

search.sources.so.type = exec

search.sources.so.command = tail -F /var/log/apache2/access.log

# Describe the sink:

search.sinks.si.channel = sc

search.sinks.si.type = hdfs

search.sinks.si.hdfs.path = hdfs://localhost:9000/flumelogs/

search.sinks.si.hdfs.writeFormat = Text

search.sinks.si.hdfs.fileType = DataStream

search.sinks.si.hdfs.rollSize=0

search.sinks.si.hdfs.rollCount = 10000

search.sinks.si.hdfs.batchSize=1000

search.sinks.si.rollInterval=1

这是我的水槽日志

14/12/18 17:47:56 INFO node.AbstractConfigurationProvider: Creating channels

14/12/18 17:47:56 INFO channel.DefaultChannelFactory: Creating instance of channel sc type memory

14/12/18 17:47:56 INFO node.AbstractConfigurationProvider: Created channel sc

14/12/18 17:47:56 INFO source.DefaultSourceFactory: Creating instance of source so, type exec

14/12/18 17:47:56 INFO sink.DefaultSinkFactory: Creating instance of sink: si, type: hdfs

14/12/18 17:47:56 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

14/12/18 17:47:56 INFO hdfs.HDFSEventSink: Hadoop Security enabled: false

14/12/18 17:47:56 INFO node.AbstractConfigurationProvider: Channel sc connected to [so, si]

14/12/18 17:47:56 INFO node.Application: Starting new configuration:{ sourceRunners:{so=EventDrivenSourceRunner: { source:org.apache.flume.source.ExecSource{name:so,state:IDLE} }} sinkRunners:{si=SinkRunner: { policy:org.apache.flume.sink.DefaultSinkProcessor@3de76481 counterGroup:{ name:null counters:{} } }} channels:{sc=org.apache.flume.channel.MemoryChannel{name: sc}} }

14/12/18 17:47:56 INFO node.Application: Starting Channel sc

14/12/18 17:47:56 INFO instrumentation.MonitoredCounterGroup: Monitored counter group for type: CHANNEL, name: sc: Successfully registered new MBean.

14/12/18 17:47:56 INFO instrumentation.MonitoredCounterGroup: Component type: CHANNEL, name: sc started

14/12/18 17:47:56 INFO node.Application: Starting Sink si

14/12/18 17:47:56 INFO node.Application: Starting Source so

14/12/18 17:47:56 INFO source.ExecSource: Exec source starting with command:tail -F /var/log/apache2/access.log

14/12/18 17:47:56 INFO instrumentation.MonitoredCounterGroup: Monitored counter group for type: SINK, name: si: Successfully registered new MBean.

14/12/18 17:47:56 INFO instrumentation.MonitoredCounterGroup: Component type: SINK, name: si started

14/12/18 17:47:56 INFO instrumentation.MonitoredCounterGroup: Monitored counter group for type: SOURCE, name: so: Successfully registered new MBean.

14/12/18 17:47:56 INFO instrumentation.MonitoredCounterGroup: Component type: SOURCE, name: so started

这是命令,我曾经开始使用水槽

flume-ng agent -n search -c conf -f ../conf/flume-conf-search

我在hdfs中有一条路径

hadoop fs -mkdir hdfs://localhost:9000/flumelogs

但我不知道为什么它不写入hdfs..i可以看到apache2的访问日志..但是水槽没有将它们发送到hdfs / flumelogs目录....请帮忙! !

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值