Flume-ng采集数据到HDFS时,生成的文件中,有头信息

原文地址:http://blog.163.com/hejian511@126/blog/static/1285610782013914104137649/

Flume版本为Flume-ng 1.4,配置文件如下。在生成的HDFS文件中,总是有“

SEQ!org.apache.hadoop.io.LongWritable"org.apache.hadoop.io.BytesWritable??H謺NSA???y”信息,
配置文件如下,
# Name the components on this agent
a1.sources = r1
a1.sinks = k1
a1.channels = c1

# Describe/configure the source
a1.sources.r1.type = spooldir
a1.sources.r1.fileHeader = true
#a1.sources.r1.deserializer.outputCharset=UTF-8
a1.sources.r1.spoolDir = /opt/personal/file/access
a1.sources.r1.channels = c1 


# Describe the sink
a1.sinks.k1.type = hdfs
a1.sinks.k1.channel = c1
a1.sinks.k1.hdfs.path = hdfs://node143:9000/access/events
a1.sinks.k1.hdfs.filePrefix = access
a1.sinks.k1.hdfs.fileSuffix=.log
#a1.sinks.k1.hdfs.hdfs.writeFormat= Text
a1.sinks.k1.hdfs.round = true
a1.sinks.k1.hdfs.roundValue = 10
a1.sinks.k1.hdfs.roundUnit = minute

# Use a channel which buffers events in memory
a1.channels.c1.type = memory
a1.channels.c1.capacity = 10000
a1.channels.c1.transactionCapacity = 1000

# Bind the source and sink to the channel
a1.sources.r1.channels = c1
a1.sinks.k1.channel = c1

翻看Flume文档,发现,HdfsEventSink中, hdfs.fileType默认为 SequenceFile,将其改为DataStream就可以按照采集的文件原样输入到hdfs。
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值