06案例-单数据源多出口-选择器

案例需求:
使用Flume-1监控文件变动,Flume-1将变动内容传递给Flume-2,
Flume-2负责存储到HDFS。同时Flume-1将变动内容传递给Flume-3,
Flume-3负责输出到Local FileSystem。

实现步骤:

1.准备工作
	在/root/app/flume/job目录下创建group1文件夹
		cd group1/
	在/root/app/datas/目录下创建flume3文件夹
		mkdir flume3
2.创建flume-file-flume.conf
	(配置1个接收日志文件的source和两个channel、两个sink,
		分别输送给flume-flume-hdfs和flume-flume-dir。)
	touch flume-file-flume.conf
	vim flume-file-flume.conf
	
		# Name the components on this agent
		a1.sources = r1
		a1.sinks = k1 k2
		a1.channels = c1 c2
		# 将数据流复制给所有channel
		a1.sources.r1.selector.type = replicating

		# Describe/configure the source
		a1.sources.r1.type = exec
		a1.sources.r1.command = tail -F /opt/module/hive/logs/hive.log
		a1.sources.r1.shell = /bin/bash -c

		# Describe the sink
		a1.sinks.k1.type = avro
		a1.sinks.k1.hostname = hadoop102 
		a1.sinks.k1.port = 4141

		a1.sinks.k2.type = avro
		a1.sinks.k2.hostname = hadoop102
		a1.sinks.k2.port = 4142

		# Describe the channel
		a1.channels.c1.type = memory
		a1.channels.c1.capacity = 1000
		a1.channels.c1.transactionCapacity = 100

		a1.channels.c2.type = memory
		a1.channels.c2.capacity = 1000
		a1.channels.c2.transactionCapacity = 100

		# Bind the source and sink to the channel
		a1.sources.r1.channels = c1 c2
		a1.sinks.k1.channel = c1
		a1.sinks.k2.channel = c2
3.创建flume-flume-hdfs.conf
	(配置上级Flume输出的Source,输出是到HDFS的Sink。)
	touch flume-flume-hdfs.conf
	vim flume-flume-hdfs.conf
	
		# Name the components on this agent
		a2.sources = r1
		a2.sinks = k1
		a2.channels = c1

		# Describe/configure the source
		a2.sources.r1.type = avro
		a2.sources.r1.bind = hadoop102
		a2.sources.r1.port = 4141

		# Describe the sink
		a2.sinks.k1.type = hdfs
		a2.sinks.k1.hdfs.path = hdfs://hadoop102:9000/flume2/%Y%m%d/%H
		#上传文件的前缀
		a2.sinks.k1.hdfs.filePrefix = flume2-
		#是否按照时间滚动文件夹
		a2.sinks.k1.hdfs.round = true
		#多少时间单位创建一个新的文件夹
		a2.sinks.k1.hdfs.roundValue = 1
		#重新定义时间单位
		a2.sinks.k1.hdfs.roundUnit = hour
		#是否使用本地时间戳
		a2.sinks.k1.hdfs.useLocalTimeStamp = true
		#积攒多少个Event才flush到HDFS一次
		a2.sinks.k1.hdfs.batchSize = 100
		#设置文件类型,可支持压缩
		a2.sinks.k1.hdfs.fileType = DataStream
		#多久生成一个新的文件
		a2.sinks.k1.hdfs.rollInterval = 600
		#设置每个文件的滚动大小大概是128M
		a2.sinks.k1.hdfs.rollSize = 134217700
		#文件的滚动与Event数量无关
		a2.sinks.k1.hdfs.rollCount = 0
		#最小冗余数
		a2.sinks.k1.hdfs.minBlockReplicas = 1

		# Describe the channel
		a2.channels.c1.type = memory
		a2.channels.c1.capacity = 1000
		a2.channels.c1.transactionCapacity = 100

		# Bind the source and sink to the channel
		a2.sources.r1.channels = c1
		a2.sinks.k1.channel = c1
4.创建flume-flume-dir.conf
	(配置上级Flume输出的Source,输出是到本地目录的Sink)
	touch flume-flume-dir.conf
	vim flume-flume-dir.conf
	
		# Name the components on this agent
		a3.sources = r1
		a3.sinks = k1
		a3.channels = c2

		# Describe/configure the source
		a3.sources.r1.type = avro
		a3.sources.r1.bind = hadoop102
		a3.sources.r1.port = 4142

		# Describe the sink
		a3.sinks.k1.type = file_roll
		a3.sinks.k1.sink.directory = /opt/module/datas/flume3

		# Describe the channel
		a3.channels.c2.type = memory
		a3.channels.c2.capacity = 1000
		a3.channels.c2.transactionCapacity = 100

		# Bind the source and sink to the channel
		a3.sources.r1.channels = c2
		a3.sinks.k1.channel = c2
		
	提示:输出的本地目录必须是已经存在的目录,
		如果该目录不存在,并不会创建新的目录
5.执行配置文件
	(分别开启对应配置文件:flume-flume-dir,flume-flume-hdfs,flume-file-flume。)
	bin/flume-ng agent --conf conf/ --name a3 
		--conf-file job/group1/flume-flume-dir.conf
	bin/flume-ng agent --conf conf/ --name a2 
		--conf-file job/group1/flume-flume-hdfs.conf
	bin/flume-ng agent --conf conf/ --name a1 
		--conf-file job/group1/flume-file-flume.conf
6.启动Hadoop和Hive
	sbin/start-dfs.sh	(node01)
	sbin/start-yarn.sh	(node02)
	bin/hive		(node01)
7.检查HDFS上数据
8.检查/root/app/datas/flume3目录中数据,即使用"ll"
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包

打赏作者

hao难懂

你的鼓励将是我创作的最大动力

¥1 ¥2 ¥4 ¥6 ¥10 ¥20
扫码支付:¥1
获取中
扫码支付

您的余额不足,请更换扫码支付或充值

打赏作者

实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值