Flume 与 Kafka 整合连接

Flume从一个java程序接受随机生成的数据,并传至kafka,java程序通过http协议连接flume

把flume安装目录下的配置文件复制一份到任意路径如:/home/hadoop/flume.conf
进入文件修改: vi /home/hadoop/flume.conf

# Define a memory channel called ch1 on agent1
agent.channels.cnl.type = memory
agent.channels.cnl.capacity = 1000
agent.channels.cnl.thransaction = 100
agent.channels.cnl.kafka.consumer.group.id = flume-consumer

# Define an Avro source called avro-source1 on agent1 and tell it
# to bind to 0.0.0.0:41414. Connect it to channel ch1.
agent.sources.src.channels = cnl
agent.sources.src.type = http # 这里是flume通过http协议接受数据
agent.sources.src.bind = 192.168.48.101 # 主机ip地址
agent.sources.src.port = 44556 # 接受数据端口号
agent.sources.src.capacity = 10000
agent.sources.src.transactionCapacity = 100

# Define a logger sink that simply logs all events it receives
# and connect it to the other end of the same channel.
agent.sinks.kafka-sink.channel = cnl
agent.sinks.kafka-sink.type = org.apache.flume.sink.kafka.KafkaSink # 输出至Kafka
agent.sinks.kafka-sink.serializer.class = kafka.serializer.StringEncoder
agent.sinks.kafka-sink.kafka.bootstrap.servers = 192.168.48.101:9092,192.168.48.102:9092,192.168.48.103:9092,192.168.48.104:9092 # Kafka集群
agent.sinks.kafka-sink.kafka.topic = kafkatest # Kafka topic

# Finally, now that we've defined all of our components, tell
# agent1 which ones we want to activate.
agent.channels = cnl # 自定义名字
agent.sources = src # 自定义名字
agent.sinks = kafka-sink # 自定义名字

flume启动命令(配置过环境变量就不需要进入安装路径启动了):
flume-ng agent --conf /home/hadoop --conf-file /home/hadoop/flume.conf --name agent -Dflume.root.logger=INFO,console
记得修改路径

  • 0
    点赞
  • 1
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值