1)启动,cd到bin目录下执行命令
./confluent start
Starting zookeeper
zookeeper is [UP]
Starting kafka
kafka is [UP]
Starting schema-registry
schema-registry is [UP]
Starting kafka-rest
kafka-rest is [UP]
Starting connect
connect is [UP]
2)列出支持的连接
命令: ./confluent list connectors
![在这里插入图片描述](https://img-blog.csdnimg.cn/20190802151622774.png)
3)bin目录下执行,将本地文件作为source源
执行此命令 : vim ../etc/kafka/connect-file-source.properties
修改成:
name=file-source
connector.class=FileStreamSource
tasks.max=1
file=test.txt
topic=connect-test
4)造测试数据,在bin目录下执行 :
$ for i in {1..3}; do echo "log line $i"; done > test.txt
5)加载数据源,bin目录下执行
命令:./confluent load file-source #如果想卸载该数据源执行./confluent unload file-source
6)查看刚才加载的连接器
命令:./confluent status connectors
结果:
[
"file-source"
]
7)查看具体连接器状态
命令: ./confluent status file-source
结果:
{
"name": "file-source",
"connector": {
"state": "RUNNING",
"worker_id": "192.168.10.1:8083"
},
"tasks": [
{
"state": "RUNNING",
"id": 0,
"worker_id": "192.168.10.1:8083"
}
]
}
8)查看kafka topic下的消息
命令 : ./kafka-avro-console-consumer --bootstrap-server localhost:9092 --topic connect-test --from-beginning
结果:
"log line 1"
"log line 2"
"log line 3"
9)使用fileSink写文件 ,修改配置文件
命令: vim ../etc/kafka/connect-file-sink.properties
修改为:
name=file-sink
connector.class=FileStreamSink
tasks.