oracle csv kafka,Kafka Connect connector for reading CSV files into Kafka.

Introduction

Installation through the Confluent Hub Client

This Kafka Connect connector provides the capability to watch a directory for files and read the data as new files are written to the input directory. Each of the records in the input file will be converted based on the user supplied schema.

The CSVRecordProcessor supports reading CSV or TSV files. It can convert a CSV on the fly to the strongly typed Kafka Connect data types. It currently has support for all of the schema types and logical types that are supported in Kafka Connect. If you couple this with the Avro converter and Schema Registry by Confluent, you will be able to process CSV, Json, or TSV files to strongly typed Avro data in real time.

This connector is used to stream _ JSON files from a directory while converting the data based on the schema supplied in the configuration.

The SpoolDirCsvSourceConnector will monitor the directory specified in input.path for files and read them as a CSV converting each of the records to the strongly typed equivalent specified in key.schema and value.schema.

This connector is used to stream JSON files from a directory while converting the data based on the schema supplied in the configuration.

This connector is used to read a file line by line and write the data to Kafka.

This connector is used to stream Extended Log File Format files from a directory while converting the data to a strongly typed schema.

Development

Building the source

mvn clean package

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值