Kafka Connector的Consumer配置SSL认证

开发SinkConnector和开发SinkeConnectorTask略

配置connect-standalone-consumer.properties

# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements.  See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License.  You may obtain a copy of the License at
#
#    http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
bootstrap.servers=hostname:19093

security.protocol=SSL
ssl.truststore.location=/home/xxx/kafka_ssl_key/client.truststore.jks
ssl.truststore.password=123456
ssl.keystore.location=/home/xxx/kafka_ssl_key/server.keystore.jks
ssl.keystore.password=123456
ssl.key.password=123456
# These are defaults. This file just demonstrates how to override some settings.


# The converters specify the format of data in Kafka and how to translate it into Connect data. Every Connect user will
# need to configure these based on the format they want their data in when loaded from or stored into Kafka
key.converter=org.apache.kafka.connect.json.JsonConverter
value.converter=com.xxxx.common.kafka.connector.AlreadyBytesConverter
# Converter-specific settings can be passed in by prefixing the Converter's setting with the converter we want to apply
# it to
key.converter.schemas.enable=false
value.converter.schemas.enable=false

offset.storage.file.filename=/tmp/connect.offsets
# Flush much faster than normal, which is useful for testing/debugging
offset.flush.interval.ms=10000

# Set to a list of filesystem paths separated by commas (,) to enable class loading isolation for plugins
# (connectors, converters, transformations). The list should consist of top level directories that include
# any combination of:
# a) directories immediately containing jars with plugins and their dependencies
# b) uber-jars with plugins and their dependencies
# c) directories immediately containing the package directory structure of classes of plugins and their dependencies
# Note: symlinks will be followed to discover dependencies or plugins.
# Examples:
# plugin.path=/usr/local/share/java,/usr/local/share/kafka/plugins,/opt/connectors,
#plugin.path=


consumer.bootstrap.servers=hostname:19093
consumer.security.protocol=SSL
consumer.ssl.truststore.location=/home/xxx/kafka_ssl_key/client.truststore.jks
consumer.ssl.truststore.password=123456
consumer.ssl.keystore.location=/home/xxxx/kafka_ssl_key/server.keystore.jks
consumer.ssl.keystore.password=123456
consumer.ssl.key.password=123456

配置Connector

name=xxxx-event
connector.class=com.xxx.xxxxxx.common.kafka.connector.JDBCSinkConnector
#topics=cdh-hive-audit-logs-topic
topics=my-topic
tasks.max=3
# format to use for the date to append at the end of the index name, optional
# if empty or null, no suffix will be used



运行命令

./connect-standalone.sh ../conf/connect-consumer.properties ../conf/xxxxx-event.properties
  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值