java kafka关闭连接_Java连接kafka

本文介绍了使用Java连接Kafka的步骤,包括添加maven依赖、配置消费者属性及示例代码。同时,文章列举了可能出现的错误,如TimeoutException、SerializationException,并提供了相应的解决方案,帮助开发者解决Kafka消费者在连接和序列化过程中遇到的问题。
摘要由CSDN通过智能技术生成

1、maven依赖:

xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">

4.0.0

com.xxx.test

xxx

1.0-SNAPSHOT

2008

2.11.12

0.10.2.0

scala-tools.org

Scala-Tools Maven2 Repository

http://scala-tools.org/repo-releases

scala-tools.org

Scala-Tools Maven2 Repository

http://scala-tools.org/repo-releases

org.apache.kafka

kafka_2.12

${kafka.version}

org.apache.kafka

kafka-clients

${kafka.version}

src/main/scala

src/test/scala

org.scala-tools

maven-scala-plugin

compile

testCompile

${scala.version}

-target:jvm-1.8

org.apache.maven.plugins

maven-eclipse-plugin

true

ch.epfl.lamp.sdt.core.scalabuilder

ch.epfl.lamp.sdt.core.scalanature

org.eclipse.jdt.launching.JRE_CONTAINER

ch.epfl.lamp.sdt.launching.SCALA_CONTAINER

maven-assembly-plugin

false

jar-with-dependencies

make-assembly

package

assembly

org.scala-tools

maven-scala-plugin

${scala.version}

2、Java代码:

import org.apache.kafka.clients.consumer.ConsumerConfig;

import org.apache.kafka.clients.consumer.ConsumerRecord;

import org.apache.kafka.clients.consumer.ConsumerRecords;

import org.apache.kafka.clients.consumer.KafkaConsumer;

import java.util.Collections;

import java.util.Properties;

public class MyConsumer{

private final static String TOPIC = "test_topic";

private final static String KAFKA_SERVER_URL = "10.31.7.200";

private final static String KAFKA_SERVER_PORT = "9092";

public static final String HOST_NAME = KAFKA_SERVER_URL;

public static KafkaConsumer getConsumer() {

KafkaConsumer consumer;

Properties props = new Properties();

props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, KAFKA_SERVER_URL + ":" + KAFKA_SERVER_PORT);

props.put(ConsumerConfig.GROUP_ID_CONFIG, "DemoConsumer");

props.put(ConsumerConfig.ENABLE_AUTO_COMMIT_CONFIG, "true");

props.put(ConsumerConfig.AUTO_COMMIT_INTERVAL_MS_CONFIG, "1000");

props.put(ConsumerConfig.SESSION_TIMEOUT_MS_CONFIG, "30000");

props.put("host.name",HOST_NAME);

props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, "org.apache.kafka.common.serialization.StringDeserializer");

props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, "org.apache.kafka.common.serialization.StringDeserializer");

// props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, "org.apache.kafka.common.serialization.IntegerDeserializer");

// props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, "org.apache.kafka.common.serialization.StringSerializer");

// props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, "org.apache.kafka.connect.json.JsonConverter");

consumer = new KafkaConsumer(props);

return consumer;

}

public static void main(String[] args) {

KafkaConsumer consumer = getConsumer();

consumer.subscribe(Collections.singletonList(TOPIC));

ConsumerRecords records ;

records = consumer.poll(1000);

for (ConsumerRecord record : records) {

System.out.println("Received message: (" + record.key() + ", " + record.value() + ") at offset " + record.offset());

}

}

}

3、可能的报错:

需要设置 props.put("host.name",ip);

org.apache.kafka.common.errors.TimeoutException

这个报错需要检查k/v的序列化类,要求序列化类是org.apache.kafka.common.serialization.Deserializer的子类

Exception in thread "main" org.apache.kafka.common.errors.SerializationException: Error deserializing key/value for partition event_topic-0 at offset 161529729

Caused by: org.apache.kafka.common.errors.SerializationException: Size of data received by IntegerDeserializer is not 4

1、需要设置 value.deserializer 的值

Exception in thread "main" org.apache.kafka.common.config.ConfigException: Missing required configuration "value.deserializer" which has no default value.

at org.apache.kafka.common.config.ConfigDef.parse(ConfigDef.java:436)

at org.apache.kafka.common.config.AbstractConfig.(AbstractConfig.java:56)

at org.apache.kafka.common.config.AbstractConfig.(AbstractConfig.java:63)

at org.apache.kafka.clients.consumer.ConsumerConfig.(ConsumerConfig.java:426)

at org.apache.kafka.clients.consumer.KafkaConsumer.(KafkaConsumer.java:597)

at org.apache.kafka.clients.consumer.KafkaConsumer.(KafkaConsumer.java:579)

at com.fumi.test.MyConsumer2.get(MyConsumer2.java:46)

at com.fumi.test.MyConsumer2.main(MyConsumer2.java:56)

2、k/v的 序列化类 org.apache.kafka.connect.json.JsonConverter 不是 org.apache.kafka.common.serialization.Deserializer 的子类,所以报错了。

Exception in thread "main" org.apache.kafka.common.KafkaException: Failed to construct kafka consumer

at org.apache.kafka.clients.consumer.KafkaConsumer.(KafkaConsumer.java:717)

at org.apache.kafka.clients.consumer.KafkaConsumer.(KafkaConsumer.java:597)

at org.apache.kafka.clients.consumer.KafkaConsumer.(KafkaConsumer.java:579)

at com.fumi.test.MyConsumer2.get(MyConsumer2.java:46)

at com.fumi.test.MyConsumer2.main(MyConsumer2.java:56)

Caused by: org.apache.kafka.common.KafkaException: org.apache.kafka.connect.json.JsonConverter is not an instance of org.apache.kafka.common.serialization.Deserializer

at org.apache.kafka.common.config.AbstractConfig.getConfiguredInstance(AbstractConfig.java:205)

at org.apache.kafka.clients.consumer.KafkaConsumer.(KafkaConsumer.java:645)

... 4 more

这个报错也是类似

Exception in thread "main" org.apache.kafka.common.KafkaException: Failed to construct kafka consumer

at org.apache.kafka.clients.consumer.KafkaConsumer.(KafkaConsumer.java:717)

at org.apache.kafka.clients.consumer.KafkaConsumer.(KafkaConsumer.java:597)

at org.apache.kafka.clients.consumer.KafkaConsumer.(KafkaConsumer.java:579)

at com.fumi.test.MyConsumer2.get(MyConsumer2.java:46)

at com.fumi.test.MyConsumer2.main(MyConsumer2.java:56)

Caused by: org.apache.kafka.common.KafkaException: org.apache.kafka.common.serialization.StringSerializer is not an instance of org.apache.kafka.common.serialization.Deserializer

at org.apache.kafka.common.config.AbstractConfig.getConfiguredInstance(AbstractConfig.java:205)

at org.apache.kafka.clients.consumer.KafkaConsumer.(KafkaConsumer.java:637)

... 4 more

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值