java.lang.NoSuchMethodError: org.apache.kafka.clients.consumer.KafkaConsumer.assign(Ljava/util/List

版权声明:本文为博主原创文章,遵循 CC 4.0 BY-SA 版权协议,转载请附上原文出处链接和本声明。
本文链接:

 Flink入门程序异常,记录一下跟大家分享。

SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
org.apache.flink.runtime.client.JobExecutionException: java.lang.NoSuchMethodError: org.apache.kafka.clients.consumer.KafkaConsumer.assign(Ljava/util/List;)V
    at org.apache.flink.runtime.minicluster.MiniCluster.executeJobBlocking(MiniCluster.java:623)
    at org.apache.flink.streaming.api.environment.LocalStreamEnvironment.execute(LocalStreamEnvironment.java:123)
    at org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.execute(StreamExecutionEnvironment.java:1511)
    at Streaming.ReadFromKafka.main(ReadFromKafka.java:41)
Caused by: java.lang.NoSuchMethodError: org.apache.kafka.clients.consumer.KafkaConsumer.assign(Ljava/util/List;)V
    at org.apache.flink.streaming.connectors.kafka.internal.KafkaConsumerCallBridge.assignPartitions(KafkaConsumerCallBridge.java:42)
    at org.apache.flink.streaming.connectors.kafka.internal.KafkaConsumerThread.reassignPartitions(KafkaConsumerThread.java:405)
    at org.apache.flink.streaming.connectors.kafka.internal.KafkaConsumerThread.run(KafkaConsumerThread.java:243)

当各位遇到这个错误的时候,相信你们也是写的Flink的入门程序,读取或者写入kafka。网上的资料少之甚少,经过一番寻找还是找到了一点东西。希望大家以后可以少走弯路。

 

【尖叫提示】:这是入门级别的一个大坑。

<dependency>

<groupId>org.apache.kafka</groupId>

<artifactId>kafka-clients</artifactId>

<version>0.9.0.1</version>

</dependency>

这个kafka-clients的版本一定要写这个。

如果写下面这个,则会报错:具体原因应该是1.0.0的不支持了。

org.apache.flink.runtime.client.JobExecutionException: java.lang.NoSuchMethodError: org.apache.kafka.clients.consumer.KafkaConsumer.assign(Ljava/util/List;)V

<dependency>

<groupId>org.apache.flink</groupId>

<artifactId>flink-clients_2.11</artifactId>

<version>1.6.0</version>

</dependency>

 

具体的代码如下:


 
 
  1. package Streaming;
  2. import org.apache.flink.api.common.functions.MapFunction;
  3. import org.apache.flink.api.common.serialization.SimpleStringSchema;
  4. import org.apache.flink.streaming.api.datastream.DataStream;
  5. import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
  6. import org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer09;
  7. import java.util.Properties;
  8. /**
  9. * Created with IntelliJ IDEA.
  10. * User: @ziyu freedomziyua@gmail.com
  11. * Date: 2018-09-10
  12. * Time: 11:25
  13. * Description: kafka.Streaming.ReadFromKafka
  14. */
  15. public class ReadFromKafka {
  16. public static void main(String args[]) throws Exception{
  17. StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
  18. Properties properties = new Properties();
  19. properties.setProperty( "bootstrap.servers", "192.168.2.41:9092");
  20. properties.setProperty( "group.id", "test");
  21. DataStream<String> stream = env
  22. .addSource( new FlinkKafkaConsumer09( "flink-demo", new SimpleStringSchema(), properties));
  23. stream.map( new MapFunction<String, String>() {
  24. private static final long serialVersionUID = - 6867736771747690202L;
  25. public String map(String value) throws Exception {
  26. return "Stream Value: " + value;
  27. }
  28. }).print();
  29. try {
  30. env.execute();
  31. } catch (Exception e) {
  32. e.printStackTrace();
  33. }
  34. }
  35. }

如果运行的话,只要环境修改好了,然后引入Flink连接kafka 的依赖


 
 
  1. <properties>
  2. <project.build.sourceEncoding>UTF-8 </project.build.sourceEncoding>
  3. <flink.version>1.6.0 </flink.version>
  4. </properties>
  5. <dependencies>
  6. <dependency>
  7. <groupId>org.apache.flink </groupId>
  8. <artifactId>flink-java </artifactId>
  9. <version>${flink.version} </version>
  10. </dependency>
  11. <dependency>
  12. <groupId>org.apache.flink </groupId>
  13. <artifactId>flink-streaming-java_2.11 </artifactId>
  14. <version>${flink.version} </version>
  15. </dependency>
  16. <dependency>
  17. <groupId>org.apache.kafka </groupId>
  18. <artifactId>kafka-clients </artifactId>
  19. <version>0.9.0.1 </version>
  20. </dependency>
  21. <!-- Flink Connector Kafka | exclude Kafka implementation to use MapR -->
  22. <dependency>
  23. <groupId>org.apache.flink </groupId>
  24. <artifactId>flink-connector-kafka-0.10_2.11 </artifactId>
  25. <version>${flink.version} </version>
  26. </dependency>
  27. </dependencies>

【运行】

1.kafka创建flink-demo 的主题

2.启动kafka 的生产者和消费者,观察时候可以互通

3.如果上述都没问题,启动Flink

4.运行本地程序,观察输出即可

 

以上为初学Flink遇到的一个比较棘手的问题,希望大家少走弯路。

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值