2021.6.24 mac上搭建kafka以及用flink连接

搭建kafka

相关链接:
https://blog.csdn.net/u010046908/article/details/62229015
https://www.cnblogs.com/BlueSkyyj/p/11425998.html
第二个连接缺少启动zookeeper的过程

个人梳理

  1. brew install kafka
  2. 配置文件位置
/usr/local/etc/kafka/server.properties

/usr/local/etc/kafka/zookeeper.properties
  1. 进入安装目录(里面有bin),启动zookeeper
.bin/zookeeper-server-start /usr/local/etc/kafka/zookeeper.properties
  1. 启动kafka
kafka-server-start /usr/local/etc/kafka/server.properties
  1. 创建topic,发消息,收消息
#创建topic 
kafka-topics --create --zookeeper localhost:2181 --replication-factor 1 --partitions 1 --topic test
#查看创建的topic
kafka-topics --list --zookeeper localhost:2181
#终端1发送消息
kafka-console-producer --broker-list localhost:9092 --topic test
#终端2接受消息
kafka-console-consumer --bootstrap-server localhost:9092 --topic test --from-beginning

flink连接kafka

参考连接:https://blog.csdn.net/u014468095/article/details/103143275
pom配置:

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
         xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>

    <groupId>org.example</groupId>
    <artifactId>UserBehaviorAnalysis</artifactId>
    <packaging>pom</packaging>
    <version>1.0-SNAPSHOT</version>
    <modules>
        <module>HotItemsAnalysis</module>
    </modules>

    <properties>
        <flink.version>1.10.1</flink.version>
        <scala.binary.version>2.12</scala.binary.version>
        <kafka.version>2.8.0</kafka.version>
    </properties>

    <dependencies>
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-java</artifactId>
            <version>${flink.version}</version>
        </dependency>
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-streaming-java_${scala.binary.version}</artifactId>
            <version>${flink.version}</version>
        </dependency>
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-connector-kafka_${scala.binary.version}</artifactId>
            <version>${flink.version}</version>
        </dependency>
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-connector-kafka-0.11_2.12</artifactId>
            <version>1.10.1</version>
            <scope>test</scope>
        </dependency>
        <dependency>
            <groupId>junit</groupId>
            <artifactId>junit</artifactId>
            <version>4.12</version>
            <scope>test</scope>
        </dependency>
        <dependency>
            <groupId>org.slf4j</groupId>
            <artifactId>slf4j-log4j12</artifactId>
            <version>1.7.7</version>
        </dependency>
        <dependency>
            <groupId>log4j</groupId>
            <artifactId>log4j</artifactId>
            <version>1.2.17</version>
        </dependency>
    </dependencies>

    <build>
        <plugins>
            <plugin>
                <artifactId>maven-compiler-plugin</artifactId>
                <configuration>
                    <source>1.8</source>
                    <target>1.8</target>
                    <encoding>UTF-8</encoding>
                </configuration>
            </plugin>
        </plugins>
    </build>

</project>

其中,下面这个最重要:

<dependency>
	    <groupId>org.apache.flink</groupId>
	    <artifactId>flink-connector-kafka-0.11_2.12</artifactId>
	    <version>1.10.1</version>
	    <scope>test</scope>
</dependency>

测试的java代码:

import org.apache.flink.api.common.serialization.SimpleStringSchema;
import org.apache.flink.streaming.api.datastream.DataStream;
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
import org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer;
import org.junit.Test;

import java.util.Properties;

public class Test2 {
    @Test
    public void myTest() throws Exception {
        final StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
        Properties props = new Properties();
        props.put("bootstrap.servers","localhost:9092");
        props.setProperty("group.id", "consume_id");
        DataStream<String> input = env.addSource(new FlinkKafkaConsumer<String>
                ("test",new SimpleStringSchema(),props));
        input.print();
        env.execute("Flink Streaming Java API Skeleton");
    }
}

注意这个new FlinkKafkaConsumer,简直是天坑,不要带10,11之类的数字,浪费我2小时

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
引用\[1\]中提到了一个异常信息:NoSuchMethodError: org.apache.flink.streaming.connectors.kafka.internal.KafkaConsumerThread。这个异常通常是由于版本不兼容引起的。引用\[3\]中提到了一种解决方法,即尝试替换flink-connector-kafka-0.11_2.11版本为原来的1.10.0,并升级其他相关的包。你可以尝试按照这个方法来解决该异常。 #### 引用[.reference_title] - *1* [java.lang.NoSuchMethodError: org.apache.kafka.clients.consumer.KafkaConsumer.assign(Ljava/util/List](https://blog.csdn.net/Sakura555/article/details/100568356)[target="_blank" data-report-click={"spm":"1018.2226.3001.9630","extra":{"utm_source":"vip_chatgpt_common_search_pc_result","utm_medium":"distribute.pc_search_result.none-task-cask-2~all~insert_cask~default-1-null.142^v91^control,239^v3^insert_chatgpt"}} ] [.reference_item] - *2* [Flink 实战问题(八):NoSuchMethodError: org.apache.kafka.clients.producer.KafkaProducer.close](https://blog.csdn.net/congcong68/article/details/127331030)[target="_blank" data-report-click={"spm":"1018.2226.3001.9630","extra":{"utm_source":"vip_chatgpt_common_search_pc_result","utm_medium":"distribute.pc_search_result.none-task-cask-2~all~insert_cask~default-1-null.142^v91^control,239^v3^insert_chatgpt"}} ] [.reference_item] - *3* [NoSuchMethodError: org.apache.flink.api.common.state.OperatorStateStore.getSerializableListState](https://blog.csdn.net/u013303975/article/details/128715937)[target="_blank" data-report-click={"spm":"1018.2226.3001.9630","extra":{"utm_source":"vip_chatgpt_common_search_pc_result","utm_medium":"distribute.pc_search_result.none-task-cask-2~all~insert_cask~default-1-null.142^v91^control,239^v3^insert_chatgpt"}} ] [.reference_item] [ .reference_list ]
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值