flink这个名字如此熟悉而又陌生,经过丧心病狂的百度,谷歌(自行解决)大致了解他的身世,然后不想再深入了解的我,本着是骡子是马牵出来溜溜的态度先从官网现在下来,前卫的我选了flink-1.11.2-bin-scala_2.11(scala不会用,应该写java也可以吧( ̄(工) ̄))
下载地址:https://flink.apache.org/downloads.html#apache-flink-1112
小声逼逼:哔哩哔哩有教学视频,我看的是尚硅谷的,flink-scala的,我是没找到java 教程T_T,求大佬指点
拖进centos7中,解压,运行start-cluster.sh(当然这台centos是我的实验田,java8已经播种)
打开浏览器访问控制台http://10.10.10.84:8081/#/overview(易燃易爆的我看到有页面出来瞬间心平气和了许多)
既然这么给面子那就搞个任务耍耍,实时数据处理,那就订阅个kafka,打印控制台
经过我我无数次的调试(谷歌,百度。。。)搞出了另一版,打包上传,提交运行
查看日志,真香
嘿嘿嘿,开局不错,这下有心情开始深入了解了╮( ̄▽  ̄)╭
勤劳的搬运工会把代码贴近来,勤劳如我。
KafkaFlinkApplication.java
public class KafkaFlinkApplication {
public static void main(String[] args) throws Exception{
final StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
Properties props = new Properties();
props.put("bootstrap.servers", "10.10.10.110:6667,10.10.10.120:6667,10.10.10.119:6667");
props.put("group.id", "flink-group");
//key 反序列化
props.put("key.deserializer", "org.apache.kafka.common.serialization.StringDeserializer");
//value 反序列化
props.put("value.deserializer", "org.apache.kafka.common.serialization.StringDeserializer");
//从最新的数据开始消费
props.put("auto.offset.reset", "latest");
DataStreamSource<String> dataStreamSource = env.addSource(new FlinkKafkaConsumer<>(
//kafka topic
"myTopic",
// String 序列化
new SimpleStringSchema(),
props)).setParallelism(1);
//把从 kafka 读取到的数据打印在控制台
dataStreamSource.print();
//执行任务
env.execute("Flink-kafka");
}
}
pom.xml
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>flink.kafka.exchange</groupId>
<artifactId>kafkaflink</artifactId>
<version>0.0.1-SNAPSHOT</version>
<name>KafkaFlink</name>
<description>flink处理kafka</description>
<properties>
<java.version>1.8</java.version>
</properties>
<dependencies>
<!-- flink-connector-kafka -->
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-kafka_2.12</artifactId>
<version>1.11.2</version>
</dependency>
<!-- flink-streaming-java -->
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-streaming-java_2.12</artifactId>
<version>1.11.2</version>
</dependency>
<!-- flink-java -->
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-java</artifactId>
<version>1.11.2</version>
</dependency>
<!-- flink-clients-->
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-clients_2.12</artifactId>
<version>1.11.2</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<configuration>
<source>8</source>
<target>8</target>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>3.0.0</version>
<executions>
<!-- Run shade goal on package phase -->
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<transformers>
<transformer implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
<mainClass>flink.kafka.exchange.kafkaflink.KafkaFlinkApplication</mainClass>
</transformer>
</transformers>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
</project>