Flink
张俊杰zjj
一个大龄的码农.目前是Java+Python开发,业余时间研究减肥健身养生,后期可能会去做减肥健身博主.
展开
-
Flink输出流SideOutput的demo
介绍测输出流SideOutput说白了就是可以将一个流变成两个流.代码import com.atguigu.apitest.SensorReadingimport org.apache.flink.api.common.state.{ValueState, ValueStateDescriptor}import org.apache.flink.streaming.api.functions.ProcessFunctionimport org.apache.flink.streaming.api原创 2021-07-14 18:49:22 · 227 阅读 · 0 评论 -
Flink定义时间属性Demo
案例来自尚硅谷…概述基于时间的操作(比如Table API和SQL中窗口操作),需要定义相关的时间语义和时间数据来源的信息。所以,Table可以提供一个逻辑上的时间字段,用于在表处理程序中,指示时间和访问相应的时间戳。时间属性,可以是每个表schema的一部分。一旦定义了时间属性,它就可以作为一个字段引用,并且可以在基于时间的操作中使用。时间属性的行为类似于常规时间戳,可以访问,并且进行计算。sensor.txtsensor_1,1547718199,35.8sensor_6,15477182原创 2021-07-13 16:16:12 · 235 阅读 · 0 评论 -
Flink将数据输出到Kafka的指定topic中
sensor.txtsensor_1,1547718199,35.8sensor_6,1547718201,15.4sensor_7,1547718202,6.7sensor_10,1547718205,38.1sensor_1,1547718207,37.2sensor_1,1547718212,33.5sensor_1,1547718215,38.1代码import org.apache.flink.streaming.api.scala._import org.apache.fl原创 2021-07-11 17:39:24 · 947 阅读 · 1 评论 -
Flink TableAPI更新模式
更新模式在流处理过程中,表的处理并不像传统定义的那样简单。对于流式查询(Streaming Queries),需要声明如何在(动态)表和外部连接器之间执行转换。与外部系统交换的消息类型,由****更新模式****(update mode)指定。Flink Table API中的更新模式有以下三种:追加模式(Append Mode)在追加模式下,表(动态表)和外部连接器只交换插入(Insert)消息。撤回模式(Retract Mode)在撤回模式下,表和外部连接器交换的是:添加(Add)和撤回(原创 2021-07-11 17:05:49 · 934 阅读 · 0 评论 -
Flink中Table表和DataStream流的互相转换
sensor.txtsensor_1,1547718199,35.8sensor_6,1547718201,15.4sensor_7,1547718202,6.7sensor_10,1547718205,38.1sensor_1,1547718207,37.2sensor_1,1547718212,33.5sensor_1,1547718215,38.1转换代码import org.apache.flink.streaming.api.scala._import org.apache.原创 2021-07-11 16:32:14 · 2323 阅读 · 0 评论 -
FlinkTableAPI和FlinkSql将计算出来的结果写到文件里面去
sensor.txtsensor_1,1547718199,35.8sensor_6,1547718201,15.4sensor_7,1547718202,6.7sensor_10,1547718205,38.1sensor_1,1547718207,37.2sensor_1,1547718212,33.5sensor_1,1547718215,38.1TableAPIimport org.apache.flink.streaming.api.scala._import org.apa原创 2021-07-11 16:25:45 · 1030 阅读 · 1 评论 -
Flink中用TableAPI和Flinksql做统计操作Demo
sensor.txtsensor_1,1547718199,35.8sensor_6,1547718201,15.4sensor_7,1547718202,6.7sensor_10,1547718205,38.1sensor_1,1547718207,37.2sensor_1,1547718212,33.5sensor_1,1547718215,38.1TableAPIimport org.apache.flink.streaming.api.scala._import org.apa原创 2021-07-11 15:50:28 · 777 阅读 · 7 评论 -
Flink读文件数据并且创建表然后用FlinkSql和TableAPI进行查询的Demo
sensor.txtsensor_1,1547718199,35.8sensor_6,1547718201,15.4sensor_7,1547718202,6.7sensor_10,1547718205,38.1sensor_1,1547718207,37.2sensor_1,1547718212,33.5sensor_1,1547718215,38.1TableAPIimport org.apache.flink.streaming.api.scala._import org.apa原创 2021-07-11 15:38:46 · 1017 阅读 · 1 评论 -
FlinkSql从Kafka里面数据并且将数据转成表的Demo
Flink定义表结构使用FlinkSQL读取kafka里面数据不涉及到source和sink概念的.import org.apache.flink.streaming.api.scala._import org.apache.flink.table.api.scala._import org.apache.flink.table.api.{DataTypes, Table}import org.apache.flink.table.descriptors._/** * 读取kafka的数据并原创 2021-07-11 14:32:51 · 1156 阅读 · 0 评论 -
FlinkSql读取文件数据并且将数据转成表的Demo
sensor.txt内容sensor_1,1547718199,35.8sensor_6,1547718201,15.4sensor_7,1547718202,6.7sensor_10,1547718205,38.1sensor_1,1547718207,37.2sensor_1,1547718212,33.5sensor_1,1547718215,38.1Flink定义表结构import org.apache.flink.streaming.api.scala._import org原创 2021-07-11 13:58:07 · 1191 阅读 · 0 评论 -
Flink中 TableAPI和FlinkSql的介绍和入门demo
sensor.txt 文件的内容sensor_1,1547718199,35.8sensor_6,1547718201,15.4sensor_7,1547718202,6.7sensor_10,1547718205,38.1sensor_1,1547718207,37.2sensor_1,1547718212,33.5sensor_1,1547718215,38.1TableAPIimport org.apache.flink.streaming.api.scala._import o原创 2021-07-11 13:02:24 · 408 阅读 · 0 评论 -
Flink SQL 与Flink Table API 的概念
出自尚硅谷Table API是流处理和批处理通用的关系型API,Table API可以基于流输入或者批输入来运行而不需要进行任何修改。Table API是SQL语言的超集并专门为Apache Flink设计的,Table API是Scala 和Java语言集成式的API。与常规SQL语言中将查询指定为字符串不同,Table API查询是以Java或Scala中的语言嵌入样式来定义的,具有IDE支持如:自动完成和语法检测。...原创 2021-07-11 11:57:53 · 277 阅读 · 0 评论 -
Flink从kafka里面读取数据
maven依赖<dependency> <groupId>org.apache.flink</groupId> <artifactId>flink-connector-kafka-0.11_2.11</artifactId> <version>1.10.0</version></dependency>代码import org.apache.flink.api.common.ser原创 2021-07-05 13:04:29 · 945 阅读 · 1 评论 -
Flink读取文件的内容并且输出的Demo
import org.apache.flink.streaming.api.scala._object SourceTest2 { def main(args: Array[String]): Unit = { // 创建执行环境 val env = StreamExecutionEnvironment.getExecutionEnvironment env.setParallelism(1) val value: DataStream[String] = env.原创 2021-07-03 13:01:09 · 742 阅读 · 0 评论 -
Flink从集合中读取数据并且输出出来
import org.apache.flink.streaming.api.scala._// 输入数据的样例类case class SensorReading2(id: String, timestamp: Long, temperature: Double)object SourceTest2 { def main(args: Array[String]): Unit = { // 创建执行环境 val env = StreamExecutionEnvironment.ge原创 2021-07-03 12:56:13 · 915 阅读 · 0 评论 -
flink 流式处理 wordcount
import org.apache.flink.api.java.utils.ParameterToolimport org.apache.flink.streaming.api.scala._// 流处理 word countobject StreamWordCount { def main(args: Array[String]): Unit = { // 创建流处理执行环境, val env: StreamExecutionEnvironment = StreamExecu原创 2021-06-05 12:40:17 · 297 阅读 · 0 评论 -
Flink 入门wordcount
txt文档内容hello worldhello flinkhello scalahow are youfine thank youand youscala代码import org.apache.flink.api.scala._// 批处理 word countobject WordCount { def main(args: Array[String]): Unit = { // 创建一个批处理的执行环境,这里是批处理环境,流处理环境和这个批处理环境不一样的.原创 2021-06-05 11:03:33 · 175 阅读 · 0 评论