Flink
TracyGao01
保持专注
展开
-
Flink Sql 窗口聚合触发配置
flink:1.11默认情况下,Flink Sql的窗口聚合在窗口结束时触发计算,如果需要实时计算,刷新计算结果,需添加如下配置: TableConfig config = bbTableEnv.getConfig(); config.getConfiguration().setBoolean("table.exec.emit.early-fire.enabled", true); config.getConfiguration().setString("table.exec.emit.earl.原创 2021-10-24 18:10:08 · 3802 阅读 · 0 评论 -
Apache Flink: savepoint Operations
Flink版本:1.11.1以下操作基于hadoop yarn文章目录查看Job信息保存一个savepointcancel生成savepoint带savepoint重启job查看Job信息flink list yarn-cluster -yid application_1629715128130_0031保存一个savepointflink savepoint 6eb006988b9ab8ca1feb180c04091d98 hdfs://hadoop102.dev:8020/flink.原创 2021-09-02 19:30:52 · 104 阅读 · 0 评论 -
Ambari安装flink服务
参考:https://github.com/abajwa-hw/ambari-flink-service设置Hadoop版本VERSION=`hdp-select status hadoop-client | sed 's/hadoop-client - \([0-9]\.[0-9]\).*/\1/'`下载ambari-flink-service服务sudo git clone https://github.com/abajwa-hw/ambari-flink-service.git .原创 2021-07-05 21:10:07 · 601 阅读 · 2 评论 -
Flink Interval Join Left Join
Flink:1.11问题Interval Join Left Outer Join不生效,Web一直显示Join Type为Inner Join原SQL var sqlQuery = """ |insert into for_shop_nt_order_detail | select | f.id,a.creat_date as create_date,shop_id,a.state,a.name,sale_amount,c..原创 2021-05-26 15:09:43 · 1649 阅读 · 6 评论 -
Flink-1.11:Error: Could not find or load main class org.apache.flink.api.scala.FlinkShell
文章目录Flink 1.11 with scala 2.12在启动flink-shell时报错解决方法测试Flink 1.11 with scala 2.12在启动flink-shell时报错Error: Could not find or load main class org.apache.flink.api.scala.FlinkShell解决方法修改pom文件:flink-dist/pom.xmlpom.xml重新编译打包:mvn clean install -D原创 2021-02-04 18:44:40 · 1881 阅读 · 0 评论 -
Flink 1.11.1:table sql支持cdc debezium数据源下的Interval Join
Flink:1.11.1目的Flink SQL CDC 模式输出的Json类型数据不支持 Interval Join,由于Interval Join只支持 append-only 的表,所以这里需要修改CDC模式debezium组件的输出格式,适配支持Table Interval Join实现这里我们通过新增一个format的形式来适配Interval Join,取名为’insert-debezium-json’,这里需要新建两个class文件如下:DebeziumJsonDeserizat.原创 2020-12-09 17:33:11 · 1101 阅读 · 1 评论 -
Flink 1.11.1:table sql Kafka Connector支持Upsert写入
flink版本:1.11.1文章目录目的测试的代码flink本身的Kafka Connector重新定义一个支持Upsert的Kafka Connector Sink编译打包替换jar包并测试目的在使用flink table sql的情况下,使kafka connector sink支持upsert正常写入测试的代码val fsSettings = EnvironmentSettings.newInstance().useBlinkPlanner().inStreamingMode().bui.原创 2020-11-25 19:39:55 · 2876 阅读 · 0 评论 -
Flink 1.11.1:flink CDC Debezium自定义修改debezium-json格式
前言Flink 1.11新增支持CDC,包括Debezium、Canal,现修改debezium-json的format格式默认输出格式1、插入(true,1,2,3)2、更新(false,1,2,3)(true,1,2,43、删除(false,1,2,3)(true,1,2,43、原创 2020-10-09 21:41:52 · 4616 阅读 · 0 评论 -
Flink 1.11.1:flink sql多表join
从kafka读取3个topic,join后写数据到clickhouse代码示例:val createTable = """ |CREATE TABLE nt_sale_order ( | id VARCHAR, | write_date BIGINT, | create_uid INT, | name VARCHAR, | op VARCHAR |) |WITH ( | 'connector'原创 2020-10-09 15:46:33 · 1696 阅读 · 2 评论 -
Flink 1.11.1:flink sql kafka-connector示例
示例:val createTable = """ |CREATE TABLE nt_sale_order ( | id VARCHAR, | write_date BIGINT, | create_uid INT, | name VARCHAR, | op VARCHAR |) |WITH ( | 'connector' = 'kafka', | 'topic' = 'shopfo原创 2020-10-09 15:38:33 · 1425 阅读 · 0 评论 -
Flink 1.11.1:jdbc-connector 添加支持Clickhouse
文章目录解决的问题:做的代码改动:编译使用替换已有的flink-connector-jdbc包解决的问题:Flink JDBC写数据到Clickhouse默认没有实现,需要自己实现,需新添加ClickhouseJDBCDialect做的代码改动:1、修改flink-release-1.11.1/flink-connectors/flink-connector-jdbc/src/main/java/org/apache/flink/connector/jdbc/dialect/JdbcDialect原创 2020-10-09 15:14:56 · 3120 阅读 · 7 评论 -
Flink:RetractStreamTableSink 自定义sink写数据到Phoenix
文章目录目标Sink定义使用自定义Sink测试目标两张表left join的结果更新写入Phoenix数据样例:(true,12,2,3,2020-06-18T00:00,2020-06-18T00:00)(true,12,2,5,2020-06-18T00:00,2020-06-18T00:00)(true,12,2,2,2020-06-18T00:00,2020-06-18T00:00)(true,12,2,4,2020-06-18T00:00,2020-06-18T00:00)(tru原创 2020-06-18 20:34:27 · 1924 阅读 · 0 评论 -
Flink 1.9.1:JDBCUpsertTableSInk 写数据到Phoenix
文章目录目的实现测试目的Flink插入跟新数据到Phoenix实现目前使用的JDBCUpsertTableSink支持如下集中RDB:DerbyMysqlPostgresql所以需要定义一下Apache Phoenix的写入形式:https://github.com/apache/flink/blob/release-1.9.1/flink-connectors/flink-jdbc/src/main/java/org/apache/flink/api/java/io/jdbc/dia原创 2020-06-12 10:58:20 · 2835 阅读 · 2 评论 -
flink 1.9.1异常:JDBCUpsertTableSink 使用Postgresql连接时报错
文章目录异常问题原因解决测试异常Caused by: java.sql.BatchUpdateException: Batch entry 0 INSERT INTO "action_log"("id", "cnt") VALUES ('1', 1) ON CONFLICT ("id" DO UPDATE SET "id"=EXCLUDED."id", "cnt"=EXCLUDED."cnt" was aborted: ERROR: syntax error at or near "DO"原创 2020-06-11 18:52:06 · 2865 阅读 · 0 评论 -
Flink写入Phoenix报错:Schema with given name already exists schemaName=SYSTEM
Phoenix:5.0Flink:1.9Phoenix设置:<property> <name>phoenix.schema.isNamespaceMappingEnabled</name> <value>true</value></property>Flink 通过jdbc sink...原创 2019-12-31 17:11:25 · 1778 阅读 · 0 评论 -
Flik代码示例(2):JDBC写数据到Phoenix
Flink: Blink分支 1.5.1https://github.com/apache/flink/tree/blinkMaven Dependency:<dependency> <groupId>com.alibaba.blink</groupId> <artifactId>flink-jdbc</artifac...原创 2019-05-23 17:24:40 · 3062 阅读 · 0 评论 -
Flink代码示例(1):消费Kafka Demo
Flink: Blink分支 1.5.1https://github.com/apache/flink/tree/blinkMaven Dependency:<dependency> <groupId>com.alibaba.blink</groupId> <artifactId>flink-scala_2.11</a...原创 2019-05-23 16:59:38 · 2749 阅读 · 0 评论