Flink sql实现topN聚合结果写入kafka (Flink 1.11.0版本)

先跟鸡哥打个广告 ,博客地址: https://me.csdn.net/weixin_47482194

写的博客很有水平的,上了几次官网推荐了。

 

步入正题,在大家接触Flink SQL的时候,肯定绕不过kafka,在写入kafka的时候,不晓得大家有没有遇到问题?如下:

 Exception in thread "main" org.apache.flink.table.api.TableException: AppendStreamTableSink requires that Table has only insert changes.

额,开什么玩笑。。最基础的 select count(*) from table 这种语句都不支持的吗????

官网的解释是:这个问题是因Flink内部Retract机制导致,在没有考虑对Chanage log全链路支持之前,无法在Kafka这样的Append only的消息队列增加对Retract/Upsert的支持。

好在table可以转变stream,这是下面的代码(我这里是分组取的topn):

如果大家嫌弃还要连接kafka麻烦的话,可以直接source生产数据替代读取kafka。

public class FlinkTopN2Doris {


    private static final String KAFKA_SQL = "CREATE TABLE kafka_table (" +
            " category_id STRING," +
            " user_id STRING ," +
            " item_id STRING ," +
            " behavior STRING ," +
            " ts STRING ," +
//            " proctime as PROCTIME() ," +
            " row_ts AS TO_TIMESTAMP(FROM_UNIXTIME(cast(ts AS BIGINT), 'yyyy-MM-dd HH:mm:ss'))," +
            " WATERMARK FOR row_ts AS row_ts - INTERVAL '5' SECOND " +
            ") WITH (" +
            " 'connector' = 'kafka'," +
            " 'topic' = 'flink_test'," +
            " 'properties.bootstrap.servers' = '192.168.12.188:9092'," +
            " 'properties.group.id' = 'test1'," +
            " 'format' = 'json'," +
            " 'scan.startup.mode' = 'earliest-offset'" +
            ")";

    private static final String SINK_KAFKA_SQL = "CREATE TABLE kafka_table2 (" +
            " ts STRING," +
            " user_id STRING ," +
            " behavior STRING ," +
            "row_num BIGINT " +
            ") WITH (" +
            " 'connector' = 'kafka'," +
            " 'topic' = 'flink_test2'," +
            " 'properties.bootstrap.servers' = '192.168.12.188:9092'," +
            " 'properties.group.id' = 'test1'," +
            " 'format' = 'json'," +
            " 'scan.startup.mode' = 'earliest-offset'" +
            ")";

    private static final String PRINT_SQL = "create table sink_print (" +
            "  p_count BIGINT ," +
            "  b STRING " +
            ") with ('connector' = 'print' )";


    private static final String PRINT_SQL2 = "create table sink_print2 (" +
            "  a STRING," +
            "  b STRING," +
            "  c STRING," +
            "  d BIGINT " +
            ") with ('connector' = 'print' )";

    public static void main(String[] args) throws Exception {
        StreamExecutionEnvironment bsEnv = StreamExecuti
  • 1
    点赞
  • 14
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
可以通过 Flink SQL 中的 `PARTITION BY` 子句来指定分区字段,然后使用 FlinkKafka Producer 将数据发送到 Kafka 中。下面是一个示例代码: ```java // 创建 Flink Table Environment StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment(); env.setStreamTimeCharacteristic(TimeCharacteristic.EventTime); StreamTableEnvironment tEnv = StreamTableEnvironment.create(env); // 创建 Kafka Producer 配置 Properties kafkaProps = new Properties(); kafkaProps.setProperty("bootstrap.servers", "localhost:9092"); // 定义输入数据源 String sourceDDL = "CREATE TABLE source_table (id INT, name STRING, event_time TIMESTAMP(3), WATERMARK FOR event_time AS event_time - INTERVAL '5' SECOND) WITH (...)"; tEnv.executeSql(sourceDDL); // 定义输出数据源 String sinkDDL = "CREATE TABLE sink_table (id INT, name STRING) WITH ('connector' = 'kafka', 'topic' = 'output-topic', 'properties' = '" + kafkaProps.toString() + "', 'format' = 'json')"; tEnv.executeSql(sinkDDL); // 执行 SQL 查询并写入 Kafka String sql = "INSERT INTO sink_table SELECT id, name FROM source_table PARTITION BY id"; tEnv.executeSql(sql); ``` 在上面的代码中,我们首先创建了一个 Flink Table Environment,并且定义了 Kafka Producer 的配置。然后,我们使用 Flink SQL 创建了输入和输出表。输入表包括一个 `event_time` 字段,我们使用它来定义 watermark。输出表是一个 Kafka topic,我们使用 `PARTITION BY` 子句按照 `id` 字段进行分区。最后,我们执行了 SQL 查询并将结果写入 Kafka topic。

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值