Flink 算子汇总

Flink 算子汇总

API文档

Map

DataStream → DataStream :接受一个元素并产生一个元素,一对一关系。

DataStream<Integer> dataStream = //...
dataStream.map(new MapFunction<Integer, Integer>() {
    @Override
    public Integer map(Integer value) throws Exception {
        return 2 * value;
    }
});

FlatMap

DataStream → DataStream :接受一个元素并产生零个、一个、多个元素

dataStream.flatMap(new FlatMapFunction<String, String>() {
    @Override
    public void flatMap(String value, Collector<String> out)
        throws Exception {
        for(String word: value.split(" ")){
            out.collect(word);
        }
    }
});

Filter

DataStream → DataStream :评估每个元素的布尔函数并保留函数返回 true 的那些

dataStream.filter(new FilterFunction<Integer>() {
    @Override
    public boolean filter(Integer value) throws Exception {
        return value != 0;
    }
});

KeyBy

DataStream → KeyedStream :在逻辑上将流划分为不相交的分区。具有相同键的所有记录都分配到同一个分区。在内部,keyBy()是通过散列分区实现的

dataStream.keyBy(value -> value.getSomeKey());
dataStream.keyBy(value -> value.f0);

Reduce

KeyedStream → DataStream :对同一key的数据进行reduce聚合操作

keyedStream.reduce(new ReduceFunction<Integer>() {
    @Override
    public Integer reduce(Integer value1, Integer value2)
    throws Exception {
        return value1 + value2;
    }
});

Union

DataStream* → DataStream :两个或多个数据流的联合创建一个包含所有流中的所有元素的新流。注意:如果将数据流与其自身合并,您将在结果流中获得每个元素两次。

dataStream.union(otherStream1, otherStream2, ...);

Split,Select(废弃)->测输出流实现分流

该算子已被废弃,可以通过process接口,打上相应标记来实现分流

Connect,CoMap,CoFlatMap

DataStream,DataStream → ConnectedStream :“连接”两个保留其类型的数据流。连接允许两个流之间的共享状态,两个流的类型可以不一致。搭配CoMap,CoFlatMap来处理ConnectedStreams(多个流的connect)。

DataStream<Integer> someStream = //...
DataStream<String> otherStream = //...

ConnectedStreams<Integer, String> connectedStreams = someStream.connect(otherStream);

案例Demo

  • source数据流:
package com.ali.flink.demo.utils;

import com.ali.flink.demo.bean.TemperatureBean;
import org.apache.commons.math3.random.RandomDataGenerator;
import org.apache.flink.api.common.functions.RuntimeContext;
import org.apache.flink.runtime.state.FunctionInitializationContext;
import org.apache.flink.streaming.api.functions.source.datagen.DataGenerator;

import java.text.SimpleDateFormat;
import java.util.Date;
import java.util.Random;

public class DataGeneratorImpl004 implements DataGenerator<TemperatureBean> {


    RandomDataGenerator generator;

    @Override
    public void open(String s, FunctionInitializationContext functionInitializationContext, RuntimeContext runtimeContext) throws Exception {
        generator = new RandomDataGenerator();
    }

    @Override
    public boolean hasNext() {
        return true;
    }

    @Override
    public TemperatureBean next() {
        Random random = new Random();
        int sleep_cnt = random.nextInt(10);
        try {
            Thread.sleep(1000 * sleep_cnt);
        } catch (InterruptedException e) {
            e.printStackTrace();
        }
        String[] ids = new String[]{"T1","T2","T3","T4"};
        SimpleDateFormat simpleDateFormat = new SimpleDateFormat("yyyy-MM-dd HH:mm:ss");
        Date date = new Date();

        TemperatureBean temperatureBean = new TemperatureBean(ids[random.nextInt(4)], random.nextInt(10), simpleDateFormat.format(date), date.getTime());

        return temperatureBean;
    }
}
  • Map
package com.ali.flink.demo.driver;

import com.ali.flink.demo.bean.TemperatureBean;
import com.ali.flink.demo.utils.DataGeneratorImpl004;
import com.ali.flink.demo.utils.FlinkEnv;
import org.apache.flink.api.common.functions.MapFunction;
import org.apache.flink.streaming.api.datastream.DataStream;
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
import org.apache.flink.streaming.api.functions.source.datagen.DataGeneratorSource;

/**
 * flink 算子操作
 */
public class FlinkTransformerFunctionDemo01 {
    public static void main(String[] args) throws Exception {
        StreamExecutionEnvironment env = FlinkEnv.FlinkDataStreamRunEnv();

        env.setParallelism(1);

        DataGeneratorSource<TemperatureBean> dataGeneratorSource = new DataGeneratorSource<>(new DataGeneratorImpl004());

        DataStream<TemperatureBean> dataGeneratorStream = env.addSource(dataGeneratorSource).returns(TemperatureBean.class);
        dataGeneratorStream.print("source");

        // 对TemperatureBean对象数据的temperatureNum+10
        DataStream<TemperatureBean> mapStream = dataGeneratorStream.map(new MapFunction<TemperatureBean, TemperatureBean>() {
            @Override
            public TemperatureBean map(TemperatureBean temperatureBean) throws Exception {
                temperatureBean.setTemperatureNum(temperatureBean.getTemperatureNum() + 10);
                return temperatureBean;
            }
        });

        mapStream.print("mapStream");
        env.execute("tumble window test");
    }
}

-----------------------------结果----------------------------------
source> TemperatureBean{id='T4', temperatureNum=9, eventTime='2022-07-13 18:16:34', wartermark=1657707394228}
mapStream> TemperatureBean{id='T4', temperatureNum=19, eventTime='2022-07-13 18:16:34', wartermark=1657707394228}
  • FlatMap
package com.ali.flink.demo.driver;

import com.ali.flink.demo.bean.TemperatureBean;
import com.ali.flink.demo.utils.DataGeneratorImpl004;
import com.ali.flink.demo.utils.FlinkEnv;
import org.apache.flink.api.common.functions.FlatMapFunction;
import org.apache.flink.api.common.functions.MapFunction;
import org.apache.flink.streaming.api.datastream.DataStream;
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
import org.apache.flink.streaming.api.functions.source.datagen.DataGeneratorSource;
import org.apache.flink.util.Collector;

/**
 * flink 算子操作
 */
public class FlinkTransformerFunctionDemo02 {
    public static void main(String[] args) throws Exception {
        StreamExecutionEnvironment env = FlinkEnv.FlinkDataStreamRunEnv();

        env.setParallelism(1);

        DataGeneratorSource<TemperatureBean> dataGeneratorSource = new DataGeneratorSource<>(new DataGeneratorImpl004());

        DataStream<TemperatureBean> dataGeneratorStream = env.addSource(dataGeneratorSource).returns(TemperatureBean.class);
        dataGeneratorStream.print("source");

        // 使用map算子构造按逗号分割的字符串
        DataStream<String> mapStream = dataGeneratorStream.map(new MapFunction<TemperatureBean, String>() {
            @Override
            public String map(TemperatureBean temperatureBean) throws Exception {
                return temperatureBean.getId() + "," + temperatureBean.getTemperatureNum();
            }
        });

        // flatmap算子,对map流的数据按逗号分割
        DataStream<String> flapMapStream = mapStream.flatMap(new FlatMapFunction<String, String>() {
            @Override
            public void flatMap(String s, Collector<String> collector) throws Exception {
                String[] split = s.split(",");
                for (String value : split) {
                    collector.collect(value);
                }
            }
        });

        mapStream.print("mapStream");
        flapMapStream.print("flapMapStream");

        env.execute("tumble window test");
    }
}

-----------------------------结果----------------------------------
source> TemperatureBean{id='T1', temperatureNum=3, eventTime='2022-07-13 18:15:10', wartermark=1657707310157}
mapStream> T1,3
flapMapStream> T1
flapMapStream> 3
  • Filter
package com.ali.flink.demo.driver;

import com.ali.flink.demo.bean.TemperatureBean;
import com.ali.flink.demo.utils.DataGeneratorImpl004;
import com.ali.flink.demo.utils.FlinkEnv;
import org.apache.flink.api.common.functions.FilterFunction;
import org.apache.flink.streaming.api.datastream.DataStream;
import org.apache.flink.streaming.api.datastream.SingleOutputStreamOperator;
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
import org.apache.flink.streaming.api.functions.source.datagen.DataGeneratorSource;

/**
 * flink filter算子操作
 */
public class FlinkTransformerFunctionDemo03 {
    public static void main(String[] args) throws Exception {
        StreamExecutionEnvironment env = FlinkEnv.FlinkDataStreamRunEnv();

        env.setParallelism(1);

        DataGeneratorSource<TemperatureBean> dataGeneratorSource = new DataGeneratorSource<>(new DataGeneratorImpl004());

        DataStream<TemperatureBean> dataGeneratorStream = env.addSource(dataGeneratorSource).returns(TemperatureBean.class);
        dataGeneratorStream.print("source");

        // 过滤temperatureNum < 5的数据
        SingleOutputStreamOperator<TemperatureBean> filterStream = dataGeneratorStream.filter(new FilterFunction<TemperatureBean>() {
            @Override
            public boolean filter(TemperatureBean temperatureBean) throws Exception {
                return temperatureBean.getTemperatureNum() >= 5;
            }
        });

        filterStream.print("filterStream");


        env.execute("tumble window test");
    }
}

-----------------------------结果----------------------------------
source> TemperatureBean{id='T3', temperatureNum=3, eventTime='2022-07-13 18:21:50', wartermark=1657707710937}
source> TemperatureBean{id='T3', temperatureNum=0, eventTime='2022-07-13 18:21:58', wartermark=1657707718951}
source> TemperatureBean{id='T4', temperatureNum=5, eventTime='2022-07-13 18:21:58', wartermark=1657707718951}
filterStream> TemperatureBean{id='T4', temperatureNum=5, eventTime='2022-07-13 18:21:58', wartermark=1657707718951}
source> TemperatureBean{id='T2', temperatureNum=2, eventTime='2022-07-13 18:22:00', wartermark=1657707720957}
source> TemperatureBean{id='T1', temperatureNum=6, eventTime='2022-07-13 18:22:00', wartermark=1657707720957}
filterStream> TemperatureBean{id='T1', temperatureNum=6, eventTime='2022-07-13 18:22:00', wartermark=1657707720957}
  • KeyBy
package com.ali.flink.demo.driver;

import com.ali.flink.demo.bean.TemperatureBean;
import com.ali.flink.demo.utils.DataGeneratorImpl004;
import com.ali.flink.demo.utils.FlinkEnv;
import org.apache.flink.api.common.eventtime.SerializableTimestampAssigner;
import org.apache.flink.api.common.eventtime.WatermarkStrategy;
import org.apache.flink.api.java.functions.KeySelector;
import org.apache.flink.streaming.api.datastream.DataStream;
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
import org.apache.flink.streaming.api.functions.source.datagen.DataGeneratorSource;
import org.apache.flink.streaming.api.windowing.assigners.TumblingProcessingTimeWindows;
import org.apache.flink.streaming.api.windowing.time.Time;

import java.time.Duration;

/**
 * flink KeyBy算子操作
 */
public class FlinkTransformerFunctionDemo04 {
    public static void main(String[] args) throws Exception {
        StreamExecutionEnvironment env = FlinkEnv.FlinkDataStreamRunEnv();

        env.setParallelism(1);

        DataGeneratorSource<TemperatureBean> dataGeneratorSource = new DataGeneratorSource<>(new DataGeneratorImpl004());

        DataStream<TemperatureBean> dataGeneratorStream = env.addSource(dataGeneratorSource).returns(TemperatureBean.class);
        dataGeneratorStream.print("source");

        DataStream<TemperatureBean> maxStream = dataGeneratorStream.assignTimestampsAndWatermarks(WatermarkStrategy.<TemperatureBean>forBoundedOutOfOrderness(Duration.ofSeconds(2))
                .withTimestampAssigner(new SerializableTimestampAssigner<TemperatureBean>() {
                    @Override
                    public long extractTimestamp(TemperatureBean temperatureBean, long l) {
                        return temperatureBean.getWartermark();
                    }
                }))
                // 按 id进行分组
                .keyBy(new KeySelector<TemperatureBean, String>() {
                    @Override
                    public String getKey(TemperatureBean temperatureBean) throws Exception {
                        return temperatureBean.getId();
                    }
                })
                // 设置滚动窗口
                .window(TumblingProcessingTimeWindows.of(Time.seconds(20)))
                // 取分组内的最大温度
                .max("temperatureNum");

        maxStream.print("max temperature");

        env.execute("tumble window test");
    }
}

-----------------------------结果----------------------------------
source> TemperatureBean{id='T2', temperatureNum=3, eventTime='2022-07-13 18:33:43', wartermark=1657708423591}
source> TemperatureBean{id='T2', temperatureNum=1, eventTime='2022-07-13 18:33:49', wartermark=1657708429601}
source> TemperatureBean{id='T4', temperatureNum=2, eventTime='2022-07-13 18:33:55', wartermark=1657708435610}
source> TemperatureBean{id='T3', temperatureNum=6, eventTime='2022-07-13 18:33:55', wartermark=1657708435610}
max temperature> TemperatureBean{id='T2', temperatureNum=3, eventTime='2022-07-13 18:33:43', wartermark=1657708423591}
max temperature> TemperatureBean{id='T3', temperatureNum=6, eventTime='2022-07-13 18:33:55', wartermark=1657708435610}
max temperature> TemperatureBean{id='T4', temperatureNum=2, eventTime='2022-07-13 18:33:55', wartermark=1657708435610}
  • Reduce
package com.ali.flink.demo.driver;

import com.ali.flink.demo.bean.TemperatureBean;
import com.ali.flink.demo.utils.DataGeneratorImpl004;
import com.ali.flink.demo.utils.FlinkEnv;
import org.apache.flink.api.common.eventtime.SerializableTimestampAssigner;
import org.apache.flink.api.common.eventtime.WatermarkStrategy;
import org.apache.flink.api.common.functions.ReduceFunction;
import org.apache.flink.api.java.functions.KeySelector;
import org.apache.flink.streaming.api.datastream.DataStream;
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
import org.apache.flink.streaming.api.functions.source.datagen.DataGeneratorSource;
import org.apache.flink.streaming.api.windowing.assigners.TumblingProcessingTimeWindows;
import org.apache.flink.streaming.api.windowing.time.Time;

import java.time.Duration;

/**
 * flink Reduce算子操作
 */
public class FlinkTransformerFunctionDemo05 {
    public static void main(String[] args) throws Exception {
        StreamExecutionEnvironment env = FlinkEnv.FlinkDataStreamRunEnv();

        env.setParallelism(1);

        DataGeneratorSource<TemperatureBean> dataGeneratorSource = new DataGeneratorSource<>(new DataGeneratorImpl004());

        DataStream<TemperatureBean> dataGeneratorStream = env.addSource(dataGeneratorSource).returns(TemperatureBean.class);
        dataGeneratorStream.print("source");

        DataStream<TemperatureBean> reduceStream = dataGeneratorStream.assignTimestampsAndWatermarks(WatermarkStrategy.<TemperatureBean>forBoundedOutOfOrderness(Duration.ofSeconds(2))
                .withTimestampAssigner(new SerializableTimestampAssigner<TemperatureBean>() {
                    @Override
                    public long extractTimestamp(TemperatureBean temperatureBean, long l) {
                        return temperatureBean.getWartermark();
                    }
                }))
                // 按 id进行分组
                .keyBy(new KeySelector<TemperatureBean, String>() {
                    @Override
                    public String getKey(TemperatureBean temperatureBean) throws Exception {
                        return temperatureBean.getId();
                    }
                })
                // 设置滚动窗口
                .window(TumblingProcessingTimeWindows.of(Time.seconds(20)))
                // 取分组内的最大温度
                .reduce(new ReduceFunction<TemperatureBean>() {
                    @Override
                    public TemperatureBean reduce(TemperatureBean temperatureBean, TemperatureBean t1) throws Exception {
                        return temperatureBean.getTemperatureNum() > t1.getTemperatureNum() ? temperatureBean : t1;
                    }
                });

        reduceStream.print("reduceStream");

        env.execute("tumble window test");
    }
}

-----------------------------结果----------------------------------
source> TemperatureBean{id='T1', temperatureNum=2, eventTime='2022-07-13 18:38:29', wartermark=1657708709906}
source> TemperatureBean{id='T3', temperatureNum=7, eventTime='2022-07-13 18:38:32', wartermark=1657708712916}
source> TemperatureBean{id='T3', temperatureNum=2, eventTime='2022-07-13 18:38:39', wartermark=1657708719921}
reduceStream> TemperatureBean{id='T1', temperatureNum=2, eventTime='2022-07-13 18:38:29', wartermark=1657708709906}
reduceStream> TemperatureBean{id='T3', temperatureNum=7, eventTime='2022-07-13 18:38:32', wartermark=1657708712916}
  • Union
package com.ali.flink.demo.driver;

import com.ali.flink.demo.bean.TemperatureBean;
import com.ali.flink.demo.utils.DataGeneratorImpl004;
import com.ali.flink.demo.utils.FlinkEnv;
import org.apache.flink.streaming.api.datastream.DataStream;
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
import org.apache.flink.streaming.api.functions.source.datagen.DataGeneratorSource;

/**
 * flink union算子操作
 */
public class FlinkTransformerFunctionDemo06 {
    public static void main(String[] args) throws Exception {
        StreamExecutionEnvironment env = FlinkEnv.FlinkDataStreamRunEnv();

        env.setParallelism(1);

        DataGeneratorSource<TemperatureBean> dataGeneratorSource1 = new DataGeneratorSource<>(new DataGeneratorImpl004());
        DataStream<TemperatureBean> dataGeneratorStream1 = env.addSource(dataGeneratorSource1).returns(TemperatureBean.class);
        dataGeneratorStream1.print("temperature1 source");

        DataGeneratorSource<TemperatureBean> dataGeneratorSource2 = new DataGeneratorSource<>(new DataGeneratorImpl004());
        DataStream<TemperatureBean> dataGeneratorStream2 = env.addSource(dataGeneratorSource2).returns(TemperatureBean.class);
        dataGeneratorStream2.print("temperature2 source");

        DataStream<TemperatureBean> unionStream = dataGeneratorStream1.union(dataGeneratorStream2);

        unionStream.print("union stream");


        env.execute("tumble window test");
    }
}

-----------------------------结果----------------------------------
temperature2 source> TemperatureBean{id='T1', temperatureNum=4, eventTime='2022-07-13 18:45:19', wartermark=1657709119888}
union stream> TemperatureBean{id='T1', temperatureNum=4, eventTime='2022-07-13 18:45:19', wartermark=1657709119888}
union stream> TemperatureBean{id='T4', temperatureNum=0, eventTime='2022-07-13 18:45:20', wartermark=1657709120888}
temperature1 source> TemperatureBean{id='T4', temperatureNum=0, eventTime='2022-07-13 18:45:20', wartermark=1657709120888}
union stream> TemperatureBean{id='T4', temperatureNum=5, eventTime='2022-07-13 18:45:27', wartermark=1657709127900}
temperature2 source> TemperatureBean{id='T1', temperatureNum=8, eventTime='2022-07-13 18:45:27', wartermark=1657709127900}
temperature1 source> TemperatureBean{id='T4', temperatureNum=5, eventTime='2022-07-13 18:45:27', wartermark=1657709127900}
union stream> TemperatureBean{id='T1', temperatureNum=8, eventTime='2022-07-13 18:45:27', wartermark=1657709127900}
  • split, select
package com.ali.flink.demo.driver;

import com.ali.flink.demo.bean.TemperatureBean;
import com.ali.flink.demo.utils.DataGeneratorImpl004;
import com.ali.flink.demo.utils.FlinkEnv;
import org.apache.flink.streaming.api.datastream.DataStream;
import org.apache.flink.streaming.api.datastream.SingleOutputStreamOperator;
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
import org.apache.flink.streaming.api.functions.ProcessFunction;
import org.apache.flink.streaming.api.functions.source.datagen.DataGeneratorSource;
import org.apache.flink.util.Collector;
import org.apache.flink.util.OutputTag;

/**
 * flink split,select算子已经废弃,可以通过测输出流的方式来进行分流操作
 */
public class FlinkTransformerFunctionDemo07 {
    public static void main(String[] args) throws Exception {
        StreamExecutionEnvironment env = FlinkEnv.FlinkDataStreamRunEnv();

        env.setParallelism(1);

        DataGeneratorSource<TemperatureBean> dataGeneratorSource = new DataGeneratorSource<>(new DataGeneratorImpl004());
        DataStream<TemperatureBean> dataGeneratorStream = env.addSource(dataGeneratorSource).returns(TemperatureBean.class);
        dataGeneratorStream.print("temperature source");

        // 创建低温、高温两个数据流
        final OutputTag<TemperatureBean> lowOutputTag = new OutputTag<TemperatureBean>("low_temperature"){};
        final OutputTag<TemperatureBean> highOutputTag = new OutputTag<TemperatureBean>("high_temperature"){};


        SingleOutputStreamOperator<TemperatureBean> processStream = dataGeneratorStream.process(new ProcessFunction<TemperatureBean, TemperatureBean>() {
            @Override
            public void processElement(TemperatureBean temperatureBean, Context context, Collector<TemperatureBean> collector) throws Exception {
                if (temperatureBean.getTemperatureNum() > 5) {
                    context.output(highOutputTag, temperatureBean);
                } else {
                    context.output(lowOutputTag, temperatureBean);
                }
            }
        });

        DataStream<TemperatureBean> highStream = processStream.getSideOutput(highOutputTag);
        DataStream<TemperatureBean> lowStream = processStream.getSideOutput(lowOutputTag);

        highStream.print("high temperature");
        lowStream.print("low temperature");

        env.execute("tumble window test");
    }
}

-----------------------------结果----------------------------------
temperature source> TemperatureBean{id='T4', temperatureNum=9, eventTime='2022-07-13 19:08:59', wartermark=1657710539773}
high temperature> TemperatureBean{id='T4', temperatureNum=9, eventTime='2022-07-13 19:08:59', wartermark=1657710539773}
temperature source> TemperatureBean{id='T2', temperatureNum=2, eventTime='2022-07-13 19:09:08', wartermark=1657710548774}
low temperature> TemperatureBean{id='T2', temperatureNum=2, eventTime='2022-07-13 19:09:08', wartermark=1657710548774}
  • Connect,CoMap,CoFlatMap
package com.ali.flink.demo.driver;

import com.ali.flink.demo.bean.TemperatureBean;
import com.ali.flink.demo.utils.DataGeneratorImpl004;
import com.ali.flink.demo.utils.FlinkEnv;
import org.apache.flink.api.common.functions.MapFunction;
import org.apache.flink.api.java.tuple.Tuple3;
import org.apache.flink.api.java.tuple.Tuple4;
import org.apache.flink.streaming.api.datastream.ConnectedStreams;
import org.apache.flink.streaming.api.datastream.DataStream;
import org.apache.flink.streaming.api.datastream.SingleOutputStreamOperator;
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
import org.apache.flink.streaming.api.functions.ProcessFunction;
import org.apache.flink.streaming.api.functions.co.CoMapFunction;
import org.apache.flink.streaming.api.functions.source.datagen.DataGeneratorSource;
import org.apache.flink.util.Collector;
import org.apache.flink.util.OutputTag;

/**
 * flink connect, CoMap算子
 */
public class FlinkTransformerFunctionDemo08 {
    public static void main(String[] args) throws Exception {
        StreamExecutionEnvironment env = FlinkEnv.FlinkDataStreamRunEnv();

        env.setParallelism(1);

        DataGeneratorSource<TemperatureBean> dataGeneratorSource = new DataGeneratorSource<>(new DataGeneratorImpl004());
        DataStream<TemperatureBean> dataGeneratorStream = env.addSource(dataGeneratorSource).returns(TemperatureBean.class);
        dataGeneratorStream.print("temperature source");

        // 创建低温、高温两个数据流
        final OutputTag<TemperatureBean> lowOutputTag = new OutputTag<TemperatureBean>("low_temperature"){};
        final OutputTag<TemperatureBean> highOutputTag = new OutputTag<TemperatureBean>("high_temperature"){};


        SingleOutputStreamOperator<TemperatureBean> processStream = dataGeneratorStream.process(new ProcessFunction<TemperatureBean, TemperatureBean>() {
            @Override
            public void processElement(TemperatureBean temperatureBean, Context context, Collector<TemperatureBean> collector) throws Exception {
                if (temperatureBean.getTemperatureNum() > 5) {
                    context.output(highOutputTag, temperatureBean);
                } else {
                    context.output(lowOutputTag, temperatureBean);
                }
            }
        });

        DataStream<TemperatureBean> highStream = processStream.getSideOutput(highOutputTag);
        DataStream<TemperatureBean> lowStream = processStream.getSideOutput(lowOutputTag);

        highStream.print("high temperature");
        lowStream.print("low temperature");

        DataStream<Tuple3<String, Integer, String>> mapStream = highStream.map(new MapFunction<TemperatureBean, Tuple3<String, Integer, String>>() {
            @Override
            public Tuple3<String, Integer, String> map(TemperatureBean temperatureBean) throws Exception {
                return Tuple3.of(temperatureBean.getId(), temperatureBean.getTemperatureNum(), temperatureBean.getEventTime());
            }
        });

        mapStream.print("mapStream");

        ConnectedStreams<Tuple3<String, Integer, String>, TemperatureBean> connectStream = mapStream.connect(lowStream);

        SingleOutputStreamOperator<Object> resultStream = connectStream.map(new CoMapFunction<Tuple3<String, Integer, String>, TemperatureBean, Object>() {
            @Override
            public Object map1(Tuple3<String, Integer, String> t3) throws Exception {
                return Tuple4.of(t3.f0, t3.f1, t3.f2, "high");
            }

            @Override
            public Object map2(TemperatureBean temperatureBean) throws Exception {
                return Tuple3.of(temperatureBean.getId(), temperatureBean.getTemperatureNum(), temperatureBean.getEventTime());
            }
        });

        resultStream.print("connectStream");

        env.execute("tumble window test");
    }
}

-----------------------------结果----------------------------------
temperature source> TemperatureBean{id='T2', temperatureNum=8, eventTime='2022-07-13 19:19:32', wartermark=1657711172552}
high temperature> TemperatureBean{id='T2', temperatureNum=8, eventTime='2022-07-13 19:19:32', wartermark=1657711172552}
mapStream> (T2,8,2022-07-13 19:19:32)
connectStream> (T2,8,2022-07-13 19:19:32,high)
temperature source> TemperatureBean{id='T2', temperatureNum=3, eventTime='2022-07-13 19:19:35', wartermark=1657711175562}
low temperature> TemperatureBean{id='T2', temperatureNum=3, eventTime='2022-07-13 19:19:35', wartermark=1657711175562}
connectStream> (T2,3,2022-07-13 19:19:35)
  • 1
    点赞
  • 1
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值