flink如何正确分流

flink如何正确分流

分流方式

  1. filter分流
  2. split分流 (只能一次分流,分流后的流不能继续分流)
  3. side output分流 (推荐使用)

场景

 

输入数据:

 

{"key":"001","type":"1","data":"data1"}
{"key":"001","type":"11","data":"data11"}
{"key":"001","type":"12","data":"data12"}
{"key":"002","type":"2","data":"data2"}
{"key":"002","type":"21","data":"data21"}
{"key":"002","type":"22","data":"data22"}

Operator中处理:转为jsonObject处理,并在jsonObject对象中添加流标志;

pom中需要引入fastjson (fastjson使用教程);

 

<dependency>
  <groupId>com.alibaba</groupId>
  <artifactId>fastjson</artifactId>
  <version>1.2.60</version>
</dependency>

输出结果为:

11> {"data":"data1","stream":"stream1","type":"1","key":"001"}
12> {"data":"data11","stream":"stream1","type":"11","key":"001"}
12> {"data":"data11","stream":"stream11","type":"11","key":"001"}
1> {"data":"data12","stream":"stream1","type":"12","key":"001"}
1> {"data":"data12","stream":"stream12","type":"12","key":"001"}
2> {"data":"data2","stream":"stream2","type":"2","key":"002"}
3> {"data":"data21","stream":"stream2","type":"21","key":"002"}
3> {"data":"data21","stream":"stream21","type":"21","key":"002"}
4> {"data":"data22","stream":"stream2","type":"22","key":"002"}
4> {"data":"data22","stream":"stream22","type":"22","key":"002"}

发送数据客户端

nc程序发送(nc程序实现)

Fliter分流

代码实现:

 

public class FliterStreamTest {

    public static final String TYPE = "type";

    public static void main(String[] args) throws Exception {

        //获取执行环境
        StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();

        final ParameterTool params = ParameterTool.fromArgs(args);
        String hostName = params.get("hostname", "localhost");
        int port = params.getInt("port", 9000);

        DataStream<String> sourceStream = env.socketTextStream(hostName, port, "\n");

        SingleOutputStreamOperator<JSONObject> jsonObjectStream = sourceStream.map(s -> JSONObject.parseObject(s));

        DataStream<JSONObject> stream1 = jsonObjectStream
                .filter(e -> e.get(TYPE).equals("1") || e.get(TYPE).equals("11") || e.get(TYPE).equals("12"));
        DataStream<JSONObject> stream11 = stream1.filter(e -> e.get(TYPE).equals("11"));
        DataStream<JSONObject> stream12 = stream1.filter(e -> e.get(TYPE).equals("12"));

        DataStream<JSONObject> stream2 = jsonObjectStream
                .filter(e -> e.get(TYPE).equals("2") || e.get(TYPE).equals("21") || e.get(TYPE).equals("22"));
        DataStream<JSONObject> stream21 = stream2.filter(e -> e.get(TYPE).equals("21"));
        DataStream<JSONObject> stream22 = stream2.filter(e -> e.get(TYPE).equals("22"));

        //数据去向
        stream1.map(e -> {e.put("stream", "stream1");return e;}).print();
        stream11.map(e -> {e.put("stream", "stream11");return e;}).print();
        stream12.map(e -> {e.put("stream", "stream12");return e;}).print();
        stream2.map(e -> {e.put("stream", "stream2");return e;}).print();
        stream21.map(e -> {e.put("stream", "stream21");return e;}).print();
        stream22.map(e -> {e.put("stream", "stream22");return e;}).print();

        env.execute("SocketStreamTest");
    }
}

输出结果(与正确结果一直):

11> {"data":"data1","stream":"stream1","type":"1","key":"001"}
12> {"data":"data11","stream":"stream1","type":"11","key":"001"}
12> {"data":"data11","stream":"stream11","type":"11","key":"001"}
1> {"data":"data12","stream":"stream1","type":"12","key":"001"}
1> {"data":"data12","stream":"stream12","type":"12","key":"001"}
2> {"data":"data2","stream":"stream2","type":"2","key":"002"}
3> {"data":"data21","stream":"stream2","type":"21","key":"002"}
3> {"data":"data21","stream":"stream21","type":"21","key":"002"}
4> {"data":"data22","stream":"stream2","type":"22","key":"002"}
4> {"data":"data22","stream":"stream22","type":"22","key":"002"}

Split分流

代码实现:

 

public class SplitStreamTest {

    public static final String TYPE = "type";

    public static void main(String[] args) throws Exception {

        //获取执行环境
        StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();

        final ParameterTool params = ParameterTool.fromArgs(args);
        String hostName = params.get("hostname", "localhost");
        int port = params.getInt("port", 9000);

        DataStream<String> sourceStream = env.socketTextStream(hostName, port, "\n");

        SingleOutputStreamOperator<JSONObject> jsonObjectStream = sourceStream.map(s -> JSONObject.parseObject(s));

        //flink split流已被标记为过期
        SplitStream<JSONObject> splitStream = jsonObjectStream.split(new OutputSelector<JSONObject>() {
            @Override
            public Iterable<String> select(JSONObject jsonObject) {
                List<String> tags = new ArrayList<>();
                String type = jsonObject.get("type").toString();
                if (type.equals("1") || type.equals("11") || type.equals("12")) {
                    tags.add("stream1");
                } else if (type.equals("2") || type.equals("21") || type.equals("22")) {
                    tags.add("stream2");
                }
                return tags;
            }
        });

        DataStream<JSONObject> stream1 = splitStream.select("stream1");

        DataStream<JSONObject> stream2 = splitStream.select("stream2");

        SplitStream<JSONObject> splitStream1 = stream1.split(new OutputSelector<JSONObject>() {
            @Override
            public Iterable<String> select(JSONObject jsonObject) {
                List<String> tags = new ArrayList<>();
                String type = jsonObject.get("type").toString();
                if (type.equals("11")) {
                    tags.add("stream11");
                } else if (type.equals("12")) {
                    tags.add("stream12");
                }
                return tags;
            }
        });

        SplitStream<JSONObject> splitStream2 = stream1.split(new OutputSelector<JSONObject>() {
            @Override
            public Iterable<String> select(JSONObject jsonObject) {
                List<String> tags = new ArrayList<>();
                String type = jsonObject.get("type").toString();
                if (type.equals("21")) {
                    tags.add("stream21");
                } else if (type.equals("22")) {
                    tags.add("stream22");
                }
                return tags;
            }
        });

        DataStream<JSONObject> stream11 = splitStream1.select("stream11");
        DataStream<JSONObject> stream12 = splitStream1.select("stream12");
        DataStream<JSONObject> stream21 = splitStream2.select("stream21");
        DataStream<JSONObject> stream22 = splitStream2.select("stream22");

        //数据去向
        stream1.map(e -> {e.put("stream", "stream1");return e;}).print();
        stream11.map(e -> {e.put("stream", "stream11");return e;}).print();
        stream12.map(e -> {e.put("stream", "stream12");return e;}).print();
        stream2.map(e -> {e.put("stream", "stream2");return e;}).print();
        stream21.map(e -> {e.put("stream", "stream21");return e;}).print();
        stream22.map(e -> {e.put("stream", "stream22");return e;}).print();

        env.execute("SocketStreamTest");
    }

}

输出结果:

Flink 1.9以前可以输出,但是第二次split将会失效;Flink 1.9 如果二次split,直接报错并提示使用side-outputs

Exception in thread "main" java.lang.IllegalStateException: Consecutive multiple splits are not supported. Splits are deprecated. Please use side-outputs.

Side-outputs分流(推荐使用)

步骤:

  1. 定义 OutputTag
  2. process 算子中收集数据
    1. ProcessFunction
    2. KeyedProcessFunction
    3. CoProcessFunction
    4. ProcessWindowFunction
    5. ProcessAllWindowFunction

代码实现:

 

public class SideOutputTest {

    public static final String TYPE = "type";

    public static void main(String[] args) throws Exception{
        //获取执行环境
        StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();

        final ParameterTool params = ParameterTool.fromArgs(args);
        String hostName = params.get("hostname", "localhost");
        int port = params.getInt("port", 9000);

        DataStream<String> sourceStream = env.socketTextStream(hostName, port, "\n");

        SingleOutputStreamOperator<JSONObject> jsonObjectStream = sourceStream.map(s -> JSONObject.parseObject(s));

        //定义OutputTag
        OutputTag<JSONObject> outputTag1 = new OutputTag<JSONObject>("stream1") {};
        OutputTag<JSONObject> outputTag2 = new OutputTag<JSONObject>("stream2") {};

        SingleOutputStreamOperator<JSONObject> outputStream = jsonObjectStream.process(new ProcessFunction<JSONObject, JSONObject>() {

            @Override
            public void processElement(JSONObject jsonObject, Context context, Collector<JSONObject> collector)
                    throws Exception {
                String type = jsonObject.getString(TYPE);
                if (type.equals("1") || type.equals("11") || type.equals("12")) {
                    context.output(outputTag1, jsonObject);
                } else if (type.equals("2") || type.equals("21") || type.equals("22")) {
                    context.output(outputTag2, jsonObject);
                }
            }
        });

        DataStream<JSONObject> stream1 = outputStream.getSideOutput(outputTag1);
        DataStream<JSONObject> stream2 = outputStream.getSideOutput(outputTag2);

        //定义OutputTag
        OutputTag<JSONObject> outputTag11 = new OutputTag<JSONObject>("stream11") {};
        OutputTag<JSONObject> outputTag12 = new OutputTag<JSONObject>("stream12") {};

        SingleOutputStreamOperator<JSONObject> outputStream1 = stream1.process(new ProcessFunction<JSONObject, JSONObject>() {

            @Override
            public void processElement(JSONObject jsonObject, Context context, Collector<JSONObject> collector)
                    throws Exception {
                String type = jsonObject.getString(TYPE);
                if (type.equals("11")) {
                    context.output(outputTag11, jsonObject);
                } else if (type.equals("12")) {
                    context.output(outputTag12, jsonObject);
                }
            }
        });

        DataStream<JSONObject> stream11 = outputStream1.getSideOutput(outputTag11);
        DataStream<JSONObject> stream12 = outputStream1.getSideOutput(outputTag12);

        //定义OutputTag
        OutputTag<JSONObject> outputTag21 = new OutputTag<JSONObject>("stream21") {};
        OutputTag<JSONObject> outputTag22 = new OutputTag<JSONObject>("stream22") {};

        SingleOutputStreamOperator<JSONObject> outputStream2 = stream2.process(new ProcessFunction<JSONObject, JSONObject>() {

            @Override
            public void processElement(JSONObject jsonObject, Context context, Collector<JSONObject> collector)
                    throws Exception {
                String type = jsonObject.getString(TYPE);
                if (type.equals("21")) {
                    context.output(outputTag21, jsonObject);
                } else if (type.equals("22")) {
                    context.output(outputTag22, jsonObject);
                }
            }
        });

        DataStream<JSONObject> stream21 = outputStream2.getSideOutput(outputTag21);
        DataStream<JSONObject> stream22 = outputStream2.getSideOutput(outputTag22);

        //数据去向
        stream1.map(e -> {e.put("stream", "stream1");return e;}).print();
        stream11.map(e -> {e.put("stream", "stream11");return e;}).print();
        stream12.map(e -> {e.put("stream", "stream12");return e;}).print();
        stream2.map(e -> {e.put("stream", "stream2");return e;}).print();
        stream21.map(e -> {e.put("stream", "stream21");return e;}).print();
        stream22.map(e -> {e.put("stream", "stream22");return e;}).print();

        env.execute("SocketStreamTest");
    }
}

输出结果(与结果一致):

3> {"data":"data1","stream":"stream1","type":"1","key":"001"}
4> {"data":"data11","stream":"stream11","type":"11","key":"001"}
4> {"data":"data11","stream":"stream1","type":"11","key":"001"}
5> {"data":"data12","stream":"stream12","type":"12","key":"001"}
5> {"data":"data12","stream":"stream1","type":"12","key":"001"}
6> {"data":"data2","stream":"stream2","type":"2","key":"002"}
7> {"data":"data21","stream":"stream21","type":"21","key":"002"}
7> {"data":"data21","stream":"stream2","type":"21","key":"002"}
8> {"data":"data22","stream":"stream22","type":"22","key":"002"}
8> {"data":"data22","stream":"stream2","type":"22","key":"002"}



作者:蜗牛写java
链接:https://www.jianshu.com/p/274d0b78d378
来源:简书
著作权归作者所有。商业转载请联系作者获得授权,非商业转载请注明出处。

  • 1
    点赞
  • 3
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
### 回答1: Flink是一个开源的大数据处理框架,可以用于实时流处理和批处理。 使用Java编写Flink代码需要几个步骤: 1. 创建Maven项目并添加Flink依赖。 在pom.xml文件中加入如下依赖: ``` <dependencies> <dependency> <groupId>org.apache.flink</groupId> <artifactId>flink-java</artifactId> <version>1.12.0</version> </dependency> </dependencies> ``` 2. 创建数据源。 Flink支持多种数据源,比如文件、Kafka、Socket等。你可以使用如下代码创建一个文件数据源: ``` // 创建数据源 DataStream<String> text = env.readTextFile("file:///path/to/file"); ``` 3. 定义转换操作。 Flink支持许多转换操作,比如map、filter、reduce等。你可以使用如下代码对数据流中的每条记录执行map操作: ``` // 定义转换操作 DataStream<Integer> numbers = text.map(new MapFunction<String, Integer>() { @Override public Integer map(String value) throws Exception { return Integer.parseInt(value); } }); ``` 4. 定义数据分流逻辑。 Flink提供了split和select操作来实现数据分流。你可以使用如下代码对数据流进行分流: ``` // 定义数据分流逻辑 SplitStream<Integer> splitStream = numbers.split(new OutputSelector<Integer>() { @Override public Iterable<String> select(Integer value) { List<String> outputs = new ArrayList<>(); if (value % 2 == 0) { outputs.add("even"); ### 回答2: Flink 是一个开源的流处理框架,使用 Java 编写 Flink 分流代码可以帮助我们对数据进行高效的处理和分析。下面是一个简单的示例: ```java // 导入必要的包 import org.apache.flink.api.common.functions.MapFunction; import org.apache.flink.streaming.api.datastream.DataStream; import org.apache.flink.streaming.api.datastream.SplitStream; import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment; public class FlinkDataStreamSplitExample { public static void main(String[] args) throws Exception { // 创建 Flink 执行环境 StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment(); // 生成一个包含一系列整数的输入流 DataStream<Integer> input = env.fromElements(1, 2, 3, 4, 5); // 使用 MapFunction 将输入流分成两个流 SplitStream<Integer> splitStream = input.split(new MySplitter()); // 通过选择器选择不同的输出流并进行处理 DataStream<Integer> evenStream = splitStream.select("even"); DataStream<Integer> oddStream = splitStream.select("odd"); // 在控制台打印偶数流和奇数流 evenStream.print(); oddStream.print(); // 执行作业 env.execute("Flink Data Stream Split Example"); } // 自定义分流逻辑 public static class MySplitter implements org.apache.flink.streaming.api.collector.selector.OutputSelector<Integer> { @Override public Iterable<String> select(Integer value) { if (value % 2 == 0) { // 偶数流选择器 return Collections.singletonList("even"); } else { // 奇数流选择器 return Collections.singletonList("odd"); } } } } ``` 这段代码首先导入 Flink 相关的包,并创建了一个 Flink 执行环境。然后,它生成了一个包含一系列整数的输入流。接下来使用 `MapFunction` 对输入流进行分流操作,将其分成两个流,其中一个流包含偶数,另一个流包含奇数。再通过选择器选择要处理的流,并在控制台打印。最后,使用 `execute` 方法执行 Flink 作业。 这只是一个简单的示例,实际业务场景中会更加复杂。在实际应用中,我们可以根据具体需求自定义分流逻辑,以便更好地处理数据。 ### 回答3: 使用Java编写flink数据分流代码可以借助DataStream API来实现。首先,我们需要创建一个ExecutionEnvironment或者StreamExecutionEnvironment对象来执行任务,并导入必要的flink依赖包。 以下是一个示例代码,演示了如何使用Java编写flink数据分流代码: ```java import org.apache.flink.api.java.tuple.Tuple2; import org.apache.flink.streaming.api.datastream.DataStream; import org.apache.flink.streaming.api.datastream.SplitStream; import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment; public class DataStreamSplitExample { public static void main(String[] args) throws Exception { // 创建执行环境 StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment(); // 创建数据流 DataStream<String> inputDataStream = env.socketTextStream("localhost", 9999); // 对数据流进行分流处理 SplitStream<Tuple2<String, Integer>> splitStream = inputDataStream .flatMap((String value, Collector<Tuple2<String, Integer>> out) -> { // 对数据进行拆分,生成一个新的数据流 String[] words = value.split(" "); for (String word : words) { out.collect(new Tuple2<>(word, 1)); } }) .split((OutputSelector<Tuple2<String, Integer>>) value -> { // 通过定义OutputSelector来对数据流进行分流 List<String> output = new ArrayList<>(); if (value.f0.contains("java")) { output.add("java"); } else if (value.f0.contains("python")) { output.add("python"); } else { output.add("other"); } return output; }); // 获取分流后的数据流 DataStream<Tuple2<String, Integer>> javaDataStream = splitStream.select("java"); DataStream<Tuple2<String, Integer>> pythonDataStream = splitStream.select("python"); DataStream<Tuple2<String, Integer>> otherDataStream = splitStream.select("other"); // 打印结果 javaDataStream.print("Java Stream"); pythonDataStream.print("Python Stream"); otherDataStream.print("Other Stream"); // 执行任务 env.execute("DataStreamSplitExample"); } } ``` 在上述代码中,我们首先创建了一个执行环境,并使用socketTextStream方法创建了一个输入数据流。然后,通过对数据流进行flatMap操作,将输入数据拆分成一个新的数据流。接着,使用split方法对新的数据流进行分流,根据指定的条件将数据流分为不同的子流。最后,通过select方法获取分流后的子流,并通过print方法打印结果。最后,执行任务并指定任务名称为"DataStreamSplitExample"。 以上是一个简单的例子,可以根据实际需求和数据类型进行相应的调整和扩展。

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值