Exception in thread "main" org.apache.flink.api.common.InvalidProgramException: This type (GenericTy

详细的错误信息:Exception in thread "main" org.apache.flink.api.common.InvalidProgramException: This type (GenericType<cn.yan.streaming.TestSocketWordCount.WordCount>) cannot be used as key.

请看我的javabean,

// 定义一个内部类
    public static class WordCount {
        public String word;
        public long count;



        public WordCount(String word, long count) {
            this.word = word;
            this.count = count;
        }

        @Override
        public String toString() {
            return "WordCount{" +
                    "word='" + word + '\'' +
                    ", count=" + count +
                    '}';
        }
    }

请注意,一般情况下我们的代码本身没什么问题,但是今天写flink测试代码的时候出现了这个问题,我们少写了一个无参的构造函数,导致了这个错误出现,加上无参的构造函数之后就没有这个问题了。真的是一个无参构造函数触发的血案啊,血淋淋的教训。

我的详细代码如下:

package cn.yan.streaming;

import org.apache.flink.api.common.functions.FlatMapFunction;
import org.apache.flink.api.common.functions.ReduceFunction;
import org.apache.flink.api.java.utils.ParameterTool;
import org.apache.flink.streaming.api.datastream.DataStream;
import org.apache.flink.streaming.api.datastream.DataStreamSource;
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
import org.apache.flink.streaming.api.windowing.time.Time;
import org.apache.flink.util.Collector;

/**
 * flink统计单词
 */
public class TestSocketWordCount {

    public static void main(String[] args) throws Exception {
        // 获取端口号(输入获取)
        int port;
        try {
            ParameterTool tool = ParameterTool.fromArgs(args);
            port = tool.getInt("port");
        } catch (Exception e) {
            port = 9000;
        }


        // 获取flink运行环境
        StreamExecutionEnvironment environment = StreamExecutionEnvironment.getExecutionEnvironment();
        // 定义主机名(flink运行的服务器的ip)
        String hostname = "192.168.1.13";
        // 分隔符
        String delimiter = "\n";
        // 获取数据源,通过socket获取数据
        DataStreamSource<String> text = environment.socketTextStream(hostname, port, delimiter);

        // 执行算子
        DataStream<WordCount> results = text.flatMap(new FlatMapFunction<String, WordCount>() {

            @Override
            public void flatMap(String value, Collector<WordCount> collector) {
                String[] splits = value.split("\\s");
                for (String word : splits) {
                    collector.collect(new WordCount(word, 1L));
                }
            }
        }).keyBy("word")
                // 每隔一秒连续两秒内的统计数据
                .timeWindow(Time.seconds(2), Time.seconds(1))
                .sum("count"); // 可以使用sum函数统计,也可使用reduce进行统计
//                .reduce(new ReduceFunction<WordCount>() {
//                    @Override
//                    public WordCount reduce(WordCount t1, WordCount t2) throws Exception {
//                        return new WordCount(t1.word, t1.count + t2.count);
//                    }
//                });


        // 把数据打印到控制台并设置并行度
        results.print().setParallelism(1);
        // 调用execute执行程序
        environment.execute("socket");
    }

    // 定义一个内部类
    public static class WordCount {
        public String word;
        public long count;



        public WordCount(String word, long count) {
            this.word = word;
            this.count = count;
        }

        @Override
        public String toString() {
            return "WordCount{" +
                    "word='" + word + '\'' +
                    ", count=" + count +
                    '}';
        }
    }
}

正确的代码为

package cn.yan.streaming;

import org.apache.flink.api.common.functions.FlatMapFunction;
import org.apache.flink.api.common.functions.ReduceFunction;
import org.apache.flink.api.java.utils.ParameterTool;
import org.apache.flink.streaming.api.datastream.DataStream;
import org.apache.flink.streaming.api.datastream.DataStreamSource;
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
import org.apache.flink.streaming.api.windowing.time.Time;
import org.apache.flink.util.Collector;

/**
 * flink统计单词
 */
public class TestSocketWordCount {

    public static void main(String[] args) throws Exception {
        // 获取端口号(输入获取)
        int port;
        try {
            ParameterTool tool = ParameterTool.fromArgs(args);
            port = tool.getInt("port");
        } catch (Exception e) {
            port = 9000;
        }


        // 获取flink运行环境
        StreamExecutionEnvironment environment = StreamExecutionEnvironment.getExecutionEnvironment();
        // 定义主机名(flink运行的服务器的ip)
        String hostname = "192.168.1.13";
        // 分隔符
        String delimiter = "\n";
        // 获取数据源,通过socket获取数据
        DataStreamSource<String> text = environment.socketTextStream(hostname, port, delimiter);

        // 执行算子
        DataStream<WordCount> results = text.flatMap(new FlatMapFunction<String, WordCount>() {

            @Override
            public void flatMap(String value, Collector<WordCount> collector) {
                String[] splits = value.split("\\s");
                for (String word : splits) {
                    collector.collect(new WordCount(word, 1L));
                }
            }
        }).keyBy("word")
                // 每隔一秒连续两秒内的统计数据
                .timeWindow(Time.seconds(2), Time.seconds(1))
                .sum("count"); // 可以使用sum函数统计,也可使用reduce进行统计
//                .reduce(new ReduceFunction<WordCount>() {
//                    @Override
//                    public WordCount reduce(WordCount t1, WordCount t2) throws Exception {
//                        return new WordCount(t1.word, t1.count + t2.count);
//                    }
//                });


        // 把数据打印到控制台并设置并行度
        results.print().setParallelism(1);
        // 调用execute执行程序
        environment.execute("socket");
    }

    // 定义一个内部类
    public static class WordCount {
        public String word;
        public long count;


        public WordCount() {
        }

        public WordCount(String word, long count) {
            this.word = word;
            this.count = count;
        }

        @Override
        public String toString() {
            return "WordCount{" +
                    "word='" + word + '\'' +
                    ", count=" + count +
                    '}';
        }
    }
}

如图所示,启动程序之后正常执行了。

一定要加上无参的构造函数!

一定要加上无参的构造函数!

一定要加上无参的构造函数!

重要的事情说三遍!!!

  • 2
    点赞
  • 1
    收藏
    觉得还不错? 一键收藏
  • 2
    评论
引用: Exception in thread "main" org.apache.flink.runtime.client.JobExecutionException: Job execution failed. Caused by: java.lang.Exception: java.net.SocketException: Connection reset Caused by: java.net.SocketException: Connection reset。 引用: 3.当设置的分区数多于机器的CPU数会发生数据混乱的错误,导致计算不正确。本身机器的CPU为4核。 org.apache.flink.runtime.entrypoint.ClusterEntrypointException: Failed to initialize the cluster entrypoint YarnSessionClusterEntrypoint. at org.apache.flink.runtime.entrypoint.ClusterEntrypoint.startCluster(ClusterEntrypoint.java:182) at org.apache.flink.runtime.entrypoint.ClusterEntrypoint.runClusterEntrypoint(ClusterEntrypoint.java:501) at org.apache.flink.yarn.entrypoint.YarnSessionClusterEntrypoint.main(YarnSessionClusterEntrypoint.java:93) Caused by: java.net.ConnectException: Call From node2/192.168.40.62 to node1:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)。 引用: 原因:socket连接重置,可能是使用不同的方式或者是重复提交flink任务,导致socket端口占用导致。2.No new data sinks have been defined since。原因:未被定义的数据输出 flink的批处理不需要行动算子来触发,因此删除最后一行的 //启动流式处理,如果没有该行代码上面的程序不会运行 streamEnv.execute("wordcount")。 "Exception in thread "main" org.apache.flink.runtime.client.JobExecutionException: Job execution failed." 这个错误是由于flink作业执行失败所引发的异常。可能的原因是网络连接重置或未定义的数据输出。对于网络连接重置,可以检查是否使用了不同的连接方式或是否重复提交了flink任务。对于未定义的数据输出,可以检查是否没有定义数据的输出操作。
评论 2
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值