问题处理:Fllink SQL, Could not find a suitable table factory

学习FLink SQL,跑了一个例子,本地运行是OK的但是无法在Flink服务器上跑。
报错详情:

Caused by: org.apache.flink.table.api.NoMatchingTableFactoryException: Could not find a suitable table factory for 'org.apache.flink.table.factories.TableSourceFactory' in
the classpath.

Reason: Required context properties mismatch.

The matching candidates:
org.apache.flink.table.sources.CsvAppendTableSourceFactory
Mismatched properties:
'connector.type' expects 'filesystem', but is 'kafka'
'format.type' expects 'csv', but is 'json'

代码内容是从Kafka消费数据,分窗后插入到mysql中。

/*
 * Licensed to the Apache Software Foundation (ASF) under one
 * or more contributor license agreements.  See the NOTICE file
 * distributed with this work for additional information
 * regarding copyright ownership.  The ASF licenses this file
 * to you under the Apache License, Version 2.0 (the
 * "License"); you may not use this file except in compliance
 * with the License.  You may obtain a copy of the License at
 *
 *     http://www.apache.org/licenses/LICENSE-2.0
 *
 * Unless required by applicable law or agreed to in writing, software
 * distributed under the License is distributed on an "AS IS" BASIS,
 * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 * See the License for the specific language governing permissions and
 * limitations under the License.
 */

package myflink;

import org.apache.flink.streaming.api.TimeCharacteristic;
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
import org.apache.flink.table.api.*;
import org.apache.flink.table.api.java.StreamTableEnvironment;

/**
 * Skeleton for a Flink Streaming Job.
 * <p>
 * <p>For a tutorial how to write a Flink streaming application, check the
 * tutorials and examples on the <a href="https://flink.apache.org/docs/stable/">Flink Website</a>.
 * <p>
 * <p>To package your application into a JAR file for execution, run
 * 'mvn clean package' on the command line.
 * <p>
 * <p>If you change the name of the main class (with the public static void main(String[] args))
 * method, change the respective entry in the POM.xml file (simply search for 'mainClass').
 */
public class StreamingJob {
    public static void main(String[] args) throws Exception {
        EnvironmentSettings settings = EnvironmentSettings.newInstance()
                .useBlinkPlanner()
                .inStreamingMode()
                .build();
//        TableEnvironment tEnv = TableEnvironment.create(settings);

        StreamExecutionEnvironment bsEnv = StreamExecutionEnvironment.getExecutionEnvironment();
        bsEnv.setParallelism(1);
        bsEnv.setStreamTimeCharacteristic(TimeCharacteristic.EventTime);
        StreamTableEnvironment tableEnv = StreamTableEnvironment.create(bsEnv, settings);

        String ddl = "CREATE TABLE user_log (\n" +
                "    userId BIGINT,\n" +
                "    itemId BIGINT,\n" +
                "    categoryId BIGINT,\n" +
                "    behavior STRING,\n" +
                "    ts TIMESTAMP(3),\n" +
                "    t as TO_TIMESTAMP(FROM_UNIXTIME(itemId / 1000,'yyyy-MM-dd HH:mm:ss'))," +
                "    proctime as PROCTIME(),\n" +
                "    WATERMARK FOR t as t - INTERVAL '0.001' SECOND \n" +
                ") WITH (\n" +
                "    'connector.type' = 'kafka',\n" +
                "    'connector.version' = 'universal',\n" +
                "    'connector.topic' = 'myTest1',\n" +
                "    'connector.startup-mode' = 'latest-offset',\n" +
                "    'connector.properties.0.key' = 'zookeeper.connect',\n" +
                "    'connector.properties.0.value' = 'localhost:2181',\n" +
                "    'connector.properties.1.key' = 'bootstrap.servers',\n" +
                "    'connector.properties.1.value' = 'localhost:9092',\n" +
                "    'update-mode' = 'append',\n" +
                "    'format.type' = 'json'\n" +
//                "    'format.derive-schema' = 'true'\n" +
                ")";
        tableEnv.sqlUpdate(ddl);


        String ddlMysql = "CREATE TABLE pvuv_sink (\n" +
                "    dt TIMESTAMP(3),\n" +
                "    pv BIGINT,\n" +
                "    uv BIGINT\n" +
                ") WITH (\n" +
                "    'connector.type' = 'jdbc',\n" +
                "    'connector.url' = 'jdbc:mysql://localhost:3306/dota2',\n" +
                "    'connector.table' = 'pvuv_sink',\n" +
                "    'connector.username' = 'root',\n" +
                "    'connector.password' = '',\n" +
                "    'connector.write.flush.max-rows' = '1'\n" +
                ")";
        tableEnv.sqlUpdate(ddlMysql);

        String dml = "INSERT INTO pvuv_sink \n" +
                "SELECT\n" +
                "  TUMBLE_START(t, INTERVAL '1' MINUTE) AS dt,\n" +
                "  count(categoryId) AS pv,\n" +
                "  userId AS uv\n" +
                "FROM user_log GROUP BY TUMBLE(t, INTERVAL '1' MINUTE), userId";
        tableEnv.sqlUpdate(dml);
        tableEnv.execute("SQL Job");
    }
}

最后一顿查,发现是在打完的JAR包中缺少东西
应该是和SPI的有关系。

https://ci.apache.org/projects/flink/flink-docs-release-1.10/zh/dev/table/connect.html

在这里插入图片描述
治标方案:
![在这里插入图片描述](https://img-blog.csdnimg.cn/20200426133529502.png?x-oss-process=image/watermark,type_ZmFuZ3poZW5naGVpdGk,shadow_10,text_aHR0cHM6Ly9ibG9nLmNzZG4ubmV0L3poYW5naHVvbGVp,size_16,color_FFFFFF,t_70
在这个文件中加入了两个factory
在这里插入图片描述
治本方案:
2020-05-07更新
经过码哥XiangYida的提示

https://stackoverflow.com/questions/52500048/flink-could-not-find-a-suitable-table-factory-for-org-apache-flink-table-facto

在这个stackoverflow里有解决方案。
在pom.xml里增加

<transformer implementation="org.apache.maven.plugins.shade.resource.ServicesResourceTransformer"/>

在这里插入图片描述
原因正如之前所说的跟SPI有关:
在这里插入图片描述
再去看打包好的jar里面org.apache.flink.table.factories.TableFactory已经是加好内容了

在这里插入图片描述
完美解决,收~

  • 2
    点赞
  • 4
    收藏
    觉得还不错? 一键收藏
  • 9
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论 9
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值