FlinkSql on yarn 提交踩坑记录

                                         FlinkSql on yarn 提交踩坑记录

    最近需要实现一个flinksql执行引擎,实现前台只需关注sql编写,后台自动提交flinksql到yarn集群中,由于初次接触flinksql,对flinksql研究不深,在实现过程中遇到很多问题。其中一个问题研究了挺长时间,相信很多初次玩flinksql的朋友们也会遇到这个问题,所以在这里分享一下问题的解决方法。

    我是在本地跑通了flinksql的kafka to mysql案例,准备通过flink run 命令提交到yarn集群执行,结果遇到如下问题:

------------------------------------------------------------

 The program finished with the following exception:

 

org.apache.flink.client.program.ProgramInvocationException: The main method caused an error: findAndCreateTableSource failed.

        at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:335)

        at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:205)

        at org.apache.flink.client.ClientUtils.executeProgram(ClientUtils.java:138)

        at org.apache.flink.client.cli.CliFrontend.executeProgram(CliFrontend.java:662)

        at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:210)

        at org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:893)

        at org.apache.flink.client.cli.CliFrontend.lambda$main$10(CliFrontend.java:966)

        at java.security.AccessController.doPrivileged(Native Method)

        at javax.security.auth.Subject.doAs(Subject.java:422)

        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1754)

        at org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)

        at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:966)

Caused by: org.apache.flink.table.api.TableException: findAndCreateTableSource failed.

        at org.apache.flink.table.factories.TableFactoryUtil.findAndCreateTableSource(TableFactoryUtil.java:55)

        at org.apache.flink.table.factories.TableFactoryUtil.findAndCreateTableSource(TableFactoryUtil.java:92)

        at org.apache.flink.table.planner.plan.schema.CatalogSourceTable.findAndCreateTableSource(CatalogSourceTable.scala:162)

        at org.apache.flink.table.planner.plan.schema.CatalogSourceTable.tableSource$lzycompute(CatalogSourceTable.scala:65)

        at org.apache.flink.table.planner.plan.schema.CatalogSourceTable.tableSource(CatalogSourceTable.scala:65)

        at org.apache.flink.table.planner.plan.schema.CatalogSourceTable.toRel(CatalogSourceTable.scala:82)

        at org.apache.calcite.sql2rel.SqlToRelConverter.toRel(SqlToRelConverter.java:3328)

        at org.apache.calcite.sql2rel.SqlToRelConverter.convertIdentifier(SqlToRelConverter.java:2357)

        at org.apache.calcite.sql2rel.SqlToRelConverter.convertFrom(SqlToRelConverter.java:2051)

        at org.apache.calcite.sql2rel.SqlToRelConverter.convertFrom(SqlToRelConverter.java:2005)

        at org.apache.calcite.sql2rel.SqlToRelConverter.convertSelectImpl(SqlToRelConverter.java:646)

        at org.apache.calcite.sql2rel.SqlToRelConverter.convertSelect(SqlToRelConverter.java:627)

        at org.apache.calcite.sql2rel.SqlToRelConverter.convertQueryRecursive(SqlToRelConverter.java:3181)

        at org.apache.calcite.sql2rel.SqlToRelConverter.convertQuery(SqlToRelConverter.java:563)

        at org.apache.flink.table.planner.calcite.FlinkPlannerImpl.org$apache$flink$table$planner$calcite$FlinkPlannerImpl$$rel(FlinkPlannerImpl.scala:148)

        at org.apache.flink.table.planner.calcite.FlinkPlannerImpl.rel(FlinkPlannerImpl.scala:135)

        at org.apache.flink.table.planner.operations.SqlToOperationConverter.toQueryOperation(SqlToOperationConverter.java:535)

        at org.apache.flink.table.planner.operations.SqlToOperationConverter.convertSqlQuery(SqlToOperationConverter.java:439)

        at org.apache.flink.table.planner.operations.SqlToOperationConverter.convert(SqlToOperationConverter.java:157)

        at org.apache.flink.table.planner.delegation.ParserImpl.parse(ParserImpl.java:66)

        at org.apache.flink.table.api.internal.TableEnvironmentImpl.sqlQuery(TableEnvironmentImpl.java:464)

        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)

        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

        at java.lang.reflect.Method.invoke(Method.java:498)

        at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:321)

        ... 11 more

Caused by: org.apache.flink.table.api.NoMatchingTableFactoryException: Could not find a suitable table factory for 'org.apache.flink.table.factories.TableSourceFactory' in

the classpath.

 

Reason: Required context properties mismatch.

 

The matching candidates:

org.apache.flink.table.sources.CsvAppendTableSourceFactory

Mismatched properties:

'connector.type' expects 'filesystem', but is 'kafka'

'format.type' expects 'csv', but is 'json'

 

The following properties are requested:

connector.properties.bootstrap.servers=172.31.100.21:9092

connector.properties.zookeeper.connect=172.31.96.26:2181

connector.startup-mode=earliest-offset

connector.topic=test1

connector.type=kafka

connector.version=universal

format.derive-schema=true

format.type=json

schema.0.data-type=VARCHAR(2147483647)

schema.0.name=address

schema.1.data-type=VARCHAR(2147483647)

schema.1.name=age

schema.2.data-type=VARCHAR(2147483647)

schema.2.name=city

schema.3.data-type=VARCHAR(2147483647)

schema.3.name=name

update-mode=append

刚开始以为是jar包冲突问题,但是发现无论怎么排除冲突都无法解决这个问题。不过最终找到了解决方法。

1、首先排除与flink安装目录 lib下相同的jar包

2、更改maven 打包配置,maven打包配置如下,重点是红色下划线部分,原因是由于JDK的服务发现机制是基于META-INF/services/目录的,如果同一接口存在多个实现需要合并 ,则可以使用此Transformer。

<plugin>
    <groupId>org.apache.maven.plugins</groupId>
    <artifactId>maven-shade-plugin</artifactId>
    <version>2.4.1</version>
    <executions>
        <execution>
            <phase>package</phase>
            <goals>
                <goal>shade</goal>
            </goals>
            <configuration>
                <filters>
                    <filter>
                        <artifact>*:*</artifact>
                        <excludes>
                            <exclude>META-INF/*.SF</exclude>
                            <exclude>META-INF/*.DSA</exclude>
                            <exclude>META-INF/*.RSA</exclude>
                        </excludes>
                    </filter>
                </filters>
                <transformers>
                    <transformer implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
                    </transformer>
                    <transformer implementation="org.apache.maven.plugins.shade.resource.ServicesResourceTransformer"/>
                </transformers>
            </configuration>
        </execution>
    </executions>
</plugin>

  • 1
    点赞
  • 3
    收藏
    觉得还不错? 一键收藏
  • 3
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论 3
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值