flink cdc 集成mysql

3.8.1\repository\org\apache\lucene\lucene-analyzers-common\4.7.2\lucene-analyzers-common-4.7.2.jar" flink.cdc.SQL_02
Exception in thread “main” org.apache.flink.table.api.ValidationException: Unable to create a source for reading table ‘default_catalog.default_database.flink_test01’.

Table options are:

‘connector’=‘mysql-cdc’
‘database-name’=‘flink_test’
‘hostname’=‘10.108.6.218’
‘password’=‘123456’
‘port’=‘3306’
‘table-name’=‘test01’
‘username’=‘root’
at org.apache.flink.table.factories.FactoryUtil.createTableSource(FactoryUtil.java:137)
at org.apache.flink.table.planner.plan.schema.CatalogSourceTable.createDynamicTableSource(CatalogSourceTable.java:116)
at org.apache.flink.table.planner.plan.schema.CatalogSourceTable.toRel(CatalogSourceTable.java:82)
at org.apache.calcite.sql2rel.SqlToRelConverter.toRel(SqlToRelConverter.java:3585)
at org.apache.calcite.sql2rel.SqlToRelConverter.convertIdentifier(SqlToRelConverter.java:2507)
at org.apache.calcite.sql2rel.SqlToRelConverter.convertFrom(SqlToRelConverter.java:2144)
at org.apache.calcite.sql2rel.SqlToRelConverter.convertFrom(SqlToRelConverter.java:2093)
at org.apache.calcite.sql2rel.SqlToRelConverter.convertFrom(SqlToRelConverter.java:2050)
at org.apache.calcite.sql2rel.SqlToRelConverter.convertSelectImpl(SqlToRelConverter.java:663)
at org.apache.calcite.sql2rel.SqlToRelConverter.convertSelect(SqlToRelConverter.java:644)
at org.apache.calcite.sql2rel.SqlToRelConverter.convertQueryRecursive(SqlToRelConverter.java:3438)
at org.apache.calcite.sql2rel.SqlToRelConverter.convertQuery(SqlToRelConverter.java:570)
at org.apache.flink.table.planner.calcite.FlinkPlannerImpl.org a p a c h e apache apacheflink t a b l e table tableplanner c a l c i t e calcite calciteFlinkPlannerImpl$ r e l ( F l i n k P l a n n e r I m p l . s c a l a : 169 ) a t o r g . a p a c h e . f l i n k . t a b l e . p l a n n e r . c a l c i t e . F l i n k P l a n n e r I m p l . r e l ( F l i n k P l a n n e r I m p l . s c a l a : 161 ) a t o r g . a p a c h e . f l i n k . t a b l e . p l a n n e r . o p e r a t i o n s . S q l T o O p e r a t i o n C o n v e r t e r . t o Q u e r y O p e r a t i o n ( S q l T o O p e r a t i o n C o n v e r t e r . j a v a : 989 ) a t o r g . a p a c h e . f l i n k . t a b l e . p l a n n e r . o p e r a t i o n s . S q l T o O p e r a t i o n C o n v e r t e r . c o n v e r t S q l Q u e r y ( S q l T o O p e r a t i o n C o n v e r t e r . j a v a : 958 ) a t o r g . a p a c h e . f l i n k . t a b l e . p l a n n e r . o p e r a t i o n s . S q l T o O p e r a t i o n C o n v e r t e r . c o n v e r t ( S q l T o O p e r a t i o n C o n v e r t e r . j a v a : 283 ) a t o r g . a p a c h e . f l i n k . t a b l e . p l a n n e r . d e l e g a t i o n . P a r s e r I m p l . p a r s e ( P a r s e r I m p l . j a v a : 101 ) a t o r g . a p a c h e . f l i n k . t a b l e . a p i . i n t e r n a l . T a b l e E n v i r o n m e n t I m p l . s q l Q u e r y ( T a b l e E n v i r o n m e n t I m p l . j a v a : 704 ) a t f l i n k . c d c . S Q L 0 2. m a i n ( S Q L 0 2. j a v a : 30 ) C a u s e d b y : j a v a . l a n g . N o S u c h M e t h o d E r r o r : o r g . a p a c h e . f l i n k . t a b l e . f a c t o r i e s . D y n a m i c T a b l e F a c t o r y rel(FlinkPlannerImpl.scala:169) at org.apache.flink.table.planner.calcite.FlinkPlannerImpl.rel(FlinkPlannerImpl.scala:161) at org.apache.flink.table.planner.operations.SqlToOperationConverter.toQueryOperation(SqlToOperationConverter.java:989) at org.apache.flink.table.planner.operations.SqlToOperationConverter.convertSqlQuery(SqlToOperationConverter.java:958) at org.apache.flink.table.planner.operations.SqlToOperationConverter.convert(SqlToOperationConverter.java:283) at org.apache.flink.table.planner.delegation.ParserImpl.parse(ParserImpl.java:101) at org.apache.flink.table.api.internal.TableEnvironmentImpl.sqlQuery(TableEnvironmentImpl.java:704) at flink.cdc.SQL_02.main(SQL_02.java:30) Caused by: java.lang.NoSuchMethodError: org.apache.flink.table.factories.DynamicTableFactory rel(FlinkPlannerImpl.scala:169)atorg.apache.flink.table.planner.calcite.FlinkPlannerImpl.rel(FlinkPlannerImpl.scala:161)atorg.apache.flink.table.planner.operations.SqlToOperationConverter.toQueryOperation(SqlToOperationConverter.java:989)atorg.apache.flink.table.planner.operations.SqlToOperationConverter.convertSqlQuery(SqlToOperationConverter.java:958)atorg.apache.flink.table.planner.operations.SqlToOperationConverter.convert(SqlToOperationConverter.java:283)atorg.apache.flink.table.planner.delegation.ParserImpl.parse(ParserImpl.java:101)atorg.apache.flink.table.api.internal.TableEnvironmentImpl.sqlQuery(TableEnvironmentImpl.java:704)atflink.cdc.SQL02.main(SQL02.java:30)Causedby:java.lang.NoSuchMethodError:org.apache.flink.table.factories.DynamicTableFactoryContext.getCatalogTable()Lorg/apache/flink/table/catalog/CatalogTable;
at com.alibaba.ververica.cdc.connectors.mysql.table.MySQLTableSourceFactory.createDynamicTableSource(MySQLTableSourceFactory.java:144)
at org.apache.flink.table.factories.FactoryUtil.createTableSource(FactoryUtil.java:134)
… 19 more

Process finished with exit code 1

原因: 此时采用的是flink-connector-mysql-cdc-1.3.0.jar, 我的flink版本采用的是1.13.2, 后面看了下flink 13版本的cdc至少需要采用1.4.0及以上版本

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
Flink CDC 是一个用于从MySQL binlog中获取数据变动的工具。通过引入Flink CDC的jar包,并编写一个main方法,我们可以实现从指定位置拉取消息的功能。具体而言,可以使用.startupOptions(StartupOptions.specificOffset("mysql-bin.000013", 1260))这句代码来指定binlog日志的位置开始读取数据。 需要注意的是,Flink CDC 1.4.0版本支持使用specificOffset方式指定binlog日志的位置开始读取数据,而新版本测试还未支持该功能。 Java是一种全球排名第一的编程语言,在大数据平台中广泛使用,主要包括Hadoop、Spark、Flink等工具,这些工具都是使用Java或Scala开发的。因此,使用Java编写Flink CDC的代码可以很好地与大数据生态系统进行集成,实现对MySQL binlog的获取。<span class="em">1</span><span class="em">2</span><span class="em">3</span> #### 引用[.reference_title] - *1* *2* [FlinkCdcMysql指定的binlog日志offsetPos位置开始读取数据](https://blog.csdn.net/shy_snow/article/details/122879590)[target="_blank" data-report-click={"spm":"1018.2226.3001.9630","extra":{"utm_source":"vip_chatgpt_common_search_pc_result","utm_medium":"distribute.pc_search_result.none-task-cask-2~all~insert_cask~default-1-null.142^v93^chatsearchT3_2"}}] [.reference_item style="max-width: 50%"] - *3* [Java + 数组 + 初始化](https://download.csdn.net/download/weixin_51202460/88254379)[target="_blank" data-report-click={"spm":"1018.2226.3001.9630","extra":{"utm_source":"vip_chatgpt_common_search_pc_result","utm_medium":"distribute.pc_search_result.none-task-cask-2~all~insert_cask~default-1-null.142^v93^chatsearchT3_2"}}] [.reference_item style="max-width: 50%"] [ .reference_list ]

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值