基于flinksql-1.12.1版本从kafka写入数据到mysql问题

1.背景描述

基于flinksql-1.12.1  实现:kafka-—>etl---->Mysql

create table if not exists kafka_ods_trainlog( 
    `trainid` BIGINT,
    `servenumber` BIGINT,
    `ordernumber` BIGINT,
    `stationid` BIGINT,
    `stopareaid` BIGINT,
    `isstopped` boolean,
    `direction` varchar,
    `schedulediff` BIGINT,
    `trainarrflg` BIGINT,
    `ts` TIMESTAMP(3)
	--`ts2` as TO_TIMESTAMP(FROM_UNIXTIME(UNIX_TIMESTAMP(ts,'yyyy-MM-dd HH:mm:ss')/1000,'yyyy-MM-dd HH:mm:ss')),
    --WATERMARK FOR ts AS ts - INTERVAL 20 SECOND 
)
with ( 
    'connector' = 'kafka',
    'topic' = 'ods_trainlog',
    'properties.bootstrap.servers' = 'bj-xxxx.com-06:9092', 
    'properties.group.id' = 'atsdriving',
    'scan.startup.mode' = 'earliest-offset',
    'format' = 'csv'
);

--------------------------只是写了kafka表,etl过程和mysql表没有写。

依赖的jar包

 flink-connector-hive_2.11-1.12.1.jar
 flink-connector-jdbc_2.11-1.12.1.jar
 flink-connector-kafka_2.11-1.12.1.jar
 flink-csv-1.12.1.jar
 flink-dist_2.11-1.12.1.jar
 flink-json-1.12.1.jar
flink-shaded-zookeeper-3.4.14.jar
 flink-sql-connector-elasticsearch6_2.11-1.12.1.jar
 flink-sql-connector-hive-2.2.0_2.11-1.12.0.jar
 flink-sql-connector-kafka_2.11-1.12.1.jar
 flink-table_2.11-1.12.1.jar
 flink-table-blink_2.11-1.12.1.jar
log4j-1.2-api-2.12.1.jar
log4j-api-2.12.1.jar
log4j-core-2.12.1.jar
log4j-slf4j-impl-2.12.1.jar
 mysql-connector-java-5.1.6.jar

2.log详情

2021-03-22 21:03:12
java.lang.NoClassDefFoundError: org/apache/kafka/clients/consumer/ConsumerRecord
	at java.lang.Class.getDeclaredMethods0(Native Method)
	at java.lang.Class.privateGetDeclaredMethods(Class.java:2701)
	at java.lang.Class.getDeclaredMethod(Class.java:2128)
	at java.io.ObjectStreamClass.getPrivateMethod(ObjectStreamClass.java:1475)
	at java.io.ObjectStreamClass.access$1700(ObjectStreamClass.java:72)
	at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:498)
	at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:472)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.io.ObjectStreamClass.<init>(ObjectStreamClass.java:472)
	at java.io.ObjectStreamClass.lookup(ObjectStreamClass.java:369)
	at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:598)
	at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1843)
	at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1713)
	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2000)
	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1535)
	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2245)
	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2169)
	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2027)
	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1535)
	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2245)
	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2169)
	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2027)
	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1535)
	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2245)
	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2169)
	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2027)
	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1535)
	at java.io.ObjectInputStream.readObject(ObjectInputStream.java:422)
	at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:615)
	at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:600)
	at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:587)
	at org.apache.flink.util.InstantiationUtil.readObjectFromConfig(InstantiationUtil.java:541)
	at org.apache.flink.streaming.api.graph.StreamConfig.getStreamOperatorFactory(StreamConfig.java:323)
	at org.apache.flink.streaming.runtime.tasks.OperatorChain.<init>(OperatorChain.java:143)
	at org.apache.flink.streaming.runtime.tasks.StreamTask.beforeInvoke(StreamTask.java:509)
	at org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:565)
	at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:755)
	at org.apache.flink.runtime.taskmanager.Task.run(Task.java:570)
	at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.ClassNotFoundException: org.apache.kafka.clients.consumer.ConsumerRecord
	at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	... 39 more

3. 需要加入如下包,对应版本得看当前环境kafka的版本

 kafka-clients-2.1.0.jar   

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 打赏
    打赏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包

打赏作者

潘永青

你的鼓励将是我创作的最大动力

¥1 ¥2 ¥4 ¥6 ¥10 ¥20
扫码支付:¥1
获取中
扫码支付

您的余额不足,请更换扫码支付或充值

打赏作者

实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值