1、'connector.version' expects 'universal', but is '2.2.0'
报错信息:
Caused by: org.apache.flink.table.api.NoMatchingTableFactoryException: Could not find a suitable table factory for 'org.apache.flink.table.factories.TableSourceFactory' in
the classpath.
Reason: Required context properties mismatch.
The matching candidates:
org.apache.flink.streaming.connectors.kafka.KafkaTableSourceSinkFactory
Mismatched properties:
'connector.version' expects 'universal', but is '2.2.0'
代码:
tableEnv.connect(new Kafka()
.version("2.2.0") //报错点
.topic("topic_producer")
.property("zookeeper.connect","192.168.xxx.xxx:2181")
.property("bootstrap.servers","192.168.xxx.xxx:9092")
)
.withFormat(new Csv())
.withSchema(new Schema()
.field("id", DataTypes.STRING())
.field("timestamp",DataTypes.BIGINT())
.field("temperature",DataTypes.DOUBLE())
).createTemporaryTable("inputTable");
注意:
.version()方法中应该传入kafka连接器的通用版本,universal。
当然啦,使用通用kafka连接器的前提是要有对应依赖:
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-kafka_${scala.binary.version}</artifactId>
<version>${flink.version}</version>
</dependency>
2、Window aggregate can only be defined over a time attribute column, but TIMESTAMP(3) encountered
报错信息:
Exception in thread "main" org.apache.flink.table.api.TableException: Window aggregate can only be defined over a time attribute column, but TIMESTAMP(3) encountered
代码:
//将kafka数据读成输入表
tableEnv.connect(new Kafka()
.version("universal")
.topic("hotItems")
.startFromLatest()
.property("zookeeper.connect","192.168.149.131:2181")
.property("bootstrap.servers","192.168.149.131:9092")
)
.withFormat(new Csv())
.withSchema(new Schema()
.field("userId", DataTypes.BIGINT())
.field("itemId",DataTypes.BIGINT())
.field("categoryId", DataTypes.INT())
.field("behavior",DataTypes.STRING())
.field("ts",DataTypes.TIMESTAMP(3)) //报错点
.rowtime(new Rowtime()
.timestampsFromField("ts")
.watermarksPeriodicBounded(1000)
)
).createTemporaryTable("inputTable");
3、Exception in thread "main" java.lang.NoClassDefFoundError: scala/Product$class
报错信息:
Exception in thread "main" java.lang.NoClassDefFoundError: scala/Product$class
Caused by: java.lang.ClassNotFoundException: scala.Product$class
单击报错提示,会显示很多信息上面需要依赖2.11和2.12的两个版本的包。
原因:
scala依赖冲突。
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-clients_2.11</artifactId>
<version>${flink.version}</version>
</dependency>
这儿flink-client的scala依赖所需要的是2.11版本,而pom文件全局配置的scala版本为2.12.
<!--全局依赖的版本-->
<properties>
<flink.version>1.10.1</flink.version>
<scala.binary.version>2.12</scala.binary.version>
<kafka.version>2.2.0</kafka.version>
</properties>
解决:
将全局的scala版本设为2.11。
<properties>
<flink.version>1.10.1</flink.version>
<scala.binary.version>2.11</scala.binary.version>
<kafka.version>2.2.0</kafka.version>
</properties>