Flink 在流处理上常见的sink
Flink将数据进行sink操作到本地文件/本地集合/HDFS等和之前的批处理操作一致
-
sink 到 kafka (读取mysql数据,落地到kafka)
创建kafka的topic:kafka-console-consumer.sh --from-beginning --topic test2 --zookeeper node01:2181,node02:2181,node03:2181
object DataSink_kafka { def main(args: Array[String]): Unit = { // 1. 创建流处理环境 val env = StreamExecutionEnvironment.getExecutionEnvironment // 2. 设置并行度 env.setParallelism(1) // 3. 添加自定义MySql数据源 val source: DataStream[(Int, String, String, String)] = env.addSource(new MySql_source) // 4. 转换元组数据为字符串 val strDataStream: DataStream[String] = source.map( line => line._1 + line._2 + line._3 + line._4 )