一、创建SBT项目,添加以下依赖
// https://mvnrepository.com/artifact/org.apache.flink/flink-table
libraryDependencies += "org.apache.flink" %% "flink-table" % "1.2.0"
// https://mvnrepository.com/artifact/org.apache.flink/flink-scala
libraryDependencies += "org.apache.flink" %% "flink-scala" % "1.2.0"
二、
package Table
import org.apache.flink.api.scala.ExecutionEnvironment
import org.apache.flink.table.api.TableEnvironment
import org.apache.flink.table.sinks.CsvTableSink
/**
* Author: HuangWei
* Date: 18-6-27 下午6:10
*/
case class ID(id:String)
case class Mark(col1:String,col2:String,col3:String,col4:String,col5:String,col6:String,col7:String,col8:String,col9:String,col10:String,col11:String,col12:String,col13:String,col14:String,col15:String,col16:String,col17:String,col18:String,col19:String,col20:

本文通过创建SBT项目,详细介绍了如何添加Flink依赖并利用Scala编写Flink SQL进行数据处理。从项目的初始化到Flink SQL的执行,展示了完整的代码示例。
最低0.47元/天 解锁文章
1069

被折叠的 条评论
为什么被折叠?



