一、介绍
Spark SQL中自定义函数包括UDF和UDAF
自定义函数
UDF:一进一出
UDAF:多进一出
二、UDF函数
这里实现自定义函数为截取字段:strSub
package SparkSQL
import org.apache.spark.sql.types.{StringType, StructField, StructType}
import org.apache.spark.sql.{Row, SQLContext}
import org.apache.spark.{SparkConf, SparkContext}object UDF {
def main(args: Array[String]): Unit = {
val conf = new SparkConf().setMaster("local").setAppName("UDF")
val sc = new SparkContext(conf)
val sqlContext = new SQLContext(sc)
//模拟构造数据
val names = Array("leo","Marry","Jack","Tom")
val nameRDD=sc.parallelize(names,5)
val namesRowRDD=nameRDD.map{name=>Row(name)}
val structType = StructType(Array(StructField("name",StringType,true)))
val namesDF=sqlContext.createDataFrame(namesRowRDD,structType)
//注册一张零时表
namesDF.registerTempTable("names")
//定义和注册自定义函数
sqlContext.udf.register("strSub",(str:String)=>str.substring(0,2))
//使用自定义函数
// sqlContext.sql("select name,strLen(name) from names").collect().foreach(println)
sqlContext.sql("select name,strSub(name) from names").show()
}
}
执行结果为9/07/11 10:43:55 INFO CodeGenerator: Code generated in 27.6355 ms
+-----+----------------+
| name|UDF:strLen(name)|
+-----+----------------+
| leo| le|
|Marry| Ma|
| Jack| Ja|
| Tom| To|
+-----+----------------+
19/07/11 10:43:55 INFO SparkContext: Invoking stop() from shutdown hook
感悟: 因为之前写过Hive的自定义函数,Hive的自定义函数需要将程序打包发布到Hive中,SparkSQL中的自定义函数相对实现更加方便!