Hive Spark Api 查询、写入、注册UDF函数,SparkSql简单操作

Hive Spark Api 查询、写入、注册UDF函数,SparkSql操作

Hive Spark insert/overwrite

object ReadKerberosHive {

  case class Employee(name: String, phone: String)

  def main(args: Array[String]): Unit = {

    val spark = SparkSession.builder().appName("ReadKerberosHive").master("local[3]")
      .enableHiveSupport()
      .getOrCreate()
	
    import spark.implicits._
    val empDF = Seq(
      Employee("revanth", "07-06-2016-06-08-27"),
      Employee("shyam", "07-06-2016-06-08-27"),
      Employee("hari", "07-06-2016-06-08-27"),
      Employee("kiran", "08-06-2016-07-08-27"),
      Employee("nandha", "08-06-2016-07-08-27"),
      Employee("pawan", "08-06-2016-07-08-27"),
      Employee("kalyan", "09-06-2016-08-08-27"),
      Employee("satish", "09-06-2016-08-08-27"),
      Employee("arun", "09-06-2016-08-08-27"),
      Employee("ram", "10-06-2016-08-08-27"),
      Employee("suda", "10-06-2016-08-08-27"),
      Employee("sunder", "10-06-2016-08-08-27"),
      Employee("charan", "12-06-2016-08-08-27"),
      Employee("ravi", "11-06-2016-08-08-27"),
      Employee("arjun", "11-06-2016-08-08-27")).toDF()

    //    empDF.coalesce(1).write.mode(SaveMode.Append).insertInto("tmp.hivetest")
	
  }
}  

Hive Spark Select

object ReadKerberosHive {

  case class Employee(name: String, phone: String)

  def main(args: Array[String]): Unit = {
  
    val spark = SparkSession.builder().appName("ReadKerberosHive").master("local[3]")
      .enableHiveSupport()
      .getOrCreate()
      
    val df: DataFrame = spark.sql("select name,phone from tmp.hivetest")
    df.show()
    df.rdd.map(f => {
      println(f(0) + " " + f(1))
    }).foreach(f => f)
  }
}  

Hive Spark 注册临时表/DataFrame Spark sql操作

object ReadKerberosHive {

  def main(args: Array[String]): Unit = {
    val spark = SparkSession.builder().appName("ReadKerberosHive").master("local[3]")
      .enableHiveSupport()
      .getOrCreate()
      
    val df: DataFrame = spark.sql("select name,phone from tmp.hivetest")
    df.registerTempTable("zhuce")
    val df_zhuce: DataFrame = spark.sql("select count(1) from zhuce")
    df_zhuce.show()
    
  }
} 

在这里插入图片描述

Hive Spark 注册UDF函数

object ReadKerberosHive {

  case class Employee(name: String, phone: String)

  def main(args: Array[String]): Unit = {

    val spark = SparkSession.builder().appName("ReadKerberosHive").master("local[3]")
      .enableHiveSupport()
      .getOrCreate()

    //    注册处理udf函数测试
    spark.udf.register("remove_1", (str: String) => str.replace("-", ""))
    
    val df: DataFrame = spark.sql("select name,remove_1(phone) from tmp.hivetest")
    df.registerTempTable("zhuce")
    val df_zhuce: DataFrame = spark.sql("select count(1) from zhuce")
    df_zhuce.show()

    df.show()
    df.rdd.map(f => {
      println(f(0) + " " + f(1))
    }).foreach(f => f)
  }
 } 
  • 1
    点赞
  • 1
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值