Hive中的开窗操作over()

前提


over() 可以为 聚合函数,窗口函数和分析函数进行开窗操作;
开窗之后,每一行的数据都会对应一个数据窗口,这个数据窗口中的数据可能会随着行的变化而变化。

over(参数) 中可以指定的参数

over() 开窗开出来的数据是整个查询结果排除开窗操作,都执行完之后的数据。也可以说是开窗操作是在整个select查询结束后才执行的,因此开窗出来的数据也是在当前select的查询结果上进行划分的。

over中可以指定的参数(官方定义: WINDOW specification):

(ROWS | RANGE) BETWEEN (UNBOUNDED | [num]) PRECEDING AND ([num] PRECEDING | CURRENT ROW | (UNBOUNDED | [num]) FOLLOWING)
(ROWS | RANGE) BETWEEN CURRENT ROW AND (CURRENT ROW | (UNBOUNDED | [num]) FOLLOWING)
(ROWS | RANGE) BETWEEN [num] FOLLOWING AND (UNBOUNDED | [num]) FOLLOWING
rows 限定的是行所在的范围,range限定的是某一列的值所在的范围

  • partition by col:按照col进行分区(类似于分组),每个分区拥有自己独立的数据窗口
  • order by col:窗口中的数据按照col进行排序;
  • current row :当前行
  • n preceding:往前n行数据
  • n following:往后n行数据
  • unbounded:有两种边界值:unbounded preceding 表示所开窗出来的起始数据(最小值);unbounded following 表示所开窗出来的最后一行数据(最大值)。

两个常用的窗口函数,配合over开窗来使用

  1. lag(col, n) 获取当前行往前第n行col列的数据,也可以指定默认值。
  2. lead(col, n) 获取当前行往后第n行col列的数据。

常用的分析函数

  1. RANK 排名,排序相同时会重复,总数不变:1,1,3,4
  2. ROW_NUMBER 根据顺序计算,相同元素的排名不会重复 :1,2,3,4
  3. DENSE_RANK (稠密的排名) 相同元素的排名会重复,但是总数会减少:1,1,2,3
  4. NTILE(n), 为开窗出来的每行数据进行分组编号, n为指定的分组个数,返回每行数据的分组编号。当数据不能均匀分布时,默认从第一个组开始一次增加一个元素,不同组别之间的元素的个数最多相差一个元素

接下来我们通过例子,更加直观的学习上述所述的开窗相关的内容。

数据准备


name	orderdate	cost
------------------------
jack	2017-01-01	10
tony	2017-01-02	15
jack	2017-02-03	23
tony	2017-01-04	29
jack	2017-01-05	46
jack	2017-04-06	42
tony	2017-01-07	50
jack	2017-01-08	55
mart	2017-04-08	62
mart	2017-04-09	68
neil	2017-05-10	12
mart	2017-04-11	75
neil	2017-06-12	80
mart	2017-04-13	94

创建表并加载数据

create table business(
name string, 
orderdate string,
cost int
) ROW FORMAT DELIMITED FIELDS TERMINATED BY ',';

load data local inpath "/opt/module/datas/business.txt" into table business;

查询表中的所有元素

business.name	business.orderdate	business.cost
jack	2017-01-01	10
tony	2017-01-02	15
jack	2017-02-03	23
tony	2017-01-04	29
jack	2017-01-05	46
jack	2017-04-06	42
tony	2017-01-07	50
jack	2017-01-08	55
mart	2017-04-08	62
mart	2017-04-09	68
neil	2017-05-10	12
mart	2017-04-11	75
neil	2017-06-12	80
mart	2017-04-13	94

查询在2017年4月份购买过的顾客及总人数(over() 简单的使用)

select
	name,
	count(*) over()
from business
where substring(orderdate, 1, 7)='2017-04'
group by name;

结果:
+----+------------------------------------------------------------------------+
|name|count(1) OVER (ROWS BETWEEN UNBOUNDED PRECEDING AND UNBOUNDED FOLLOWING)|
+----+------------------------------------------------------------------------+
|mart|2                                                                       |
|jack|2                                                                       |
+----+------------------------------------------------------------------------+

上述代码我们通过spark on hive的方式来访问hive中的数据,加快执行效率;如果没有spark,可以之间在hive shell中运行hql语句

partition by col测试:查询顾客的购买明细及月购买总额

    val spark: SparkSession = SparkSession.builder()
      .appName("HiveRead").master("local[*]")
      .enableHiveSupport()
      .config("spark.sql.warehouse.dir", "hdfs://hadoop102:9000/user/hive/warehouse")
      .getOrCreate()
    spark.sql(
      """
        |select
        | name,
        | orderdate,
        | cost,
        | sum(cost) over(partition by month(orderdate))
        |from business
        |""".stripMargin
    ).show(false)

结果:
+----+----------+----+-------------------------------------------------------------------------------------------------------------------------------------+
|name|orderdate |cost|sum(CAST(cost AS BIGINT)) OVER (PARTITION BY month(CAST(orderdate AS DATE)) ROWS BETWEEN UNBOUNDED PRECEDING AND UNBOUNDED FOLLOWING)|
+----+----------+----+-------------------------------------------------------------------------------------------------------------------------------------+
|jack|2017-01-01|10  |205                                                                                                                                  |
|tony|2017-01-02|15  |205                                                                                                                                  |
|tony|2017-01-04|29  |205                                                                                                                                  |
|jack|2017-01-05|46  |205                                                                                                                                  |
|tony|2017-01-07|50  |205                                                                                                                                  |
|jack|2017-01-08|55  |205                                                                                                                                  |
|neil|2017-06-12|80  |80                                                                                                                                   |
|neil|2017-05-10|12  |12                                                                                                                                   |
|jack|2017-04-06|42  |341                                                                                                                                  |
|mart|2017-04-08|62  |341                                                                                                                                  |
|mart|2017-04-09|68  |341                                                                                                                                  |
|mart|2017-04-11|75  |341                                                                                                                                  |
|mart|2017-04-13|94  |341                                                                                                                                  |
|jack|2017-02-03|23  |23                                                                                                                                   |
+----+----------+----+-------------------------------------------------------------------------------------------------------------------------------------+


Process finished with exit code 0

通过partition by 将开窗中的数据按照月份进行分组,每组对应于独立的数据窗口

order by 测试(更改上述例子,将cost进行累加)

    spark.sql(
      """
        |select
        | name,orderdate,cost,
        | sum(cost) over(partition by name order by orderdate)
        |from business
        |""".stripMargin
    ).show(false)
结果:
+----+----------+----+---------------------------------------------------------------------------------------------------------------------------------------+
|name|orderdate |cost|sum(CAST(cost AS BIGINT)) OVER (PARTITION BY name ORDER BY orderdate ASC NULLS FIRST RANGE BETWEEN UNBOUNDED PRECEDING AND CURRENT ROW)|
+----+----------+----+---------------------------------------------------------------------------------------------------------------------------------------+
|mart|2017-04-08|62  |62                                                                                                                                     |
|mart|2017-04-09|68  |130                                                                                                                                    |
|mart|2017-04-11|75  |205                                                                                                                                    |
|mart|2017-04-13|94  |299                                                                                                                                    |
|jack|2017-01-01|10  |10                                                                                                                                     |
|jack|2017-01-05|46  |56                                                                                                                                     |
|jack|2017-01-08|55  |111                                                                                                                                    |
|jack|2017-02-03|23  |134                                                                                                                                    |
|jack|2017-04-06|42  |176                                                                                                                                    |
|tony|2017-01-02|15  |15                                                                                                                                     |
|tony|2017-01-04|29  |44                                                                                                                                     |
|tony|2017-01-07|50  |94                                                                                                                                     |
|neil|2017-05-10|12  |12                                                                                                                                     |
|neil|2017-06-12|80  |92                                                                                                                                     |
+----+----------+----+---------------------------------------------------------------------------------------------------------------------------------------+


Process finished with exit code 0

额外补充

select name,orderdate,cost,
sum(cost) over() as sample1,–所有行相加
sum(cost) over(partition by name) as sample2,–按name分组,组内数据相加
sum(cost) over(partition by name order by orderdate) as sample3,–按name分组,组内数据累加
sum(cost) over(partition by name order by orderdate rows between UNBOUNDED PRECEDING and current row ) as sample4 ,–和sample3一样,由起点到当前行的聚合
sum(cost) over(partition by name order by orderdate rows between 1 PRECEDING and current row) as sample5, --当前行和前面一行做聚合
sum(cost) over(partition by name order by orderdate rows between 1 PRECEDING AND 1 FOLLOWING ) as sample6,–当前行和前边一行及后面一行
sum(cost) over(partition by name order by orderdate rows between current row and UNBOUNDED FOLLOWING ) as sample7 --当前行及后面所有行
from business;

关于over中的窗口子句order by的特别说明(官方说明)

这里仔细观察上述order by的例子,我们就会有一个疑问:为什么使用了order by 就可以进行累加操作了呢?
hive的官方文档给出了解答:https://cwiki.apache.org/confluence/display/Hive/LanguageManual+WindowingAndAnalytics

  1. When ORDER BY is specified with missing WINDOW clause, the WINDOW specification defaults to RANGE BETWEEN UNBOUNDED PRECEDING AND CURRENT ROW.(大致意思:当order by单独出现在over中,而没有窗口子句,那么窗口子句存在默认值:RANGE BETWEEN UNBOUNDED PRECEDING AND CURRENT ROW)

  2. When both ORDER BY and WINDOW clauses are missing, the WINDOW specification defaults to ROW BETWEEN UNBOUNDED PRECEDING AND UNBOUNDED FOLLOWING.(大致意思:当order by 和窗口子句都不存在的时候,over中的默认限定:ROW BETWEEN UNBOUNDED PRECEDING AND UNBOUNDED FOLLOWING)

查看顾客上次的购买时间(lag测试)

//TODO 查询上次顾客购买的时间
spark.sql(
  """
    |select
    | name, orderdate, cost,
    | lag(orderdate, 1, '1997-03-15') over(partition by name order by orderdate)
    |from business
    |""".stripMargin
).show(false)

结果:
+----+----------+----+----------------------------------------------------------------------------------------------------------------------------------+
|name|orderdate |cost|lag(orderdate, 1, 1997-03-15) OVER (PARTITION BY name ORDER BY orderdate ASC NULLS FIRST ROWS BETWEEN 1 PRECEDING AND 1 PRECEDING)|
+----+----------+----+----------------------------------------------------------------------------------------------------------------------------------+
|mart|2017-04-08|62  |1997-03-15                                                                                                                        |
|mart|2017-04-09|68  |2017-04-08                                                                                                                        |
|mart|2017-04-11|75  |2017-04-09                                                                                                                        |
|mart|2017-04-13|94  |2017-04-11                                                                                                                        |
|jack|2017-01-01|10  |1997-03-15                                                                                                                        |
|jack|2017-01-05|46  |2017-01-01                                                                                                                        |
|jack|2017-01-08|55  |2017-01-05                                                                                                                        |
|jack|2017-02-03|23  |2017-01-08                                                                                                                        |
|jack|2017-04-06|42  |2017-02-03                                                                                                                        |
|tony|2017-01-02|15  |1997-03-15                                                                                                                        |
|tony|2017-01-04|29  |2017-01-02                                                                                                                        |
|tony|2017-01-07|50  |2017-01-04                                                                                                                        |
|neil|2017-05-10|12  |1997-03-15                                                                                                                        |
|neil|2017-06-12|80  |2017-05-10                                                                                                                        |
+----+----------+----+----------------------------------------------------------------------------------------------------------------------------------+


Process finished with exit code 0

关于over开窗的另一种写法,可以在from语句后面用window as over条件 (条件内容)进行代替,具体写法如下:

//TODO 查询上次顾客购买的时间, 另一种写法
spark.sql(
  """
    |select
    | name, orderdate, cost,
    | lag(orderdate, 2) over w
    |from business
    |window w as (partition by name order by orderdate)
    |""".stripMargin
).show(false)

结果:
+----+----------+----+----------------------------------------------------------------------------------------------------------------------------+
|name|orderdate |cost|lag(orderdate, 2, NULL) OVER (PARTITION BY name ORDER BY orderdate ASC NULLS FIRST ROWS BETWEEN 2 PRECEDING AND 2 PRECEDING)|
+----+----------+----+----------------------------------------------------------------------------------------------------------------------------+
|mart|2017-04-08|62  |null                                                                                                                        |
|mart|2017-04-09|68  |null                                                                                                                        |
|mart|2017-04-11|75  |2017-04-08                                                                                                                  |
|mart|2017-04-13|94  |2017-04-09                                                                                                                  |
|jack|2017-01-01|10  |null                                                                                                                        |
|jack|2017-01-05|46  |null                                                                                                                        |
|jack|2017-01-08|55  |2017-01-01                                                                                                                  |
|jack|2017-02-03|23  |2017-01-05                                                                                                                  |
|jack|2017-04-06|42  |2017-01-08                                                                                                                  |
|tony|2017-01-02|15  |null                                                                                                                        |
|tony|2017-01-04|29  |null                                                                                                                        |
|tony|2017-01-07|50  |2017-01-02                                                                                                                  |
|neil|2017-05-10|12  |null                                                                                                                        |
|neil|2017-06-12|80  |null                                                                                                                        |
+----+----------+----+----------------------------------------------------------------------------------------------------------------------------+


Process finished with exit code 0

lead测试

    //TODO 查询顾客下次购买时间
    spark.sql(
      """
        |select
        | name, orderdate, cost,
        | lead(orderdate, 1) over overcondition
        |from business
        |window overcondition as (partition by name order by orderdate)
        |""".stripMargin
    ).show(false)

结果:
+----+----------+----+-----------------------------------------------------------------------------------------------------------------------------+
|name|orderdate |cost|lead(orderdate, 1, NULL) OVER (PARTITION BY name ORDER BY orderdate ASC NULLS FIRST ROWS BETWEEN 1 FOLLOWING AND 1 FOLLOWING)|
+----+----------+----+-----------------------------------------------------------------------------------------------------------------------------+
|mart|2017-04-08|62  |2017-04-09                                                                                                                   |
|mart|2017-04-09|68  |2017-04-11                                                                                                                   |
|mart|2017-04-11|75  |2017-04-13                                                                                                                   |
|mart|2017-04-13|94  |null                                                                                                                         |
|jack|2017-01-01|10  |2017-01-05                                                                                                                   |
|jack|2017-01-05|46  |2017-01-08                                                                                                                   |
|jack|2017-01-08|55  |2017-02-03                                                                                                                   |
|jack|2017-02-03|23  |2017-04-06                                                                                                                   |
|jack|2017-04-06|42  |null                                                                                                                         |
|tony|2017-01-02|15  |2017-01-04                                                                                                                   |
|tony|2017-01-04|29  |2017-01-07                                                                                                                   |
|tony|2017-01-07|50  |null                                                                                                                         |
|neil|2017-05-10|12  |2017-06-12                                                                                                                   |
|neil|2017-06-12|80  |null                                                                                                                         |
+----+----------+----+-----------------------------------------------------------------------------------------------------------------------------+


Process finished with exit code 0

ntile(n)测试,用于给数据进行分组编号,方便取前(后)m%的数据, n为分组的个数

例如:查询前20%时间的订单信息(可以将数据分成5组,取组别编号为1的数据(按照时间排序))

    //TODO ntile(n)测试,取前20%时间的数据. 可以将数据进行编号处理,分为5组,取出组号为1的数据,记为前20%的数据
    spark.sql(
      """
        |select
        | name,orderdate, cost,
        | ntile(6) over(order by orderdate) as gn
        |from business
        |""".stripMargin
    ).createOrReplaceTempView("t1")

    println("=============")
    spark.sql(
      """
        |select
        | name,orderdate, cost,
        | ntile(6) over(order by orderdate) as gn
        |from business
        |""".stripMargin
    ).show(false)

    println("=============")
    spark.sql(
      """
        |select
        | name,orderdate,cost
        |from t1
        |where gn=1
        |""".stripMargin
    ).show(false)


结果:
=============
+----+----------+----+---+
|name|orderdate |cost|gn |
+----+----------+----+---+
|jack|2017-01-01|10  |1  |
|tony|2017-01-02|15  |1  |
|tony|2017-01-04|29  |1  |
|jack|2017-01-05|46  |2  |
|tony|2017-01-07|50  |2  |
|jack|2017-01-08|55  |2  |
|jack|2017-02-03|23  |3  |
|jack|2017-04-06|42  |3  |
|mart|2017-04-08|62  |4  |
|mart|2017-04-09|68  |4  |
|mart|2017-04-11|75  |5  |
|mart|2017-04-13|94  |5  |
|neil|2017-05-10|12  |6  |
|neil|2017-06-12|80  |6  |
+----+----------+----+---+

=============
+----+----------+----+
|name|orderdate |cost|
+----+----------+----+
|jack|2017-01-01|10  |
|tony|2017-01-02|15  |
|tony|2017-01-04|29  |
+----+----------+----+


Process finished with exit code 0

排名函数测试

    //TODO rank, row_number, dense_rank
    spark.sql(
      """
        |select
        | name,orderdate,cost,
        | rank() over(partition by name order by orderdate) as rk,
        | dense_rank() over(partition by name order by orderdate) as dr,
        | row_number() over(partition by name order by orderdate) as rn
        |from business
        |""".stripMargin
    ).show(false)

结果:
+----+----------+----+---+---+---+
|name|orderdate |cost|rk |dr |rn |
+----+----------+----+---+---+---+
|mart|2017-04-08|62  |1  |1  |1  |
|mart|2017-04-09|500 |2  |2  |2  |
|mart|2017-04-09|68  |2  |2  |3  |
|mart|2017-04-11|75  |4  |3  |4  |
|mart|2017-04-13|94  |5  |4  |5  |
|jack|2017-01-01|10  |1  |1  |1  |
|jack|2017-01-05|46  |2  |2  |2  |
|jack|2017-01-08|50  |3  |3  |3  |
|jack|2017-01-08|55  |3  |3  |4  |
|jack|2017-02-03|23  |5  |4  |5  |
|jack|2017-04-06|50  |6  |5  |6  |
|jack|2017-04-06|42  |6  |5  |7  |
|tony|2017-01-02|15  |1  |1  |1  |
|tony|2017-01-04|29  |2  |2  |2  |
|tony|2017-01-07|50  |3  |3  |3  |
|neil|2017-05-10|100 |1  |1  |1  |
|neil|2017-05-10|12  |1  |1  |2  |
|neil|2017-06-12|80  |3  |2  |3  |
+----+----------+----+---+---+---+
  • 0
    点赞
  • 2
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值