【转】Hive ,row_number,dense_rank,percent_rank不要带参数 ;rank要带参数

转载自 https://blog.csdn.net/u010002184/article/details/109499535
感谢作者。

正确如下:

row_number() OVER( partition by session_id order by  request_time) as row_num -- 不能带参数
rank(request_time) OVER( partition by session_id order by  request_time) as row_num -- 要带参数
dense_rank() OVER( partition by session_id order by  request_time) as row_num -- 不能带参数
percent_rank() OVER( partition by session_id order by  request_time) as row_num -- 不能带参数


错误如下:

row_number(request_time) OVER( partition by session_id order by  request_time) as row_num
rank() OVER( partition by session_id order by  request_time) as row_num
dense_rank(request_time) OVER( partition by session_id order by  request_time) as row_num
percent_rank(request_time) OVER( partition by session_id order by  request_time) as row_num

 

Hive2.0.0版本

select
request_time,session_id,c3,c4,
Row_Number() OVER( partition by request_time,session_id,c3,c4
                order by  request_time,session_id,c3,c4) as row_num
from 
table_test1 where dt='2099-99-99' 
 --正确
 
 
select
request_time,session_id,c3,c4,
Row_Number(request_time,session_id,c3,c4) OVER( partition by request_time,session_id,c3,c4
                order by  request_time,session_id,c3,c4) as row_num
from 
table_test1 where dt='2099-99-99' 
--  FAILED: SemanticException Failed to breakup Windowing invocations into Groups. At least 1 group must only depend on input columns. Also check for circular dependencies.
-- Underlying error: org.apache.hadoop.hive.ql.exec.UDFArgumentTypeException: No argument is expected.
 
 
 
 select
request_time,session_id,c3,c4,
Row_Number(request_time) OVER( partition by request_time,session_id,c3,c4
                order by  request_time,session_id,c3,c4) as row_num
from 
table_test1 where dt='2099-99-99' 
--  FAILED: SemanticException Failed to breakup Windowing invocations into Groups. At least 1 group must only depend on input columns. Also check for circular dependencies.
-- Underlying error: org.apache.hadoop.hive.ql.exec.UDFArgumentTypeException: No argument is expected.
 
 
 
 select
request_time,session_id,c3,c4,
Row_Number(request_time) OVER( partition by request_time,session_id,c3,c4
                order by  request_time) as row_num
from 
table_test1 where dt='2099-99-99' 
 --  FAILED: SemanticException Failed to breakup Windowing invocations into Groups. At least 1 group must only depend on input columns. Also check for circular dependencies.
-- Underlying error: org.apache.hadoop.hive.ql.exec.UDFArgumentTypeException: No argument is expected.
 
 
===========================
 
 select
request_time,session_id,c3,c4,
rank(request_time) OVER( partition by request_time,session_id,c3,c4
                order by  request_time) as row_num
from 
table_test1 where dt='2099-99-99' 
 --正确
 
 
 
 select
session_id,request_time,c3,c4,
rank(request_time) OVER( partition by session_id order by  request_time) as row_num
from 
table_test1 where dt='2020-03-16'
and session_id in ('158430718024490880175|1','15837004010552038104667|17');
-- 正确
 
 
 select
session_id,request_time,c3,c4,
rank() OVER( partition by session_id order by  request_time) as row_num
from 
table_test1 where dt='2020-03-16'
and session_id in ('158430718024490880175|1','15837004010552038104667|17');
-- FAILED: SemanticException Failed to breakup Windowing invocations into Groups. At least 1 group must only depend on input columns. Also check for circular dependencies.
-- Underlying error: org.apache.hadoop.hive.ql.exec.UDFArgumentTypeException: One or more arguments are expected.
 
 
===========================
 
select
request_time,session_id,c3,c4,
dense_rank(request_time) OVER( partition by request_time,session_id,c3,c4
                order by  request_time) as row_num
from 
table_test1 where dt='2099-99-99' 
--  FAILED: SemanticException Failed to breakup Windowing invocations into Groups. At least 1 group must only depend on input columns. Also check for circular dependencies.
-- Underlying error: Ranking Functions can take no arguments
 
 
 
 select
session_id,request_time,c3,c4,
dense_rank(request_time) OVER( partition by session_id order by  request_time) as row_num
from 
table_test1 where dt='2020-03-16'
and session_id in ('158430718024490880175|1','15837004010552038104667|17');
-- FAILED: SemanticException Failed to breakup Windowing invocations into Groups. At least 1 group must only depend on input columns. Also check for circular dependencies.
-- Underlying error: Ranking Functions can take no arguments
 
 
 select
session_id,request_time,c3,c4,
dense_rank() OVER( partition by session_id order by  request_time) as row_num
from 
table_test1 where dt='2020-03-16'
and session_id in ('158430718024490880175|1','15837004010552038104667|17');
 -- 正确 
 
===========================
 
select
request_time,session_id,c3,c4,
percent_rank(request_time) OVER( partition by request_time,session_id,c3,c4
                order by  request_time) as row_num
from 
table_test1 where dt='2099-99-99' 
 --  FAILED: SemanticException Failed to breakup Windowing invocations into Groups. At least 1 group must only depend on input columns. Also check for circular dependencies.
-- Underlying error: Ranking Functions can take no arguments
 
 
 
 select
session_id,request_time,c3,c4,
percent_rank(request_time) OVER( partition by session_id order by  request_time) as row_num
from 
table_test1 where dt='2020-03-16'
and session_id in ('158430718024490880175|1','15837004010552038104667|17');
-- FAILED: SemanticException Failed to breakup Windowing invocations into Groups. At least 1 group must only depend on input columns. Also check for circular dependencies.
-- Underlying error: Ranking Functions can take no arguments
 
 
 
 select
session_id,request_time,c3,c4,
percent_rank() OVER( partition by session_id order by  request_time) as row_num
from 
table_test1 where dt='2020-03-16'
and session_id in ('158430718024490880175|1','15837004010552038104667|17');
-- 正确 

Hive 2.0.0

  • 1
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 1
    评论
评论 1
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值