hive explain

hive 语句执行顺序
大致顺序
from... where.... select...group by... having ... order by...
explain查看执行计划

hive语句和mysql都可以通过explain查看执行计划,这样就可以查看执行顺序,比如

explain
    select city,ad_type,device,sum(cnt) as cnt
    from tb_pmp_raw_log_basic_analysis
    where day = '2016-05-28' and type = 0 and media = 'sohu' and (deal_id = '' or deal_id = '-' or deal_id is NULL)    
    group by city,ad_type,device

显示执行计划如下

STAGE DEPENDENCIES:
  Stage-1 is a root stage
  Stage-0 is a root stage

STAGE PLANS:
  Stage: Stage-1
    Map Reduce
      Map Operator Tree:
          TableScan
            alias: tb_pmp_raw_log_basic_analysis
            Statistics: Num rows: 8195357 Data size: 580058024 Basic stats: COMPLETE Column stats: NONE
            Filter Operator
              predicate: (((deal_id = '') or (deal_id = '-')) or deal_id is null) (type: boolean)
              Statistics: Num rows: 8195357 Data size: 580058024 Basic stats: COMPLETE Column stats: NONE
              Select Operator
                expressions: city (type: string), ad_type (type: string), device (type: string), cnt (type: bigint)
                outputColumnNames: city, ad_type, device, cnt
                Statistics: Num rows: 8195357 Data size: 580058024 Basic stats: COMPLETE Column stats: NONE
                Group By Operator
                  aggregations: sum(cnt)
                  keys: city (type: string), ad_type (type: string), device (type: string)
                  mode: hash
                  outputColumnNames: _col0, _col1, _col2, _col3
                  Statistics: Num rows: 8195357 Data size: 580058024 Basic stats: COMPLETE Column stats: NONE
                  Reduce Output Operator
                    key expressions: _col0 (type: string), _col1 (type: string), _col2 (type: string)
                    sort order: +++
                    Map-reduce partition columns: _col0 (type: string), _col1 (type: string), _col2 (type: string)
                    Statistics: Num rows: 8195357 Data size: 580058024 Basic stats: COMPLETE Column stats: NONE
                    value expressions: _col3 (type: bigint)
      Reduce Operator Tree:
        Group By Operator
          aggregations: sum(VALUE._col0)
          keys: KEY._col0 (type: string), KEY._col1 (type: string), KEY._col2 (type: string)
          mode: mergepartial
          outputColumnNames: _col0, _col1, _col2, _col3
          Statistics: Num rows: 4097678 Data size: 290028976 Basic stats: COMPLETE Column stats: NONE
          Select Operator
            expressions: _col0 (type: string), _col1 (type: string), _col2 (type: string), _col3 (type: bigint)
            outputColumnNames: _col0, _col1, _col2, _col3
            Statistics: Num rows: 4097678 Data size: 290028976 Basic stats: COMPLETE Column stats: NONE
            File Output Operator
              compressed: false
              Statistics: Num rows: 4097678 Data size: 290028976 Basic stats: COMPLETE Column stats: NONE
              table:
                  input format: org.apache.hadoop.mapred.TextInputFormat
                  output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat
                  serde: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe

  Stage: Stage-0
    Fetch Operator
      limit: -1

具体介绍如下 


**stage1的map阶段**
        TableScan:from加载表,描述中有行数和大小等
        Filter Operator:where过滤条件筛选数据,描述有具体筛选条件和行数、大小等
        Select Operator:筛选列,描述中有列名、类型,输出类型、大小等。
        Group By Operator:分组,描述了分组后需要计算的函数,keys描述用于分组的列,outputColumnNames为输出的列名,可以看出列默认使用固定的别名_col0,以及其他信息
        Reduce Output Operator:map端本地的reduce,进行本地的计算,然后按列映射到对应的reduce
**stage1的reduce阶段Reduce Operator Tree**
        Group By Operator:总体分组,并按函数计算。map计算后的结果在reduce端的合并。描述类似。mode: mergepartial是说合并map的计算结果。map端是hash映射分组
        Select Operator:最后过滤列用于输出结果
        File Output Operator:输出结果到临时文件中,描述介绍了压缩格式、输出文件格式。
        stage0第二阶段没有,这里可以实现limit 100的操作。

总结

1,每个stage都是一个独立的MR,复杂的hql语句可以产生多个stage,可以通过执行计划的描述,看看具体步骤是什么。
2,执行计划有时预测数据量,不是真实运行,可能不准确

 

转载于:https://my.oschina.net/u/2000675/blog/1920468

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值