Spark SQL的执行计划

Spark SQL的架构

在这里插入图片描述

实例分析

spark-sql> explain extended select * from emp e inner join dept d on e.deptno = d.deptno where e.deptno > 10;
20/02/04 20:16:31 INFO CodeGenerator: Code generated in 22.286318 ms
== Parsed Logical Plan ==
'Project [*]
+- 'Filter ('e.deptno > 10)
   +- 'Join Inner, ('e.deptno = 'd.deptno)
      :- 'SubqueryAlias `e`
      :  +- 'UnresolvedRelation `emp`
      +- 'SubqueryAlias `d`
         +- 'UnresolvedRelation `dept`

== Analyzed Logical Plan ==
empno: int, ename: string, position: string, managerid: int, hiredate: string, salary: double, allowance: double, deptno: int, deptno: int, ename: string, dname: string, city: int
Project [empno#18, ename#19, position#20, managerid#21, hiredate#22, salary#23, allowance#24, deptno#25, deptno#26, ename#27, dname#28, city#29]
+- Filter (deptno#25 > 10)
   +- Join Inner, (deptno#25 = deptno#26)
      :- SubqueryAlias `e`
      :  +- SubqueryAlias `h_demo`.`emp`
      :     +- HiveTableRelation `h_demo`.`emp`, org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe, [empno#18, ename#19, position#20, managerid#21, hiredate#22, salary#23, allowance#24, deptno#25]
      +- SubqueryAlias `d`
         +- SubqueryAlias `h_demo`.`dept`
            +- HiveTableRelation `h_demo`.`dept`, org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe, [deptno#26, ename#27, dname#28, city#29]

== Optimized Logical Plan ==
Join Inner, (deptno#25 = deptno#26)
:- Filter (isnotnull(deptno#25) && (deptno#25 > 10))
:  +- HiveTableRelation `h_demo`.`emp`, org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe, [empno#18, ename#19, position#20, managerid#21, hiredate#22, salary#23, allowance#24, deptno#25]
+- Filter ((deptno#26 > 10) && isnotnull(deptno#26))
   +- HiveTableRelation `h_demo`.`dept`, org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe, [deptno#26, ename#27, dname#28, city#29]

== Physical Plan ==
*(2) BroadcastHashJoin [deptno#25], [deptno#26], Inner, BuildRight
:- *(2) Filter (isnotnull(deptno#25) && (deptno#25 > 10))
:  +- Scan hive h_demo.emp [empno#18, ename#19, position#20, managerid#21, hiredate#22, salary#23, allowance#24, deptno#25], HiveTableRelation `h_demo`.`emp`, org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe, [empno#18, ename#19, position#20, managerid#21, hiredate#22, salary#23, allowance#24, deptno#25]
+- BroadcastExchange HashedRelationBroadcastMode(List(cast(input[0, int, false] as bigint)))
   +- *(1) Filter ((deptno#26 > 10) && isnotnull(deptno#26))
      +- Scan hive h_demo.dept [deptno#26, ename#27, dname#28, city#29], HiveTableRelation `h_demo`.`dept`, org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe, [deptno#26, ename#27, dname#28, city#29]
Time taken: 0.352 seconds, Fetched 1 row(s)
20/02/04 20:16:31 INFO SparkSQLCLIDriver: Time taken: 0.352 seconds, Fetched 1 row(s)

  • 0
    点赞
  • 2
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值