mysql数据转成hql_Hive是如何将hql语法转换成MR执行的

> explain extended select * from mytest_staff_info_demo4_cp1

> where statis_date='20180229'

> order by name

> limit 3;

OK

ABSTRACT SYNTAX TREE:

TOK_QUERY

TOK_FROM

TOK_TABREF

TOK_TABNAME

mytest_staff_info_demo4_cp1

TOK_INSERT

TOK_DESTINATION

TOK_DIR

TOK_TMP_FILE

TOK_SELECT

TOK_SELEXPR

TOK_ALLCOLREF

TOK_WHERE

=

TOK_TABLE_OR_COL

statis_date

'20180229'

TOK_ORDERBY

TOK_TABSORTCOLNAMEASC

TOK_TABLE_OR_COL

name

TOK_LIMIT

3

STAGE DEPENDENCIES:

Stage-1 is a root stage

Stage-0 is a root stage

STAGE PLANS:

Stage: Stage-1

Map Reduce

Map Operator Tree:

TableScan

alias: mytest_staff_info_demo4_cp1

Statistics: Num rows: 0 Data size: 195 Basic stats: PARTIAL Column stats: NONE

GatherStats: false

Select Operator

expressions: name (type: string), deptcode (type: string), id (type: int), account (type: string), areacode (type: string), statis_date (type: string)

outputColumnNames: _col0, _col1, _col2, _col3, _col4, _col5

Statistics: Num rows: 0 Data size: 195 Basic stats: PARTIAL Column stats: NONE

Reduce Output Operator

key expressions: _col0 (type: string)

sort order: +

Statistics: Num rows: 0 Data size: 195 Basic stats: PARTIAL Column stats: NONE

tag: -1

value expressions: _col0 (type: string), _col1 (type: string), _col2 (type: int), _col3 (type: string), _col4 (type: string), _col5 (type: string)

Path -> Alias:

hdfs://SuningHadoop2/user/finance/hive/warehouse/fdm_sor.db/mytest_staff_info_demo4_cp1/statis_date=20180229 [mytest_staff_info_demo4_cp1]

Path -> Partition:

hdfs://SuningHadoop2/user/finance/hive/warehouse/fdm_sor.db/mytest_staff_info_demo4_cp1/statis_date=20180229

Partition

base file name: statis_date=20180229

input format: org.apache.hadoop.mapred.TextInputFormat

output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat

partition values:

statis_date 20180229

properties:

bucket_count -1

columns name,deptcode,id,account,areacode

columns.comments

columns.types string:string:int:string:string

field.delim

file.inputformat org.apache.hadoop.mapred.TextInputFormat

file.outputformat org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat

location hdfs://SuningHadoop2/user/finance/hive/warehouse/fdm_sor.db/mytest_staff_info_demo4_cp1/statis_date=20180229

name fdm_sor.mytest_staff_info_demo4_cp1

partition_columns statis_date

partition_columns.types string

serialization.ddl struct mytest_staff_info_demo4_cp1 { string name, string deptcode, i32 id, string account, string areacode}

serialization.format

serialization.lib org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe

transient_lastDdlTime 1521166440

serde: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe

input format: org.apache.hadoop.mapred.TextInputFormat

output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat

properties:

bucket_count -1

columns name,deptcode,id,account,areacode

columns.comments

columns.types string:string:int:string:string

field.delim

file.inputformat org.apache.hadoop.mapred.TextInputFormat

file.outputformat org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat

location hdfs://SuningHadoop2/user/finance/hive/warehouse/fdm_sor.db/mytest_staff_info_demo4_cp1

name fdm_sor.mytest_staff_info_demo4_cp1

partition_columns statis_date

partition_columns.types string

serialization.ddl struct mytest_staff_info_demo4_cp1 { string name, string deptcode, i32 id, string account, string areacode}

serialization.format

serialization.lib org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe

transient_lastDdlTime 1519780670

serde: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe

name: fdm_sor.mytest_staff_info_demo4_cp1

name: fdm_sor.mytest_staff_info_demo4_cp1

Truncated Path -> Alias:

/fdm_sor.db/mytest_staff_info_demo4_cp1/statis_date=20180229 [mytest_staff_info_demo4_cp1]

Needs Tagging: false

Reduce Operator Tree:

Extract

Statistics: Num rows: 0 Data size: 195 Basic stats: PARTIAL Column stats: NONE

Limit

Number of rows: 3

Statistics: Num rows: 0 Data size: 195 Basic stats: PARTIAL Column stats: NONE

File Output Operator

compressed: true

GlobalTableId: 0

directory: alluxio-ft://namenode1-sit.cnsuning.com:19998/user/finance/tmp/hive-finance/hive_2018-03-19_09-35-04_376_4971094765120649963-1/-ext-10001

NumFilesPerFileSink: 1

Statistics: Num rows: 0 Data size: 195 Basic stats: PARTIAL Column stats: NONE

Stats Publishing Key Prefix: alluxio-ft://namenode1-sit.cnsuning.com:19998/user/finance/tmp/hive-finance/hive_2018-03-19_09-35-04_376_4971094765120649963-1/-ext-10001/

table:

input format: org.apache.hadoop.mapred.TextInputFormat

output format: org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat

properties:

columns _col0,_col1,_col2,_col3,_col4,_col5

columns.types string:string:int:string:string:string

escape.delim \

hive.serialization.extend.nesting.levels true

serialization.format 1

serialization.lib org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe

serde: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe

TotalFiles: 1

GatherStats: false

MultiFileSpray: false

Stage: Stage-0

Fetch Operator

limit: 3

Time taken: 0.151 seconds, Fetched: 133 row(s)

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值