【若泽大数据实战第十八天】Hive - 创建测试表dual的使用

-- 创建 dual 表(只有在测试的时候用insert)

hive> create table dual(x string);
OK
Time taken: 0.282 seconds
hive> insert into table dual values('');
Query ID = hadoop_20180611233030_645e070e-77f9-4ea4-8b32-ee306424c16b
Total jobs = 3
Launching Job 1 out of 3
Number of reduce tasks is set to 0 since there's no reduce operator
Starting Job = job_1528730871092_0001, Tracking URL = http://hadoop000:8088/proxy/application_1528730871092_0001/
Kill Command = /home/hadoop/app/hadoop-2.6.0-cdh5.7.0/bin/hadoop job  -kill job_1528730871092_0001
Hadoop job information for Stage-1: number of mappers: 1; number of reducers: 0
2018-06-11 23:31:32,290 Stage-1 map = 0%,  reduce = 0%
2018-06-11 23:31:37,712 Stage-1 map = 100%,  reduce = 0%, Cumulative CPU 1.54 sec
MapReduce Total cumulative CPU time: 1 seconds 540 msec
Ended Job = job_1528730871092_0001
Stage-4 is selected by condition resolver.
Stage-3 is filtered out by condition resolver.
Stage-5 is filtered out by condition resolver.
Moving data to: hdfs://hadoop000:9000/user/hive/warehouse/hive3.db/dual/.hive-staging_hive_2018-06-11_23-31-19_987_4145860992197710987-1/-ext-10000
Loading data to table hive3.dual
Table hive3.dual stats: [numFiles=1, numRows=1, totalSize=1, rawDataSize=0]
MapReduce Jobs Launched: 
Stage-Stage-1: Map: 1   Cumulative
  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值