hive> desc extended s_path;
-----------------------
hive> desc formatted s_path;
-----------------------
desc formatted r_ku6_rss_vid_daily partition(pt='2013-08-14');
分区表的优势:
不不言而喻,可以针对于指定的分区的数据进行查询,不必遍历整张hive表里面的数据。
分区表操作:
必须在表定义时创建partition
a、单分区建表语句:
create table day_table (id int, content string) partitioned by (dt string);
单分区表,按天分区,在表结构中存在id,content,dt三列。
以dt为文件夹区分
b、 双分区建表语句:
create table day_hour_table (id int, content string) partitioned by (dt string, hour string);
双分区表,按天和小时分区,在表结构中新增加了dt和hour两列。
先以dt为文件夹,再以hour子文件夹区分
添加分区表语法(表已创建,在此基础上添加分区):
ALTER TABLE table_name ADD
partition_spec [ LOCATION 'location1' ]
partition_spec [ LOCATION 'location2' ] ...
--比如:
ALTER TABLE day_table ADD
PARTITION (dt='2008-08-08', hour='08')
location '/path/pv1.txt'
alter table t_pvorder add IF NOT EXISTS partition (start_date='20130123')
location '/workspace/tda/tunion/thive/t_pvorder/start_date=20130123'
添加分区的时候,分区路径必须是个目录,不能为具体文件。否则会出现异常
alter table anjian_low_risk add IF NOT EXISTS partition (datecol=20130321)
> LOCATION "/workspace/yda/fuxi/anjian/20130321/td_low_risk"
> ;
FAILED: Error in metadata: MetaException(message:Got exception: org.apache.hadoop.ipc.RemoteException java.io.FileNotFoundException: Parent path is not a directory: /workspace/yda/fuxi/anjian/20130321/td_low_risk
删除分区语法:
ALTER TABLE table_name DROP
partition_spec, partition_spec,...
用户可以用 ALTER TABLE DROP PARTITION 来删除分区。分区的元数据和数据将被一并删除。
例:(注意括号)
alter table t_pvorder drop partition (start_date='20130123') ;
ALTER TABLE day_hour_table DROP PARTITION (dt='2008-08-08', hour='09');
数据加载进分区表中语法:
LOAD DATA [LOCAL] INPATH 'filepath' [OVERWRITE] INTO TABLE tablename [PARTITION (partcol1=val1, partcol2=val2 ...)]
例:
LOAD DATA INPATH '/user/pv.txt' INTO TABLE day_hour_table PARTITION(dt='2008-08- 08', hour='08'); LOAD DATA local INPATH '/user/hua/*' INTO TABLE day_hour partition(dt='2010-07- 07');
当数据被加载至表中时,不会对数据进行任何转换。Load操作只是将数据复制至Hive表对应的位置。数据加载时在表下自动创建一个目录
基于分区的查询的语句:
SELECT day_table.* FROM day_table WHERE day_table.dt>= '2008-08-08';
查看分区语句:
hive> show partitions day_hour_table;
OK dt=2008-08-08/hour=08 dt=2008-08-08/hour=09 dt=2008-08-09/hour=09
查看表信息:
查看建表信息:
hiveddl p_sdo_data_etl.r_ku6_rss_vid_daily
实际使用例子:shell
#!/bin/sh
WORKDIR='/work/tda/tunion/unionshell'
date=`date -d yesterday +%Y%m%d`
if [ $# -eq "1" ]
then
date=$1
fi
#$WORKDIR/RunHQL "$Hql"
#Hql=' drop table tunion.t_pvjoinvv ;'
#echo $Hql
#$WORKDIR/RunHQL "$Hql"
Hql=' create external table IF NOT EXISTS tunion.t_pvjoinvv
(
juid string
,sessionid string
,uid string
,pid string
)
partitioned by (start_date string) --指定分区
row format delimited fields terminated by "\t" --指定分隔符
LOCATION "/workspace/tda/tunion/thive/t_pvjoinvv"; --指定分区文件存放路径
stored as rcfile; -----指定表文件存储方式
'
$WORKDIR/RunHQL "$Hql"
Hql=' set mapred.reduce.tasks=8;
set mapred.job.priority=VERY_HIGH;
use tunion;
insert overwrite table t_pvjoinvv PARTITION (start_date='$date')
select
t1.juid
,t1.sessionid
,t1.uid
,t2.flag
,t2.url
,t2.referurl
,t2.ip
,t2.newuser
from tunion.t_pvorder t1
join
t_vvformat t2
on t1.juid=t2.juid
and t1.pvid=t2.pvid
where t1.start_date='$date'
and t2.start_date='$date'
;'
#echo $Hql
$WORKDIR/RunHQL "$Hql"
查看