[Hive]Hive指定查询输出分隔符

102 篇文章 11 订阅

 

业务场景

做数据分析的时候,经常会用到hive -e "sql" > xxx.txt或者最原始的hive命令行来获得查询结果,然后再将查询结果放到Excel等工具中,但是如果查询的字段太多,这时候将查询结果放到Excel会经常会碰到错位问题,很是头疼.

解决方案一:借助linux管道替换输出分隔符

样例如下:

# 方法一:sed
hive -e "select * from pms.pms_algorithm_desc" | sed 's/\t/,/g' > ./aaa.txt

# 方法二:tr
hive -e "select * from pms.pms_tp_config" | tr "\t" ","
  • 1
  • 2
  • 3
  • 4
  • 5

结果如下:

$ cat aaa.txt 
id,algorithm_id,algorithm_name,algorithm_desc,is_delete,update_time
1,0,默认,,0,2015-11-02 18:14:25.0
2,1,相关,相关分类或者买了还买,0,2015-11-02 18:14:25.0
3,2,相似,,0,2015-11-02 18:14:25.0
4,3,购物车商品为空时类目热销,,0,2015-11-02 18:14:25.0
5,4,热销补余(销量,GMV),,0,2015-11-02 18:14:25.0
6,5,指定类目选品,APP首页价比JD低补余逻辑中指定CE类目选品,0,2015-11-02 18:14:25.0

解决方案二:借助Hive的insert语法

样例如下:

insert overwrite local directory '/home/pms/workspace/ouyangyewei/data/bi_lost'
row format delimited
fields terminated by ','
select xxxx 
from xxxx;
  • 1
  • 2
  • 3
  • 4
  • 5

上面的sql将会把查询结果写到/home/pms/workspace/ouyangyewei/data/bi_lost_add_cart目录中,字段之间以,分隔

结果如下:

$ ls ~/workspace/ouyangyewei/data/bi_lost
000000_0

$ cat ~/workspace/ouyangyewei/data/bi_lost/000000_0 
125171836,11565,6225443584836

Apache Hive官网上的介绍如下:

Standard syntax:
INSERT OVERWRITE [LOCAL] DIRECTORY directory1
  [ROW FORMAT row_format] [STORED AS file_format] (Note: Only available starting with Hive 0.11.0)
  SELECT ... FROM ...

Hive extension (multiple inserts):
FROM from_statement
INSERT OVERWRITE [LOCAL] DIRECTORY directory1 select_statement1
[INSERT OVERWRITE [LOCAL] DIRECTORY directory2 select_statement2] ...


row_format
  : DELIMITED [FIELDS TERMINATED BY char [ESCAPED BY char]] [COLLECTION ITEMS TERMINATED BY char]
        [MAP KEYS TERMINATED BY char] [LINES TERMINATED BY char]
        [NULL DEFINED AS char] (Note: Only available starting with Hive 0.13)
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值