Sqoop1.4.7导入Hive

Hive表创建

1、Hive创建内部表

create table mvs_temp_13(
id string,
user_id string,
dept_id string,
device_code string,
temp_body string,
timepoint string,
date_point string
)
row format delimited fields terminated by ',' lines terminated by '\n' ;

连接hive:

!connect jdbc:hive2://localhost:10000

load data local inpath '/data/hive_file/temp13.txt' into table mvs_temp_13;

2、Hive创建外部表

create external table mvs_temp_13_external(
id string,
user_id string,
dept_id string,
device_code string,
temp_body double,
timepoint string,
date_point string
)
row format delimited fields terminated by ',' lines terminated by '\n' location '/lgfTest/mvstemp13';

注意:通常配合hdfs使用,先把文件上传到hdfs,之后通过创建hive外部表进行关联

3、Hive创建分区表

create table mvs_temp_p(
id string,
user_id string,
dept_id string,
device_code string,
temp_body string,
timepoint string
) partitioned by (date_point string)
row format delimited fields terminated by ',' lines terminated by '\n' ;

加载数据:

方式一:

insert into table mvs_temp_12 partition(date_point='2021-12-12') select id,user_id,dept_id,device_code,temp_body where date_point='2021-12-12';

方式二:

load data local inpath '/data/hive_file/temp12.txt' into table mvs_temp_12 partition(date_point='2021-12-12');

注意在load 文件到hive表中时,如果定义的字段属性和实际文件中的数据不一致(如定义的字段temp_body 为double ,而文件中数据为带引号的数值”360“),hive无法把字符串转换为double,则在加载到hive表时,对应的temp_body字段则为null

Sqoop命令:

mysql导入hdfs

连接MySQL测试:
sqoop-list-databases --connect jdbc:mysql://192.168.0.57:3307/mvs_base_test --username root --password Rf3m2RXOWBi2YyMT

mysql导入hdfs--生成1个文件
sqoop import --connect jdbc:mysql://192.168.0.57:3307/mvs_202112 --username root --password Rf3m2RXOWBi2YyMT --delete-target-dir  --target-dir /lgfTest/mvstemp13  --fields-terminated-by ',' --query 'select id,user_id,dept_id,device_code,temp_body,timepoint,date_point from mvs_temp_13 where version=0 and $CONDITIONS' -m 1

mysql导入hdfs--生成2个文件
sqoop import --connect jdbc:mysql://192.168.0.57:3307/mvs_202112 --username root --password Rf3m2RXOWBi2YyMT --delete-target-dir  --target-dir /lgfTest/mvstemp13  --fields-terminated-by ',' --query 'select id,user_id,dept_id,device_code,temp_body,timepoint,date_point from mvs_temp_13 where version=0 and $CONDITIONS' -m 2 --split-by id

mysql 导入hive

创建hive表----mysql导入hive表

create table mvs_temp_14(
id string,
user_id string,
dept_id string,
device_code string,
temp_body double,
timepoint string,
date_point string
)
row format delimited fields terminated by ',' lines terminated by '\n' ;

sqoop import --connect jdbc:mysql://192.168.0.57:3307/mvs_202112 --username root --password Rf3m2RXOWBi2YyMT --query 'select id,user_id,dept_id,device_code,temp_body,timepoint,date_point from mvs_temp_14 where version=0 and $CONDITIONS' --hive-import --hive-overwrite --hive-database lgfdb --hive-table mvs_temp_14 -m 1  --fields-terminated-by ',' --delete-target-dir   --target-dir /lgfTest/mvstemp14

mysql导入hive---自动生成hive内部表(字段类型对应MySQL)
sqoop import --connect jdbc:mysql://192.168.0.57:3307/mvs_202112 --username root --password Rf3m2RXOWBi2YyMT --query 'select id,user_id,dept_id,device_code,temp_body,timepoint,date_point from mvs_temp_15 where version=0 and $CONDITIONS' --hive-import --hive-overwrite --hive-database lgfdb --hive-table mvs_temp_15 -m 1  --fields-terminated-by ',' --delete-target-dir   --target-dir /lgfTest/mvstemp15

    参考:

大数据之sqoopCDH 备份 - 农民阿姨 - 博客园

SQOOP数据导入_zleven的博客-CSDN博客_sqoop数据导入

  • 0
    点赞
  • 3
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值