导入:
1、方式1: insert into table score3 partition(dt ='2022-10-01') values ('001','002',100); 2、方式2 insert overwrite table score4 partition(dt ='2022-10-01') select sid,cid,sscore from score; 3、方式3 load data local inpath '/export/data/hivedatas/score.txt' overwrite into table score5 partition(dt ='2022-10-01'); 4、方式4 create table score5 as select * from score; 5、方式5 create external table score6 (sid string,cid string,sscore int) row format delimited fields terminated by '\t' location '/myscore6'; 6、方式6 hadoop fs -put stu.txt /user/hive/warehouse/myhive.db/stu 7、方式7 sqoop框架将数据直接导入hive
导出:
-- 将sql查询的结果导出到本地磁盘 insert overwrite local directory '/export/data/exporthive' row format delimited fields terminated by '\t' select * from score where sscore > 85; -- 将sql查询的结果导出到HDFS insert overwrite directory '/export/data/exporthive' row format delimited fields terminated by '\t' select * from score where sscore > 85; -- 将Hive -e 命令的执行结果导出本地目录文件 hive -e "select * from myhive.score;" > /export/data/exporthive/score.txt -- 将一张表的数据全部导出到HDFS export table score to '/export/exporthive/score';
# 博学谷IT 技术支持