准备数据
SELECT * FROM audi;
1.使用insert 导出
1.1 导出数据到本地
INSERT overwrite local directory '/root/data/auditmp' SELECT * FROM auditmp ;
在Linux上查看导出结果
(base) [root@192 data]# ll *audi*
-rw-r--r--. 1 root root 534089 4月 22 06:00 audi_202204220558.csv
auditmp:
总用量 512
-rw-r--r--. 1 root root 523336 4月 23 19:46 000000_0
(base) [root@192 data]# pwd
/root/data
1.2 导出数据到HDFS
INSERT overwrite directory '/auditmpinsertexport' SELECT * FROM auditmp ;
在hdfs上查看结果
2.使用Hadoop命令下载到本地
2.1 方式一
(base) [root@192 data]# pwd
/root/data
(base) [root@192 data]# find bac*
find: ‘bac*’: 没有那个文件或目录
(base) [root@192 data]# hdfs dfs -get /user/hive/warehouse/hive_test_one.db/auditmp /root/data/bac1auditmp
(base) [root@192 data]# find bac*
bac1auditmp
bac1auditmp/000000_0
2.2 方式二
(base) [root@192 data]# hadoop fs -get /user/hive/warehouse/hive_test_one.db/auditmp /root/data/bac2auditmp
(base) [root@192 data]# find bac*
bac1auditmp
bac1auditmp/000000_0
bac2auditmp
bac2auditmp/000000_0
3.使用hive命令导出数据
需要指定数据库和表
(base) [root@192 data]# hive -e 'select * from hive_test_one.auditmp;' > /root/data/bac3auditmp
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/opt/apache-hive-2.3.9-bin/lib/log4j-slf4j-impl-2.6.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/hadoop-2.7.7/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
Logging initialized using configuration in jar:file:/opt/apache-hive-2.3.9-bin/lib/hive-common-2.3.9.jar!/hive-log4j2.properties Async: true
OK
Time taken: 6.399 seconds, Fetched: 10668 row(s)
(base) [root@192 data]# find bac*
bac1auditmp
bac1auditmp/000000_0
bac2auditmp
bac2auditmp/000000_0
bac3auditmp
4.Export导出数据
export table audi to '/exportaudi';