sqoop导入导出

1.列出有多少数据库
sqoop list-databases \
–connect jdbc:mysql://192.168.85.3:3306/hadoop \
–username root \
–password Caofeng2012@

2.简单导入HDFS
sqoop import \
–connect jdbc:mysql://192.168.85.3:3306/hadoop \
–username root \
–password Caofeng2012@ \
–table t_student \
-m 1 \
–target-dir /sqo/01 –delete-target-dir;

导出的查询结果:
[root@localhost conf]# hadoop fs -ls /sqo/01;
Found 2 items
-rw-r–r– 1 root supergroup 0 2018-07-18 20:31 /sqo/01/_SUCCESS
-rw-r–r– 1 root supergroup 39 2018-07-18 20:31 /sqo/01/part-m-00000
[root@localhost conf]# hadoop fs -cat /sqo/01/part-m-00000;
11,111,1111
11,11,11
222,222,222
3,3,3
[root@localhost conf]#

查询语句mysql导入到hdfs
sqoop import \
–connect jdbc:mysql://192.168.85.3:3306/hadoop \
–username root \
–password Caofeng2012@ \
–target-dir /sqo/03 \
–query ‘select id,name from t_student where $CONDITIONS and id=”3”’ \
–split-by id \
–fields-terminated-by ‘\t’ \
-m 4;
除配置好环境外,需要复制jar包
cp /home/hive/lib/hive-common-2.3.3.jar /home/sqoop1.4.7/lib
不然会报错
Import failed: java.io.IOException: java.lang.ClassNotFoundException: org.apache.hadoop.hive.conf.HiveConf

导入到hive
sqoop import \
–connect jdbc:mysql://192.168.85.3:3306/hadoop \
–username root \
–password Caofeng2012@ \
–query ‘select id,name from t_student where $CONDITIONS and id=”3”’ \
–split-by id \
–fields-terminated-by ‘\t’ \
–create-hive-table –hive-import –hive-overwrite \
–target-dir /temp3/ \
–hive-table hivetest.student16 –delete-target-dir ;

下面是参考别人的
使用Sqoop导入MySQL数据到HDFS

sqoop import –connect jdbc:mysql://localhost:3306/hadoop \
–username root –password Caofeng2012@ \
–table user –columns ‘uid,uname’ -m 1 \
-target-dir ‘/sqoop/user’;

使用Sqoop导入MySQL数据到Hive中

sqoop import –hive-import –connect jdbc:mysql://localhost:3306/hadoop \
–username root –password Caofeng2012@ \
–table user \
–columns ‘uid,uname’ -m 1 ;

使用Sqoop导入MySQL数据到Hive中,并使用查询语句
sqoop import –hive-import –connect jdbc:mysql://localhost:3306/hadoop \
–username root –password Caofeng2012@ -m 1 –hive-table user6 \
–query ‘select * from user where uid<10 and $CONDITIONS’ \
–target-dir /sqoop/user5;

使用Sqoop将Hive中的数据导出到MySQL中

sqoop export –connect jdbc:mysql://localhost:3306/hadoop \
–username root –password Caofeng2012@ -m 1 \
–table user5 –export-dir /user/hive/warehouse/user6 ;

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值