java sqoop hive,Sqoop import data to hive and hdfs

Can we import data with sqoop into hdfs and hive? I have try it with two different user (root and admin) Below the command between root and admin

[Root]

sqoop import --connect jdbc:mysql://10.6.162.58/test_alpha --username pbd -P --table posts --hive-import --hive-database test_root --hive-table posts1 --hive-drop-import-delims --null-string '\N' --null-non-string '\N' --target-dir test_hive_root/2

[Admin]

sqoop import --connect jdbc:mysql://10.6.162.58/test_alpha --username pbd -P --table posts --hive-import --hive-database test_admin --hive-table posts1 --hive-drop-import-delims --null-string '\N' --null-non-string '\N' --target-dir test_hive_admin/2

It returns:

Both ways successful import data to hive but both fail import to HDFS.

For root, it don't create directory that I've defined "test_hive_root/2"

For admin, it's just create directory "test_hive_admin/2" but only "success" file that have been created not the data (usually when I import it to HDFS, it will create "success" file and 4 more file)

How I can solve this problem? Can sqoop import to HDFS and Hive?

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值