在Sqoop中执行HIVE/HDFS 到 RDBMS

Sqoop的导出数据:
在 Sqoop 中,“导出”概念指:从大数据集群(HDFS,HIVE,HBASE)向非大数据集群(RDBMS)中传输数据,
叫做:导出,即使用 export 关键字

在Sqoop中执行导出数据,出现了如下的错误信息:

[admin@hadoop102 sqoop-1.4.6.bin__hadoop-2.0.4-alpha]$ bin/sqoop export --connect jdbc:mysql://hadoop102:3306/company --username root --password root --table staff --num-mappers 1 --export-dir /user/hive/warehouse/staff_hive --input-fields-terminated-by "\t"
Warning: /opt/module/sqoop-1.4.6.bin__hadoop-2.0.4-alpha/../hbase does not exist! HBase imports will fail.
Please set $HBASE_HOME to the root of your HBase installation.
Warning: /opt/module/sqoop-1.4.6.bin__hadoop-2.0.4-alpha/../hcatalog does not exist! HCatalog jobs will fail.
Please set $HCAT_HOME to the root of your HCatalog installation.
Warning: /opt/module/sqoop-1.4.6.bin__hadoop-2.0.4-alpha/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
18/08/08 07:06:55 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6
18/08/08 07:06:56 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
18/08/08 07:06:56 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
18/08/08 07:06:56 INFO tool.CodeGenTool: Beginning code generation
18/08/08 07:06:57 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `staff` AS t LIMIT 1
18/08/08 07:06:58 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `staff` AS t LIMIT 1
18/08/08 07:06:58 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /opt/module/hadoop-2.7.2
注: /tmp/sqoop-atguigu/compile/0e689538e1c720e90b6c94377089c476/staff.java使用或覆盖了已过时的 API。
注: 有关详细信息, 请使用 -Xlint:deprecation 重新编译。
18/08/08 07:07:02 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-atguigu/compile/0e689538e1c720e90b6c94377089c476/staff.jar
18/08/08 07:07:02 INFO mapreduce.ExportJobBase: Beginning export of staff
18/08/08 07:07:02 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar
18/08/08 07:07:04 WARN mapreduce.ExportJobBase: Input path hdfs://hadoop102:9000/user/hive/warehouse/staff_hive does not exist
18/08/08 07:07:04 INFO Configuration.deprecation: mapred.reduce.tasks.speculative.execution is deprecated. Instead, use mapreduce.reduce.speculative
18/08/08 07:07:04 INFO Configuration.deprecation: mapred.map.tasks.speculative.execution is deprecated. Instead, use mapreduce.map.speculative
18/08/08 07:07:04 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps
18/08/08 07:07:05 INFO client.RMProxy: Connecting to ResourceManager at hadoop103/192.168.1.103:8032
18/08/08 07:07:17 INFO hdfs.DFSClient: Could not complete /tmp/hadoop-yarn/staging/atguigu/.staging/job_1533681366197_0002/libjars/parquet-avro-1.4.1.jar retrying...
18/08/08 07:07:23 INFO hdfs.DFSClient: Could not complete /tmp/hadoop-yarn/staging/atguigu/.staging/job_1533681366197_0002/libjars/parquet-avro-1.4.1.jar retrying...
18/08/08 07:07:23 INFO mapreduce.JobSubmitter: Cleaning up the staging area /tmp/hadoop-yarn/staging/atguigu/.staging/job_1533681366197_0002
18/08/08 07:07:23 ERROR tool.ExportTool: Encountered IOException running export job: java.io.IOException: Unable to close file because the last block does not have enough number of replicas

求各位指导一下,谢谢啦!

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值