org.apache.sqoop.hive.HiveImport - Loading uploaded data into Hive Intercepting System.exit(1)

  在 HUE 的 作业设计器中 运行sqoop 作业,调用命令:

sqoop import --connect jdbc:mysql://localhost:3306/test --username root --password mysql-password --table t1 --hive-import

时报错,错误如下:

 Sqoop command arguments :
  import
  --connect
  jdbc:mysql://192.168.7.74:3306/test
  --username
  test
  --password
  test
  --table
  user_info
  --hive-import
  =================================================================
  
  >>> Invoking Sqoop command line now >>>
  
  3323 [main] WARN  org.apache.sqoop.tool.SqoopTool  - $SQOOP_CONF_DIR has not been set in the environment. Cannot check for additional configuration.
  3367 [main] INFO  org.apache.sqoop.Sqoop  - Running Sqoop version: 1.4.5-cdh5.2.0
  3389 [main] WARN  org.apache.sqoop.tool.BaseSqoopTool  - Setting your password on the command-line is insecure. Consider using -P instead.
  3390 [main] INFO  org.apache.sqoop.tool.BaseSqoopTool  - Using Hive-specific delimiters for output. You can override
  3390 [main] INFO  org.apache.sqoop.tool.BaseSqoopTool  - delimiters with --fields-terminated-by, etc.
  3431 [main] WARN  org.apache.sqoop.ConnFactory  - $SQOOP_CONF_DIR has not been set in the environment. Cannot check for additional configuration.
  3580 [main] INFO  org.apache.sqoop.manager.SqlManager  - Using default fetchSize of 1000
  3586 [main] INFO  org.apache.sqoop.tool.CodeGenTool  - Beginning code generation
  4036 [main] INFO  org.apache.sqoop.manager.SqlManager  - Executing SQL statement: SELECT t.* FROM `user_info` AS t LIMIT 1
  4084 [main] INFO  org.apache.sqoop.manager.SqlManager  - Executing SQL statement: SELECT t.* FROM `user_info` AS t LIMIT 1
  4087 [main] INFO  org.apache.sqoop.orm.CompilationManager  - HADOOP_MAPRED_HOME is /opt/cloudera/parcels/CDH-5.2.0-1.cdh5.2.0.p0.36/lib/hadoop-mapreduce
  6090 [main] INFO  org.apache.sqoop.orm.CompilationManager  - Writing jar file: /tmp/sqoop-yarn/compile/f9a1056980029d03e32f75e1b231f4b5/user_info.jar
  6112 [main] WARN  org.apache.sqoop.manager.MySQLManager  - It looks like you are importing from mysql.
  6112 [main] WARN  org.apache.sqoop.manager.MySQLManager  - This transfer can be faster! Use the --direct
  6112 [main] WARN  org.apache.sqoop.manager.MySQLManager  - option to exercise a MySQL-specific fast path.
  6112 [main] INFO  org.apache.sqoop.manager.MySQLManager  - Setting zero DATETIME behavior to convertToNull (mysql)
  6116 [main] INFO  org.apache.sqoop.mapreduce.ImportJobBase  - Beginning import of user_info
  6155 [main] WARN  org.apache.sqoop.mapreduce.JobBase  - SQOOP_HOME is unset. May not be able to find all job dependencies.
  6839 [main] INFO  org.apache.sqoop.mapreduce.db.DBInputFormat  - Using read commited transaction isolation
  6840 [main] INFO  org.apache.sqoop.mapreduce.db.DataDrivenDBInputFormat  - BoundingValsQuery: SELECT MIN(`uid`), MAX(`uid`) FROM `user_info`
  Heart beat
  33737 [main] INFO  org.apache.sqoop.mapreduce.ImportJobBase  - Transferred 14 bytes in 27.57 seconds (0.5078 bytes/sec)
  33747 [main] INFO  org.apache.sqoop.mapreduce.ImportJobBase  - Retrieved 1 records.
  33770 [main] INFO  org.apache.sqoop.manager.SqlManager  - Executing SQL statement: SELECT t.* FROM `user_info` AS t LIMIT 1
  33782 [main] INFO  org.apache.sqoop.hive.HiveImport  - Loading uploaded data into Hive
  Intercepting System.exit(1)
  
  <<< Invocation of Main class completed <<<
  
  Failing Oozie Launcher, Main class [org.apache.oozie.action.hadoop.SqoopMain], exit code [1]
  
  Oozie Launcher failed, finishing Hadoop job gracefully
  
  Oozie Launcher, uploading action data to HDFS sequence file: hdfs://rhel072:8020/user/admin/oozie-oozi/0000003-141029091352918-oozie-oozi-W/mysqlTableData2hive--sqoop/action-data.seq
  
  Oozie Launcher ends
分析发现 在往 hive 写数据的时候中断了,经测试发现,能往 hive 读数据,不能往 hive 写数据;求大神指点?

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 3
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论 3
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值