sqoop错误解决

 ./sqoop import --connect jdbc:mysql://localhost:3306/xxxx  --username dba  --password  123456  --direct --table ehm_hosts  --target-dir /data/ehm_hosts -m1         


出现错误:

 

java.net.ConnectException
MESSAGE: Connection refused

STACKTRACE:

java.net.ConnectException: Connection refused
        at java.net.PlainSocketImpl.socketConnect(Native Method)
        at java.net.PlainSocketImpl.doConnect(PlainSocketImpl.java:351)
        at java.net.PlainSocketImpl.connectToAddress(PlainSocketImpl.java:213)
        at java.net.PlainSocketImpl.connect(PlainSocketImpl.java:200)
        at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:366)
        at java.net.Socket.connect(Socket.java:529)
        at java.net.Socket.connect(Socket.java:478)
        at java.net.Socket.<init>(Socket.java:375)
        at java.net.Socket.<init>(Socket.java:218)
        at com.mysql.jdbc.StandardSocketFactory.connect(StandardSocketFactory.java:173)
        at com.mysql.jdbc.MysqlIO.<init>(MysqlIO.java:267)
        at com.mysql.jdbc.Connection.createNewIO(Connection.java:2739)
        at com.mysql.jdbc.Connection.<init>(Connection.java:1553)
        at com.mysql.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:266)
        at java.sql.DriverManager.getConnection(DriverManager.java:582)
        at java.sql.DriverManager.getConnection(DriverManager.java:185)
        at org.apache.sqoop.mapreduce.db.DBConfiguration.getConnection(DBConfiguration.java:278)
        at org.apache.sqoop.mapreduce.db.DBInputFormat.getConnection(DBInputFormat.java:187)
        at org.apache.sqoop.mapreduce.db.DBInputFormat.setConf(DBInputFormat.java:162)
        at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:62)
        at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117)
        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:723)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
        at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:396)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
        at org.apache.hadoop.mapred.Child.main(Child.java:249)


** END NESTED EXCEPTION **



Last packet sent to the server was 22 ms ago.
        at org.apache.sqoop.mapreduce.db.DBInputFormat.getConnection(DBInputFormat.java:193)
        at org.apache.sqoop.mapreduce.db.DBInputFormat.setConf(DBInputFormat.java:162)
        ... 9 more
Caused by: com.mysql.jdbc.CommunicationsException: Communications link failure due to underlying exception: 

** BEGIN NESTED EXCEPTION ** 

java.net.ConnectException
MESSAGE: Connection refused

STACKTRACE:

java.net.ConnectException: Connection refused
        at java.net.PlainSocketImpl.socketConnect(Native Method)
        at java.net.PlainSocketImpl.doConnect(PlainSocketImpl.java:351)
        at java.net.PlainSocketImpl.connectToAddress(PlainSocketImpl.java:213)
        at java.net.PlainSocketImpl.connect(PlainSocketImpl.java:200)
        at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:366)
        at java.net.Socket.connect(Socket.java:529)
        at java.net.Socket.connect(Socket.java:478)
        at java.net.Socket.<init>(Socket.java:375)
        at java.net.Socket.<init>(Socket.java:218)
        at com.mysql.jdbc.StandardSocketFactory.connect(StandardSocketFactory.java:173)
        at com.mysql.jdbc.MysqlIO.<init>(MysqlIO.java:267)
        at com.mysql.jdbc.Connection.createNewIO(Connection.java:2739)
        at com.mysql.jdbc.Connection.<init>(Connection.java:1553)
        at com.mysql.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:266)
        at java.sql.DriverManager.getConnection(DriverManager.java:582)
        at java.sql.DriverManager.getConnection(DriverManager.java:185)
        at org.apache.sqoop.mapreduce.db.DBConfiguration.getConnection(DBConfiguration.java:278)
        at org.apache.sqoop.mapreduce.db.DBInputFormat.getConnection(DBInputFormat.java:187)
        at org.apache.sqoop.mapreduce.db.DBInputFormat.setConf(DBInputFormat.java:162)
        at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:62)
        at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117)
        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:723)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
        at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:396)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
        at org.apache.hadoop.mapred.Child.main(Child.java:249)


** END NESTED EXCEPTION **



Last packet sent to the server was 22 ms ago.
        at com.mysql.jdbc.Connection.createNewIO(Connection.java:2814)
        at com.mysql.jdbc.Connection.<init>(Connection.java:1553)
        at com.mysql.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:266)
        at java.sql.DriverManager.getConnection(DriverManager.java:582)
        at java.sql.DriverManager.getConnection(DriverManager.java:185)
        at org.apache.sqoop.mapreduce.db.DBConfiguration.getConnection(DBConfiguration.java:278)
        at org.apache.sqoop.mapreduce.db.DBInputFormat.getConnection(DBInputFormat.java:187)
        ... 10 more


 

换成:

./sqoop import --connect jdbc:mysql://192.168.205.101:3306/xxxx  --username dba  --password  123456  --direct --table ehm_hosts  --target-dir /data/ehm_hosts -m1         


问题解决!

 

出现问题:

3/06/26 00:37:18 INFO mapred.JobClient: Task Id : attempt_201306250027_0021_m_000000_0, Status : FAILED
java.io.IOException: Cannot run program "mysqldump": java.io.IOException: error=2, No such file or directory
        at java.lang.ProcessBuilder.start(ProcessBuilder.java:460)
        at java.lang.Runtime.exec(Runtime.java:593)
        at java.lang.Runtime.exec(Runtime.java:466)
        at org.apache.sqoop.mapreduce.MySQLDumpMapper.map(MySQLDumpMapper.java:403)
        at org.apache.sqoop.mapreduce.MySQLDumpMapper.map(MySQLDumpMapper.java:47)
        at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
        at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:396)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
        at org.apache.hadoop.mapred.Child.main(Child.java:249)
Caused by: java.io.IOException: java.io.IOException: error=2, No such file or directory
        at java.lang.UNIXProcess.<init>(UNIXProcess.java:148)
        at java.lang.ProcessImpl.start(ProcessImpl.java:65)
        at java.lang.ProcessBuilder.start(ProcessBuilder.java:453)
        ... 12 more


解决方案:

看mR日志.

50027_0021_m_000000_0	task_201306250027_0021_m_000000	slave1	FAILED	

java.io.IOException: Cannot run program "mysqldump": java.io.IOException: error=2, No such file or directory
	at java.lang.ProcessBuilder.start(ProcessBuilder.java:460)
	at java.lang.Runtime.exec(Runtime.java:593)
	at java.lang.Runtime.exec(Runtime.java:466)
	at org.apache.sqoop.mapreduce.MySQLDumpMapper.map(MySQLDumpMapper.java:403)
	at org.apache.sqoop.mapreduce.MySQLDumpMapper.map(MySQLDumpMapper.java:47)
	at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
	at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
	at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:396)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
	at org.apache.hadoop.mapred.Child.main(Child.java:249)
Caused by: java.io.IOException: java.io.IOException: error=2, No such file or directory
	at java.lang.UNIXProcess.<init>(UNIXProcess.java:148)
	at java.lang.ProcessImpl.start(ProcessImpl.java:65)
	at java.lang.ProcessBuilder.start(ProcessBuilder.java:453)
	... 12 more


导出数据在slave1上,所以需要在slave1上安装mysqldump.安装后,执行成功。

Warning: /usr/lib/hbase does not exist! HBase imports will fail.
Please set $HBASE_HOME to the root of your HBase installation.
Warning: $HADOOP_HOME is deprecated.

13/06/26 00:51:23 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
13/06/26 00:51:23 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
13/06/26 00:51:23 INFO tool.CodeGenTool: Beginning code generation
13/06/26 00:51:23 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `ehm_hosts` AS t LIMIT 1
13/06/26 00:51:23 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `ehm_hosts` AS t LIMIT 1
13/06/26 00:51:23 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr
Note: /tmp/sqoop-hadoop/compile/a067fc87107ca67800cb30e3e4bd56f9/ehm_hosts.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
13/06/26 00:51:27 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-hadoop/compile/a067fc87107ca67800cb30e3e4bd56f9/ehm_hosts.jar
13/06/26 00:51:27 INFO manager.DirectMySQLManager: Beginning mysqldump fast path import
13/06/26 00:51:27 INFO mapreduce.ImportJobBase: Beginning import of ehm_hosts
13/06/26 00:51:30 INFO mapred.JobClient: Running job: job_201306250027_0023
13/06/26 00:51:31 INFO mapred.JobClient:  map 0% reduce 0%
13/06/26 00:51:51 INFO mapred.JobClient:  map 100% reduce 0%
13/06/26 00:51:56 INFO mapred.JobClient: Job complete: job_201306250027_0023
13/06/26 00:51:56 INFO mapred.JobClient: Counters: 18
13/06/26 00:51:56 INFO mapred.JobClient:   Job Counters 
13/06/26 00:51:56 INFO mapred.JobClient:     SLOTS_MILLIS_MAPS=19622
13/06/26 00:51:56 INFO mapred.JobClient:     Total time spent by all reduces waiting after reserving slots (ms)=0
13/06/26 00:51:56 INFO mapred.JobClient:     Total time spent by all maps waiting after reserving slots (ms)=0
13/06/26 00:51:56 INFO mapred.JobClient:     Launched map tasks=1
13/06/26 00:51:56 INFO mapred.JobClient:     SLOTS_MILLIS_REDUCES=0
13/06/26 00:51:56 INFO mapred.JobClient:   File Output Format Counters 
13/06/26 00:51:56 INFO mapred.JobClient:     Bytes Written=332
13/06/26 00:51:56 INFO mapred.JobClient:   FileSystemCounters
13/06/26 00:51:56 INFO mapred.JobClient:     HDFS_BYTES_READ=87
13/06/26 00:51:56 INFO mapred.JobClient:     FILE_BYTES_WRITTEN=31929
13/06/26 00:51:56 INFO mapred.JobClient:     HDFS_BYTES_WRITTEN=332
13/06/26 00:51:56 INFO mapred.JobClient:   File Input Format Counters 
13/06/26 00:51:56 INFO mapred.JobClient:     Bytes Read=0
13/06/26 00:51:56 INFO mapred.JobClient:   Map-Reduce Framework
13/06/26 00:51:56 INFO mapred.JobClient:     Map input records=1
13/06/26 00:51:56 INFO mapred.JobClient:     Physical memory (bytes) snapshot=62537728
13/06/26 00:51:56 INFO mapred.JobClient:     Spilled Records=0
13/06/26 00:51:56 INFO mapred.JobClient:     CPU time spent (ms)=1360
13/06/26 00:51:56 INFO mapred.JobClient:     Total committed heap usage (bytes)=16252928
13/06/26 00:51:56 INFO mapred.JobClient:     Virtual memory (bytes) snapshot=417800192
13/06/26 00:51:56 INFO mapred.JobClient:     Map output records=3
13/06/26 00:51:56 INFO mapred.JobClient:     SPLIT_RAW_BYTES=87
13/06/26 00:51:56 INFO mapreduce.ImportJobBase: Transferred 332 bytes in 28.6117 seconds (11.6037 bytes/sec)
13/06/26 00:51:56 INFO mapreduce.ImportJobBase: Retrieved 3 records.


 

 

 

 

问题:

Note: Recompile with -Xlint:deprecation for details.
13/06/26 00:49:52 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-hadoop/compile/2a8b237421375fb4414d761cf7c7c998/ehm_hosts.jar
13/06/26 00:49:52 INFO manager.DirectMySQLManager: Beginning mysqldump fast path import
13/06/26 00:49:52 INFO mapreduce.ImportJobBase: Beginning import of ehm_hosts
13/06/26 00:49:54 INFO mapred.JobClient: Cleaning up the staging area hdfs://master:9000/tmp/hadoop-hadoop/mapred/staging/hadoop/.staging/job_201306250027_0022
13/06/26 00:49:54 ERROR security.UserGroupInformation: PriviledgedActionException as:hadoop cause:org.apache.hadoop.mapred.FileAlreadyExistsException: Output directory /data/ehm_hosts already exists
13/06/26 00:49:54 ERROR tool.ImportTool: Encountered IOException running import job: org.apache.hadoop.mapred.FileAlreadyExistsException: Output directory /data/ehm_hosts already exists
        at org.apache.hadoop.mapreduce.lib.output.FileOutputFormat.checkOutputSpecs(FileOutputFormat.java:137)
        at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:887)
        at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:850)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:396)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
        at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:850)
        at org.apache.hadoop.mapreduce.Job.submit(Job.java:500)
        at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:530)
        at org.apache.sqoop.mapreduce.ImportJobBase.doSubmitJob(ImportJobBase.java:173)
        at org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:151)
        at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:221)
        at org.apache.sqoop.manager.DirectMySQLManager.importTable(DirectMySQLManager.java:92)
        at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:403)
        at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:476)
        at org.apache.sqoop.Sqoop.run(Sqoop.java:145)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
        at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181)
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:220)
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:229)
        at org.apache.sqoop.Sqoop.main(Sqoop.java:238)


 

解决方案:

由于目录在hdfs上建立了,必须删除。

 hadoop fs -rmr  /data/ehm_hosts


 

 

 

Sqoop是一种用于在Hadoop生态系统中传输数据的工具,可以将关系型数据库中的数据导入到Hadoop中进行处理,也可以将Hadoop中的数据导出到关系型数据库中。当使用Sqoop导出数据到MySQL时,可能会出现以下错误: 1. "java.sql.SQLException: No suitable driver found for jdbc:mysql://localhost:3306/test" 这个错误通常是因为Sqoop没有找到MySQL JDBC驱动程序。解决方法是在Sqoop命令中添加--driver选项,指定MySQL JDBC驱动程序的路径。 2. "ERROR tool.ExportTool: Error during export: Export job failed!" 这个错误可能是由于导出过程中发生了错误。可以通过查看Sqoop日志来了解更多信息,例如导出过程中出现的任何异常或错误消息。另外,还可以尝试增加--verbose选项来获取更详细的日志信息,以便更好地了解导出过程中发生的情况。 3. "ERROR manager.SqlManager: Error executing statement: java.sql.SQLException: Table 'test.table_name' doesn't exist" 这个错误通常是因为导出命令指定的表名不存在。可以通过检查表名是否正确拼写以及是否存在来解决这个问题。另外,还可以尝试使用--table选项来显式指定表名。 4. "ERROR manager.SqlManager: Error executing statement: java.sql.SQLException: Incorrect string value: '\xF0\x9F\x98\x82\xF0\x9F...' for column 'column_name' at row 1" 这个错误通常是由于导出的数据包含MySQL不支持的字符集。可以尝试使用--mysql-delimiters选项来指定MySQL的字段和行分隔符,或者使用--input-null-string选项和--input-null-non-string选项来指定要导入的空字符串和非字符串值。 5. "ERROR manager.SqlManager: Error executing statement: java.sql.SQLException: Data too long for column 'column_name'" 这个错误通常是由于导出的数据超过了MySQL列的最大长度限制。可以通过在导出命令中使用--map-column-java选项来将列类型映射为Java中的更大类型,或者使用--mysql-delimiters选项来将字段和行分隔符设置为更短的值,以减少导出的数据量。 总之,解决Sqoop导出到MySQL的错误需要结合具体的错误提示以及导出命令的配置选项来进行调试和优化。
评论 1
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值