使用sqoop将hive中的表数据导入到mysql数据库表中,错误解决

[hadoop@Master bin]$ sqoop export --connect jdbc:mysql://localhost:3306/test --username dyh --password 000000 --table users --export-dir /user/hive/warehouse/users/part-m-00000 --input-fields-terminated-by '\0001'
Warning: /usr/lib/hbase does not exist! HBase imports will fail.
Please set $HBASE_HOME to the root of your HBase installation.
Warning: /usr/lib/hcatalog does not exist! HCatalog jobs will fail.
Please set $HCAT_HOME to the root of your HCatalog installation.
Warning: $HADOOP_HOME is deprecated.

13/12/12 19:50:38 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
13/12/12 19:50:38 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
13/12/12 19:50:38 INFO tool.CodeGenTool: Beginning code generation
13/12/12 19:50:38 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `users` AS t LIMIT 1
13/12/12 19:50:38 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `users` AS t LIMIT 1
13/12/12 19:50:38 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/hadoop
Note: /tmp/sqoop-hadoop/compile/9731783979d46a3414a9f86d700bec33/users.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
13/12/12 19:50:39 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-hadoop/compile/9731783979d46a3414a9f86d700bec33/users.jar
13/12/12 19:50:39 INFO mapreduce.ExportJobBase: Beginning export of users
13/12/12 19:50:41 INFO input.FileInputFormat: Total input paths to process : 1
13/12/12 19:50:41 INFO input.FileInputFormat: Total input paths to process : 1
13/12/12 19:50:41 INFO util.NativeCodeLoader: Loaded the native-hadoop library
13/12/12 19:50:41 WARN snappy.LoadSnappy: Snappy native library not loaded
13/12/12 19:50:42 INFO mapred.JobClient: Running job: job_201312051716_0034
13/12/12 19:50:43 INFO mapred.JobClient:  map 0% reduce 0%
13/12/12 19:50:51 INFO mapred.JobClient:  map 25% reduce 0%
13/12/12 19:50:53 INFO mapred.JobClient:  map 50% reduce 0%
13/12/12 19:50:58 INFO mapred.JobClient: Task Id : attempt_201312051716_0034_m_000002_0, Status : FAILED
java.io.IOException: java.sql.SQLException: Access denied for user 'dyh'@'localhost' (using password: YES)
        at org.apache.sqoop.mapreduce.ExportOutputFormat.getRecordWriter(ExportOutputFormat.java:79)
        at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.<init>(MapTask.java:628)
        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:753)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
        at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:396)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1136)
        at org.apache.hadoop.mapred.Child.main(Child.java:249)
Caused by: java.sql.SQLException: Access denied for user 'dyh'@'localhost' (using password: YES)
        at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:1078)
        at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:4190)
        at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:4122)
        at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:927)
        at com.mysql.jdbc.MysqlIO.proceedHandshakeWithPluggableAuthentication(MysqlIO.java:1709)
        at com.mysql.jdbc.MysqlIO.doHandshake(MysqlIO.java:1252)
        at com.mysql.jdbc.ConnectionImpl.coreConnect(ConnectionImpl.java:2486)
        at com.mysql.jdbc.ConnectionImpl.connectOneTryOnly(ConnectionImpl.java:2519)
        at com.mysql.jdbc.ConnectionImpl.createNewIO(ConnectionImpl.java:2304)
        at com.mysql.jdbc.ConnectionImpl.<init>(ConnectionImpl.java:834)
        at com.mysql.jdbc.JDBC4Connection.<init>(JDBC4Connection.java:47)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
        at com.mysql.jdbc.Util.handleNewInstance(Util.java:411)
        at com.mysql.jdbc.ConnectionImpl.getInstance(ConnectionImpl.java:416)
        at com.mysql.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:346)
        at java.sql.DriverManager.getConnection(DriverManager.java:582)
        at java.sql.DriverManager.getConnection(DriverManager.java:185)
        at org.apache.sqoop.mapreduce.db.DBConfiguration.getConnection(DBConfiguration.java:294)
        at org.apache.sqoop.mapreduce.AsyncSqlRecordWriter.<init>(AsyncSqlRecordWriter.java:76)
        at org.apache.sqoop.mapreduce.ExportOutputFormat$ExportRecordWriter.<init>(ExportOutputFormat.java:95)
        at org.apache.sqoop.mapreduce.ExportOutputFormat.getRecordWriter(ExportOutputFormat.java:77)
        ... 8 more

13/12/12 19:50:58 INFO mapred.JobClient: Task Id : attempt_201312051716_0034_m_000003_0, Status : FAILED
java.io.IOException: java.sql.SQLException: Access denied for user 'dyh'@'localhost' (using password: YES)
        at org.apache.sqoop.mapreduce.ExportOutputFormat.getRecordWriter(ExportOutputFormat.java:79)
        at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.<init>(MapTask.java:628)
        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:753)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
        at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:396)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1136)
        at org.apache.hadoop.mapred.Child.main(Child.java:249)
Caused by: java.sql.SQLException: Access denied for user 'dyh'@'localhost' (using password: YES)
        at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:1078)
        at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:4190)
        at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:4122)
        at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:927)
        at com.mysql.jdbc.MysqlIO.proceedHandshakeWithPluggableAuthentication(MysqlIO.java:1709)
        at com.mysql.jdbc.MysqlIO.doHandshake(MysqlIO.java:1252)
        at com.mysql.jdbc.ConnectionImpl.coreConnect(ConnectionImpl.java:2486)
        at com.mysql.jdbc.ConnectionImpl.connectOneTryOnly(ConnectionImpl.java:2519)
        at com.mysql.jdbc.ConnectionImpl.createNewIO(ConnectionImpl.java:2304)
        at com.mysql.jdbc.ConnectionImpl.<init>(ConnectionImpl.java:834)
        at com.mysql.jdbc.JDBC4Connection.<init>(JDBC4Connection.java:47)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
        at com.mysql.jdbc.Util.handleNewInstance(Util.java:411)
        at com.mysql.jdbc.ConnectionImpl.getInstance(ConnectionImpl.java:416)
        at com.mysql.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:346)
        at java.sql.DriverManager.getConnection(DriverManager.java:582)
        at java.sql.DriverManager.getConnection(DriverManager.java:185)
        at org.apache.sqoop.mapreduce.db.DBConfiguration.getConnection(DBConfiguration.java:294)
        at org.apache.sqoop.mapreduce.AsyncSqlRecordWriter.<init>(AsyncSqlRecordWriter.java:76)
        at org.apache.sqoop.mapreduce.ExportOutputFormat$ExportRecordWriter.<init>(ExportOutputFormat.java:95)
        at org.apache.sqoop.mapreduce.ExportOutputFormat.getRecordWriter(ExportOutputFormat.java:77)
        ... 8 more

13/12/12 19:51:01 INFO mapred.JobClient:  map 75% reduce 0%
13/12/12 19:51:06 INFO mapred.JobClient: Task Id : attempt_201312051716_0034_m_000003_1, Status : FAILED
java.io.IOException: com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failure

The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.
        at org.apache.sqoop.mapreduce.ExportOutputFormat.getRecordWriter(ExportOutputFormat.java:79)
        at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.<init>(MapTask.java:628)
        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:753)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
        at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:396)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1136)
        at org.apache.hadoop.mapred.Child.main(Child.java:249)
Caused by: com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failure

The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
        at com.mysql.jdbc.Util.handleNewInstance(Util.java:411)
        at com.mysql.jdbc.SQLError.createCommunicationsException(SQLError.java:1121)
        at com.mysql.jdbc.MysqlIO.<init>(MysqlIO.java:357)
        at com.mysql.jdbc.ConnectionImpl.coreConnect(ConnectionImpl.java:2482)
        at com.mysql.jdbc.ConnectionImpl.connectOneTryOnly(ConnectionImpl.java:2519)
        at com.mysql.jdbc.ConnectionImpl.createNewIO(ConnectionImpl.java:2304)
        at com.mysql.jdbc.ConnectionImpl.<init>(ConnectionImpl.java:834)
        at com.mysql.jdbc.JDBC4Connection.<init>(JDBC4Connection.java:47)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
        at com.mysql.jdbc.Util.handleNewInstance(Util.java:411)
        at com.mysql.jdbc.ConnectionImpl.getInstance(ConnectionImpl.java:416)
        at com.mysql.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:346)
        at java.sql.DriverManager.getConnection(DriverManager.java:582)
        at java.sql.DriverManager.getConnection(DriverManager.java:185)
        at org.apache.sqoop.mapreduce.db.DBConfiguration.getConnection(DBConfiguration.java:294)
        at org.apache.sqoop.mapreduce.AsyncSqlRecordWriter.<init>(AsyncSqlRecordWriter.java:76)
        at org.apache.sqoop.mapreduce.ExportOutputFormat$ExportRecordWriter.<init>(ExportOutputFormat.java:95)
        at org.apache.sqoop.mapreduce.ExportOutputFormat.getRecordWriter(ExportOutputFormat.java:77)
        ... 8 more
Caused by: java.net.ConnectException: Connection refused
        at java.net.PlainSocketImpl.socketConnect(Native Method)
        at java.net.PlainSocketImpl.doConnect(PlainSocketImpl.java:333)
        at java.net.PlainSocketImpl.connectToAddress(PlainSocketImpl.java:195)
        at java.net.PlainSocketImpl.connect(PlainSocketImpl.java:182)
        at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:366)
        at java.net.Socket.connect(Socket.java:519)
        at java.net.Socket.connect(Socket.java:469)
        at java.net.Socket.<init>(Socket.java:366)
        at java.net.Socket.<init>(Socket.java:209)
        at com.mysql.jdbc.StandardSocketFactory.connect(StandardSocketFactory.java:259)
        at com.mysql.jdbc.MysqlIO.<init>(MysqlIO.java:307)
        ... 26 more

13/12/12 19:51:10 INFO mapred.JobClient:  map 100% reduce 0%
13/12/12 19:51:11 INFO mapred.JobClient: Job complete: job_201312051716_0034
13/12/12 19:51:11 INFO mapred.JobClient: Counters: 19
13/12/12 19:51:11 INFO mapred.JobClient:   Job Counters
13/12/12 19:51:11 INFO mapred.JobClient:     SLOTS_MILLIS_MAPS=42036
13/12/12 19:51:11 INFO mapred.JobClient:     Total time spent by all reduces waiting after reserving slots (ms)=0
13/12/12 19:51:11 INFO mapred.JobClient:     Total time spent by all maps waiting after reserving slots (ms)=0
13/12/12 19:51:11 INFO mapred.JobClient:     Rack-local map tasks=6
13/12/12 19:51:11 INFO mapred.JobClient:     Launched map tasks=7
13/12/12 19:51:11 INFO mapred.JobClient:     Data-local map tasks=1
13/12/12 19:51:11 INFO mapred.JobClient:     SLOTS_MILLIS_REDUCES=0
13/12/12 19:51:11 INFO mapred.JobClient:   File Output Format Counters
13/12/12 19:51:11 INFO mapred.JobClient:     Bytes Written=0
13/12/12 19:51:11 INFO mapred.JobClient:   FileSystemCounters
13/12/12 19:51:11 INFO mapred.JobClient:     HDFS_BYTES_READ=759
13/12/12 19:51:11 INFO mapred.JobClient:     FILE_BYTES_WRITTEN=129632
13/12/12 19:51:11 INFO mapred.JobClient:   File Input Format Counters
13/12/12 19:51:11 INFO mapred.JobClient:     Bytes Read=0
13/12/12 19:51:11 INFO mapred.JobClient:   Map-Reduce Framework
13/12/12 19:51:11 INFO mapred.JobClient:     Map input records=3
13/12/12 19:51:11 INFO mapred.JobClient:     Physical memory (bytes) snapshot=327516160
13/12/12 19:51:11 INFO mapred.JobClient:     Spilled Records=0
13/12/12 19:51:11 INFO mapred.JobClient:     CPU time spent (ms)=5180
13/12/12 19:51:11 INFO mapred.JobClient:     Total committed heap usage (bytes)=127205376
13/12/12 19:51:11 INFO mapred.JobClient:     Virtual memory (bytes) snapshot=2333941760
13/12/12 19:51:11 INFO mapred.JobClient:     Map output records=3
13/12/12 19:51:11 INFO mapred.JobClient:     SPLIT_RAW_BYTES=661
13/12/12 19:51:11 INFO mapreduce.ExportJobBase: Transferred 759 bytes in 30.8765 seconds (24.5818 bytes/sec)
13/12/12 19:51:11 INFO mapreduce.ExportJobBase: Exported 3 records.


问题描述:

将数据从hive中导出到mysql过程中,报如上错误,但最终的执行结果没有问题,数据都能正常的导入到mysql数据库中,只是报异常太让人不爽了。

需要说明的是,同样的一条命令(如上),有时不会出现如下错误:

“java.io.IOException: java.sql.SQLException: Access denied for user 'dyh'@'localhost' (using password: YES)”

这个错是有时有,有时无,经分析发现,我在执行这条命令的时候,没有加 "-m" 参数,默认该作业会使用4个map任务来执行,当遇到数据库与map

不在同一台机器上时,就会报错,因数据量很少,只有几条,所以仅仅一个map任务(与数据库在同一台的map任务)就可以全部完成,所以才会出现报错,但结果没问题。


解决办法:

将jdbc:mysql://localhost:3306/test  中的localhost 改为ip地址即可。



使用 SqoopMySQL 数据导入Hive ,需要遵循以下步骤: 1. 确保已经安装了 SqoopHive。 2. 在 Hive 创建一个数据库,用于存储导入数据。 3. 在 HDFS 创建一个目录,用于存储导入数据。 4. 使用 Sqoop 的 import 命令将 MySQL 数据导入到 HDFS : ``` sqoop import \ --connect jdbc:mysql://mysql-server:3306/mydatabase \ --username mysqluser \ --password mysqlpassword \ --table mytable \ --target-dir /user/hive/warehouse/mydatabase.db/mytable \ --fields-terminated-by ',' \ --hive-import \ --hive-table mydatabase.mytable ``` 其,`jdbc:mysql://mysql-server:3306/mydatabase` 是 MySQL 数据库的连接 URL,`mysqluser` 和 `mysqlpassword` 分别是 MySQL 数据库的用户名和密码,`mytable` 是要导入数据名,`/user/hive/warehouse/mydatabase.db/mytable` 是数据导入到 HDFS 的目录,`--fields-terminated-by ','` 指定了字段分隔符为逗号,`--hive-import` 示需要将数据导入Hive ,`--hive-table mydatabase.mytable` 指定了在 Hive 创建的目标的名称和所在的数据库。 5. 在 Hive 创建一个外部,将 HDFS 目录数据映射为 Hive : ``` CREATE EXTERNAL TABLE mydatabase.mytable ( column1 datatype1, column2 datatype2, ... ) ROW FORMAT DELIMITED FIELDS TERMINATED BY ',' STORED AS TEXTFILE LOCATION '/user/hive/warehouse/mydatabase.db/mytable'; ``` 注意,这里的结构要与 MySQL 数据的结构保持一致。 6. 使用 Hive 的 INSERT INTO 语句将数据从外部表中插入到 Hive 表中: ``` INSERT INTO mydatabase.mytable SELECT * FROM mydatabase.mytable; ``` 这样就完成了将 MySQL 数据导入Hive 的操作。
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值