sqoop导入table报错

3.将关系型数据的表结构复制到hive中,只是复制表的结构,表中的内容没有复制过去。
   sqoop create-hive-table --connect jdbc:mysql://node06:3306/gygh --table fact --username gygh --password gygh123 --hive-table fact 



4.从关系数据库导入文件到hive中(hive中的表要先存在)

   sqoop import --connect jdbc:mysql://node06:3306/gygh --username  gygh --password gygh123 --table fact --hive-import --hive-table  fact -m 1 

执行上面的语句报错

ERROR manager.SqlManager: Error reading from database: java.sql.SQLException: Streaming result set com.mysql.jdbc.RowDataDynamic@5ba3f27a is still active. No statements may be issued when any streaming result sets are open and in use on a given connection. Ensure that you have called .close() on any active streaming result sets before attempting more queries.

java.sql.SQLException: Streaming result set com.mysql.jdbc.RowDataDynamic@5ba3f27a is still active. No statements may be issued when any streaming result sets are open and in use on a given connection. Ensure that you have called .close() on any active streaming result sets before attempting more queries.
        at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:934)
        at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:931)
        at com.mysql.jdbc.MysqlIO.checkForOutstandingStreamingData(MysqlIO.java:2735)
        at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:1899)
        at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2151)
        at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2619)
        at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2569)
        at com.mysql.jdbc.StatementImpl.executeQuery(StatementImpl.java:1524)
        at com.mysql.jdbc.ConnectionImpl.getMaxBytesPerChar(ConnectionImpl.java:3003)
        at com.mysql.jdbc.Field.getMaxBytesPerCharacter(Field.java:602)
        at com.mysql.jdbc.ResultSetMetaData.getPrecision(ResultSetMetaData.java:445)
        at org.apache.sqoop.manager.SqlManager.getColumnInfoForRawQuery(SqlManager.java:305)
        at org.apache.sqoop.manager.SqlManager.getColumnTypesForRawQuery(SqlManager.java:260)
        at org.apache.sqoop.manager.SqlManager.getColumnTypes(SqlManager.java:246)
        at org.apache.sqoop.manager.ConnManager.getColumnTypes(ConnManager.java:328)
        at org.apache.sqoop.orm.ClassWriter.getColumnTypes(ClassWriter.java:1853)
        at org.apache.sqoop.orm.ClassWriter.generate(ClassWriter.java:1653)
        at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:107)
        at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:488)
        at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:615)
        at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
        at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:225)
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
        at org.apache.sqoop.Sqoop.main(Sqoop.java:243)
18/07/02 11:01:04 ERROR tool.ImportTool: Encountered IOException running import job: java.io.IOException: No columns to generate for ClassWriter
        at org.apache.sqoop.orm.ClassWriter.generate(ClassWriter.java:1659)
        at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:107)
        at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:488)
        at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:615)
        at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
        at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:225)
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
        at org.apache.sqoop.Sqoop.main(Sqoop.java:243)


解决办法:  

3.将关系型数据的表结构复制到hive中,只是复制表的结构,表中的内容没有复制过去。
   sqoop create-hive-table --connect jdbc:mysql://node06:3306/gygh --table fact --username gygh --password gygh123 --hive-table fact --driver com.mysql.jdbc.Driver




4.从关系数据库导入文件到hive中(hive中的表要先存在)
   sqoop import --connect jdbc:mysql://node06:3306/gygh --username  gygh --password gygh123 --table fact --hive-import --hive-table  fact -m 1 --driver com.mysql.jdbc.Driver

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值