一、删除Job
二、创建Job
说明:在创建Job时,我们使用--password-file参数,而且非--passowrd。主要原因是在执行Job时使用--password参数将有警告,并且需要输入密码才能执行Job。当我们采用--password-file参数时,执行Job无需输入数据库密码。
注:import与"--"之间必须要有一个空格
三、执行Job
四、创建--password-file参数需要的密码文件(注:sqoop规定密码文件必须放在HDFS之上,并且权限必须为400)
1、操作命令:
2、检查sqoop的配置文件(sqoop-site.xml)是否存在以下配置,并且value值为true
注:password文件的格式不对错误提示如下:
解决办法参见第四点。
若泽大数据交流群:671914634
点击(此处)折叠或打开
- sqoop job \
- --delete testjob
说明:在创建Job时,我们使用--password-file参数,而且非--passowrd。主要原因是在执行Job时使用--password参数将有警告,并且需要输入密码才能执行Job。当我们采用--password-file参数时,执行Job无需输入数据库密码。
点击(此处)折叠或打开
- sqoop job \
- --create testjob \
- -- \
- import \
- --connect jdbc:mysql://192.168.137.130:3306/jepsondb \
- --username root \
- --password-file /input/sqoop.pwd \
- --table abc \
- --target-dir /input/abc \
- --delete-target-dir \
- -m 1
三、执行Job
点击(此处)折叠或打开
- sqoop job \
- --exec testjob
1、操作命令:
点击(此处)折叠或打开
- echo -n "123456" > sqoop.pwd
- hdfs dfs -rm sqoop.pwd /input/sqoop.pwd
- hdfs dfs -put sqoop.pwd /input
- hdfs dfs -chmod 400 /input/sqoop.pwd
- hdfs dfs -ls /input
- -r-------- 1 hadoop supergroup 6 2018-01-15 18:38 /input/sqoop.pwd
点击(此处)折叠或打开
- <property>
- <name>sqoop.metastore.client.record.password</name>
- <value>true</value>
- <description>If true, allow saved passwords in the metastore.
- </description>
- </property>
点击(此处)折叠或打开
- arning: /app/sqoop-1.4.6-cdh5.7.0/../hbase does not exist! HBase imports will fail.
- Please set $HBASE_HOME to the root of your HBase installation.
- Warning: /app/sqoop-1.4.6-cdh5.7.0/../hcatalog does not exist! HCatalog jobs will fail.
- Please set $HCAT_HOME to the root of your HCatalog installation.
- Warning: /app/sqoop-1.4.6-cdh5.7.0/../accumulo does not exist! Accumulo imports will fail.
- Please set $ACCUMULO_HOME to the root of your Accumulo installation.
- Warning: /app/sqoop-1.4.6-cdh5.7.0/../zookeeper does not exist! Accumulo imports will fail.
- Please set $ZOOKEEPER_HOME to the root of your Zookeeper installation.
- 18/01/15 18:33:50 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6-cdh5.7.0
- 18/01/15 18:33:53 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
- 18/01/15 18:33:53 INFO tool.CodeGenTool: Beginning code generation
- 18/01/15 18:33:54 ERROR manager.SqlManager: Error executing statement: java.sql.SQLException: Access denied for user 'root'@'spark220' (using password: YES)
- java.sql.SQLException: Access denied for user 'root'@'spark220' (using password: YES)
- at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:965)
- at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3973)
- at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3909)
- at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:873)
- at com.mysql.jdbc.MysqlIO.proceedHandshakeWithPluggableAuthentication(MysqlIO.java:1710)
- at com.mysql.jdbc.MysqlIO.doHandshake(MysqlIO.java:1226)
- at com.mysql.jdbc.ConnectionImpl.coreConnect(ConnectionImpl.java:2188)
- at com.mysql.jdbc.ConnectionImpl.connectOneTryOnly(ConnectionImpl.java:2219)
- at com.mysql.jdbc.ConnectionImpl.createNewIO(ConnectionImpl.java:2014)
- at com.mysql.jdbc.ConnectionImpl.<init>(ConnectionImpl.java:776)
- at com.mysql.jdbc.JDBC4Connection.<init>(JDBC4Connection.java:47)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at com.mysql.jdbc.Util.handleNewInstance(Util.java:425)
- at com.mysql.jdbc.ConnectionImpl.getInstance(ConnectionImpl.java:386)
- at com.mysql.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:330)
- at java.sql.DriverManager.getConnection(DriverManager.java:664)
- at java.sql.DriverManager.getConnection(DriverManager.java:247)
- at org.apache.sqoop.manager.SqlManager.makeConnection(SqlManager.java:904)
- at org.apache.sqoop.manager.GenericJdbcManager.getConnection(GenericJdbcManager.java:52)
- at org.apache.sqoop.manager.SqlManager.execute(SqlManager.java:763)
- at org.apache.sqoop.manager.SqlManager.execute(SqlManager.java:786)
- at org.apache.sqoop.manager.SqlManager.getColumnInfoForRawQuery(SqlManager.java:289)
- at org.apache.sqoop.manager.SqlManager.getColumnTypesForRawQuery(SqlManager.java:260)
- at org.apache.sqoop.manager.SqlManager.getColumnTypes(SqlManager.java:246)
- at org.apache.sqoop.manager.ConnManager.getColumnTypes(ConnManager.java:327)
- at org.apache.sqoop.orm.ClassWriter.getColumnTypes(ClassWriter.java:1846)
- at org.apache.sqoop.orm.ClassWriter.generate(ClassWriter.java:1646)
- at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:107)
- at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:478)
- at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605)
- at org.apache.sqoop.Sqoop.run(Sqoop.java:143)
- at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
- at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)
- at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)
- at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)
- at org.apache.sqoop.Sqoop.main(Sqoop.java:236)
- 18/01/15 18:33:54 ERROR tool.ImportTool: Encountered IOException running import job: java.io.IOException: No columns to generate for ClassWriter
- at org.apache.sqoop.orm.ClassWriter.generate(ClassWriter.java:1652)
- at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:107)
- at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:478)
- at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605)
- at org.apache.sqoop.Sqoop.run(Sqoop.java:143)
- at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
- at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)
- at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)
- at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)
- at org.apache.sqoop.Sqoop.main(Sqoop.java:236)
若泽大数据交流群:671914634
来自 “ ITPUB博客 ” ,链接:http://blog.itpub.net/31511218/viewspace-2150175/,如需转载,请注明出处,否则将追究法律责任。
转载于:http://blog.itpub.net/31511218/viewspace-2150175/