在将mysql数据库中库名为test,表名为table2 的表导入到hive中时,数据库用户名为hive,
sqoop import --connect jdbc:mysql://localhost:3306/test --username hive --password ×××× --table table2 --hive-import 出现一下错误:ERROR tool.ImportTool: Encountered IOException running import job: java.io.FileNotFoundException: File does not exist: hdfs://master:9000/home/hadoop/sqoop-1.4.6.bin__hadoop-2.0.4-alpha/lib/jackson-databind-2.3.1.jar意思是在hdfs文件系统中缺少这个jar包
我的sqoop安装在master机器上,路径为 /home/hadoop/sqoop-1.4.6.bin__hadoop-2.0.4-alpha/在按照网上一般思路安装配置好sqoop,运行后,在hdfs上生成
一个空目录/home/hadoop/sqoop-1.4.6.bin__hadoop-2.0.4-alpha sqoop运行时找不到jar包,出现错误
解决如下:
hadoop fs -mkdir /home/hadoop/sqoop-1.4.6.bin__hadoop-2.0.4-alpha/lib
jar包较多,python脚本如下import os filepath = "/home/hadoop/sqoop-1.4.6.bin__hadoop-2.0.4-alpha/lib/" files = os.listdir(filepath) for jarfile in files: jarfile = filepath + jarfile order = "hadoop fs -put " + jarfile + " " +jarfile print order os.system(order) 然后 hadoop fs -put /home/hadoop/sqoop-1.4.6.bin__hadoop-2.0.4-alpha/sqoop-1.4.6.jar /home/hadoop/sqoop-1.4.6.bin__hadoop-2.0.4-alpha/sqoop-1.4.6.jar 问题解决。