使用Sqoop将Cache数据导入到Hive中

使用Sqoop将Cache数据导入到Hive中

1 导入语句

sqoop import \
--connect jdbc:Cache://192.168.0.115:1972/WEBSRC \
--driver com.intersys.jdbc.CacheDriver \
--username _system \
--password dhcc \
--query 'select * from SQLUser.CT_Loc where $CONDITIONS' \
--num-mappers 1 \
--target-dir /user/dhcc/SQLUser.CT_Loc \
--hive-import \
--hive-database source_db \
--hive-table ct_loc \
--as-parquetfile

2 报错

19/03/04 15:17:16 ERROR sqoop.Sqoop: Got exception running Sqoop: java.lang.RuntimeException: Could not load db driver class: com.intersys.jdbc.CacheDriver
java.lang.RuntimeException: Could not load db driver class: com.intersys.jdbc.CacheDriver
	at org.apache.sqoop.manager.SqlManager.makeConnection(SqlManager.java:874)
	at org.apache.sqoop.manager.GenericJdbcManager.getConnection(GenericJdbcManager.java:52)
	at org.apache.sqoop.manager.SqlManager.execute(SqlManager.java:762)
	at org.apache.sqoop.manager.SqlManager.execute(SqlManager.java:785)
	at org.apache.sqoop.manager.SqlManager.getColumnInfoForRawQuery(SqlManager.java:288)
	at org.apache.sqoop.manager.SqlManager.getColumnTypesForRawQuery(SqlManager.java:259)
	at org.apache.sqoop.manager.SqlManager.getColumnTypesForQuery(SqlManager.java:252)
	at org.apache.sqoop.manager.ConnManager.getColumnTypes(ConnManager.java:342)
	at org.apache.sqoop.orm.ClassWriter.getColumnTypes(ClassWriter.java:1858)
	at org.apache.sqoop.orm.ClassWriter.generate(ClassWriter.java:1657)
	at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:106)
	at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:494)
	at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:621)
	at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
	at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
	at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
	at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
	at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
	at org.apache.sqoop.Sqoop.main(Sqoop.java:252)

3 解决

将cachejdbc.jar拷贝到/var/lib/sqoop/目录下后,可以连接到Cache数据库,但是报了一个HDFS权限的错误。

Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): Permission denied: user=root, access=WRITE, inode="/user":hdfs:supergroup:drwxr-xr-x

切换到hdfs用户下再次执行导入,或者勾掉dfs.permissions配置。

[root@cdh3 sqoop]# su - hdfs 

但是报了一个新错误。

Error: QueryResult : Unsupported major.minor version 52.0

发现是由于CDH自带的jdk是1.7,而系统安装的是1.8

find / -name java
/usr/java/jdk1.7.0_67-cloudera/bin/java
/usr/java/jdk1.7.0_67-cloudera/jre/bin/java
vi /etc/profile
export JAVA_HOME=/usr/java/jdk1.7.0_67-cloudera
source /etc/profile

最后再次执行,成功。

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值