oracle导入初始数据死机,Sqoop导入Oracle数据至hive卡死在hive.HiveImport: Connecting to jdbc:hive2不执行...

环境信息:

HDP-3.1.4

已经下载好odjbc8.jar驱动程序放置在/usr/hdp/current/sqoop-client/lib/目录

Sqoop读取Oracle数据库数据导入Hive时,一直卡在INFO hive.HiveImport: Connecting to jdbc:hive2:// 不懂,日志执行到下面这样子就不执行了。

20/10/19 05:16:56 WARN hive.TableDefWriter: Column SJYRQ had to be cast to a less precise type in Hive

20/10/19 05:16:56 WARN hive.TableDefWriter: Column TYSJ had to be cast to a less precise type in Hive

20/10/19 05:16:56 WARN hive.TableDefWriter: Column HFSJ had to be cast to a less precise type in Hive

20/10/19 05:16:56 INFO hive.HiveImport: Loading uploaded data into Hive

20/10/19 05:16:57 INFO hive.HiveImport: SLF4J: Class path contains multiple SLF4J bindings.

20/10/19 05:16:57 INFO hive.HiveImport: SLF4J: Found binding in [jar:file:/usr/hdp/3.1.4.0-315/hive/lib/log4j-slf4j-impl-2.10.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]

20/10/19 05:16:57 INFO hive.HiveImport: SLF4J: Found binding in [jar:file:/usr/hdp/3.1.4.0-315/hadoop/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]

20/10/19 05:16:57 INFO hive.HiveImport: SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.

20/10/19 05:16:57 INFO hive.HiveImport: SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]

20/10/19 05:16:59 INFO hive.HiveImport: Connecting to jdbc:hive2://node93.prpq:2181,node94.prpq:2181,node92.prpq:2181/default;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2

方案一:

在执行sqoop命令的节点新建/etc/hive/conf/beeline-hs2-connection.xml配置文件,内容如下:

beeline.hs2.connection.user

hdfs

beeline.hs2.connection.password

hdfs

由于我们使用hdfs账号进行sqoop导入操作,所以账号和密码都配置成hdfs,比方案二好的地方是,如果hive的配置发生变化,不会被覆盖。

方案二:

参考HiveServer2 Clients配置/etc/hive/3.1.4.0-315/0/beeline-site.xml,修改点如下,原始内容:

beeline.hs2.jdbc.url.container

jdbc:hive2://cdh-m1:2181,cdh-n1:2181,cdh-n2:2181/;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2

修改为带用户名密码的

beeline.hs2.jdbc.url.container

jdbc:hive2://cdh-m1:2181,cdh-n1:2181,cdh-n2:2181/;user=hdfs;password=hdfs;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2

之后再执行sqoop import能够正常执行完毕了

  • 1
    点赞
  • 3
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值