hive创建数据库的错误,小白笔记

hive创建数据库的错误

最近开始要复习一下hive,考完研一年基本上没碰大数据了,可是一上来注定不能顺顺利利的,启动hive,输入“create database test;”回车。

FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:java.lang.IllegalArgumentException: java.net.UnknownHostException: server.node2)

好像我的主机名也不是什么server.node2,这个是哪里冒出来的,想了一下,好像当时把Hadoop的安装配置直接拷贝到这台主机上来的,所以主机名解析才会出问题,于是找到core-site.xml,果然:

<configuration>
        <!-- 指定HADOOP所使用的文件系统schema(URI),HDFS的老大(NameNode)的地址 -->
        <property>
                <name>fs.defaultFS</name>
                <value>hdfs://server.node2:9000</value>
        </property>
        <!-- 指定hadoop运行时产生文件的存储目录 -->
        <property>
                <name>hadoop.tmp.dir</name>
                <value>/home/hadoop/hadoop-2.4.1/tmp</value>
    </property>
</configuration>

于是,赶紧把server.node2改成了现在的主机名(主机名需要在hosts文件里面配置映射关系)。
再次运行hive,创建数据库,继续报错:

FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:Got exception: java.net.ConnectException Call From mini1/127.0.0.1 to server.mini1:9000 failed on connection exception: java.net.ConnectException: 拒绝连接; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused)

为什么会拒绝连接呢?hive创建数据库需要MapReduce,那应该是需要启动一下HDFS,启动HDFS,又报了下面的错误:

warning:: ssh: Could not resolve hostname warning:: Name or service not known
have: ssh: Could not resolve hostname have: Name or service not known
HotSpot(TM): ssh: Could not resolve hostname HotSpot(TM): Name or service not known
library: ssh: Could not resolve hostname library: Name or service not known
You: ssh: Could not resolve hostname You: No address associated with hostname
VM: ssh: Could not resolve hostname VM: Name or service not known
loaded: ssh: Could not resolve hostname loaded: Name or service not known
The: ssh: Could not resolve hostname The: Name or service not known
stack: ssh: Could not resolve hostname stack: Name or service not known
which: ssh: Could not resolve hostname which: Name or service not known
Client: ssh: Could not resolve hostname Client: Name or service not known
have: ssh: Could not resolve hostname have: Name or service not known
guard.: ssh: Could not resolve hostname guard.: Name or service not known
fix: ssh: Could not resolve hostname fix: Name or service not known
VM: ssh: Could not resolve hostname VM: Name or service not known

网上查了一下,需要在/etc/profile文件中加入:

export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export HADOOP_OPTS=-Djava.library.path=$HADOOP_HOME/lib

以上代码直接复制粘贴就可以,不需要做任何修改,我猜测,应该是linux的版本问题吧,不管了,反正这块解决了。
这样,HDFS就正常启动了,接着再启动hive,创建数据库也正常了。

  • 0
    点赞
  • 3
    收藏
    觉得还不错? 一键收藏
  • 3
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论 3
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值