Hive环境搭建(ubuntu 20.04 + java 1.8.0_281 + hadoop 3.1.0 + hive 3.1.2)

Hive依赖于Hadoop,而Hadoop依赖于Java。因此搭建Hive环境需要先装java,然后装hadoop,然后才是hive。

  1. java安装

java8网址:https://www.oracle.com/java/technologies/javase/javase-jdk8-downloads.html
下载文件jdk-8u281-linux-x64.tar.gz

tar -zxvf jdk-8u281-linux-x64.tar.gz
sudo mkdir /usr/java
sudo mv jdk-8u281-linux-x64 /usr/java/
sudo vim /etc/profile.d/java.sh
#写入以下两行并退出
#export JAVA_HOME=/usr/java/jdk1.8.0_281
#export PATH=$PATH:$JAVA_HOME/bin
. /etc/profile

验证是否安装成功(查看java版本)

java -version

结果如下:

java version “1.8.0_281”
Java™ SE Runtime Environment (build 1.8.0_281-b09)
Java HotSpot™ 64-Bit Server VM (build 25.281-b09, mixed mode)

  1. hadoop安装
    hadoop下载网站:https://archive.apache.org/dist/hadoop/common/
    找到所需版本下载,例如:https://archive.apache.org/dist/hadoop/common/hadoop-3.1.0/
tar -zxvf hadoop-3.1.0.tar.gz
sudo mkdir /usr/hadoop
sudo mv hadoop-3.1.0 /usr/hadoop/
sudo vim /etc/profile.d/hadoop.sh
#写入以下两行并退出
#export HADOOP_HOME=/usr/hadoop/hadoop-3.1.0
#export PATH=$PATH:$HADOOP_HOME/bin
. /etc/profile

验证是否安装成功:
a, 查看hadoop版本

hadoop version

得到如下结果

Hadoop 3.1.0
Source code repository https://github.com/apache/hadoop -r 16b70619a24cdcf5d3b0fcf4b58ca77238ccbe6d
Compiled by centos on 2018-03-30T00:00Z
Compiled with protoc 2.5.0
From source with checksum 14182d20c972b3e2105580a1ad6990
This command was run using /usr/hadoop/hadoop-3.1.0/share/hadoop/common/hadoop-common-3.1.0.jar

b, 用wordcount测试hadoop

mkdir wc-in
echo "bla bla" > wc-in/a.txt
echo "bla wa wa " > wc-in/b.txt
hadoop jar /usr/hadoop/hadoop-3.1.0/share/hadoop/mapreduce/sources/hadoop-mapreduce-examples-3.1.0-sources.jar org.apache.hadoop.examples.WordCount  wc-in wc-out
cat wc-out/*
#bla     3
#wa      2
  1. hive安装
    hive镜像网址列表:http://www.apache.org/dyn/closer.cgi/hive/
    随便选一个,选择3.1.2下载,例如:https://mirror.bit.edu.cn/apache/hive/hive-3.1.2/
tar -zxvf apache-hive-3.1.2-bin.tar.gz 
sudo mkdir /usr/hive
sudo mv apache-hive-3.1.2-bin /usr/hive/
sudo vim /etc/profile.d/hive.sh
#写入以下两行并退出
#export HIVE_HOME=/usr/hive/apache-hive-3.1.2-bin
#export PATH=$PATH:$HIVE_HOME/bin
. /etc/profile

验证hive是否安装成功

cd $HIVE_HOME
hive

进行如下操作并得到类似结果:(大概就是建表删表)

root@mylaptop:/usr/hive/apache-hive-3.1.2-bin$ hive
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/hive/apache-hive-3.1.2-bin/lib/log4j-slf4j-impl-2.10.0.jar!/org/slf4j/impl/StaticLoggerBinder.cla ss]
SLF4J: Found binding in [jar:file:/usr/hadoop/hadoop-3.1.0/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLogger Binder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
Hive Session ID = 6210aa3c-a0ff-4a26-bce3-a581ea2a2793
Logging initialized using configuration in jar:file:/usr/hive/apache-hive-3.1.2-bin/lib/hive-common-3.1.2.jar!/hive-log4j2.properties Async: true
Hive-on-MR is deprecated in Hive 2 and may not be available in the future versions. Consider using a different execution engine (i.e. spark, tez) or using Hive 1.X releases.
Hive Session ID = f172faa6-202f-4ec1-89a5-9a913139084a
hive> show tables;
OK
Time taken: 2.13 seconds
hive> create table x (a int);
OK
Time taken: 1.843 seconds
hive> show tables;
OK
x
Time taken: 0.083 seconds, Fetched: 1 row(s)
hive> drop table x;
OK
Time taken: 2.339 seconds
hive> show tables;
OK
Time taken: 0.037 seconds
hive> exit;

可能遇到的问题:

1)hive中输入show tables报错:FAILED: HiveException java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
参考:https://www.cnblogs.com/lfri/p/13099126.html

解决办法:

cd $HIVE_HOME
rm -rf metastore_db
schematool -initSchema -dbType derby

若得到以下信息,则初始化成功

Initialization script completed
schemaTool completed

2)hive中创建表格失败:
CREATE TABLE x (a INT);
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:file:/user/hive/warehouse/x is not a directory or unable to create one)

解决方案:
直接创建一个目录/user/hive/,并且设置权限为所有用户都可以读写

sudo mkdir /user/hive 
sudo chmod -R 777 /user/hive
  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值