事先装好Hadoop
1.下载Hive
https://mirrors.bfsu.edu.cn/apache/hive/
这里下载
apache-hive-2.3.8-bin.tar.gz
2.解压安装包
tar -xzvf apache-hive-2.3.8-bin.tar.gz
3.配置环境变量
vi /etc/profile
export JAVA_HOME=/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.232.b09-0.el7_7.x86_64
export PATH=$PATH:$JAVA_HOME/bin
export HADOOP_HOME=/home/hadoop-2.10.1
export PATH=$PATH:$HADOOP_HOME/bin
export PATH=$PATH:$HADOOP_HOME/sbin
export HIVE_HOME=/home/apache-hive-2.3.8-bin
export PATH=$PATH:$HIVE_HOME/bin
source /etc/profile
4.启动HIVE
设置保存Schema的数据库类型
schematool -dbType derby -initSchema
启动
hive
注意:
1.如果启动hive后尝试使用show databases;后报safe mode
可以运行:
hadoop dfsadmin -safemode leave
2.如果报错Error: FUNCTION ‘NUCLEUS_ASCII’ already exists.
则再以下目录找到相应的文件后注释掉’NUCLEUS_ASCII’即可
/home/apache-hive-2.3.8-bin/scripts/metastore/upgrade/derby