xxIndex of /hive/hive-3.1.2
1、下载hive包,解压
https://downloads.apache.org/hive/hive-3.1.2/
2、修改环境变量
vim /etc/profile
export HIVE_HOME=/george/soft/apache-hive-3.1.2-bin
export PATH=$PATH:$HIVE_HOME/bin
source /etc/profile
3、MySQL配置
metastore hive的元数据需要保存在数据库中,可将数据库替换成MySQL等关系型数据库,将存储数据独立出来在多个服务示例之间共享
Ubuntu的MySQL安装参考另一篇文章, CSDN
[MySQL官网下载驱动](MySQL :: Download Connector/J)
wget https://cdn.mysql.com//Downloads/Connector-J/mysql-connector-java-8.0.27.tar.gz
#解压文件到Hive的lib目录下
将MySQL的JDBC驱动拷贝到Hive的lib目录下
4、配置hive文件
cd /export/servers/apache-hive-3.1.2-src/conf
把初始化的配置文件 复制一份出来 并且改名
cp hive-env.sh.template hive-env.sh
cp hive-default.xml.template hive-site.xml
cp hive-log4j2.properties.template hive-log4j2.properties
cp hive-exec-log4j2.properties.template hive-exec-log4j2.properties
5、配置hive-env.sh
vi hive-env.sh
# Set HADOOP_HOME to point to a specific hadoop install directory
export HADOOP_HOME=/export/servers/hadoop-3.3.1 #hadoop安装路径
# Hive Configuration Directory can be controlled by:
export HIVE_CONF_DIR=/export/servers/apache-hive-3.1.2-src/conf #hive路径
export HIVE_AUX_JARS_PATH=/export/servers/apache-hive-3.1.2-src/lib #hivejar包路径
配置hive-site.xml文件
首先使用hadoop创建3个文件夹
hdfs dfs -mkdir -p /user/hive/warehouse
hadoop fs -mkdir -p /user/hive/tmp
hadoop fs -mkdir -p /user/hive/log
修改文件夹权限
hadoop fs -chmod -R 777 /user/hive/warehouse
hadoop fs -chmod -R 777 /user/hive/tmp
hadoop fs -chmod -R 777 /user/hive/log
6、配置Metastore到MySQL hive-site.xml
</configuration>
<property>
<name>javax.jdo.option.ConnectionURL</name>
<value>jdbc:mysql://192.168.44.137:3306/hahive?autoReconnect=true&useUnicode=true&createDatabaseIfNotExist=true&characterEncoding=utf8&useSSL=false&serverTimezone=UTC</value>
</property>
<property>
<name>javax.jdo.option.ConnectionDriverName</name>
<value>com.mysql.jdbc.Driver</value> #com.mysql.cj.jdbc.Driver
</property>
<property>
<name>javax.jdo.option.ConnectionUserName</name>
<value>root</value>
</property>
<property>
<name>javax.jdo.option.ConnectionPassword</name>
<value>root</value>
</property>
<property>
<name>hive.metastore.schema.verification</name>
<value>false</value>
</property>
<property>
<name>hive.server2.webui.host</name>
<!--主机名或ip-->
<value>192.168.44.137</value>
</property>
<property>
<name>hive.server2.webui.port</name>
<value>10002</value>
</property>
<property>
<name>hive.metastore.uris</name>
<value>thrift://192.168.33.75:9083</value>
</property>
</configuration>
Hive 元数据初始化命令
schematool -dbType mysql -initSchema
8、配置好后启动hadoop,启动Hive
sudo ./start-all.sh # 启动hadoop
启动hive