hive 安装链接mysql_二.Hive数据库的安装

======一.Hive数据库的安装======

1.首先需要安装以上hadoop环境。

2.安装mysql环境存储Hive的元数据,因为默认的元数据是存放在derby(只支持一个链接,用于测试)实际环境用mysql。

3.安装环境使用centos 6.5  IP为:192.168.0.12

======二.安装mysql数据库存储Hive元数据======

yum install mysql-server

mysql -uroot -p

create database hive;

update mysql.user set password=PASSWORD ('root') where User='root';

flush privileges;

======三.安装Hive======

需要Java环境,上述Hadoop已经配置。

cd /data/hadoop

wget -c http://114.242.101.2:808/hive/apache-hive-2.3.2-bin.tar.gz

tar xf apache-hive-2.3.2-bin.tar.gz

mv apache-hive-2.3.2-bin hive

chown -R hadoop:hadoop hive

设置Hive环境变量hadoop我已经设置

vim /etc/profile

#hive

export HIVE_HOME=/data/hadoop/hive

export PATH=$HIVE_HOME/bin:$PATH

soure /etc/profile

======四.修改Hive配置文件======

su - hadoop

cd /data/hadoop/hive/conf

mv hive-default.xml.template hive-site.xml

清空文件中之间的内容并加入下列内容:

javax.jdo.option.ConnectionURL

jdbc:mysql://127.0.0.1:3306/hive?characterEncoding=UTF-8

javax.jdo.option.ConnectionDriverName

com.mysql.jdbc.Driver

javax.jdo.option.ConnectionUserName

root

javax.jdo.option.ConnectionPassword

root

把MySQL的JDBC驱动包复制到Hive的lib目录下

cd /data/hadoop/hive/lib/

wget -c http://114.242.101.2:808/hive/mysql-connector-java-5.1.44-bin.jar

======五. Hive在HDFS上的默认存储路径======

官网说明:

Hive uses Hadoop, so:

you must have Hadoop in your path OR

export HADOOP_HOME=

In addition, you must use below HDFS commands to create

/tmp and /user/hive/warehouse (aka hive.metastore.warehouse.dir)

and set them chmod g+w before you can create a table in Hive.

su - hadoop

cd /data/hadoop/hadoop-2.7.4

./bin/hadoop fs -mkdir       /tmp

./bin/hadoop fs -mkdir  -p   /user/hive/warehouse

./bin/hadoop fs -chmod g+w   /tmp

./bin/hadoop fs -chmod g+w   /user/hive/warehouse

======六.运行Hive======

出现以下表明运行成功。

[hadoop@localhost hadoop]$ hive

which: no hbase in (/data/hadoop/hadoop-2.7.4/bin:/data/hadoop/hive/bin:/usr/lib64/qt-3.3/bin:/usr/local/bin:/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/sbin:/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.151-1.b12.el6_9.x86_64/bin:/home/hadoop/bin)

SLF4J: Class path contains multiple SLF4J bindings.

SLF4J: Found binding in [jar:file:/data/hadoop/hive/lib/log4j-slf4j-impl-2.6.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]

SLF4J: Found binding in [jar:file:/data/hadoop/hadoop-2.7.4/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]

SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.

SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]

Logging initialized using configuration in jar:file:/data/hadoop/hive/lib/hive-common-2.3.2.jar!/hive-log4j2.properties Async: true

Hive-on-MR is deprecated in Hive 2 and may not be available in the future versions. Consider using a different execution engine (i.e. spark, tez) or using Hive 1.X releases.

hive>

>

>

>

======七.初始化Hive数据库======

su -hadoop

schematool -initSchema -dbType mysql

======八.运行Hive操作命令======

hive> CREATE TABLE pokes (foo INT, bar STRING);

hive> CREATE TABLE invites (foo INT, bar STRING) PARTITIONED BY (ds STRING);

hive> SHOW TABLES;

hive> SHOW TABLES '.*s';

hive> DESCRIBE invites;

hive> ALTER TABLE events RENAME TO 3koobecaf;

hive> ALTER TABLE pokes ADD COLUMNS (new_col INT);

hive> ALTER TABLE invites ADD COLUMNS (new_col2 INT COMMENT 'a comment');

hive> ALTER TABLE invites REPLACE COLUMNS (foo INT, bar STRING, baz INT COMMENT 'baz replaces new_col2');

hive> ALTER TABLE invites REPLACE COLUMNS (foo INT COMMENT 'only keep the first column');

hive> DROP TABLE pokes;

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值