Hive2.1.0安装及其环境配置

一:安装环境
    hadoop2.6.4
    hive2.1.0
    mysql5.5.46
    JDBC驱动包:mysql-connector-java-5.1.38.tar.gz

二:安装mysql
  
  
huang@ubuntu:~$ sudo apt-get install mysql-server
    安装完后,修改mysql数据库用户root的密码为123;
    登录数据库
   
   
huang@ubuntu:~$ mysql -u root -p
    进入数据库后创建hive数据库
  
  
mysql> create database hive;


二:安装hive
    先到官网下载一个名字为:apache-hive-2.1.0-bin.tar.gz的软件放到ubuntu的Downloads下
    进入Downloads目录下.  解压文件到/usr/local下
  
  
huang@ubuntu:~/Downloads$ tar -zxvf apache-hive-2.1.0-bin.tar.gz -C /usr/local
    进入/usr/local下修改文件夹 apache - hive - 2.1 . 0 - bin 名字为hive(方便操作)
  
  
huang@ubuntu:/usr/local$ mv apache-hive-2.1.0-bin hive
    配置hive环境变量(可省略)
   
   
huang@ubuntu:~$ sudo vim /etc/profile
     加入下面内容:
   
   
##hive
export HIVE_HOME=/usr/local/hive
export PATH=${PATH}:${HIVE_HOME}/bin
export CLASSPATH=$CLASSPATH.:{HIVE_HOME}/lib
    配置hive-env.sh文件
          进入目录/usr/local/hive/conf(这是hive的配置目录)
          复制hive-env.sh.template 并重命名文hive-env.sh,编辑该文件     
   
   
huang@ubuntu:/usr/local/hive/conf$ vim hive-env.sh
           修改HADOOP_HOME为 HADOOP_HOME=/usr/local/hadoop-2.6.4
           修改HIVE_CONF_DIR为HIVE_CONF_DIR=/usr/local/hive/conf
           修改后保存退出
  配置hive-site.xml文件
          进入目录/usr/local/hive/conf
          复制hive-default.xml.template  并重命名文hive-site.xml,编辑该文件
        此处有四个需要修改的内容,修改内容如下:
         (1)修改连接的数据库端口号
    
    
<property>
<name>javax.jdo.option.ConnectionURL</name>
<value>jdbc:mysql://localhost:3306/hive</value>
<description>
JDBC connect string for a JDBC metastore.
To use SSL to encrypt/authenticate the connection, provide database-specific SSL flag in the connection URL.
For example, jdbc:postgresql://myhost/db?ssl=true for postgres database.
</description>
</property>
           (2)连接名称
    
    
<property>
<name>javax.jdo.option.ConnectionDriverName</name>
<value>com.mysql.jdbc.Driver</value>
<description>Driver class name for a JDBC metastore</description>
</property>
         (3)连接用户名
    
    
<property>
<name>javax.jdo.option.ConnectionUserName</name>
<value>root</value>
<description>Username to use against metastore database</description>
</property>
         (4)密码
    
    
<property>
<name>javax.jdo.option.ConnectionPassword</name>
<value>123</value>
<description>password to use against metastore database</description>
</property>

          保存并退出后执行命令
   
   
schematool -dbType mysql initSchema
         输入hive看能否正确运行(要先打开hadoop服务)
         但是发现其报错了,报错内容如下
    
    
huang@ubuntu:/usr/local/hive/conf$ hive
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/local/hive/lib/log4j-slf4j-impl-2.4.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/hadoop-2.6.4/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
 
Logging initialized using configuration in jar:file:/usr/local/hive/lib/hive-common-2.1.0.jar!/hive-log4j2.properties Async: true
Exception in thread "main" java.lang.IllegalArgumentException: java.net.URISyntaxException: Relative path in absolute URI: ${system:java.io.tmpdir%7D/$%7Bsystem:user.name%7D
at org.apache.hadoop.fs.Path.initialize(Path.java:206)
at org.apache.hadoop.fs.Path.<init>(Path.java:172)
at org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:631)
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:550)
at org.apache.hadoop.hive.ql.session.SessionState.beginStart(SessionState.java:518)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:705)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:641)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
Caused by: java.net.URISyntaxException: Relative path in absolute URI: ${system:java.io.tmpdir%7D/$%7Bsystem:user.name%7D
at java.net.URI.checkPath(URI.java:1823)
at java.net.URI.<init>(URI.java:745)
at org.apache.hadoop.fs.Path.initialize(Path.java:203)
... 12 more
            应该是system:java.io.tmp的错误,于是在hive下新建了一个文件夹iotmp
    
    
huang@ubuntu:/usr/local/hive$ mkdir iotmp
             再次重新修改hive-site.xml文件
             将其所有的system:java.io.tmpdir改为我刚刚创建的文件夹路径/usr/local/hive/iotmp
             修改后再次输入hive( 要先打开hadoop服务)
    
    
huang@ubuntu:/usr/local/hive/conf$ hive
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/local/hive/lib/log4j-slf4j-impl-2.4.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/hadoop-2.6.4/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
 
Logging initialized using configuration in jar:file:/usr/local/hive/lib/hive-common-2.1.0.jar!/hive-log4j2.properties Async: true
Hive-on-MR is deprecated in Hive 2 and may not be available in the future versions. Consider using a different execution engine (i.e. spark, tez) or using Hive 1.X releases.
hive>
         成功进入hive,到此完成hive安装配置

    
    
    


    
评论 2
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值