hive配置mysql连接,配置java连接用户名密码,配置自定义类验证用户名密码

设置连接的数据库

如果使用的默认的 derby数据库是没有用户名和密码的,连接用的用户名密码可以为空


下面配置了连接 mysql的数据库

配置hive-site.xml

<property>
  <name>javax.jdo.option.ConnectionURL</name>
  <value>jdbc:mysql://localhost:3306/hive?createDatabaseIfNotExist=true</value>
  <description>JDBC connect string for a JDBC metastore</description>
</property>
<property>
  <name>javax.jdo.option.ConnectionDriverName</name>
  <value>com.mysql.jdbc.Driver</value>
  <description>Driver class name for a JDBC metastore</description>
</property>

<property>
  <name>javax.jdo.option.ConnectionUserName</name>
  <value>mysql</value>
  <description>username to use against metastore database</description>
</property>

<property>
  <name>javax.jdo.option.ConnectionPassword</name>
  <value>123456</value>
  <description>password to use against metastore database</description>
</property>


mysql的驱动文件mysql-connector-java-5.1.30.jar,需要放在hive的 /lib目录下

然后启动hive

报错,因为没有初始化mysql的脚本文件。需要先初始化

Logging initialized using configuration in file:/usr/tools/hive/conf/hive-log4j2.properties Async: true
Exception in thread "main" java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Hive metastore database is not initialized. Please use schematool (e.g. ./schematool -initSchema -dbType ...) to create the schema. If needed, don't forget to include the option to auto-create the underlying database in your JDBC connection string (e.g. ?createDatabaseIfNotExist=true for mysql))
	at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:591)
	at org.apache.hadoop.hive.ql.session.SessionState.beginStart(SessionState.java:531)
	at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:705)
	at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:641)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
	at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Hive metastore database is not initialized. Please use schematool (e.g. ./schematool -initSchema -dbType ...) to create the schema. If needed, don't forget to include the option to auto-create the underlying database in your JDBC connection string (e.g. ?createDatabaseIfNotExist=true for mysql))
	at org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:226)
	at org.apache.hadoop.hive.ql.metadata.Hive.<init>(Hive.java:366)

执行

[root@master bin]# ./schematool -initSchema -dbType mysql
which: no hbase in (/usr/tools/hadoop-2.7.3/bin/:/usr/java/jdk1.7.0_79/bin/:/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/root/bin)
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/tools/hive/lib/log4j-slf4j-impl-2.4.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/tools/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
Metastore connection URL:	 jdbc:mysql://120.55.190.57:3306/hive?characterEncoding=UTF-8
Metastore Connection Driver :	 com.mysql.jdbc.Driver
Metastore connection User:	 mysql
Starting metastore schema initialization to 2.1.0
Initialization script hive-schema-2.1.0.mysql.sql
Initialization script completed
schemaTool completed
[root@master bin]# 

初始化完成可以通过hive命令行模式启动  无报错,出现 hive> 命令行 启动成功。

[root@master bin]# ./hive
which: no hbase in (/usr/tools/hadoop-2.7.3/bin/:/usr/java/jdk1.7.0_79/bin/:/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/root/bin)
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/tools/hive/lib/log4j-slf4j-impl-2.4.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/tools/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]

Logging initialized using configuration in file:/usr/tools/hive/conf/hive-log4j2.properties Async: true
Hive-on-MR is deprecated in Hive 2 and may not be available in the future versions. Consider using a different execution engine (i.e. tez, spark) or using Hive 1.X releases.
hive> 

创建一个apachelog表, 根据正则规则

CREATE TABLE  apachelog  (ipaddress STRING,identity STRING,t_user STRING,time 
STRING,request STRING,protocol STRING,status STRING,size STRING,referer STRING,agent STRING)
ROW FORMAT SERDE 'org.apache.hadoop.hive.contrib.serde2.RegexSerDe' 
WITH SERDEPROPERTIES ("input.regex" = "([^ ]*) ([^ ]*) ([^ ]*) (-|\\[[^\\]]*\\]) ([^ \"]*|\"[^\"]*) ([^ ]*\") (-|[0-9]*) (-|[0-9]*)(?: ([^ \"]*|\".*\") ([^ \"]*|\".*\"))?",
"output.format.string" = "%1$s %2$s %3$s %4$s %5$s %6$s %7$s %8$s %9$s %10$s")STORED AS TEXTFILE;  


拿到nginx的access.log日志文件,导入到表apachelog中(具体纤细操作见另外一篇博客:http://blog.csdn.net/lr131425/article/details/72457202)

load data local inpath '/usr/toos/access.log' into table apachelog; 


查询下看是否导入成功 。

hive> 
    > select * from apachelog limit 5;
OK
58.60.168.164	-	-	[07/Jan/2016:09:09:43 +0800]	"GET /	HTTP/1.1"	200	11250	"-"	"Mozilla/5.0 (Windows NT 6.1; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/47.0.2526.106 Safari/537.36"
58.60.168.164	-	-	[07/Jan/2016:09:09:43 +0800]	"GET /tomcat.css	HTTP/1.1"	304	0	"http://120.55.190.57/"	"Mozilla/5.0 (Windows NT 6.1; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/47.0.2526.106 Safari/537.36"
58.60.168.164	-	-	[07/Jan/2016:09:09:43 +0800]	"GET /tomcat.png	HTTP/1.1"	304	0	"http://120.55.190.57/"	"Mozilla/5.0 (Windows NT 6.1; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/47.0.2526.106 Safari/537.36"
58.60.168.164	-	-	[07/Jan/2016:09:09:44 +0800]	"GET /	HTTP/1.1"	200	11250	"-"	"Mozilla/5.0 (Windows NT 6.1; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/47.0.2526.106 Safari/537.36"
58.60.168.164	-	-	[07/Jan/2016:09:09:44 +0800]	"GET /tomcat.css	HTTP/1.1"	304	0	"http://120.55.190.57/"	"Mozilla/5.0 (Windows NT 6.1; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/47.0.2526.106 Safari/537.36"
Time taken: 1.903 seconds, Fetched: 5 row(s)
hive> 

退出命令模式

hive> exit;


以服务的方式启动,启动hiverserver2,使用java代码连接hive

[root@master bin]# ./hiveserver2 
which: no hbase in (/usr/tools/hadoop-2.7.3/bin/:/usr/java/jdk1.7.0_79/bin/:/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/root/bin)
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/tools/hive/lib/log4j-slf4j-impl-2.4.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/tools/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]





设置连接hive用户名密码,使用custom模式连接

配置hive-site.xml

<property>
  <name>hive.server2.authentication</name>
  <value>CUSTOM</value>
</property>


配置自定义验证类

<property>
  <name>hive.server2.custom.authentication.class</name>
  <value>org.apache.hadoop.hive.contrib.auth.CustomPasswdAuthenticator</value>
</property>

hive-site.xml中配置用户密码 用户名zhangsan,密码 123456789

<property>
    <name>hive.jdbc_passwd.auth.zhagnsan</name>
    <value>123456789</value>
    <description/>
</property>

自定义类CustomPasswdAuthenticator实现类代码

package org.apache.hadoop.hive.contrib.auth;

import javax.security.sasl.AuthenticationException;

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.hive.conf.HiveConf;
import org.slf4j.Logger;

public class CustomPasswdAuthenticator implements org.apache.hive.service.auth.PasswdAuthenticationProvider{
	
	private Logger LOG = org.slf4j.LoggerFactory.getLogger(CustomPasswdAuthenticator.class);
	
	private static final String HIVE_JDBC_PASSWD_AUTH_PREFIX="hive.jdbc_passwd.auth.%s";
	
	private Configuration conf=null;
	
	@Override
	public void Authenticate(String userName, String passwd)  
		      throws AuthenticationException {  
		    LOG.info("user: "+userName+" try login.");  
		    String passwdConf = getConf().get(String.format(HIVE_JDBC_PASSWD_AUTH_PREFIX, userName));  
		    if(passwdConf==null){  
		      String message = "user's ACL configration is not found. user:"+userName;  
		      LOG.info(message);  
		      throw new AuthenticationException(message);  
		    }   
		    if(!passwd.equals(passwdConf)){  
		      String message = "user name and password is mismatch. user:"+userName;  
		      throw new AuthenticationException(message);  
		    }  
		  }  
	
		  public Configuration getConf() {  
		    if(conf==null){  
		      this.conf=new Configuration(new HiveConf());  
		    }  
		    return conf;  
		  }  
		  
		  public void setConf(Configuration conf) {  
		    this.conf=conf;  
		  }
	
	
}


package com.hive;

import java.sql.Connection;
import java.sql.DriverManager;
import java.sql.PreparedStatement;
import java.sql.ResultSet;
import java.sql.SQLException;

import org.apache.hive.jdbc.HiveDriver;

public class HiveManage {
	
    private static final String URLHIVE = "jdbc:hive2://192.168.184.130:10000/default";//hive服务地址
    private static Connection connection = null;

    public static Connection getHiveConnection() {
        if (null == connection) {
            synchronized (HiveManage.class) {
                if (null == connection) {
                    try {
                        Class.forName("org.apache.hive.jdbc.HiveDriver");
                        connection = DriverManager.getConnection(URLHIVE, "zhangsan", "123456789");
                    } catch (SQLException e) {
                        e.printStackTrace();
                    } catch (ClassNotFoundException e) {
                        e.printStackTrace();
                    }
                }
            }
        }
        return connection;
    }

    public static void main(String args[]) throws SQLException{
    	
//    	String sql = "select ipaddress,count(ipaddress) as count from apachelog "
//    			+ "group by ipaddress order by count desc";
    	String sql1="select ipaddress ,t_user,request,agent from apachelog limit 5";
    	PreparedStatement pstm = getHiveConnection().prepareStatement(sql1);
    	ResultSet rs= pstm.executeQuery(sql1);
    	
    	while (rs.next()) {
			System.out.println(rs.getString(1)+"	"+rs.getString(2)+
					"		"+rs.getString(3)+"		"+rs.getString(4));
		}
    	pstm.close();
    	rs.close();
    	
    }
	
	
}


运行结果:

ERROR StatusLogger No log4j2 configuration file found. Using default configuration: logging only errors to the console.
58.60.168.164	-		"GET /		"Mozilla/5.0 (Windows NT 6.1; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/47.0.2526.106 Safari/537.36"
58.60.168.164	-		"GET /tomcat.css		"Mozilla/5.0 (Windows NT 6.1; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/47.0.2526.106 Safari/537.36"
58.60.168.164	-		"GET /tomcat.png		"Mozilla/5.0 (Windows NT 6.1; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/47.0.2526.106 Safari/537.36"
58.60.168.164	-		"GET /		"Mozilla/5.0 (Windows NT 6.1; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/47.0.2526.106 Safari/537.36"
58.60.168.164	-		"GET /tomcat.css		"Mozilla/5.0 (Windows NT 6.1; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/47.0.2526.106 Safari/537.36"
























评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包

打赏作者

奔跑的窝窝牛

你的鼓励将是我创作的最大动力

¥1 ¥2 ¥4 ¥6 ¥10 ¥20
扫码支付:¥1
获取中
扫码支付

您的余额不足,请更换扫码支付或充值

打赏作者

实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值