hive2 自定义验证登录

本人也是刚学习hadoop/hive, 很多问题在探索中, 目前就遇到了一个测试hive自定义登录验证的问题.

先说下场景, hadoop2.7.0 已部署好, 写的程序也能运算出正确的结果, 现在就差把数据向外展示, 装上了hive2, 通过jdbc方式把数据拿出去

现在就遇到了验证的问题, 使用beeline测试了下, 报下面的错误:

beeline> !connect jdbc:hive2://localhost:10000 xiaokang kangyun9413
Connecting to jdbc:hive2://localhost:10000
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/hive/apache-hive-2.1.0-bin/lib/log4j-slf4j-impl-2.4.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/hadoop/hadoop-2.7.0/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
Error: Failed to open new session: java.lang.RuntimeException: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.authorize.AuthorizationException): User: root is not allowed to impersonate xiaokang (state=,code=0)


经过网上搜索了一番, 说要在hadoop 的core-site.xml 配置中增加如下配置

<property>
    <name>hadoop.proxyuser.root.hosts</name>
    <value>*</value>
</property>
<property>
    <name>hadoop.proxyuser.root.groups</name>
    <value>*</value>
</property>

加上之后果然好了!


重要的自定义验证还没有说

HiveServer2支持多种用户安全认证方式:NONE,NOSASL, KERBEROS, LDAP, PAM ,CUSTOM等, 符合自身的仅研究了CUSTOM

自定义有两块东西要做, 1 是写一个验证类, 2是增加两处配置

验证类:

package hive.server2.auth;

import java.io.BufferedReader;
import java.io.File;
import java.io.FileReader;
import java.io.IOException;
import java.security.MessageDigest;
import java.security.NoSuchAlgorithmException;

import javax.security.sasl.AuthenticationException;

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.hive.conf.HiveConf;
import org.apache.hive.service.auth.PasswdAuthenticationProvider;

public class CustomHiveServer2Auth implements PasswdAuthenticationProvider {
	@Override
	public void Authenticate(String username, String password) throws AuthenticationException {

		boolean ok = false;
		String passMd5 = new MD5().md5(password);
		HiveConf hiveConf = new HiveConf();
		Configuration conf = new Configuration(hiveConf);
		String filePath = conf.get("hive.server2.custom.authentication.file");
		System.out.println("hive.server2.custom.authentication.file [" + filePath + "] ..");
		File file = new File(filePath);
		BufferedReader reader = null;
		try {
			reader = new BufferedReader(new FileReader(file));
			String tempString = null;
			while ((tempString = reader.readLine()) != null) {
				String[] datas = tempString.split(",", -1);
				if (datas.length != 2) {
					continue;
				}
				// ok
				if (datas[0].equals(username) && datas[1].equals(passMd5)) {
					ok = true;
					break;
				}
			}
			reader.close();
		} catch (Exception e) {
			e.printStackTrace();
			throw new AuthenticationException("read auth config file error, [" + filePath + "] ..", e);
		} finally {
			if (reader != null) {
				try {
					reader.close();
				} catch (IOException e1) {
				}
			}
		}
		if (ok) {
			System.out.println("user [" + username + "] auth check ok .. ");
		} else {
			System.out.println("user [" + username + "] auth check fail .. ");
			throw new AuthenticationException("user [" + username + "] auth check fail .. ");
		}
	}

	// MD5加密
	class MD5 {
		private MessageDigest digest;
		private char hexDigits[] = { '0', '1', '2', '3', '4', '5', '6', '7', '8', '9', 'a', 'b', 'c', 'd', 'e', 'f' };

		public MD5() {
			try {
				digest = MessageDigest.getInstance("MD5");
			} catch (NoSuchAlgorithmException e) {
				throw new RuntimeException(e);
			}
		}

		public String md5(String str) {
			byte[] btInput = str.getBytes();
			digest.reset();
			digest.update(btInput);
			byte[] md = digest.digest();
			// 把密文转换成十六进制的字符串形式
			int j = md.length;
			char strChar[] = new char[j * 2];
			int k = 0;
			for (int i = 0; i < j; i++) {
				byte byte0 = md[i];
				strChar[k++] = hexDigits[byte0 >>> 4 & 0xf];
				strChar[k++] = hexDigits[byte0 & 0xf];
			}
			return new String(strChar);
		}
	}

}

需要引用的: hadoop-common-2.7.0.jar/hive-common-2.1.0.jar/hive-service-2.1.0.jar

然后打包成 HiveServer2Auth.jar, 上传到 $HIVE_HOME/lib下


2 增加配置

   1 在hive 的conf/hive-site.xml 中增加:

        <property>
                <name>hive.server2.authentication</name>
                <value>CUSTOM</value>
        </property>
        <property>
                <name>hive.server2.custom.authentication.class</name>
                <value>hive.server2.auth.CustomHiveServer2Auth</value>
        </property>
        <property>
                <name>hive.server2.custom.authentication.file</name>
                <value>/home/hive/apache-hive-2.1.0-bin/conf/user.password.conf</value>
        </property>

   2 创建 /home/hive/apache-hive-2.1.0-bin/conf/user.password.conf 文件

格式: root,cd4d65c296a84cb09e928cf9dba6f751

root是用户名, 后面是md5后的密码

在beeline 登录时, 输入的是明文密码!


接下来就要用jdbc测试下查询数据了 :)





评论 1
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值