Java Hive 没有用户名,Hive设置连接用户名和密码

本文介绍了如何在Hive中配置自定义的用户名和密码认证,包括修改`hive-site.xml`,使用`CustomPasswdAuthenticator`类进行密码验证,并将该类打包放入Hive的lib目录。同时,文章提到了Java连接Hive时的配置以及遇到的权限问题,以及解决权限问题的方法。最后,通过Beeline展示了正确的连接过程。
摘要由CSDN通过智能技术生成

Hive设置连接用户名和密码,操作相关步骤如下:

Hive-site.xml,缺省为NONE。此处改为CUSTOM

hive.server2.authentication

CUSTOM

Expects one of [nosasl, none, ldap, kerberos, pam, custom].

Client authentication types.

NONE: no authentication check

LDAP: LDAP/AD based authentication

KERBEROS: Kerberos/GSSAPI authentication

CUSTOM: Custom authentication provider

(Use with property hive.server2.custom.authentication.class)

PAM: Pluggable authentication module

NOSASL: Raw transport

配置自定义验证类 hive-site.xml

hive.server2.custom.authentication.class

org.apache.hadoop.hive.contrib.auth.CustomPasswdAuthenticator

自定义验证类代码:

package org.apache.hadoop.hive.contrib.auth;

import javax.security.sasl.AuthenticationException;

import org.apache.hadoop.conf.Configuration;

import org.apache.hadoop.hive.conf.HiveConf;

import org.slf4j.Logger;

public class CustomPasswdAuthenticator implements org.apache.hive.service.auth.PasswdAuthenticationProvider{

private Logger LOG = org.slf4j.LoggerFactory.getLogger(CustomPasswdAuthenticator.class);

private static final String HIVE_JDBC_PASSWD_AUTH_PREFIX="hive.jdbc_passwd.auth.%s";

private Configuration conf=null;

@Override

public void Authenticate(String userName, String passwd)

throws AuthenticationException {

LOG.info("user: "+userName+" try login.");

String passwdConf = getConf().get(String.format(HIVE_JDBC_PASSWD_AUTH_PREFIX, userName));

if(passwdConf==null){

String message = "user's ACL configration is not found. user:"+userName;

LOG.info(message);

throw new AuthenticationException(message);

}

if(!passwd.equals(passwdConf)){

String message = "user name and password is mismatch. user:"+userName;

throw new AuthenticationException(message);

}

}

public Configuration getConf() {

if(conf==null){

this.conf=new Configuration(new HiveConf());

}

return conf;

}

public void setConf(Configuration conf) {

this.conf=conf;

}

}

此自定义类打成jar包放入到hive/lib目录下。不然会报错找不到class  这里我把class配置的名字故意写错了。

2017-05-22T09:42:20,241 ERROR [HiveServer2-Handler-Pool: Thread-41] server.TThreadPoolServer: Error occurred during processing of message.

java.lang.RuntimeException: java.lang.ClassNotFoundException: Class org.apache.hadoop.hive.contrib.auth.CustomPasswdAuthenticatve not found

at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2195)

at org.apache.hive.service.auth.CustomAuthenticationProviderImpl.(CustomAuthenticationProviderImpl.java:40)

at org.apache.hive.service.auth.AuthenticationProviderFactory.getAuthenticationProvider(AuthenticationProviderFactory.java:70)

at org.apache.hive.service.auth.AuthenticationProviderFactory.getAuthenticationProvider(AuthenticationProviderFactory.java:61)

at org.apache.hive.service.auth.PlainSaslHelper$PlainServerCallbackHandler.handle(PlainSaslHelper.java:106)

at org.apache.hive.service.auth.PlainSaslServer.evaluateResponse(PlainSaslServer.java:103)

at org.apache.thrift.transport.TSaslTransport$SaslParticipant.evaluateChallengeOrResponse(TSaslTransport.java:539)

at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:283)

at org.apache.thrift.transport.TSaslServerTransport.open(TSaslServerTransport.java:41)

at org.apache.thrift.transport.TSaslServerTransport$Factory.getTransport(TSaslServerTransport.java:216)

at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:269)

at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)

at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)

at java.lang.Thread.run(Thread.java:745)

Caused by: java.lang.ClassNotFoundException: Class org.apache.hadoop.hive.contrib.auth.CustomPasswdAuthenticatve not found

at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2101)

at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2193)

... 13 more

hive-site.xml中配置用户密码 用户名weisongyi,密码 123456789  如果有多个就写多个property

hive.jdbc_passwd.auth.weisongyi

123456789

java连接代码:

package com.hive;

import java.sql.Connection;

import java.sql.DriverManager;

import java.sql.PreparedStatement;

import java.sql.ResultSet;

import java.sql.SQLException;

import org.apache.hive.jdbc.HiveDriver;

public class HiveManage {

private static final String URLHIVE = "jdbc:hive2://192.168.184.130:10000/default";

private static Connection connection = null;

public static Connection getHiveConnection() {

if (null == connection) {

synchronized (HiveManage.class) {

if (null == connection) {

try {

Class.forName("org.apache.hive.jdbc.HiveDriver");

connection = DriverManager.getConnection(URLHIVE, "zhangsan", "123456789");

} catch (SQLException e) {

e.printStackTrace();

} catch (ClassNotFoundException e) {

e.printStackTrace();

}

}

}

}

return connection;

}

public static void main(String args[]) throws SQLException{

// String sql = "select ipaddress,count(ipaddress) as count from apachelog "

// + "group by ipaddress order by count desc";

String sql1="select ipaddress ,t_user,request,agent from apachelog limit 5";

PreparedStatement pstm = getHiveConnection().prepareStatement(sql1);

ResultSet rs= pstm.executeQuery(sql1);

while (rs.next()) {

System.out.println(rs.getString(1)+" "+rs.getString(2)+

" "+rs.getString(3)+" "+rs.getString(4));

}

pstm.close();

rs.close();

}

}

当报如下错误的时候,说明目录没有权限

java.lang.RuntimeException: java.lang.RuntimeException: org.apache.hadoop.security.AccessControlException: Permission denied: user=zhangsan, access=EXECUTE, inode="/tmp/hive":root:supergroup:drwx------

at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:319)

at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkTraverse(FSPermissionChecker.java:259)

at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:205)

at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:190)

at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1728)

at org.apache.hadoop.hdfs.server.namenode.FSDirStatAndListingOp.getFileInfo(FSDirStatAndListingOp.java:108)

at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:3857)

at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:1012)

at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:843)

at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)

at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616)

at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:982)

at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2049)

at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2045)

at java.security.AccessController.doPrivileged(Native Method)

at javax.security.auth.Subject.doAs(Subject.java:415)

at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698)

at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2043)

进入到hadoop/bin目录执行

[root@master bin]# ./hdfs dfs -chmod -R 777 /tmp

eclipse控制台显示:

dfd7b93f9319b629262fcc769965a62b.png

如果用户名密码错误hive日志显示的就是我们自定义类打印的日志:

131ac21c4f249eb6d183feee0b36b3b6.png

通过beeline连接

beeline> !connect jdbc:hive2://localhost:10000/default

Connecting to jdbc:hive2://localhost:10000/default

Enter username for jdbc:hive2://localhost:10000/default: zhangsan

Enter password for jdbc:hive2://localhost:10000/default: *********

Connected to: Apache Hive (version 2.1.1)

Driver: Hive JDBC (version 2.1.1)

17/05/22 10:50:45 [main]: WARN jdbc.HiveConnection: Request to set autoCommit to false; Hive does not support autoCommit=false.

Transaction isolation: TRANSACTION_REPEATABLE_READ

喜欢 (0)or分享 (0)

eb21f6dd870983d3571c1960770c09cb.png

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值