默认如果不配置的话,是不进行认证的。所以用beeline连接thriftserver时,可以填写任意用户名/密码,均可以认证成功。
下面是进行Custom认证配置的过程:
1.新建一个工程,引入spark-assembly-1.3.0-hadoop2.4.0.jar
需要实现PasswdAuthenticationProvider接口,代码如下:
package org.apache.hadoop.hive.contrib.auth;
import javax.security.sasl.AuthenticationException;
import org.apache.hadoop.conf.Configurable;
import org.apache.hadoop.conf.Configuration;
import org.apache.hive.service.auth.PasswdAuthenticationProvider;
public class Authticator implements PasswdAuthenticationProvider,Configurable{
private Configuration conf = null;
private String prefix = "org.spark.auth.";
@Override
public Configuration getConf() {
if(this.conf == null){
this.conf = new Configuration();
}
return this.conf;
}
@Override
public void setConf(Configuration arg0) {
this.conf = arg0;
}
@Override
public void Authenticate(String username, String password) throws AuthenticationException {
if(username == null){
throw new AuthenticationException("Login fail");
}
String realPass = this.conf.get(prefix+username);
if(realPass == null || password == null || !realPass.equals(password)){
throw new AuthenticationException("Login fail");
}
}
}
这里读取的,其实是hive-site.xml中配置的内容。将内容与输入的密码进行比对(按道理,应该存的时候md5,然后用md5值比对的;不过我为了方便,就没这样做)
2.hive-site.xml中的配置:
3.启动thriftserver
./start-thriftserver.sh --master local --driver-class-path /home/pijing/spark/auxlib/postgresql-9.4-1201.jdbc41.jar --jars /home/pijing/spark/lib/Authticator.jar
4.此时,就需要用用户名"pijing"和配置给"pijing"的密码进行登录