基于Kerberos环境下,使用Java连接操作HiveMetastore

背景

连接HiveMetastore监听event

前提条件

通过密钥(keytab)认证Kerberos主体(Principal),不需要手动输入密码,但前提是密钥要与Kerberos主体相匹配。
有core-site.xml 文件

直接上代码

package org.example;

import lombok.extern.slf4j.Slf4j;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.hive.conf.HiveConf;
import org.apache.hadoop.hive.metastore.HiveMetaStoreClient;
import org.apache.hadoop.hive.metastore.api.CurrentNotificationEventId;
import org.apache.hadoop.security.UserGroupInformation;

@Slf4j
public class HiveClientDemo {
    private HiveMetaStoreClient hiveMetaStoreClient;
    public static String hiveMetastoreUris = "thrift://cdh-master:9083";
    public static String hadoopConfDir = "/Users/yl/JavaProject/hiveApi/conf/";
    public static String krb5Conf = "/etc/krb5.conf";
    public static String keytab = "/Users/yl/JavaProject/hiveApi/conf/hive.keytab";
    public static String keytabPrincipal = "hive/cdh-master@LCC.COM";



    void setHiveMetaStoreConf() throws Exception {
        HiveConf hiveConf = new HiveConf();
        hiveConf.addResource(new org.apache.hadoop.fs.Path(hadoopConfDir + "hive-site.xml"));
        hiveConf.addResource(new org.apache.hadoop.fs.Path(hadoopConfDir + "core-site.xml"));

        log.info("-------------------------------------------");
        log.info("DEFAULT_CONFIG: hadoop.rpc.protection -> " + hiveConf.get("hadoop.rpc.protection"));
        if (hiveConf.getVar(HiveConf.ConfVars.METASTOREURIS).isEmpty()) {
            hiveConf.setVar(HiveConf.ConfVars.METASTOREURIS, hiveMetastoreUris);
        }

        handleKerberos(hiveConf);

        try {
            this.hiveMetaStoreClient = new HiveMetaStoreClient(hiveConf);
            ping();
        } catch (Exception e) {
            log.error("setHiveMetaStoreConf error", e);
            throw e;
        }
    }

    private void handleKerberos(HiveConf hiveConf) throws Exception {
        System.setProperty("java.security.krb5.conf",krb5Conf);
        log.info("CONFIG: hadoop.rpc.protection -> " + hiveConf.getVar(HiveConf.ConfVars.METASTORE_USE_THRIFT_SASL));
        log.info("CONFIG: hive.server2.authentication -> " + hiveConf.getVar(HiveConf.ConfVars.HIVE_SERVER2_AUTHENTICATION));

        if (!hiveConf.getBoolVar(HiveConf.ConfVars.METASTORE_USE_THRIFT_SASL)) {
            return;
        }
        Configuration hadoopConf = new Configuration();
        hadoopConf.setBoolean("hadoop.security.authorization", true);
        hadoopConf.set("hadoop.security.authentication", "kerberos");

        UserGroupInformation.setConfiguration(hiveConf);
        log.info("UserGroupInformation.loginUserFromKeytab keytabPrincipal ->" + keytabPrincipal + " keytab -> " +
                 keytab);
        UserGroupInformation.loginUserFromKeytab(keytabPrincipal, keytab);
    }

    private boolean ping()  throws Exception {
        log.info("ping");
        log.info("show databases");
        for (String database : this.hiveMetaStoreClient.getAllDatabases()) {
            log.info(database);
        }

        CurrentNotificationEventId event = this.hiveMetaStoreClient.getCurrentNotificationEventId();
        log.info("CurrentNotificationEventId -> " + event.getEventId());

        return true;
    }


    public static void main(String[] args) {
        try {
            HiveClientDemo client = new HiveClientDemo();
            client.setHiveMetaStoreConf();
            client.ping();
        } catch (Exception e) {
            log.error("error", e);
        }
    }
}

问题

  • javax.security.sasl.SaslException: No common protection layer between client and server
    HDFS读文件失败报错“No common protection layer”_MapReduce服务 MRS
    [图片]
    client使用了默认的配置,与服务端不一致,且这个配置无法通过 setVar 的方式在代码里的修改
    不可代码显示的配置hadoop.rpc.protection,配置hadoop_conf 让jar包的自己去读,HADOOP_CONF_DIR=/Users/zhouhao/JavaProject/hiveApi/src/main/resources
  • no supported default etypes for default_tkt_enctypes
    配置hive后启动trino报错KrbException: no supported default etypes for default_tkt_enctypes
    • krb5.conf 配置
      • System.setProperty(“java.security.krb5.conf”,“/etc/krb5.conf”);
      • -Djava.security.krb5.conf=./krb5.conf
  • 5
    点赞
  • 10
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
要在Hive2.1.1中启用Kerberos认证,需要进行以下步骤: 1. 配置Kerberos客户端 - 安装Kerberos客户端并配置krb5.conf文件。 - 为Hive服务和Metastore服务创建Kerberos principal(principal是Kerberos中的用户或服务)。 2. 配置Hadoop - 在core-site.xml中添加以下属性: ``` <property> <name>hadoop.security.authentication</name> <value>kerberos</value> </property> ``` - 在hdfs-site.xml中添加以下属性(如果HDFS启用了Kerberos认证): ``` <property> <name>dfs.namenode.kerberos.principal</name> <value>hdfs/_HOST@EXAMPLE.COM</value> </property> <property> <name>dfs.datanode.kerberos.principal</name> <value>hdfs/_HOST@EXAMPLE.COM</value> </property> ``` - 在yarn-site.xml中添加以下属性(如果YARN启用了Kerberos认证): ``` <property> <name>yarn.resourcemanager.principal</name> <value>yarn/_HOST@EXAMPLE.COM</value> </property> <property> <name>yarn.nodemanager.principal</name> <value>yarn/_HOST@EXAMPLE.COM</value> </property> ``` 3. 配置Hive Metastore - 在hive-site.xml中添加以下属性: ``` <property> <name>javax.jdo.option.ConnectionUserName</name> <value>hive</value> </property> <property> <name>javax.jdo.option.ConnectionPassword</name> <value>hivepassword</value> </property> <property> <name>javax.jdo.option.ConnectionURL</name> <value>jdbc:mysql://localhost/metastore?createDatabaseIfNotExist=true</value> </property> <property> <name>javax.jdo.option.ConnectionDriverName</name> <value>com.mysql.jdbc.Driver</value> </property> <property> <name>hive.metastore.sasl.enabled</name> <value>true</value> </property> <property> <name>hive.metastore.kerberos.keytab.file</name> <value>/path/to/metastore.keytab</value> </property> <property> <name>hive.metastore.kerberos.principal</name> <value>metastore/_HOST@EXAMPLE.COM</value> </property> ``` 其中,hive.metastore.kerberos.keytab.file和hive.metastore.kerberos.principal分别指定Metastore服务的Kerberos keytab文件和principal。 4. 配置HiveServer2 - 在hive-site.xml中添加以下属性: ``` <property> <name>hive.server2.authentication</name> <value>kerberos</value> </property> <property> <name>hive.server2.authentication.kerberos.keytab</name> <value>/path/to/hive.keytab</value> </property> <property> <name>hive.server2.authentication.kerberos.principal</name> <value>hive/_HOST@EXAMPLE.COM</value> </property> ``` 其中,hive.server2.authentication.kerberos.keytab和hive.server2.authentication.kerberos.principal分别指定HiveServer2服务的Kerberos keytab文件和principal。 完成以上步骤后,启动Hive MetastoreHiveServer2服务,并使用Kerberos principal登录。
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值