使用Spark/Java读取已开启Kerberos认证的HBase

45 篇文章 1 订阅

1.赋予drguo用户相应的权限
2.KDC中创建drguo用户并导出相应的keytab文件
[root@bigdata28 ~]# kadmin.local 
Authenticating as principal drguo/admin@AISINO.COM with password.
kadmin.local:  addprinc drguo/bigdata28
WARNING: no policy specified for drguo/bigdata28@AISINO.COM; defaulting to no policy
Enter password for principal "drguo/bigdata28@AISINO.COM": 
Re-enter password for principal "drguo/bigdata28@AISINO.COM": 
Principal "drguo/bigdata28@AISINO.COM" created.
kadmin.local:  xst -norandkey -k /home/drguo/drguo_bigdata28.keytab drguo/bigdata28@AISINO.COM
Entry for principal drguo/bigdata28@AISINO.COM with kvno 1, encryption type aes256-cts-hmac-sha1-96 added to keytab WRFILE:/home/drguo/drguo_bigdata28.keytab.
Entry for principal drguo/bigdata28@AISINO.COM with kvno 1, encryption type aes128-cts-hmac-sha1-96 added to keytab WRFILE:/home/drguo/drguo_bigdata28.keytab.
Entry for principal drguo/bigdata28@AISINO.COM with kvno 1, encryption type des3-cbc-sha1 added to keytab WRFILE:/home/drguo/drguo_bigdata28.keytab.
Entry for principal drguo/bigdata28@AISINO.COM with kvno 1, encryption type arcfour-hmac added to keytab WRFILE:/home/drguo/drguo_bigdata28.keytab.
Entry for principal drguo/bigdata28@AISINO.COM with kvno 1, encryption type des-hmac-sha1 added to keytab WRFILE:/home/drguo/drguo_bigdata28.keytab.
Entry for principal drguo/bigdata28@AISINO.COM with kvno 1, encryption type des-cbc-md5 added to keytab WRFILE:/home/drguo/drguo_bigdata28.keytab.
kadmin.local:  q

3.将krb5.conf与keytab文件拷到本地,方便测试
4.使用Spark读取HBase
package drguo.test

import java.io.IOException

import com.google.protobuf.ServiceException
import dguo.test.HBaseKerb
import org.apache.hadoop.hbase.HBaseConfiguration
import org.apache.hadoop.hbase.client.{HBaseAdmin, HTable}
import org.apache.hadoop.hbase.mapreduce.{TableInputFormat}
import org.apache.hadoop.hbase.util.Bytes
import org.apache.hadoop.security.UserGroupInformation
import org.apache.spark.{SparkConf, SparkContext}

/**
  * Created by drguo on 2018/7/18.
  */
object SparkExecHBase {


  def main(args: Array[String]): Unit = {
//    HBaseKerb.getAllRows("XMJZ")
    System.setProperty("java.security.krb5.conf", "d:/krb5.conf")
    val sparkConf = new SparkConf().setAppName("SparkExecHBase").setMaster("local")
    val sc = new SparkContext(sparkConf)

    val conf = HBaseConfiguration.create()
    conf.set(TableInputFormat.INPUT_TABLE, "XMJZ")
    conf.set("hbase.zookeeper.quorum","172.19.6.28,172.19.6.29,172.19.6.30")
    conf.set("hbase.zookeeper.property.clientPort", "2181")
    conf.set("hadoop.security.authentication", "Kerberos")

    UserGroupInformation.setConfiguration(conf)
    try {
      UserGroupInformation.loginUserFromKeytab("drguo/bigdata28@AISINO.COM", "d:/drguo_bigdata28.keytab")
      HBaseAdmin.checkHBaseAvailable(conf)
    } catch {
      case e: IOException =>
        e.printStackTrace()
      case e: ServiceException =>
        e.printStackTrace()
    }

    val hbaseRdd = sc.newAPIHadoopRDD(conf, classOf[TableInputFormat], classOf[org.apache.hadoop.hbase.io.ImmutableBytesWritable], classOf[org.apache.hadoop.hbase.client.Result])
//    println(hbaseRdd.toString())
    hbaseRdd.map( x=>x._2).map{result => (result.getRow,result.getValue(Bytes.toBytes("Info"),Bytes.toBytes("ADDTIME")))}.map(row => (new String(row._1),new String(row._2))).collect.foreach(r => (println(r._1+":"+r._2)))

  }

}

5.使用Java读取(网上也有不少例子,但大部分都有一些重复、多余的代码)
package dguo.test;

import com.google.protobuf.ServiceException;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.hbase.Cell;
import org.apache.hadoop.hbase.CellUtil;
import org.apache.hadoop.hbase.HBaseConfiguration;
import org.apache.hadoop.hbase.client.*;
import org.apache.hadoop.hbase.util.Bytes;
import org.apache.hadoop.security.UserGroupInformation;

import java.io.IOException;

/**
 * Created by drguo on 2018/7/18.
 */
public class HBaseKerb {

    private static Configuration conf = null;
    static {
        System.setProperty("java.security.krb5.conf", "d:/krb5.conf" );
        //使用HBaseConfiguration的单例方法实例化
        conf = HBaseConfiguration.create();
        conf.set("hbase.zookeeper.quorum", "172.19.6.28,172.19.6.29,172.19.6.30");
        conf.set("hbase.zookeeper.property.clientPort", "2181");
        conf.set("hadoop.security.authentication" , "Kerberos" );

        UserGroupInformation.setConfiguration(conf);

        try {
            UserGroupInformation.loginUserFromKeytab("drguo/bigdata28@AISINO.COM", "d:/drguo_bigdata28.keytab");
            HBaseAdmin.checkHBaseAvailable(conf);
        } catch (IOException e) {
            e.printStackTrace();
        } catch (ServiceException e) {
            e.printStackTrace();
        }

    }

    public static void getAllRows(String tableName) throws IOException{
        HTable hTable = new HTable(conf, tableName);
        //得到用于扫描region的对象
        Scan scan = new Scan();
        //使用HTable得到resultcanner实现类的对象
        ResultScanner resultScanner = hTable.getScanner(scan);
        for(Result result : resultScanner){
            Cell[] cells = result.rawCells();
            for(Cell cell : cells){
                //得到rowkey
                System.out.println("行键:" + Bytes.toString(CellUtil.cloneRow(cell)));
                //得到列族
                System.out.println("列族" + Bytes.toString(CellUtil.cloneFamily(cell)));
                System.out.println("列:" + Bytes.toString(CellUtil.cloneQualifier(cell)));
                System.out.println("值:" + Bytes.toString(CellUtil.cloneValue(cell)));
            }
        }
    }

    public static void main(String[] args) throws IOException{
        getAllRows("XMJZ");
    }
}

PS:
出现下述错误往往是因为System.setProperty(“java.security.krb5.conf”, “d:/krb5.conf”)中的krb5.conf文件没有找到(比如路径错误)或是里面配置的kdc、admin_server地址错误。

Exception in thread “main” java.lang.IllegalArgumentException: Can’t get Kerberos realm 
at org.apache.hadoop.security.HadoopKerberosName.setConfiguration(HadoopKerberosName.java:65) 
at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:319) 
at org.apache.hadoop.security.UserGroupInformation.setConfiguration(UserGroupInformation.java:374) 
at drguo.test.SparkExecHBase$.main(SparkExecHBase.scala:32) 
at drguo.test.SparkExecHBase.main(SparkExecHBase.scala) 
Caused by: java.lang.reflect.InvocationTargetException 
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
at java.lang.reflect.Method.invoke(Method.java:498) 
at org.apache.hadoop.security.authentication.util.KerberosUtil.getDefaultRealm(KerberosUtil.java:84) 
at org.apache.hadoop.security.HadoopKerberosName.setConfiguration(HadoopKerberosName.java:63) 
… 4 more 
Caused by: KrbException: Cannot locate default realm 
at sun.security.krb5.Config.getDefaultRealm(Config.java:1029) 
… 10 more
--------------------- 
作者:光于前裕于后 
来源:CSDN 
原文:https://blog.csdn.net/Dr_Guo/article/details/81097197 
版权声明:本文为博主原创文章,转载请附上博文链接!

  • 0
    点赞
  • 2
    收藏
    觉得还不错? 一键收藏
  • 1
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论 1
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值