-
下载相关配置文件
登录
CDH控制台
,到HBASE
的登录页面,下载相关的配置文件到本地
-
新建maven项目,将相关的配置文件存放到
src/resources
目录下面 -
登录kerberos主机,生成hbase的kerberos密钥文件。然后和
krb5.conf
一起下载到本地,也是同样存放到src/resources
目录下面 -
本地
hosts
文件填写相关的CDH集群信息 -
pom.xml
的内容如下:<properties> <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding> <scala.version>2.10</scala.version> <spark.version>1.6.3</spark.version> <java.version>1.8</java.version> </properties> <dependencies> <!-- https://mvnrepository.com/artifact/org.apache.hbase/hbase-client --> <dependency> <groupId>org.apache.hbase</groupId> <artifactId>hbase-client</artifactId> <version>2.0.0</version> </dependency> <dependency> <groupId>org.scala-lang</groupId> <artifactId>scala-library</artifactId> <version>2.10.6</version> </dependency> <!-- https://mvnrepository.com/artifact/org.apache.spark/spark-core --> <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-core_${scala.version}</artifactId> <version>${spark.version}</version> </dependency> <!-- https://mvnrepository.com/artifact/org.apache.spark/spark-sql --> <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-sql_${scala.version}</artifactId> <version>${spark.version}</version> </dependency> <!-- https://mvnrepository.com/artifact/org.apache.spark/spark-hive --> <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-hive_${scala.version}</artifactId> <version>${spark.version}</version> <!--<scope>provided</scope>--> </dependency> <dependency> <groupId>junit</groupId> <artifactId>junit</artifactId> <version>4.4</version> <scope>test</scope> </dependency> <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-streaming_2.10</artifactId> <version>1.6.3</version> <!--<scope>provided</scope>--> </dependency> <dependency> <groupId>org.apache.kafka</groupId> <artifactId>kafka-clients</artifactId> <version>2.0.0</version> </dependency> <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-streaming-kafka-0-8_2.10</artifactId> <version>2.0.0</version> </dependency> </dependencies>
-
代码如下:
public class TestHbase { private static Configuration conf = null; static { String rootPath = System.getProperty("user.dir"); String keytab = rootPath + "/src/main/resources/hbase.keytab"; String krb5 = rootPath + "/src/main/resources/krb5.conf"; System. setProperty("java.security.krb5.conf", krb5); conf = HBaseConfiguration.create(); conf.set("hadoop.security.authentication" , "Kerberos" ); // 这个hbase.keytab也是从远程服务器上copy下来的, 里面存储的是密码相关信息 // 这样我们就不需要交互式输入密码了 conf.set("keytab.file" , keytab ); // 这个可以理解成用户名信息,也就是Principal conf.set("kerberos.principal" , "hbase@CATTSOFT.COM" ); UserGroupInformation. setConfiguration(conf); try { UserGroupInformation. loginUserFromKeytab("hbase@CATTSOFT.COM", keytab ); } catch (IOException e) { e.printStackTrace(); } } public static void scanSpan( String tableName) throws Exception { } public static void main(String[] args) { try { TestHbase.scanSpan("user"); } catch (Exception e) { e.printStackTrace(); } } }
-
运行主函数,结果如下:
19/11/11 09:56:37 INFO security.UserGroupInformation: Login successful for user hbase@CATTSOFT.COM using keytab file D:\code\ownProject\hbase_kerberos_test/src/main/resources/hbase.keytab
CDH-Kerberos环境下,HBASE java连接
最新推荐文章于 2024-07-11 00:31:05 发布