1,错误呈现
Spark集成HIve后执行如下语句
def readHive(): Unit ={
val conf: SparkConf = new SparkConf().setMaster("local").setAppName("my")
val sc: SparkSession = SparkSession.builder()
.enableHiveSupport()// 启用Hive的支持
.config("spark.sql.warehouse.dir", "hdfs://hadoop102:9820/user/hive/warehouse") //设置spark连接Hive写出数据的存储位置,默认是spark的安装目录
.config(conf).getOrCreate()
// 导入隐式类 注意不能导入非val的类型的变量
import sc.implicits._
// 创建数据库
sc.sql("create database sparkHiveDB")
// 展示hive所有的数据库
sc.sql("show databases").show()
// 使用数据库
//sc.sql("use sparkHiveDB")
// 创建表
// sc.sql("create table student(name string,age int)")
// 插入数据
//sc.sql("insert into student values('lisi',23),('wangwu',34)")
// 查询数据
//sc.sql("select * from student").show()
}
发现当创建库的时候报如下错误
org.apache.hadoop.security.AccessControlException: Permission denied: user=zhijm, access=WRITE, inode="/user/hive/warehouse":hadoop1:supergroup:drwxr-xr-x
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:317)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:223)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:199)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1752)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1736)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkAncestorAccess(FSDirectory.java:1719)
at org.apache.hadoop.hdfs.server.namenode.FSDirMkdirOp.mkdirs(FSDirMkdirOp.java:69)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:3861)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:984)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:634)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:982)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2217)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2213)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1746)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2213)
2,原因:
根据错误信息提示:inode="/user/hive/warehouse":hadoop1:supergroup:drwxr-xr-x 知道在windows上运行程序默认的用户是本机的账号,是没有权限往HDFS中的/user/hive/warehouse目录写入文件的
3,处理办法:
修改/user/hive/warehouse的读写权限为hadoop fs -chmod 777 /user/hive/warehouse