运行程序遇到的问题

在尝试运行Spark程序时,遇到了org.apache.spark.SparkException,具体错误为无法创建数据库default,因为未能创建其目录/user/hive/warehouse。错误的根本原因是权限问题,用户lenovo没有写入权限。这个问题涉及到HDFS的权限管理和Ranger的访问控制。
摘要由CSDN通过智能技术生成
py4j.protocol.Py4JJavaError: An error occurred while calling o36.load.

org.apache.spark.SparkException: Unable to create database default as failed to create its directory /user/hive/warehouse
at org.apache.spark.sql.catalyst.catalog.InMemoryCatalog.liftedTree1 1 ( I n M e m o r y C a t a l o g . s c a l a : 115 ) a t o r g . a p a c h e . s p a r k . s q l . c a t a l y s t . c a t a l o g . I n M e m o r y C a t a l o g . d o C r e a t e D a t a b a s e ( I n M e m o r y C a t a l o g . s c a l a : 109 ) a t o r g . a p a c h e . s p a r k . s q l . c a t a l y s t . c a t a l o g . E x t e r n a l C a t a l o g . c r e a t e D a t a b a s e ( E x t e r n a l C a t a l o g . s c a l a : 69 ) a t o r g . a p a c h e . s p a r k . s q l . i n t e r n a l . S h a r e d S t a t e . e x t e r n a l C a t a l o g 1(InMemoryCatalog.scala:115) at org.apache.spark.sql.catalyst.catalog.InMemoryCatalog.doCreateDatabase(InMemoryCatalog.scala:109) at org.apache.spark.sql.catalyst.catalog.ExternalCatalog.createDatabase(ExternalCatalog.scala:69) at org.apache.spark.sql.internal.SharedState.externalCatalog 1(InMemoryCatalog.scala:115)atorg.apache.spark.sql.catalyst.catalog.InMemoryCatalog.doCreateDatabase(InMemoryCatalog.scala:109)atorg.apache.spark.sql.catalyst.catalog.ExternalCatalog.createDatabase(ExternalCatalog.scala:69)atorg.apache.spark.sql.internal.SharedState.externalCataloglzycompute(SharedState.scala:117)
at org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:102)
at org.apache.spark.sql.internal.BaseSessionStateBuilder.catalog l z y c o m p u t e ( B a s e S e s s i o n S t a t e B u i l d e r . s c a l a : 133 ) a t o r g . a p a c h e . s p a r k . s q l . i n t e r n a l . B a s e S e s s i o n S t a t e B u i l d e r . c a t a l o g ( B a s e S e s s i o n S t a t e B u i l d e r . s c a l a : 131 ) a t o r g . a p a c h e . s p a r k . s q l . i n t e r n a l . B a s e S e s s i o n S t a t e B u i l d e r lzycompute(BaseSessionStateBuilder.scala:133) at org.apache.spark.sql.internal.BaseSessionStateBuilder.catalog(BaseSessionStateBuilder.scala:131) at org.apache.spark.sql.internal.BaseSessionStateBuilder lzycompute(BaseSessionStateBuilder.scala:133)atorg.apache.spark.sql.internal.BaseSessionStateBuilder.catalog(BaseSessionStateBuilder.scala:131)atorg.apache.spark.sql.internal.BaseSessionStateBuilder$anon 1. &lt; i n i t &gt; ( B a s e S e s s i o n S t a t e B u i l d e r . s c a l a : 157 ) a t o r g . a p a c h e . s p a r k . s q l . i n t e r n a l . B a s e S e s s i o n S t a t e B u i l d e r . a n a l y z e r ( B a s e S e s s i o n S t a t e B u i l d e r . s c a l a : 157 ) a t o r g . a p a c h e . s p a r k . s q l . i n t e r n a l . B a s e S e s s i o n S t a t e B u i l d e r 1.&lt;init&gt;(BaseSessionStateBuilder.scala:157) at org.apache.spark.sql.internal.BaseSessionStateBuilder.analyzer(BaseSessionStateBuilder.scala:157) at org.apache.spark.sql.internal.BaseSessionStateBuilder 1.<init>(BaseSessionStateBuilder.scala:157)atorg.apache.spark.sql.internal.BaseSessionStateBuilder.analyzer(BaseSessionStateBuilder.scala:157)atorg.

  • 1
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 1
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论 1
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值