-
py4j.protocol.Py4JJavaError: An error occurred while calling o36.load.
-
org.apache.spark.SparkException: Unable to create database default as failed to create its directory /user/hive/warehouse
at org.apache.spark.sql.catalyst.catalog.InMemoryCatalog.liftedTree1 1 ( I n M e m o r y C a t a l o g . s c a l a : 115 ) a t o r g . a p a c h e . s p a r k . s q l . c a t a l y s t . c a t a l o g . I n M e m o r y C a t a l o g . d o C r e a t e D a t a b a s e ( I n M e m o r y C a t a l o g . s c a l a : 109 ) a t o r g . a p a c h e . s p a r k . s q l . c a t a l y s t . c a t a l o g . E x t e r n a l C a t a l o g . c r e a t e D a t a b a s e ( E x t e r n a l C a t a l o g . s c a l a : 69 ) a t o r g . a p a c h e . s p a r k . s q l . i n t e r n a l . S h a r e d S t a t e . e x t e r n a l C a t a l o g 1(InMemoryCatalog.scala:115) at org.apache.spark.sql.catalyst.catalog.InMemoryCatalog.doCreateDatabase(InMemoryCatalog.scala:109) at org.apache.spark.sql.catalyst.catalog.ExternalCatalog.createDatabase(ExternalCatalog.scala:69) at org.apache.spark.sql.internal.SharedState.externalCatalog 1(InMemoryCatalog.scala:115)atorg.apache.spark.sql.catalyst.catalog.InMemoryCatalog.doCreateDatabase(InMemoryCatalog.scala:109)atorg.apache.spark.sql.catalyst.catalog.ExternalCatalog.createDatabase(ExternalCatalog.scala:69)atorg.apache.spark.sql.internal.SharedState.externalCataloglzycompute(SharedState.scala:117)
at org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:102)
at org.apache.spark.sql.internal.BaseSessionStateBuilder.catalog l z y c o m p u t e ( B a s e S e s s i o n S t a t e B u i l d e r . s c a l a : 133 ) a t o r g . a p a c h e . s p a r k . s q l . i n t e r n a l . B a s e S e s s i o n S t a t e B u i l d e r . c a t a l o g ( B a s e S e s s i o n S t a t e B u i l d e r . s c a l a : 131 ) a t o r g . a p a c h e . s p a r k . s q l . i n t e r n a l . B a s e S e s s i o n S t a t e B u i l d e r lzycompute(BaseSessionStateBuilder.scala:133) at org.apache.spark.sql.internal.BaseSessionStateBuilder.catalog(BaseSessionStateBuilder.scala:131) at org.apache.spark.sql.internal.BaseSessionStateBuilder lzycompute(BaseSessionStateBuilder.scala:133)atorg.apache.spark.sql.internal.BaseSessionStateBuilder.catalog(BaseSessionStateBuilder.scala:131)atorg.apache.spark.sql.internal.BaseSessionStateBuilder$anon 1. < i n i t > ( B a s e S e s s i o n S t a t e B u i l d e r . s c a l a : 157 ) a t o r g . a p a c h e . s p a r k . s q l . i n t e r n a l . B a s e S e s s i o n S t a t e B u i l d e r . a n a l y z e r ( B a s e S e s s i o n S t a t e B u i l d e r . s c a l a : 157 ) a t o r g . a p a c h e . s p a r k . s q l . i n t e r n a l . B a s e S e s s i o n S t a t e B u i l d e r 1.<init>(BaseSessionStateBuilder.scala:157) at org.apache.spark.sql.internal.BaseSessionStateBuilder.analyzer(BaseSessionStateBuilder.scala:157) at org.apache.spark.sql.internal.BaseSessionStateBuilder 1.<init>(BaseSessionStateBuilder.scala:157)atorg.apache.spark.sql.internal.BaseSessionStateBuilder.analyzer(BaseSessionStateBuilder.scala:157)atorg.
运行程序遇到的问题
最新推荐文章于 2022-12-18 14:18:25 发布
在尝试运行Spark程序时,遇到了org.apache.spark.SparkException,具体错误为无法创建数据库default,因为未能创建其目录/user/hive/warehouse。错误的根本原因是权限问题,用户lenovo没有写入权限。这个问题涉及到HDFS的权限管理和Ranger的访问控制。
摘要由CSDN通过智能技术生成