Failed to start database 'metastore_db' with class loader

问题解决:
我是本地运行,需要修改/user/hive/warehouse的权限,然后将hive配置文件hive-site.xml文件拷贝到程序的resources目录下解决完毕
19/01/15 22:53:00 ERROR Schema: Failed initialising database.
Unable to open a test connection to the given database. JDBC url = jdbc:derby:;databaseName=metastore_db;create=true, username = APP. Terminating connection pool (set lazyInit to true if you expect to start your database after your app). Original Exception: ------
java.sql.SQLException: Failed to start database ‘metastore_db’ with class loader org.apache.spark.sql.hive.client.IsolatedClientLoaderKaTeX parse error: Can't use function '$' in math mode at position 5: anon$̲1@73e5e4cc, see…anonfun$transform 1. a p p l y ( J a v a D S t r e a m L i k e . s c a l a : 335 ) a t o r g . a p a c h e . s p a r k . s t r e a m i n g . a p i . j a v a . J a v a D S t r e a m L i k e 1.apply(JavaDStreamLike.scala:335) at org.apache.spark.streaming.api.java.JavaDStreamLike 1.apply(JavaDStreamLike.scala:335)atorg.apache.spark.streaming.api.java.JavaDStreamLike a n o n f u n anonfun anonfuntransform 1. a p p l y ( J a v a D S t r e a m L i k e . s c a l a : 335 ) a t o r g . a p a c h e . s p a r k . s t r e a m i n g . d s t r e a m . D S t r e a m 1.apply(JavaDStreamLike.scala:335) at org.apache.spark.streaming.dstream.DStream 1.apply(JavaDStreamLike.scala:335)atorg.apache.spark.streaming.dstream.DStream a n o n f u n anonfun anonfuntransform 1 1 1 a n o n f u n anonfun anonfunapply 21. a p p l y ( D S t r e a m . s c a l a : 654 ) a t o r g . a p a c h e . s p a r k . s t r e a m i n g . d s t r e a m . D S t r e a m 21.apply(DStream.scala:654) at org.apache.spark.streaming.dstream.DStream 21.apply(DStream.scala:654)atorg.apache.spark.streaming.dstream.DStream a n o n f u n anonfun anonfuntransform 1 1 1 a n o n f u n anonfun anonfunapply 21. a p p l y ( D S t r e a m . s c a l a : 654 ) a t o r g . a p a c h e . s p a r k . s t r e a m i n g . d s t r e a m . D S t r e a m 21.apply(DStream.scala:654) at org.apache.spark.streaming.dstream.DStream 21.apply(DStream.scala:654)atorg.apache.spark.streaming.dstream.DStream a n o n f u n anonfun anonfuntransform 2 2 2$anonfun 5. a p p l y ( D S t r e a m . s c a l a : 668 ) a t o r g . a p a c h e . s p a r k . s t r e a m i n g . d s t r e a m . D S t r e a m 5.apply(DStream.scala:668) at org.apache.spark.streaming.dstream.DStream 5.apply(DStream.scala:668)atorg.apache.spark.streaming.dstream.DStream a n o n f u n anonfun anonfuntransform 2 2 2$anonfun 5. a p p l y ( D S t r e a m . s c a l a : 666 ) a t o r g . a p a c h e . s p a r k . s t r e a m i n g . d s t r e a m . T r a n s f o r m e d D S t r e a m . c o m p u t e ( T r a n s f o r m e d D S t r e a m . s c a l a : 41 ) a t o r g . a p a c h e . s p a r k . s t r e a m i n g . d s t r e a m . D S t r e a m 5.apply(DStream.scala:666) at org.apache.spark.streaming.dstream.TransformedDStream.compute(TransformedDStream.scala:41) at org.apache.spark.streaming.dstream.DStream 5.apply(DStream.scala:666)atorg.apache.spark.streaming.dstream.TransformedDStream.compute(TransformedDStream.scala:41)atorg.apache.spark.streaming.dstream.DStream a n o n f u n anonfun anonfungetOrCompute 1 1 1$anonfun 1 1 1 a n o n f u n anonfun anonfunapply 7. a p p l y ( D S t r e a m . s c a l a : 350 ) a t o r g . a p a c h e . s p a r k . s t r e a m i n g . d s t r e a m . D S t r e a m 7.apply(DStream.scala:350) at org.apache.spark.streaming.dstream.DStream 7.apply(DStream.scala:350)atorg.apache.spark.streaming.dstream.DStream a n o n f u n anonfun anonfungetOrCompute 1 1 1$anonfun 1 1 1 a n o n f u n anonfun anonfunapply 7. a p p l y ( D S t r e a m . s c a l a : 350 ) a t s c a l a . u t i l . D y n a m i c V a r i a b l e . w i t h V a l u e ( D y n a m i c V a r i a b l e . s c a l a : 57 ) a t o r g . a p a c h e . s p a r k . s t r e a m i n g . d s t r e a m . D S t r e a m 7.apply(DStream.scala:350) at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57) at org.apache.spark.streaming.dstream.DStream 7.apply(DStream.scala:350)atscala.util.DynamicVariable.withValue(DynamicVariable.scala:57)atorg.apache.spark.streaming.dstream.DStream a n o n f u n anonfun anonfungetOrCompute 1 1 1$anonfun 1. a p p l y ( D S t r e a m . s c a l a : 349 ) a t o r g . a p a c h e . s p a r k . s t r e a m i n g . d s t r e a m . D S t r e a m 1.apply(DStream.scala:349) at org.apache.spark.streaming.dstream.DStream 1.apply(DStream.scala:349)atorg.apache.spark.streaming.dstream.DStream a n o n f u n anonfun anonfungetOrCompute 1 1 1$anonfun 1. a p p l y ( D S t r e a m . s c a l a : 349 ) a t o r g . a p a c h e . s p a r k . s t r e a m i n g . d s t r e a m . D S t r e a m . c r e a t e R D D W i t h L o c a l P r o p e r t i e s ( D S t r e a m . s c a l a : 399 ) a t o r g . a p a c h e . s p a r k . s t r e a m i n g . d s t r e a m . D S t r e a m 1.apply(DStream.scala:349) at org.apache.spark.streaming.dstream.DStream.createRDDWithLocalProperties(DStream.scala:399) at org.apache.spark.streaming.dstream.DStream 1.apply(DStream.scala:349)atorg.apache.spark.streaming.dstream.DStream.createRDDWithLocalProperties(DStream.scala:399)atorg.apache.spark.streaming.dstream.DStream a n o n f u n anonfun anonfungetOrCompute 1. a p p l y ( D S t r e a m . s c a l a : 344 ) a t o r g . a p a c h e . s p a r k . s t r e a m i n g . d s t r e a m . D S t r e a m 1.apply(DStream.scala:344) at org.apache.spark.streaming.dstream.DStream 1.apply(DStream.scala:344)atorg.apache.spark.streaming.dstream.DStream a n o n f u n anonfun anonfungetOrCompute 1. a p p l y ( D S t r e a m . s c a l a : 342 ) a t s c a l a . O p t i o n . o r E l s e ( O p t i o n . s c a l a : 257 ) a t o r g . a p a c h e . s p a r k . s t r e a m i n g . d s t r e a m . D S t r e a m . g e t O r C o m p u t e ( D S t r e a m . s c a l a : 339 ) a t o r g . a p a c h e . s p a r k . s t r e a m i n g . d s t r e a m . F o r E a c h D S t r e a m . g e n e r a t e J o b ( F o r E a c h D S t r e a m . s c a l a : 38 ) a t o r g . a p a c h e . s p a r k . s t r e a m i n g . D S t r e a m G r a p h 1.apply(DStream.scala:342) at scala.Option.orElse(Option.scala:257) at org.apache.spark.streaming.dstream.DStream.getOrCompute(DStream.scala:339) at org.apache.spark.streaming.dstream.ForEachDStream.generateJob(ForEachDStream.scala:38) at org.apache.spark.streaming.DStreamGraph 1.apply(DStream.scala:342)atscala.Option.orElse(Option.scala:257)atorg.apache.spark.streaming.dstream.DStream.getOrCompute(DStream.scala:339)atorg.apache.spark.streaming.dstream.ForEachDStream.generateJob(ForEachDStream.scala:38)atorg.apache.spark.streaming.DStreamGraph$anonfun 1. a p p l y ( D S t r e a m G r a p h . s c a l a : 120 ) a t o r g . a p a c h e . s p a r k . s t r e a m i n g . D S t r e a m G r a p h 1.apply(DStreamGraph.scala:120) at org.apache.spark.streaming.DStreamGraph 1.apply(DStreamGraph.scala:120)atorg.apache.spark.streaming.DStreamGraph$anonfun 1. a p p l y ( D S t r e a m G r a p h . s c a l a : 120 ) a t s c a l a . c o l l e c t i o n . T r a v e r s a b l e L i k e 1.apply(DStreamGraph.scala:120) at scala.collection.TraversableLike 1.apply(DStreamGraph.scala:120)atscala.collection.TraversableLike a n o n f u n anonfun anonfunflatMap 1. a p p l y ( T r a v e r s a b l e L i k e . s c a l a : 251 ) a t s c a l a . c o l l e c t i o n . T r a v e r s a b l e L i k e 1.apply(TraversableLike.scala:251) at scala.collection.TraversableLike 1.apply(TraversableLike.scala:251)atscala.collection.TraversableLike a n o n f u n anonfun anonfunflatMap 1. a p p l y ( T r a v e r s a b l e L i k e . s c a l a : 251 ) a t s c a l a . c o l l e c t i o n . m u t a b l e . R e s i z a b l e A r r a y 1.apply(TraversableLike.scala:251) at scala.collection.mutable.ResizableArray 1.apply(TraversableLike.scala:251)atscala.collection.mutable.ResizableArrayclass.foreach(ResizableArray.scala:59)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
at scala.collection.TraversableLike c l a s s . f l a t M a p ( T r a v e r s a b l e L i k e . s c a l a : 251 ) a t s c a l a . c o l l e c t i o n . A b s t r a c t T r a v e r s a b l e . f l a t M a p ( T r a v e r s a b l e . s c a l a : 105 ) a t o r g . a p a c h e . s p a r k . s t r e a m i n g . D S t r e a m G r a p h . g e n e r a t e J o b s ( D S t r e a m G r a p h . s c a l a : 120 ) a t o r g . a p a c h e . s p a r k . s t r e a m i n g . s c h e d u l e r . J o b G e n e r a t o r class.flatMap(TraversableLike.scala:251) at scala.collection.AbstractTraversable.flatMap(Traversable.scala:105) at org.apache.spark.streaming.DStreamGraph.generateJobs(DStreamGraph.scala:120) at org.apache.spark.streaming.scheduler.JobGenerator class.flatMap(TraversableLike.scala:251)atscala.collection.AbstractTraversable.flatMap(Traversable.scala:105)atorg.apache.spark.streaming.DStreamGraph.generateJobs(DStreamGraph.scala:120)atorg.apache.spark.streaming.scheduler.JobGenerator$anonfun 2. a p p l y ( J o b G e n e r a t o r . s c a l a : 247 ) a t o r g . a p a c h e . s p a r k . s t r e a m i n g . s c h e d u l e r . J o b G e n e r a t o r 2.apply(JobGenerator.scala:247) at org.apache.spark.streaming.scheduler.JobGenerator 2.apply(JobGenerator.scala:247)atorg.apache.spark.streaming.scheduler.JobGenerator$anonfun 2. a p p l y ( J o b G e n e r a t o r . s c a l a : 245 ) a t s c a l a . u t i l . T r y 2.apply(JobGenerator.scala:245) at scala.util.Try 2.apply(JobGenerator.scala:245)atscala.util.Try.apply(Try.scala:161)
at org.apache.spark.streaming.scheduler.JobGenerator.generateJobs(JobGenerator.scala:245)
at org.apache.spark.streaming.scheduler.JobGenerator.org a p a c h e apache apachespark s t r e a m i n g streaming streamingscheduler J o b G e n e r a t o r JobGenerator JobGenerator p r o c e s s E v e n t ( J o b G e n e r a t o r . s c a l a : 181 ) a t o r g . a p a c h e . s p a r k . s t r e a m i n g . s c h e d u l e r . J o b G e n e r a t o r processEvent(JobGenerator.scala:181) at org.apache.spark.streaming.scheduler.JobGenerator processEvent(JobGenerator.scala:181)atorg.apache.spark.streaming.scheduler.JobGenerator$anon 1. o n R e c e i v e ( J o b G e n e r a t o r . s c a l a : 87 ) a t o r g . a p a c h e . s p a r k . s t r e a m i n g . s c h e d u l e r . J o b G e n e r a t o r 1.onReceive(JobGenerator.scala:87) at org.apache.spark.streaming.scheduler.JobGenerator 1.onReceive(JobGenerator.scala:87)atorg.apache.spark.streaming.scheduler.JobGenerator$anon 1. o n R e c e i v e ( J o b G e n e r a t o r . s c a l a : 86 ) a t o r g . a p a c h e . s p a r k . u t i l . E v e n t L o o p 1.onReceive(JobGenerator.scala:86) at org.apache.spark.util.EventLoop 1.onReceive(JobGenerator.scala:86)atorg.apache.spark.util.EventLoop$anon 1. r u n ( E v e n t L o o p . s c a l a : 48 ) C a u s e d b y : j a v a . s q l . S Q L E x c e p t i o n : F a i l e d t o s t a r t d a t a b a s e ′ m e t a s t o r e d b ′ w i t h c l a s s l o a d e r o r g . a p a c h e . s p a r k . s q l . h i v e . c l i e n t . I s o l a t e d C l i e n t L o a d e r 1.run(EventLoop.scala:48) Caused by: java.sql.SQLException: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader 1.run(EventLoop.scala:48)Causedby:java.sql.SQLException:Failedtostartdatabasemetastoredbwithclassloaderorg.apache.spark.sql.hive.client.IsolatedClientLoader$anon$1@73e5e4cc, see the next exception for details.
at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
at org.apache.derby.impl.jdbc.SQLExceptionFactory40.wrapArgsForTransportAcrossDRDA(Unknown Source)
… 119 more
Caused by: java.sql.SQLException: Another instance of Derby may have already booted the database F:\IdeaProject\Ibfsparkpro\metastore_db.
at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
at org.apache.derby.impl.jdbc.SQLExceptionFactory40.wrapArgsForTransportAcrossDRDA(Unknown Source)
at org.apache.derby.impl.jdbc.SQLExceptionFactory40.getSQLException(Unknown Source)
at org.apache.derby.impl.jdbc.Util.generateCsSQLException(Unknown Source)
… 116 more
Caused by: ERROR XSDB6: Another instance of Derby may have already booted the database F:\IdeaProject\Ibfsparkpro\metastore_db.
at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.privGetJBMSLockOnDB(Unknown Source)
at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.run(Unknown Source)
at java.security.AccessController.doPrivileged(Native Method)
at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.getJBMSLockOnDB(Unknown Source)
at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.boot(Unknown Source)
at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
at org.apache.derby.impl.store.raw.RawStore.boot(Unknown Source)
at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
at org.apache.derby.impl.store.access.RAMAccessManager.boot(Unknown Source)
at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
at org.apache.derby.impl.db.BasicDatabase.bootStore(Unknown Source)
at org.apache.derby.impl.db.BasicDatabase.boot(Unknown Source)
at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
at org.apache.derby.impl.services.monitor.BaseMonitor.bootService(Unknown Source)
at org.apache.derby.impl.services.monitor.BaseMonitor.startProviderService(Unknown Source)
at org.apache.derby.impl.services.monitor.BaseMonitor.findProviderAndStartService(Unknown Source)
at org.apache.derby.impl.services.monitor.BaseMonitor.startPersistentService(Unknown Source)
at org.apache.derby.iapi.services.monitor.Monitor.startPersistentService(Unknown Source)
… 116 more

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值