java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/HBaseConfiguration

40 篇文章 21 订阅 ¥19.90 ¥99.00

用sbt打包Spark程序,并未将所有依赖都打入包中,把Spark应用放到集群中运行时,出现异常:

Exception in thread “main” java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/HBaseConfiguration
at SparkHbase .main(SparkHbase.scala:34)atSparkHbase.main(SparkHbase.scala)atsun.reflect.NativeMethodAccessorImpl.invoke0(NativeMethod)atsun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)atsun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)atjava.lang.reflect.Method.invoke(Method.java:606)atorg.apache.spark.deploy.SparkSubmit .org apache spark deploy SparkSubmit$$runMain(SparkSubmit.scala:569)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hbase.HBaseConfiguration
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
… 11 more
出现该异常的原因是Spark应用缺少hbase依赖,我这里的做法是在集群的spark/conf/spark-env.sh中添加下文:

export SPARK_CLASSPATH=/home/hadoop/SW/hbase/lib/hbase-client-0.98.12-hadoop2.jar:/home/hadoop/SW/hbase/lib/hbase-server-0.98.12-hadoop2.jar:/home/hadoop/SW/hbase/lib/hbase-common-0.98.12-hadoop2.jar:/home/hadoop/SW/hbase/lib/hbase-protocol-0.98.12-hadoop2.jar:/home/hadoop/SW/hbase/lib/htrace-core-2.04.jar:/home/hadoop/SW/hbase/lib/hbase-hadoop2-compat-0.98.12-hadoop2.jar:/home/hadoop/SW/hbase/lib/hbase-it-0.98.12-hadoop2.jar:/home/hadoop/SW/hbase/lib/guava-12.0.1.jar
切记注意每个jar包之间用冒号分隔!然后执行命令:

source spark-env.sh
并重启一下spark服务,就ok了!

其实还有一个方法,就是在你提交应用时增加–driver-class-path配置参数来设置driver的classpath:

./spark-submit –driver-class-path /home/hadoop/SW/hbase/lib/hbase-client-0.98.12-hadoop2.jar:/home/hadoop/SW/hbase/lib/hbase-server-0.98.12-hadoop2.jar:/home/hadoop/SW/hbase/lib/hbase-common-0.98.12-hadoop2.jar:/home/hadoop/SW/hbase/lib/hbase-protocol-0.98.12-hadoop2.jar:/home/hadoop/SW/hbase/lib/htrace-core-2.04.jar:/home/hadoop/SW/hbase/lib/hbase-hadoop2-compat-0.98.12-hadoop2.jar:/home/hadoop/SW/hbase/lib/hbase-it-0.98.12-hadoop2.jar:/home/hadoop/SW/hbase/lib/guava-12.0.1.jar –class com.dtxy.data.SqlTest ../lib/bigdata-1.0-SNAPSHOT.jar
注:不能同时在spark/conf/spark-env.sh里面配置SPARK_CLASSPATH又在提交作业加上–driver-class-path参数,否则会出现异常:

15/08/14 09:22:23 ERROR SparkContext: Error initializing SparkContext.
org.apache.spark.SparkException: Found both spark.driver.extraClassPath and SPARK_CLASSPATH. Use only the former.
at org.apache.spark.SparkConf

anonfun$validateSettings$6
anonfun apply 8.apply(SparkConf.scala:444)
at org.apache.spark.SparkConf
anonfun$validateSettings$6
anonfun apply 8.apply(SparkConf.scala:442)
at scala.collection.immutable.List.foreach(List.scala:318)
at org.apache.spark.SparkConf
anonfun$validateSettings$6.apply(SparkConf.scala:442)atorg.apache.spark.SparkConf
anonfun validateSettings 6.apply(SparkConf.scala:430)
at scala.Option.foreach(Option.scala:236)
at org.apache.spark.SparkConf.validateSettings(SparkConf.scala:430)
at org.apache.spark.SparkContext.(SparkContext.scala:365)
at com.dtxy.data.SqlTest .main(SqlTest.scala:27)atcom.dtxy.data.SqlTest.main(SqlTest.scala)atsun.reflect.NativeMethodAccessorImpl.invoke0(NativeMethod)atsun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)atsun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)atjava.lang.reflect.Method.invoke(Method.java:606)atorg.apache.spark.deploy.SparkSubmit .org apache spark deploy SparkSubmit
runMain(SparkSubmit.scala:664)atorg.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:169)atorg.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:192)atorg.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111)atorg.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)15/08/1409:22:23INFOSparkContext:SuccessfullystoppedSparkContextExceptioninthreadmainorg.apache.spark.SparkException:Foundbothspark.driver.extraClassPathandSPARKCLASSPATH.Useonlytheformer.atorg.apache.spark.SparkConf
anonfun validateSettings 6
anonfun$apply$8.apply(SparkConf.scala:444)atorg.apache.spark.SparkConf
anonfun validateSettings 6
anonfun$apply$8.apply(SparkConf.scala:442)atscala.collection.immutable.List.foreach(List.scala:318)atorg.apache.spark.SparkConf
anonfun validateSettings 6.apply(SparkConf.scala:442) at org.apache.spark.SparkConf
anonfun$validateSettings$6.apply(SparkConf.scala:430)atscala.Option.foreach(Option.scala:236)atorg.apache.spark.SparkConf.validateSettings(SparkConf.scala:430)atorg.apache.spark.SparkContext.(SparkContext.scala:365)atcom.dtxy.data.SqlTest$.main(SqlTest.scala:27)atcom.dtxy.data.SqlTest.main(SqlTest.scala)atsun.reflect.NativeMethodAccessorImpl.invoke0(NativeMethod)atsun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)atsun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)atjava.lang.reflect.Method.invoke(Method.java:606)atorg.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit
runMain(SparkSubmit.scala:664)
at org.apache.spark.deploy.SparkSubmit .doRunMain 1(SparkSubmit.scala:169)
at org.apache.spark.deploy.SparkSubmit .submit(SparkSubmit.scala:192)atorg.apache.spark.deploy.SparkSubmit .main(SparkSubmit.scala:111)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
15/08/14 09:22:23 INFO Utils: Shutdown hook called
到此为止,问题解决!

参考来源:http://www.abcn.net/2014/07/lighting-spark-with-hbase-full-edition.html

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值