关闭

java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/HBaseConfiguration

标签: spark
2235人阅读 评论(0) 收藏 举报
分类:

用sbt打包Spark程序,并未将所有依赖都打入包中,把Spark应用放到集群中运行时,出现异常:

Exception in thread “main” java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/HBaseConfiguration
at SparkHbase.main(SparkHbase.scala:34)atSparkHbase.main(SparkHbase.scala)atsun.reflect.NativeMethodAccessorImpl.invoke0(NativeMethod)atsun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)atsun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)atjava.lang.reflect.Method.invoke(Method.java:606)atorg.apache.spark.deploy.SparkSubmit.orgapachesparkdeploySparkSubmit$$runMain(SparkSubmit.scala:569)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hbase.HBaseConfiguration
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
… 11 more
出现该异常的原因是Spark应用缺少hbase依赖,我这里的做法是在集群的spark/conf/spark-env.sh中添加下文:

export SPARK_CLASSPATH=/home/hadoop/SW/hbase/lib/hbase-client-0.98.12-hadoop2.jar:/home/hadoop/SW/hbase/lib/hbase-server-0.98.12-hadoop2.jar:/home/hadoop/SW/hbase/lib/hbase-common-0.98.12-hadoop2.jar:/home/hadoop/SW/hbase/lib/hbase-protocol-0.98.12-hadoop2.jar:/home/hadoop/SW/hbase/lib/htrace-core-2.04.jar:/home/hadoop/SW/hbase/lib/hbase-hadoop2-compat-0.98.12-hadoop2.jar:/home/hadoop/SW/hbase/lib/hbase-it-0.98.12-hadoop2.jar:/home/hadoop/SW/hbase/lib/guava-12.0.1.jar
切记注意每个jar包之间用冒号分隔!然后执行命令:

source spark-env.sh
并重启一下spark服务,就ok了!

其实还有一个方法,就是在你提交应用时增加–driver-class-path配置参数来设置driver的classpath:

./spark-submit –driver-class-path /home/hadoop/SW/hbase/lib/hbase-client-0.98.12-hadoop2.jar:/home/hadoop/SW/hbase/lib/hbase-server-0.98.12-hadoop2.jar:/home/hadoop/SW/hbase/lib/hbase-common-0.98.12-hadoop2.jar:/home/hadoop/SW/hbase/lib/hbase-protocol-0.98.12-hadoop2.jar:/home/hadoop/SW/hbase/lib/htrace-core-2.04.jar:/home/hadoop/SW/hbase/lib/hbase-hadoop2-compat-0.98.12-hadoop2.jar:/home/hadoop/SW/hbase/lib/hbase-it-0.98.12-hadoop2.jar:/home/hadoop/SW/hbase/lib/guava-12.0.1.jar –class com.dtxy.data.SqlTest ../lib/bigdata-1.0-SNAPSHOT.jar
注:不能同时在spark/conf/spark-env.sh里面配置SPARK_CLASSPATH又在提交作业加上–driver-class-path参数,否则会出现异常:

15/08/14 09:22:23 ERROR SparkContext: Error initializing SparkContext.
org.apache.spark.SparkException: Found both spark.driver.extraClassPath and SPARK_CLASSPATH. Use only the former.
at org.apache.spark.SparkConf

anonfun$validateSettings$6
anonfunapply8.apply(SparkConf.scala:444)
at org.apache.spark.SparkConf
anonfun$validateSettings$6
anonfunapply8.apply(SparkConf.scala:442)
at scala.collection.immutable.List.foreach(List.scala:318)
at org.apache.spark.SparkConf
anonfun$validateSettings$6.apply(SparkConf.scala:442)atorg.apache.spark.SparkConf
anonfunvalidateSettings6.apply(SparkConf.scala:430)
at scala.Option.foreach(Option.scala:236)
at org.apache.spark.SparkConf.validateSettings(SparkConf.scala:430)
at org.apache.spark.SparkContext.(SparkContext.scala:365)
at com.dtxy.data.SqlTest.main(SqlTest.scala:27)atcom.dtxy.data.SqlTest.main(SqlTest.scala)atsun.reflect.NativeMethodAccessorImpl.invoke0(NativeMethod)atsun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)atsun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)atjava.lang.reflect.Method.invoke(Method.java:606)atorg.apache.spark.deploy.SparkSubmit.orgapachesparkdeploySparkSubmit
runMain(SparkSubmit.scala:664)atorg.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:169)atorg.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:192)atorg.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111)atorg.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)15/08/1409:22:23INFOSparkContext:SuccessfullystoppedSparkContextExceptioninthreadmainorg.apache.spark.SparkException:Foundbothspark.driver.extraClassPathandSPARKCLASSPATH.Useonlytheformer.atorg.apache.spark.SparkConf
anonfunvalidateSettings6
anonfun$apply$8.apply(SparkConf.scala:444)atorg.apache.spark.SparkConf
anonfunvalidateSettings6
anonfun$apply$8.apply(SparkConf.scala:442)atscala.collection.immutable.List.foreach(List.scala:318)atorg.apache.spark.SparkConf
anonfunvalidateSettings6.apply(SparkConf.scala:442) at org.apache.spark.SparkConf
anonfun$validateSettings$6.apply(SparkConf.scala:430)atscala.Option.foreach(Option.scala:236)atorg.apache.spark.SparkConf.validateSettings(SparkConf.scala:430)atorg.apache.spark.SparkContext.(SparkContext.scala:365)atcom.dtxy.data.SqlTest$.main(SqlTest.scala:27)atcom.dtxy.data.SqlTest.main(SqlTest.scala)atsun.reflect.NativeMethodAccessorImpl.invoke0(NativeMethod)atsun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)atsun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)atjava.lang.reflect.Method.invoke(Method.java:606)atorg.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit
runMain(SparkSubmit.scala:664)
at org.apache.spark.deploy.SparkSubmit.doRunMain1(SparkSubmit.scala:169)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:192)atorg.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala:111)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
15/08/14 09:22:23 INFO Utils: Shutdown hook called
到此为止,问题解决!

参考来源:http://www.abcn.net/2014/07/lighting-spark-with-hbase-full-edition.html

0
0
查看评论

HBase学习之四: mapreduce处理数据后存储到hbase及错误java.lang.NoClassDefFoundError的解决办法

mapreduce处理数据后存储到hbase源代码(参考网上资料测试OK): map类: package hbase; import java.io.IOException; import org.apache.hadoop.io.LongWritable; import org.apach...
  • javajxz008
  • javajxz008
  • 2016-07-07 22:25
  • 2943

hadoop+hbase导致报错(NoClassDefFoundError: org/apache/hadoop/hbase/HBaseConfiguration)

Exception in thread "main" java.io.IOException: Error opening job jar: ./ADReport2Hbase_0309.jar         at org.apache.h...
  • zwx19921215
  • zwx19921215
  • 2014-03-19 14:40
  • 36525

nutch解决编译后java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/HBaseConfigura的问题

本文章接nutch系列2——nutch2.x的编译、安装和配置 前面讲了nutch的编译安装和配置,接下来就是使用nutch: 在命令行执行:nutch inject urls -crawlId web 如下: 报错了!!!!查了官方文档,是nutch的一个bugIn addition add...
  • enson16855
  • enson16855
  • 2016-05-12 13:23
  • 1665

升级hbase-client报错 java.lang.NoClassDefFoundError: org/apache/commons/collections/map/UnmodifiableMap

Caused by: java.lang.NoClassDefFoundError: org/apache/commons/collections/map/UnmodifiableMap at org.apache.hadoop.conf.Configuration$DeprecationCont...
  • guliangliang
  • guliangliang
  • 2016-12-07 19:25
  • 3156

nutch2.3 hadoop2.6.0 hbase0.98.8 分布式爬虫NoClassDefFoundError: org/apache/hadoop/hbase/HBaseConfigurati

Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/HBaseConfiguration at org.apache.gora.hbase.store.HBaseS...
  • qq_23280769
  • qq_23280769
  • 2015-09-14 17:43
  • 893

HBase MapReduce 解决java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/...

在使用MapReduce 和HBase结合时候,在运行程序的时候,会出现 java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/xxx错误,原因是Hadoop的运行环境中缺少HBase支持的jar包,按照如下方法可以解决
  • macanv
  • macanv
  • 2017-02-18 21:26
  • 1764

HDFS上数据保存到Hbase运行报错:NoClassDefFoundError: org/apache/hadoop/hbase/HBaseConfiguration

把HDFS上数据保存到Hbase运行报错!!!! 错误如下: Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/HBaseConfiguration at c...
  • young_0609
  • young_0609
  • 2017-10-12 00:11
  • 144

HBaseConfiguration类,参数说明

HBaseConfiguration类,参数说明
  • high2011
  • high2011
  • 2017-02-09 10:09
  • 3457

源代码分析一:创建HBaseConfiguration和HConnectionManager

一:从 val HBASE_CONF = HBaseConfiguration.create开始 创建HBaseConfiguration过程: A,新建Configuration对象 //loadDefaults默认为true public Configura...
  • u013494310
  • u013494310
  • 2015-04-26 14:37
  • 1564

HBase之Java API

1.Configuration 在使用Java API时,Client端需要知道HBase的配置环境,如存储地址,zookeeper等信息。这些信息通过Configuration对象来封装,可通过如下代码构建该对象        ...
  • JavaMan_chen
  • JavaMan_chen
  • 2012-01-30 17:15
  • 18425
    个人资料
    • 访问:120789次
    • 积分:2466
    • 等级:
    • 排名:第17800名
    • 原创:115篇
    • 转载:75篇
    • 译文:3篇
    • 评论:12条
    文章分类
    最新评论