spark搭建 java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration

初次start启动时出错:

starting org.apache.spark.deploy.master.Master, logging to /soft/spark/log/spark-superahua-org.apache.spark.deploy.master.Master-1-b1.out
failed to launch org.apache.spark.deploy.master.Master:
  	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
  	... 7 more
full log in /soft/spark/log/spark-superahua-org.apache.spark.deploy.master.Master-1-b1.out
localhost: starting org.apache.spark.deploy.worker.Worker, logging to /soft/spark/log/spark-superahua-org.apache.spark.deploy.worker.Worker-1-b1.out
localhost: failed to launch org.apache.spark.deploy.worker.Worker:
localhost:   	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
localhost:   	... 7 more
localhost: full log in /soft/spark/log/spark-superahua-org.apache.spark.deploy.worker.Worker-1-b1.out

查看master日志

Error: A JNI error has occurred, please check your installation and try again
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration
	at java.lang.Class.getDeclaredMethods0(Native Method)
	at java.lang.Class.privateGetDeclaredMethods(Class.java:2701)
	at java.lang.Class.privateGetMethodRecursive(Class.java:3048)
	at java.lang.Class.getMethod0(Class.java:3018)
	at java.lang.Class.getMethod(Class.java:1784)
	at sun.launcher.LauncherHelper.validateMainClass(LauncherHelper.java:544)
	at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:526)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.conf.Configuration
	at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)

查看worker日志

Error: A JNI error has occurred, please check your installation and try again
Exception in thread "main" java.lang.NoClassDefFoundError: org/slf4j/Logger
	at java.lang.Class.getDeclaredMethods0(Native Method)
	at java.lang.Class.privateGetDeclaredMethods(Class.java:2701)
	at java.lang.Class.privateGetMethodRecursive(Class.java:3048)
	at java.lang.Class.getMethod0(Class.java:3018)
	at java.lang.Class.getMethod(Class.java:1784)
	at sun.launcher.LauncherHelper.validateMainClass(LauncherHelper.java:544)
	at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:526)
Caused by: java.lang.ClassNotFoundException: org.slf4j.Logger
	at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)

 

 解决:

开始时看这个博客https://blog.csdn.net/qq_41212491/article/details/87716710

在spark-env.sh中添加了:

export export SPARK_DIST_CLASSPATH=$(${HADOOP_HOME}/bin/hadoop classpath)

还是不对,出现:

starting org.apache.spark.deploy.master.Master, logging to /soft/spark/log/spark-superahua-org.apache.spark.deploy.master.Master-1-b1.out
localhost: /soft/spark/conf/spark-env.sh: line 73: /bin/hadoop: No such file or directory
localhost: starting org.apache.spark.deploy.worker.Worker, logging to /soft/spark/log/spark-superahua-org.apache.spark.deploy.worker.Worker-1-b1.out
localhost: failed to launch org.apache.spark.deploy.worker.Worker:
localhost:   	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
localhost:   	... 7 more
localhost: full log in /soft/spark/log/spark-superahua-org.apache.spark.deploy.worker.Worker-1-b1.out

还是不写环境变量了,换成了完整路径

export SPARK_DIST_CLASSPATH=$(/soft/hadoop/bin/hadoop classpath)

启动成功

starting org.apache.spark.deploy.master.Master, logging to /soft/spark/log/spark-superahua-org.apache.spark.deploy.master.Master-1-b1.out
localhost: starting org.apache.spark.deploy.worker.Worker, logging to /soft/spark/log/spark-superahua-org.apache.spark.deploy.worker.Worker-1-b1.out

 

  • 0
    点赞
  • 4
    收藏
    觉得还不错? 一键收藏
  • 打赏
    打赏
  • 5
    评论
引用\[1\]:在你提供的引用中,出现了一个Java错误,即"java.lang.NoClassDefFoundError",这个错误通常是在编译过程中没有问题,但在运行过程中发现缺少类或者有重复类导致的。\[2\]根据错误提示中的关键点"org/apache/hadoop/conf/Configuration",我们可以看到缺少了这个类。解决这个问题的方法是修改pom.xml文件中的Spark依赖项的作用域(scope),将其从"provided"改为"compile"。\[3\]这样做可以确保在运行时能够正确加载所需的类。 #### 引用[.reference_title] - *1* *3* [spark 遇到NoClassDefFoundError解决方法: Apache Spark error on start: java.lang.NoClassDefFoundError:...](https://blog.csdn.net/a391000181/article/details/102221192)[target="_blank" data-report-click={"spm":"1018.2226.3001.9630","extra":{"utm_source":"vip_chatgpt_common_search_pc_result","utm_medium":"distribute.pc_search_result.none-task-cask-2~all~insert_cask~default-1-null.142^v91^insertT0,239^v3^insert_chatgpt"}} ] [.reference_item] - *2* [问题解决:java运行HiveQL,报错:java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration](https://blog.csdn.net/u013084266/article/details/106805574)[target="_blank" data-report-click={"spm":"1018.2226.3001.9630","extra":{"utm_source":"vip_chatgpt_common_search_pc_result","utm_medium":"distribute.pc_search_result.none-task-cask-2~all~insert_cask~default-1-null.142^v91^insertT0,239^v3^insert_chatgpt"}} ] [.reference_item] [ .reference_list ]

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论 5
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包

打赏作者

Ahuuua

你的鼓励将是我创作的最大动力

¥1 ¥2 ¥4 ¥6 ¥10 ¥20
扫码支付:¥1
获取中
扫码支付

您的余额不足,请更换扫码支付或充值

打赏作者

实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值