Spark:3.0.3版本报错“java.lang.NoSuchFieldError: JAVA_9“

 执行spark程序,具体报错如下

Exception in thread "main" java.lang.NoSuchFieldError: JAVA_9
	at org.apache.spark.storage.StorageUtils$.<init>(StorageUtils.scala:207)
	at org.apache.spark.storage.StorageUtils$.<clinit>(StorageUtils.scala)
	at org.apache.spark.storage.BlockManagerMasterEndpoint.<init>(BlockManagerMasterEndpoint.scala:93)
	at org.apache.spark.SparkEnv$.$anonfun$create$9(SparkEnv.scala:370)
	at org.apache.spark.SparkEnv$.registerOrLookupEndpoint$1(SparkEnv.scala:311)
	at org.apache.spark.SparkEnv$.create(SparkEnv.scala:359)
	at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:189)
	at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:272)
	at org.apache.spark.SparkContext.<init>(SparkContext.scala:448)
	at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2589)
	at org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$2(SparkSession.scala:937)
	at scala.Option.getOrElse(Option.scala:189)
	at org.apache.spark.sql.SparkSession$Builder.getOrCreate(S

报错代码是

https://github.com/apache/spark/blob/branch-3.0/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala

 第207行 if (SystemUtils.isJavaVersionAtLeast(JavaVersion.JAVA_9))

private[spark] object StorageUtils extends Logging {

  // In Java 8, the type of DirectBuffer.cleaner() was sun.misc.Cleaner, and it was possible
  // to access the method sun.misc.Cleaner.clean() to invoke it. The type changed to
  // jdk.internal.ref.Cleaner in later JDKs, and the .clean() method is not accessible even with
  // reflection. However sun.misc.Unsafe added a invokeCleaner() method in JDK 9+ and this is
  // still accessible with reflection.
  private val bufferCleaner: DirectBuffer => Unit =
    if (SystemUtils.isJavaVersionAtLeast(JavaVersion.JAVA_9)) {
      val cleanerMethod =
        Utils.classForName("sun.misc.Unsafe").getMethod("invokeCleaner", classOf[ByteBuffer])
      val unsafeField = classOf[Unsafe].getDeclaredField("theUnsafe")
      unsafeField.setAccessible(true)
      val unsafe = unsafeField.get(null).asInstanceOf[Unsafe]
      buffer: DirectBuffer => cleanerMethod.invoke(unsafe, buffer)
    } else {
      val cleanerMethod = Utils.classForName("sun.misc.Cleaner").getMethod("clean")
      buffer: DirectBuffer => {
        // Careful to avoid the return type of .cleaner(), which changes with JDK
        val cleaner: AnyRef = buffer.cleaner()
        if (cleaner != null) {
          cleanerMethod.invoke(cleaner)
        }
      }
    }

原因是与maven中的包不兼容,我们修改hadoop-common和hadoop-mapreduce-client-core包的引用,注意加上 <scope>provided</scope>

        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-common</artifactId>
            <version>${hadoop.version}</version>
        </dependency>
        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-mapreduce-client-core</artifactId>
            <version>${hadoop.version}</version>
        </dependency>

修改后,重新reimport一下,再次执行程序,结果还是报错,没办法,继续查找资料,终于被我找到一个资料,告诉我们主要是因为Spark-3.x会依赖commons-lang3这个包

普通的解决方法就不说了,解决冲突,缺少commons-lang3的加上去,在pom文件加入下面依赖

        <dependency>
            <groupId>org.apache.commons</groupId>
            <artifactId>commons-lang3</artifactId>
            <version>3.8.1</version>
        </dependency>

如果还不行,就检查下是否有hive-exec这个依赖的存在,打开这个引用你会发现它里面竟然也有commons-lang3,但是它的JavaVersion这个类竟然不一样!!!

如果是hive的问题,排除掉hive-exec就好了

总结
        感谢能看到这里的朋友😉

        本次的分享就到这里,猫头鹰数据致力于为大家分享技术干货😎

        如果以上过程中出现了任何的纰漏错误,烦请大佬们指正😅

        受益的朋友或对技术感兴趣的伙伴记得点赞关注支持一波🙏

        也可以搜索关注我的微信公众号【猫头鹰数据分析】,留言交流🙏

根据提供的引用内容,报错信息"Exception in thread "main" java.lang.ClassNotFoundException: BigData.wordcount1"是由于找不到类BigData.wordcount1引起的。根据引用\[1\]中的错误信息,可能是由于缺少org.apache.hadoop.fs.FSDataInputStream类导致的。而引用\[2\]提供了解决方法,即将类移动到main中,并进行clean和package操作。另外,引用\[3\]中提供了一个Spark远程调用示例,可以参考该示例来设置SparkConf和SparkSession。所以,解决这个问题的方法是将类BigData.wordcount1移动到main中,并进行clean和package操作,同时参考引用\[3\]中的示例来设置SparkConf和SparkSession。 #### 引用[.reference_title] - *1* *3* [Spark3.0.3版本报““main“ java.lang.NoClassDefFoundError: org/apache/hadoop/fs/FSDataInputStream](https://blog.csdn.net/suwei825/article/details/120449015)[target="_blank" data-report-click={"spm":"1018.2226.3001.9630","extra":{"utm_source":"vip_chatgpt_common_search_pc_result","utm_medium":"distribute.pc_search_result.none-task-cask-2~all~insert_cask~default-1-null.142^v91^control_2,239^v3^insert_chatgpt"}} ] [.reference_item] - *2* [运行jar包出现Exception in thread “main“ java.lang.ClassNotFoundException: edu.bigdata.mr.Demo02_WC](https://blog.csdn.net/weixin_44952852/article/details/117969095)[target="_blank" data-report-click={"spm":"1018.2226.3001.9630","extra":{"utm_source":"vip_chatgpt_common_search_pc_result","utm_medium":"distribute.pc_search_result.none-task-cask-2~all~insert_cask~default-1-null.142^v91^control_2,239^v3^insert_chatgpt"}} ] [.reference_item] [ .reference_list ]
评论 4
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包

打赏作者

猫头鹰数据分析

你的鼓励将是我创作的最大动力

¥1 ¥2 ¥4 ¥6 ¥10 ¥20
扫码支付:¥1
获取中
扫码支付

您的余额不足,请更换扫码支付或充值

打赏作者

实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值