推荐系统实现过程中遇到的日志NoClassDefFoundError:StaticLoggerBinder问题_fqzzzzz的CSDN

提示:文章写完后,目录可以自动生成,如何生成可参考右边的帮助文档

文章目录


问题

使用spark做推荐系统时,出现日志文件没找到的报错,报错信息如下:

Exception in thread "main" java.lang.NoClassDefFoundError: org/slf4j/impl/StaticLoggerBinder
	at org.apache.spark.internal.Logging$.org$apache$spark$internal$Logging$$isLog4j12(Logging.scala:222)
	at org.apache.spark.internal.Logging.initializeLogging(Logging.scala:127)
	at org.apache.spark.internal.Logging.initializeLogIfNecessary(Logging.scala:111)
	at org.apache.spark.internal.Logging.initializeLogIfNecessary$(Logging.scala:105)
	at org.apache.spark.SparkContext.initializeLogIfNecessary(SparkContext.scala:80)
	at org.apache.spark.internal.Logging.initializeLogIfNecessary(Logging.scala:102)
	at org.apache.spark.internal.Logging.initializeLogIfNecessary$(Logging.scala:101)
	at org.apache.spark.SparkContext.initializeLogIfNecessary(SparkContext.scala:80)
	at org.apache.spark.internal.Logging.log(Logging.scala:49)
	at org.apache.spark.internal.Logging.log$(Logging.scala:47)
	at org.apache.spark.SparkContext.log(SparkContext.scala:80)
	at org.apache.spark.internal.Logging.logInfo(Logging.scala:57)
	at org.apache.spark.internal.Logging.logInfo$(Logging.scala:56)
	at org.apache.spark.SparkContext.logInfo(SparkContext.scala:80)
	at org.apache.spark.SparkContext.<init>(SparkContext.scala:186)
	at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2555)
	at org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$1(SparkSession.scala:930)
	at scala.Option.getOrElse(Option.scala:189)
	at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:921)
	at com.study.statistics.StatisticsRecommender$.main(StatisticsRecommender.scala:25)
	at com.study.statistics.StatisticsRecommender.main(StatisticsRecommender.scala)
Caused by: java.lang.ClassNotFoundException: org.slf4j.impl.StaticLoggerBinder
	at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
	... 21 more

Process finished with exit code 1

分析

也是根据报错信息一步步点进源码看了,确实在日志包jar包下面找不到这个类
也查了资料,说包没导入,但是查看maven依赖后,日志包,绑定包都导入了,一度怀疑是log4j包升级后去掉了这个类。
但是突然想到我连接的是linux下的mysql,这个mysql是作为hive的元数据存储的,spark可能会通过hive去访问mysql?(这点只是我的猜测)

解决

尝试性的在pom依赖中导入hive依赖

<dependency>
	   <groupId>org.apache.hive</groupId>
	   <artifactId>hive-exec</artifactId>
	   <version>3.1.2</version>
</dependency>

再次运行程序

Loading class `com.mysql.jdbc.Driver'. This is deprecated. The new driver class is `com.mysql.cj.jdbc.Driver'. The driver is automatically registered via the SPI and manual loading of the driver class is generally unnecessary.
2021-11-27 00:17:20 [driver-heartbeater] WARN  ProcfsMetricsGetter:69 - Exception when trying to compute pagesize, as a result reporting of ProcessTree metrics is stopped

Process finished with exit code 0

编译也是成功通过了,证明导入hive依赖即可

  • 1
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值