最近部署spark应用时抛出以下异常:
Exception in thread "Driver" java.lang.LinkageError: loader constraint violation: when resolving method "org.slf4j.impl.StaticLoggerBinder.getLoggerFactory()Lorg/slf4j/ILoggerFactory;" the class loader (instance of org/apache/spark/util/ChildFirstURLClassLoader) of the current class, org/slf4j/LoggerFactory, and the class loader (instance of sun/misc/Launcher$AppClassLoader) for the method's defining class, org/slf4j/impl/StaticLoggerBinder, have different Class objects for the type org/slf4j/ILoggerFactory used in the signature at org.slf4j.LoggerFactory.getILoggerFactory(LoggerFactory.java:335) at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:283) at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:304) at com.coupang.fds.detection.OrderDetectionStreaming.<clinit>(OrderDetectionStreaming.java:31) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:645)
根据以前的总结:
- 默认情况下,spark优先使用/etc/spark/conf/classpath.txt里自带的依赖包;
- 若是找不到则查找用户通过 --jar 提交的依赖包(位于driver、executor的classpath里);
- 若是两个路径下都有相同名字的依赖包(版本不同),则抛出java.lang.LinkageError用户解决冲突;
- 使用 --spark.{driver,executor}.userClassPathFirst = true 优先启用用户提供的依赖包;
- 使用 --spark.{driver,executor}.extraClassPath = conflict-jar 来解决同名冲突的包依赖;
解决冲突的方式有两种,第一种是用户在使用--jars引入的时候不引入。第二种时必须引入新的版本的包时采用--spark.{driver,executor}.extraClassPath属性优先加载解决冲突。
可以知道上面的问题是slf4j的版本冲突问题,具体查看是哪个版本冲突。
TIPS:
查看具体使用的是哪个版本方法:spark-submit 的时候添加
--driver-java-options -verbose:class
参考文章:
https://blog.csdn.net/adorechen/article/details/80109625
https://blog.csdn.net/adorechen/article/details/80110272
https://spark.apache.org/docs/latest/submitting-applications.html#advanced-dependency-management