java.lang.NoClassDefFoundError异常解决
这个异常通俗的解释就是编译通过,能找到对应的依赖,但是运行时候却找不到相应的类。只会在运行时出现这个异常,这种问题的调试和解决都涉及到java的运行和编译机制。
java.lang.NoClassDefFoundError: org/apache/spark/api/java/function/FlatMapFunction
at java.lang.Class.getDeclaredMethods0(Native Method)
at java.lang.Class.privateGetDeclaredMethods(Class.java:2701)
at java.lang.Class.privateGetMethodRecursive(Class.java:3048)
at java.lang.Class.getMethod0(Class.java:3018)
at java.lang.Class.getMethod(Class.java:1784)
at sun.launcher.LauncherHelper.validateMainClass(LauncherHelper.java:544)
at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:526)
Caused by: java.lang.ClassNotFoundException: org.apache.spark.api.java.function.FlatMapFunction
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 7 more
Error: A JNI error has occurred, please check your installation and try again
Exception in thread "main"
不过我遇到的问题还是pom的配置错误(一个小细节,但是没有找到提示,结果试了很久才解决),所以在此记录一下。
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_2.11</artifactId>
<version>2.2.0</version>
<exclusions>
<exclusion>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-client</artifactId>
</exclusion>
</exclusions>
<scope>provided</scope>
</dependency>
不能设置 <scope>provided</scope>,此设置把jar包设置为隐藏,就会出现以上错误。