scala中spark错误 Error initializing SparkContext

在做日志分析的时候,spark出现一个很匪夷所思的问题,更新完代码,运行本地环境报错(错误见下),在集群yarn环境正常

2017-08-29 09:46:30 [org.apache.hadoop.util.NativeCodeLoader]-[WARN] Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2017-08-29 09:46:32 []-[WARN] Your hostname, USER-20151116VQ resolves to a loopback/non-reachable address: fe80:0:0:0:e08b:ef97:c992:73ae%18, but we couldn't find any external IP address!
2017-08-29 09:46:33 [org.apache.spark.SparkContext]-[ERROR] Error initializing SparkContext.
java.lang.SecurityException: class "javax.servlet.FilterRegistration"'s signer information does not match signer information of other classes in the same package
    at java.lang.ClassLoader.checkCerts(ClassLoader.java:944)
    at java.lang.ClassLoader.preDefineClass(ClassLoader.java:658)
    at java.lang.ClassLoader.defineClass(ClassLoader.java:786)
    at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
    at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
    at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
    at org.spark-project.jetty.servlet.ServletContextHandler.<init>(ServletContextHandler.java:136)
    at org.spark-project.jetty.servlet.ServletContextHandler.<init>(ServletContextHandler.java:129)
    at org.spark-project.jetty.servlet.ServletContextHandler.<init>(ServletContextHandler.java:98)
    at org.apache.spark.ui.JettyUtils$.createServletHandler(JettyUtils.scala:129)
    at org.apache.spark.ui.JettyUtils$.createServletHandler(JettyUtils.scala:116)
    at org.apache.spark.ui.WebUI.attachPage(WebUI.scala:79)
    at org.apache.spark.ui.WebUI$$anonfun$attachTab$1.apply(WebUI.scala:63)
	at org.apache.spark.ui.WebUI$$anonfun$attachTab$1.apply(WebUI.scala:63)
    at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
    at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
    at org.apache.spark.ui.WebUI.attachTab(WebUI.scala:63)
    at org.apache.spark.ui.SparkUI.initialize(SparkUI.scala:67)
    at org.apache.spark.ui.SparkUI.<init>(SparkUI.scala:80)
    at org.apache.spark.ui.SparkUI$.create(SparkUI.scala:208)
    at org.apache.spark.ui.SparkUI$.createLiveUI(SparkUI.scala:150)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:484)
    at org.richinfo.se.process.clean.ClientServiceLogClean$.run(ClientServiceLogClean.scala:136)
    at org.richinfo.se.process.clean.ClientServiceLogClean$.main(ClientServiceLogClean.scala:112)
    at org.richinfo.se.process.clean.ClientServiceLogClean.main(ClientServiceLogClean.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at com.intellij.rt.execution.application.AppMain.main(AppMain.java:147)

提示我包冲突,无法初始化,以下是我的依赖包

<dependencies>
        <dependency>
            <groupId>org.scala-lang</groupId>
            <artifactId>scala-library</artifactId>
            <version>${scala.version}</version>
        </dependency>
        <dependency>
            <groupId>org.richinfo.se</groupId>
            <artifactId>se-common-usertransforms</artifactId>
            <version>0.0.2-RELEASES</version>
        </dependency>
        <dependency>
            <groupId>mysql</groupId>
            <artifactId>mysql-connector-java</artifactId>
            <version>5.1.30</version>
        </dependency>
        <dependency>
            <groupId>org.apache.httpcomponents</groupId>
            <artifactId>httpmime</artifactId>
            <version>4.5.2</version>
        </dependency>
        <dependency>
            <groupId>org.noggit</groupId>
            <artifactId>noggit</artifactId>
            <version>0.5</version>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-core_2.10</artifactId>
            <version>1.6.0-cdh5.8.0</version>
        </dependency>
        <dependency>
            <groupId>org.apache.solr</groupId>
            <artifactId>solr-solrj</artifactId>
            <version>4.6.1</version>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-sql_2.10</artifactId>
            <version>1.6.0-cdh5.8.0</version>
        </dependency>
    </dependencies>

**怎么检查也没检查出来哪个包有问题,猜测可能是solr中的包引用和spark中的hadoop依赖包冲突,最后从stackover上发现一个有些类似的情况:

https://stackoverflow.com/questions/29742480/spark-runtime-error-spark-metrics-sink-metricsservlet-cannot-be-instantialized

**
尝试更改maven引入顺序,见spark引用包换到scala包下面,程序顺利执行。。。
总结:包的引用顺序也会造成一系列的问题,且问题很难查找,任何细小的问题都可能造成很复杂的错误,耐心加细心

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值