Spark问题解决 - scala.Predef$.$scope()Lscala/xml/TopScope$和not found: type Application异常

使用intellij idea+scala+spark,运行程序提示下面错误。

问题1描述:java.lang.NoSuchMethodError: scala.Predef$.$scope()Lscala/xml/TopScope$;

Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
17/10/03 22:35:16 INFO SparkContext: Running Spark version 2.1.0
17/10/03 22:35:16 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
17/10/03 22:35:17 INFO SecurityManager: Changing view acls to: Administrator
17/10/03 22:35:17 INFO SecurityManager: Changing modify acls to: Administrator
17/10/03 22:35:17 INFO SecurityManager: Changing view acls groups to: 
17/10/03 22:35:17 INFO SecurityManager: Changing modify acls groups to: 
17/10/03 22:35:17 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(Administrator); groups with view permissions: Set(); users  with modify permissions: Set(Administrator); groups with modify permissions: Set()
17/10/03 22:35:18 INFO Utils: Successfully started service 'sparkDriver' on port 63233.
17/10/03 22:35:18 INFO SparkEnv: Registering MapOutputTracker
17/10/03 22:35:18 INFO SparkEnv: Registering BlockManagerMaster
17/10/03 22:35:18 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
17/10/03 22:35:18 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
17/10/03 22:35:18 INFO DiskBlockManager: Created local directory at C:\Users\Administrator\AppData\Local\Temp\blockmgr-7d37f54c-7f7d-4452-bbe1-edd74a1b3cef
17/10/03 22:35:18 INFO MemoryStore: MemoryStore started with capacity 908.1 MB
17/10/03 22:35:18 INFO SparkEnv: Registering OutputCommitCoordinator
Exception in thread "main" java.lang.NoSuchMethodError: scala.Predef$.$scope()Lscala/xml/TopScope$;
    at org.apache.spark.ui.jobs.AllJobsPage.<init>(AllJobsPage.scala:39)
    at org.apache.spark.ui.jobs.JobsTab.<init>(JobsTab.scala:38)
    at org.apache.spark.ui.SparkUI.initialize(SparkUI.scala:65)
    at org.apache.spark.ui.SparkUI.<init>(SparkUI.scala:82)
    at org.apache.spark.ui.SparkUI$.create(SparkUI.scala:220)
    at org.apache.spark.ui.SparkUI$.createLiveUI(SparkUI.scala:162)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:452)
    at cn.hadron.JoinDemo$.main(JoinDemo.scala:10)
    at cn.hadron.JoinDemo.main(JoinDemo.scala)
17/10/03 22:35:18 INFO DiskBlockManager: Shutdown hook called
17/10/03 22:35:18 INFO ShutdownHookManager: Shutdown hook called
17/10/03 22:35:18 INFO ShutdownHookManager: Deleting directory C:\Users\Administrator\AppData\Local\Temp\spark-fa8aeada-59ea-402b-98cd-1f0424746877\userFiles-e03aaa25-fd89-45a5-9917-bde095172ac8
17/10/03 22:35:18 INFO ShutdownHookManager: Deleting directory C:\Users\Administrator\AppData\Local\Temp\spark-fa8aeada-59ea-402b-98cd-1f0424746877

Process finished with exit code 1

解决方法:

原因是版本依赖出了问题,将原来的pom.xml

 <dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-core_2.10</artifactId>
    <version>2.1.0</version>
 </dependency>

修改为:

<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-core_2.11</artifactId>
    <version>2.1.1</version>
</dependency>

再次运行,上面的问题已经消失,但是出现下面的问题。

问题2描述:not found: type Application

Error:(7, 20) not found: type Application
object App extends Application {

解决方法:

参考: 
http://stackoverflow.com/questions/26176509/why-does-2-11-1-fail-with-error-not-found-type-application 
Application has been deprecated from scala 2.9, probably it has been deleted in scala 2.11 (it still exists in scala 2.10) even though at the moment I can’t find proofs for that, use App instead.

this is the scala 2.11 branch on github which has only an App.scala and this is the 2.10which has App.scala and Application.scala with a deprecated warning.

既然App.scala和Application.scala已经过时,直接删除生成App.scala文件即可。 

再次运行,正常。

  • 1
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 1
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论 1
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值