spark发布了1.0.1,修正了1.0.0的变量类型检查的bug,又引入了import问题。
修改spark-1.0.1/repl/src/main/scala/org/apache/spark/repl/SparkImports.scala,注释掉下面几行1.0.1新加入的代码:
// case x: ClassHandler =>
// I am trying to guess if the import is a defined class
// This is an ugly hack, I am not 100% sure of the consequences.
// Here we, let everything but "defined classes" use the import with val.
// The reason for this is, otherwise the remote executor tries to pull the
// classes involved and may fail.
// for (imv <- x.definedNames) {
// val objName = req.lineRep.readPath
// code.append("import " + objName + ".INSTANCE" + req.accessPath + ".`" + imv + "`\n")
// }
编译scalac -cp spark-assembly-1.0.1-2.2.0.jar repl/src/main/scala/org/apache/spark/repl
更新jar包中的文件jar uvf spark-assembly-1.0.1-2.2.0.jar org/apache/spark/repl/*
用新的jar包替换原jar包。