今天有哥们问到如何对Spark进行单元测试。现在将Sbt的测试方法写出来,如下:
对Spark的test case进行测试的时候可以用sbt的test命令:
一、测试全部test case
sbt/sbt test
二、测试单个test case
sbt/sbt "test-only *DriverSuite*"
下面举个例子:
这个Test Case是位于$SPARK_HOME/core/src/test/scala/org/apache/spark/DriverSuite.scala
FunSuit是scalatest里面的测试Suit,要继承它。这里主要是一个回归测试,测试Spark程序正常结束后,Driver会不会正常退出。
注:我就拿这个例子模拟一下,测试成功和测试失败的情景,这个例子和DriverSuite的测试目的完全不一致,只是演示作用。 :)
下面是正常运行退出的例子:
package org.apache.spark
import java.io.File
import org.apache.log4j.Logger
import org.apache.log4j.Level
import org.scalatest.FunSuite
import org.scalatest.concurrent.Timeouts
import org.scalatest.prop.TableDrivenPropertyChecks._
import org.scalatest.time.SpanSugar._
import org.apache.spark.util.Utils
import scala.language.postfixOps
class DriverSuite extends FunSuite with Timeouts {
test("driver should exit after finishing") {
val sparkHome = sys.env.get("SPARK_HOME").orElse(sys.props.get("spark.home")).get
// Regression test for SPARK-530: "Spark driver process doesn't exit after finishing"
val masters = Table(("master"), ("local"), ("local-cluster[2,1,512]"))
forAll(masters) { (master: String) =>
failAfter(60 seconds) {
Utils.executeAndGetOutput(
Seq("./bin/spark-class", "org.apache.spark.DriverWithoutCleanup", master),
new File(sparkHome),
Map("SPARK_TESTING" -> "1", "SPARK_HOME" -> sparkHome))
}
}
}
}
/**
* Program that creates a Spark driver but doesn't call SparkContext.stop() or
* Sys.exit() after finishing.
*/
object DriverWithoutCleanup {
def main(args: Array[String]) {
Logger.getRootLogger().setLevel(Level.WARN)
val sc = new SparkContext(args(0), "DriverWithoutCleanup")