看example源码学spark系列(4)-DriverSubmissionTest

先运行

jpan@jpan-Beijing:~/Software/spark-0.9.1$ ./bin/run-example org.apache.spark.examples.DriverSubmissionTest 3
Environment variables containing SPARK_TEST:
System properties containing spark.test:
Alive for 1 out of 3 seconds
Alive for 2 out of 3 seconds

发现打印出来的信息有点问题,SPARK_TEST后面什么都没有,我们分析下源码

/*
 * Licensed to the Apache Software Foundation (ASF) under one or more
 * contributor license agreements.  See the NOTICE file distributed with
 * this work for additional information regarding copyright ownership.
 * The ASF licenses this file to You under the Apache License, Version 2.0
 * (the "License"); you may not use this file except in compliance with
 * the License.  You may obtain a copy of the License at
 *
 *    http://www.apache.org/licenses/LICENSE-2.0
 *
 * Unless required by applicable law or agreed to in writing, software
 * distributed under the License is distributed on an "AS IS" BASIS,
 * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 * See the License for the specific language governing permissions and
 * limitations under the License.
 */

package org.apache.spark.examples

import scala.collection.JavaConversions._

/** Prints out environmental information, sleeps, and then exits. Made to
  * test driver submission in the standalone scheduler. */
object DriverSubmissionTest {
  def main(args: Array[String]) {
    if (args.size < 1) {
      println("Usage: DriverSubmissionTest <seconds-to-sleep>")
      System.exit(0)
    }
    val numSecondsToSleep = args(0).toInt

    val env = System.getenv()
    val properties = System.getProperties()

    println("Environment variables containing SPARK_TEST:")
    env.filter{case (k, v) => k.contains("SPARK_TEST")}.foreach(println)

    println("System properties containing spark.test:")
    properties.filter{case (k, v) => k.toString.contains("spark.test")}.foreach(println)

    for (i <- 1 until numSecondsToSleep) {
      println(s"Alive for $i out of $numSecondsToSleep seconds")
      Thread.sleep(1000)
    }
  }
}

从源码中可以看出,程序是打印系统变量中包含SPARK_TEST属性的值。但是我本机的env 和 properties如下(这是在spark-shell下):

scala> val env = System.getenv()
env: java.util.Map[String,String] = {TERM=xterm, XDG_SESSION_PATH=/org/freedesktop/DisplayManager/Session0, HADOOP_COMMON_LIB_NATIVE_DIR=/home/jpan/Software/hadoop-2.2.0/lib/native, SSH_AGENT_PID=2091, JAVA_HOME=/usr/share/jdk1.7.0_51, SSH_AGENT_LAUNCHER=upstart, LESSCLOSE=/usr/bin/lesspipe %s %s, UPSTART_SESSION=unix:abstract=/com/ubuntu/upstart-session/1000/2035, SESSION_MANAGER=local/jpan-Beijing:@/tmp/.ICE-unix/2199,unix/jpan-Beijing:/tmp/.ICE-unix/2199, LC_NUMERIC=zh_CN.UTF-8, GNOME_DESKTOP_SESSION_ID=this-is-deprecated, COMPIZ_CONFIG_PROFILE=ubuntu, IM_CONFIG_PHASE=1, GDMSESSION=ubuntu, MANDATORY_PATH=/usr/share/gconf/ubuntu.mandatory.path, PWD=/home/jpan/Software/spark-0.9.1, SESSIONTYPE=gnome-session, GTK_IM_MODULE=ibus, MASTER=spark://jpan-Beijing:7077, XDG_GREETER_DATA_DIR=/va...
scala> val properties = System.getProperties()
properties: java.util.Properties = 
{java.runtime.name=Java(TM) SE Runtime Environment, =/usr/share/jdk1.7.0_51/jre/lib/amd64, java.vm.version=24.51-b03, java.vm.vendor=Oracle Corporation, java.vendor.url=http://java.oracle.com/, path.separator=:, java.vm.name=Java HotSpot(TM) 64-Bit Server VM, file.encoding.pkg=sun.io, user.country=US, sun.java.launcher=SUN_STANDARD, sun.os.patch.level=unknown, java.vm.specification.name=Java Virtual Machine Specification, user.dir=/home/jpan/Software/spark-0.9.1, java.runtime.version=1.7.0_51-b13, java.awt.graphicsenv=sun.awt.X11GraphicsEnvironment, java.endorsed.dirs=/usr/share/jdk1.7.0_51/jre/lib/endorsed, os.arch=amd64, java.io.tmpdir=/tmp, line.separator=
, java.vm.specification.vendor=Oracle Corporation, os.name=Linux, sun.jn...

其中并没有包含spark_test字样,所以没有打印出来。于是我自己见了项目,并把SPARK_TEST改为MASTER,把spark_test改为sun.boot.library.path,运行结果如下:

jpan@jpan-Beijing:~/Mywork/spark_test/DriverSubmissionTest$ sbt "project driversubmissiontest" "run 3"
[info] Set current project to DriverSubmissionTest (in build file:/home/jpan/Mywork/spark_test/DriverSubmissionTest/)
[info] Set current project to DriverSubmissionTest (in build file:/home/jpan/Mywork/spark_test/DriverSubmissionTest/)
[info] Compiling 1 Scala source to /home/jpan/Mywork/spark_test/DriverSubmissionTest/target/scala-2.10/classes...
[info] Running main.scala.DriverSubmissionTest 3
Environment variables containing SPARK_TEST:
(MASTER,spark://jpan-Beijing:7077)
System properties containing spark.test:
(sun.boot.library.path,/usr/share/jdk1.7.0_51/jre/lib/amd64)
Alive for 1 out of 3 seconds
Alive for 2 out of 3 seconds
[success] Total time: 9 s, completed Jun 4, 2014 3:49:10 PM
即可以看到结果。


源码分析:

这个源码比较简单,就是打印出系统信息,并sleep几秒后退出。程序里面有意思的是filter中的case(k,v),它自动把等好前面的赋值为K,后面赋值为V。

scala语法里有。

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值