[spark-src-core] given SPARK_PRINT_LAUNCH_COMMAND to output more details

with enabling both system environment 'SPARK_PRINT_LAUNCH_COMMAND' and --verbose ,the spark command is more detailed that outputed from spark-submit.sh:

 

hadoop@GZsw04:~/spark/spark-1.4.1-bin-hadoop2.4$ spark-submit --master yarn --verbose --class org.apache.spark.examples.JavaWordCount lib/spark-examples-1.4.1-hadoop2.4.0-my.jar RELEASE

Spark Command: /usr/local/jdk/jdk1.6.0_31/bin/java -cp /home/hadoop/spark/spark-1.4.1-bin-hadoop2.4/conf/:/home/hadoop/spark/spark-1.4.1-bin-hadoop2.4/lib/spark-assembly-1.4.1-hadoop2.4.0.jar:/home/hadoop/spark/spark-1.4.1-bin-hadoop2.4/lib/datanucleus-api-jdo-3.2.6.jar:/home/hadoop/spark/spark-1.4.1-bin-hadoop2.4/lib/datanucleus-rdbms-3.2.9.jar:/home/hadoop/spark/spark-1.4.1-bin-hadoop2.4/lib/datanucleus-core-3.2.10.jar:/usr/local/hadoop/hadoop-2.5.2/etc/hadoop/ -Xms6g -Xmx6g -XX:MaxPermSize=256m org.apache.spark.deploy.SparkSubmit --master yarn --class org.apache.spark.examples.JavaWordCount --verbose lib/spark-examples-1.4.1-hadoop2.4.0-my.jar RELEASE

========================================

Using properties file: /home/hadoop/spark/spark-1.4.1-bin-hadoop2.4/conf/spark-defaults.conf

Adding default property: spark.executor.extraJavaOptions=-Xloggc:~/spark-executor.gc -XX:+UseCMSCompactAtFullCollection -XX:CMSFullGCsBeforeCompaction=2 -XX:CMSInitiatingOccupancyFraction=65 -XX:+UseCMSInitiatingOccupancyOnly -XX:PermSize=64m -XX:MaxPermSize=256m -XX:NewRatio=5 -XX:+UseParNewGC -XX:+UseConcMarkSweepGC -XX:+PrintGCDateStamps -XX:+PrintGCDetails -XX:ParallelGCThreads=5

Adding default property: spark.eventLog.enabled=true

Adding default property: spark.ui.port=7106

Adding default property: spark.cores.max=50

Adding default property: spark.storage.memoryFraction=0.5

Adding default property: spark.driver.memory=6g

Adding default property: spark.worker.ui.port=7105

Adding default property: spark.master.ui.port=7102

Adding default property: spark.executor.memory=2g

Adding default property: spark.eventLog.dir=/home/hadoop/spark/spark-eventlog

Adding default property: spark.executor.cores=2

Adding default property: spark.driver.allowMultipleContexts=true

Parsed arguments:

  master                  yarn

  deployMode              null

  executorMemory          2g

  executorCores           2

  totalExecutorCores      50

  propertiesFile          /home/hadoop/spark/spark-1.4.1-bin-hadoop2.4/conf/spark-defaults.conf

  driverMemory            6g

  driverCores             null

  driverExtraClassPath    null

  driverExtraLibraryPath  null

  driverExtraJavaOptions  null

  supervise               false

  queue                   null

  numExecutors            null

  files                   null

  pyFiles                 null

  archives                null

  mainClass               org.apache.spark.examples.JavaWordCount

  primaryResource         file:/home/hadoop/spark/spark-1.4.1-bin-hadoop2.4/lib/spark-examples-1.4.1-hadoop2.4.0-my.jar

  name                    org.apache.spark.examples.JavaWordCount

  childArgs               [RELEASE]

  jars                    null

  packages                null

  repositories            null

  verbose                 true

 

Spark properties used, including those specified through

 --conf and those from the properties file /home/hadoop/spark/spark-1.4.1-bin-hadoop2.4/conf/spark-defaults.conf:

  spark.driver.memory -> 6g

  spark.executor.memory -> 2g

  spark.eventLog.enabled -> true

  spark.driver.allowMultipleContexts -> true

  spark.cores.max -> 50

  spark.ui.port -> 7106

  spark.executor.extraJavaOptions -> -Xloggc:~/spark-executor.gc -XX:+UseCMSCompactAtFullCollection -XX:CMSFullGCsBeforeCompaction=2 -XX:CMSInitiatingOccupancyFraction=65 -XX:+UseCMSInitiatingOccupancyOnly -XX:PermSize=64m -XX:MaxPermSize=256m -XX:NewRatio=5 -XX:+UseParNewGC -XX:+UseConcMarkSweepGC -XX:+PrintGCDateStamps -XX:+PrintGCDetails -XX:ParallelGCThreads=5

  spark.eventLog.dir -> /home/hadoop/spark/spark-eventlog

  spark.worker.ui.port -> 7105

  spark.storage.memoryFraction -> 0.5

  spark.master.ui.port -> 7102

  spark.executor.cores -> 2

 

    

Main class:

org.apache.spark.examples.JavaWordCount

Arguments:

RELEASE

System properties:

spark.driver.memory -> 6g

spark.executor.memory -> 2g

spark.eventLog.enabled -> true

spark.driver.allowMultipleContexts -> true

spark.cores.max -> 50

SPARK_SUBMIT -> true

spark.ui.port -> 7106

spark.executor.extraJavaOptions -> -Xloggc:~/spark-executor.gc -XX:+UseCMSCompactAtFullCollection -XX:CMSFullGCsBeforeCompaction=2 -XX:CMSInitiatingOccupancyFraction=65 -XX:+UseCMSInitiatingOccupancyOnly -XX:PermSize=64m -XX:MaxPermSize=256m -XX:NewRatio=5 -XX:+UseParNewGC -XX:+UseConcMarkSweepGC -XX:+PrintGCDateStamps -XX:+PrintGCDetails -XX:ParallelGCThreads=5

spark.app.name -> org.apache.spark.examples.JavaWordCount

spark.jars -> file:/home/hadoop/spark/spark-1.4.1-bin-hadoop2.4/lib/spark-examples-1.4.1-hadoop2.4.0-my.jar

spark.eventLog.dir -> /home/hadoop/spark/spark-eventlog

spark.worker.ui.port -> 7105

spark.master -> yarn-client

spark.executor.cores -> 2

spark.master.ui.port -> 7102

spark.storage.memoryFraction -> 0.5

Classpath elements:

 

file:/home/hadoop/spark/spark-1.4.1-bin-hadoop2.4/lib/spark-examples-1.4.1-hadoop2.4.0-my.jar

 

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值