[spark-src-core] 2.1 relationships b/t misc spark shells

  similar to other open source projects,spark has several shells are listed there

sbinserver side shells  
 start-all.shstart the whole spark daemons(ie. start-master.sh,start-slaves.sh)
 start-master.sh startup the spark's master process deliver to "spark-daemon.sh start master.Master"
 start-slaves.shstarts all workers deliver to sbin/slaves.sh 
  spark-daemon.sh spawn up any daemons,e.g. spark-class.sh,spark-submit.sh.usage:

spark-daemon.sh [--config <conf-dir>]

(start|stop|status) <spark-command> <spark-instance-number> <args...>

 for start-master.sh,here will deliver to bin/spark-class.sh
 slaves.sh  login into all slaves dfined in conf/slaves,then issue sbin/start-slave.sh 
 spark-config.shexport some global variables,e.g. SPARK_HOME 
  start-slave.sh deliver to spark-daemon.sh start deploy.worker.Worker

some settings,eg. 

SPARK_WORKER_INSTANCES

    
    
bin   
 load-spark-env.shload the file spark-env.sh if exists 
  spark-class.sh[end]

 finally,spawn up a daemon(class) returned by executing "org.apache.spark.launcher.Main",the class is specified by caller.(eg. spark-daemon.sh)

Usage: spark-class <class> [<args>]

 
  spark-shell

 an interactive interface with spark to test,demo,committing spark app.

Help msg is grapped from spark-submit.it uses athird-party jar named Jline to simulate a shell style.

 it will deliver to spark-submit shell.ie.

bin/spark-submit --class org.apache.spark.repl.Main 

 spark-submitsubmit a spark app

differs with start-master.sh and start-slave.sh,here will issue new class:

"spark-class

org.apache.spark.deploy.SparkSubmit .."

 run-exampleruns the examples by given a subfix class name.

deliver to "spark-submit .."

conf   
 spark-env.sh

misc spark cluster settings corresponding to cluster manager,e.g. SPARK_LOCAL_IP,

SPARK_CLASSPATH,SPARK_MASTER_IP..

 

 

eg.

hadoop@GZsw04:~$ spark-daemon.sh status org.apache.spark.deploy.master.Master 1
org.apache.spark.deploy.master.Master is running.

 launch.Main.java

  the union entry of issuing any class/daemon,e.g.if u wanna submit a workcount app,u can do like this:

run-example JavaWordCount /file/to/count

  then the concrete class named "org.apache.spark.examples.JavaWordCount" is executed by inflecting by Main.java.also,u can dig into the src of JavaWordCount for more details.

 

conclusions;

a.complete flow of startup master

 [start-all.sh]>start-master.sh>spark-daemon.sh>spark-class.sh > launch.Main>issue app master.Master

 and the slaves(workers)

 [start-all.sh]>start-slaves.sh>slaves.sh>start-slave.sh>spark-daemon.sh[ worker.Worker]>same steps with above

b.it's different from hadoop,hbase.since it will supply a union entry of launch.Main to fix several params,or coming them.

c.advances step by step will reduce the reduplicate code,improves the reusability

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值