clusterManager, deployMode
//Spark 2.3.2 SparkSubmit.scala
private def doPrepareSubmitEnvironment(
args: SparkSubmitArguments,
conf: Option[HadoopConfiguration] = None)
: (Seq[String], Seq[String], SparkConf, String) = {
...
// Fail fast, the following modes are not supported or applicable
(clusterManager, deployMode) match {
case (STANDALONE, CLUSTER) if args.isPython =>
printErrorAndExit("Cluster deploy mode is currently not supported for python " +
"applications on standalone clusters.")
case (STANDALONE, CLUSTER) if args.isR =>
printErrorAndExit("Cluster deploy mode is currently not supported for R " +
"applications on standalone clusters.")
case (KUBERNETES, _) if args.isPython =>
printErrorAndExit("Python applications are currently not supported for Kubernetes.")
case (KUBERNETES, _) if args.isR =>
printErrorAndExit("R applications are currently not supported for Kubernetes.")
case (KUBERNETES, CLIENT) =>
printErrorAndExit("Client mode is currently not supported for Kubernetes.")
case (LOCAL, CLUSTER) =>
printErrorAndExit("Cluster deploy mode is not compatible with master \"local\"")
case (_, CLUSTER) if isShell(args.primaryResource) =>
printErrorAndExit("Cluster deploy mode is not applicable to Spark shells.")
case (_, CLUSTER) if isSqlShell(args.mainClass) =>
printErrorAndExit("Cluster deploy mode is not applicable to Spark SQL shell.")
case (_, CLUSTER) if isThriftServer(args.mainClass) =>
printErrorAndExit("Cluster deploy mode is not applicable to Spark Thrift server.")
case _ =>
}
...
}
由此可知各种情况下的 deploy mode 的支持情况
指定appname
$ spark-sql --name testspark
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
2023-11-09 15:21:16,551 [WARN] [main] HiveConf: HiveConf of name hive.metastore.db.encoding does not exist
2023-11-09 15:21:16,552 [WARN] [main] HiveConf: HiveConf of name hive.hwi.listen.host does not exist
2023-11-09 15:21:16,552 [WARN] [main] HiveConf: HiveConf of name hive.hwi.listen.port does not exist
2023-11-09 15:21:17,732 [WARN] [main] Utils: Service 'SparkUI' could not bind on port 4040. Attempting port 4041.
2023-11-09 15:21:18,131 [WARN] [main] Client: Neither spark.yarn.jars nor spark.yarn.archive is set, falling back to uploading libraries under SPARK_HOME.
Spark master: yarn, Application Id: application_1698156918462_0615
spark-sql>
获取spark app id
scala> spark.sparkContext.applicationId
res0: String = application_1698156918462_0617