Initializing Spark
The first thing a Spark program must do is to create a SparkContext object, which tells Spark how to access a cluster. To create a SparkContext you first need to build a SparkConf object that contains information about your application.
- Main entry point(主要入口) for Spark functionality. A SparkContext represents the connection to a Spark cluster, and can be used to create RDDs, accumulators and broadcast variables on that cluster.
- Only one SparkContext may be active per JVM. You must
stop()
the active SparkContext before creating a new one. This limitation may eventually be removed; see SPARK-2243 for more details.
val conf = new SparkConf().setAppName(appName).setMaster(master)
new SparkContext(conf)
Spark Context appid
val appId = SparkSQLEnv.sparkContext.applicationId
SparkSession.getActiveSession.orNull.sparkContext.applicationId