第一种: val conf: SparkConf = new SparkConf()
conf.setAppName("SparkWC")
conf.setMaster("spark://spark1:7077")如果这样写的话 直接就在集群上面执行了(idea种)
使用spark-submit不用指定 master了就
./spark-submit \
--class day06.SparkWC \
--executor-memory 512m \
--total-executor-cores 2 /home/hadoop02/sparkTools/sparkDemo-1.0-SNAPSHOT.jar
第二种: val conf: SparkConf = new SparkConf()
conf.setAppName("SparkWC")
//conf.setMaster("local")如果这样写的话 或者不加注释 写上local 在idea种无法跑集群,在spark-shell使用如下
使用spark-submit不用指定 master了就
./spark-submit \
--class day06.SparkWC \
--master spark://spark1:7077 \
--executor-memory 512m \
--total-executor-cores 2 /home/hadoop02/sparkTools/sparkDemo-1.0-SNAPSHOT.jar