WordCount案例

一、WordCount案例

  • Spark Shell仅在测试和验证我们的程序时使用的较多,在生产环境中,通常会在IDE中编制程序,然后打成Jar包,然后提交到集群,最常用的是创建一个Maven项目,利用Maven来管理Jar包的依赖。

1.1 编写程序

  • 创建一个Maven项目WordCount,包名为com.spark.day01
  • 输入文件夹准备:新建input文件夹,并在文件夹中创建文件,在其中准备一些单词
  • 导入项目依赖
<dependencies>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-core_2.11</artifactId>
        <version>2.1.1</version>
    </dependency>
</dependencies>
<build>
    <finalName>WordCount</finalName>
    <plugins>
        <plugin>
            <groupId>net.alchim31.maven</groupId>
            <artifactId>scala-maven-plugin</artifactId>
            <version>3.4.6</version>
            <executions>
                <execution>
                   <goals>
                      <goal>compile</goal>
                      <goal>testCompile</goal>
                   </goals>
                </execution>
             </executions>
        </plugin>
    </plugins>
</build>
  • 创建伴生对象WordCount,编写代码
package com.spark.day01

import org.apache.spark.rdd.RDD
import org.apache.spark.{SparkConf, SparkContext}

/*
*
*
* */

object WordCount {
  def main(args: Array[String]): Unit = {
    // 创建SparkConf配置文件
//    val conf: SparkConf = new SparkConf().setMaster("local[*]").setAppName("WordCount")
//
//    // 创建SparkCOntext对象
//    val sc: SparkContext = new SparkContext(conf)
//
//    // 在shell中执行WordCount案例:sc.textFile("hdfs://hadoop102:8020/input").flatMap(_.split(" ")).map((_,1)).reduceByKey(_+_).collect
//
//    // 读取外部文件
//    val textRDD: RDD[String] = sc.textFile("/Users/tiger/Desktop/JinWWProject/Spark_Project/input")
//
//    // 对读取到的内容进行切割并进行扁平化操作
//    val flatMapRDD: RDD[String] = textRDD.flatMap(_.split(" "))
//
//    // 对数据集中的内容进行结构转换————计数
//    val mapRDD = flatMapRDD.map((_, 1))
//
//    // 对相同的单词出现的次数进行汇总
//    val reduceRDD: RDD[(String, Int)] = mapRDD.reduceByKey(_ + _)
//
//    // collect将执行的结果进行收集
//    val res: Array[(String, Int)] = reduceRDD.collect()
//
//    res.foreach(println)

    // 一行代码搞定
    val conf: SparkConf = new SparkConf().setMaster("local[*]").setAppName("WordCount")

    // 创建SparkCOntext对象
    val sc: SparkContext = new SparkContext(conf)


    // 业务
    sc.textFile(args(0)).flatMap(_.split(" ")).map((_,1)).reduceByKey(_+_).saveAsTextFile(args(1))

    // 释放资源
    sc.stop()


  }
}
  • 打包插件
<plugin>
    <groupId>org.apache.maven.plugins</groupId>
    <artifactId>maven-assembly-plugin</artifactId>
    <version>3.0.0</version>
    <configuration>
        <archive>
            <manifest>
                <mainClass>com.atguigu.spark.WordCount</mainClass>
            </manifest>
        </archive>
        <descriptorRefs>
            <descriptorRef>jar-with-dependencies</descriptorRef>
        </descriptorRefs>
    </configuration>
    <executions>
        <execution>
            <id>make-assembly</id>
            <phase>package</phase>
            <goals>
                <goal>single</goal>
            </goals>
        </execution>
    </executions>
</plugin>
  • 打包到集群测试
    • 点击package打包,然后,查看打完后的jar包
    • 将jar上传到服务器上
    • 在HDFS上创建,存储输入文件的路径/input,并上传数据到input目录下
    • 执行任务
[hadoop@master spark-yarn]$ bin/spark-submit --class com.spark.day01.WordCount --master yarn ../../WordCount.jar /input /output
21/03/28 15:27:46 INFO spark.SparkContext: Running Spark version 2.1.1
21/03/28 15:27:46 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
21/03/28 15:27:46 INFO spark.SecurityManager: Changing view acls to: hadoop
21/03/28 15:27:46 INFO spark.SecurityManager: Changing modify acls to: hadoop
21/03/28 15:27:46 INFO spark.SecurityManager: Changing view acls groups to: 
21/03/28 15:27:46 INFO spark.SecurityManager: Changing modify acls groups to: 
21/03/28 15:27:46 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(hadoop); groups with view permissions: Set(); users  with modify permissions: Set(hadoop); groups with modify permissions: Set()
21/03/28 15:27:46 INFO util.Utils: Successfully started service 'sparkDriver' on port 34163.
21/03/28 15:27:46 INFO spark.SparkEnv: Registering MapOutputTracker
21/03/28 15:27:46 INFO spark.SparkEnv: Registering BlockManagerMaster
21/03/28 15:27:46 INFO storage.BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
21/03/28 15:27:46 INFO storage.BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
21/03/28 15:27:46 INFO storage.DiskBlockManager: Created local directory at /tmp/blockmgr-e1832b63-681b-4056-9483-c57607b1b292
21/03/28 15:27:46 INFO memory.MemoryStore: MemoryStore started with capacity 366.3 MB
21/03/28 15:27:46 INFO spark.SparkEnv: Registering OutputCommitCoordinator
21/03/28 15:27:46 INFO util.log: Logging initialized @1368ms
21/03/28 15:27:46 INFO server.Server: jetty-9.2.z-SNAPSHOT
21/03/28 15:27:46 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@ce5a68e{/jobs,null,AVAILABLE,@Spark}
21/03/28 15:27:46 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@9d157ff{/jobs/json,null,AVAILABLE,@Spark}
21/03/28 15:27:46 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@2f162cc0{/jobs/job,null,AVAILABLE,@Spark}
21/03/28 15:27:46 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@5df417a7{/jobs/job/json,null,AVAILABLE,@Spark}
21/03/28 15:27:46 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@7c041b41{/stages,null,AVAILABLE,@Spark}
21/03/28 15:27:46 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@7f69d591{/stages/json,null,AVAILABLE,@Spark}
21/03/28 15:27:46 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@61078690{/stages/stage,null,AVAILABLE,@Spark}
21/03/28 15:27:46 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1cb3ec38{/stages/stage/json,null,AVAILABLE,@Spark}
21/03/28 15:27:46 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@403132fc{/stages/pool,null,AVAILABLE,@Spark}
21/03/28 15:27:46 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@71c5b236{/stages/pool/json,null,AVAILABLE,@Spark}
21/03/28 15:27:46 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@2cab9998{/storage,null,AVAILABLE,@Spark}
21/03/28 15:27:46 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@2f7a7219{/storage/json,null,AVAILABLE,@Spark}
21/03/28 15:27:46 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@669513d8{/storage/rdd,null,AVAILABLE,@Spark}
21/03/28 15:27:46 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3a1d593e{/storage/rdd/json,null,AVAILABLE,@Spark}
21/03/28 15:27:46 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@4a8a60bc{/environment,null,AVAILABLE,@Spark}
21/03/28 15:27:46 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@361c294e{/environment/json,null,AVAILABLE,@Spark}
21/03/28 15:27:46 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@7859e786{/executors,null,AVAILABLE,@Spark}
21/03/28 15:27:46 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@285d851a{/executors/json,null,AVAILABLE,@Spark}
21/03/28 15:27:46 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@314b8f2d{/executors/threadDump,null,AVAILABLE,@Spark}
21/03/28 15:27:46 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@664a9613{/executors/threadDump/json,null,AVAILABLE,@Spark}
21/03/28 15:27:46 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@5118388b{/static,null,AVAILABLE,@Spark}
21/03/28 15:27:46 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@15a902e7{/,null,AVAILABLE,@Spark}
21/03/28 15:27:46 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@7876d598{/api,null,AVAILABLE,@Spark}
21/03/28 15:27:46 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@4a3e3e8b{/jobs/job/kill,null,AVAILABLE,@Spark}
21/03/28 15:27:46 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@5af28b27{/stages/stage/kill,null,AVAILABLE,@Spark}
21/03/28 15:27:46 INFO server.ServerConnector: Started Spark@5471388b{HTTP/1.1}{0.0.0.0:4040}
21/03/28 15:27:46 INFO server.Server: Started @1464ms
21/03/28 15:27:46 INFO util.Utils: Successfully started service 'SparkUI' on port 4040.
21/03/28 15:27:46 INFO ui.SparkUI: Bound SparkUI to 0.0.0.0, and started at http://172.23.4.221:4040
21/03/28 15:27:46 INFO spark.SparkContext: Added JAR file:/opt/software/spark-yarn/../../WordCount.jar at spark://172.23.4.221:34163/jars/WordCount.jar with timestamp 1616916466961
21/03/28 15:27:47 INFO executor.Executor: Starting executor ID driver on host localhost
21/03/28 15:27:47 INFO util.Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 40832.
21/03/28 15:27:47 INFO netty.NettyBlockTransferService: Server created on 172.23.4.221:40832
21/03/28 15:27:47 INFO storage.BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
21/03/28 15:27:47 INFO storage.BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 172.23.4.221, 40832, None)
21/03/28 15:27:47 INFO storage.BlockManagerMasterEndpoint: Registering block manager 172.23.4.221:40832 with 366.3 MB RAM, BlockManagerId(driver, 172.23.4.221, 40832, None)
21/03/28 15:27:47 INFO storage.BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 172.23.4.221, 40832, None)
21/03/28 15:27:47 INFO storage.BlockManager: Initialized BlockManager: BlockManagerId(driver, 172.23.4.221, 40832, None)
21/03/28 15:27:47 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3163987e{/metrics/json,null,AVAILABLE,@Spark}
21/03/28 15:27:47 INFO scheduler.EventLoggingListener: Logging events to hdfs://master:9000/directory/local-1616916466990
21/03/28 15:27:48 INFO memory.MemoryStore: Block broadcast_0 stored as values in memory (estimated size 240.6 KB, free 366.1 MB)
21/03/28 15:27:48 INFO memory.MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 23.2 KB, free 366.0 MB)
21/03/28 15:27:48 INFO storage.BlockManagerInfo: Added broadcast_0_piece0 in memory on 172.23.4.221:40832 (size: 23.2 KB, free: 366.3 MB)
21/03/28 15:27:48 INFO spark.SparkContext: Created broadcast 0 from textFile at WordCount.scala:46
21/03/28 15:27:48 INFO mapred.FileInputFormat: Total input paths to process : 1
21/03/28 15:27:48 INFO Configuration.deprecation: mapred.tip.id is deprecated. Instead, use mapreduce.task.id
21/03/28 15:27:48 INFO Configuration.deprecation: mapred.task.id is deprecated. Instead, use mapreduce.task.attempt.id
21/03/28 15:27:48 INFO Configuration.deprecation: mapred.task.is.map is deprecated. Instead, use mapreduce.task.ismap
21/03/28 15:27:48 INFO Configuration.deprecation: mapred.task.partition is deprecated. Instead, use mapreduce.task.partition
21/03/28 15:27:48 INFO Configuration.deprecation: mapred.job.id is deprecated. Instead, use mapreduce.job.id
21/03/28 15:27:48 INFO output.FileOutputCommitter: File Output Committer Algorithm version is 1
21/03/28 15:27:48 INFO spark.SparkContext: Starting job: saveAsTextFile at WordCount.scala:46
21/03/28 15:27:48 INFO scheduler.DAGScheduler: Registering RDD 3 (map at WordCount.scala:46)
21/03/28 15:27:48 INFO scheduler.DAGScheduler: Got job 0 (saveAsTextFile at WordCount.scala:46) with 2 output partitions
21/03/28 15:27:48 INFO scheduler.DAGScheduler: Final stage: ResultStage 1 (saveAsTextFile at WordCount.scala:46)
21/03/28 15:27:48 INFO scheduler.DAGScheduler: Parents of final stage: List(ShuffleMapStage 0)
21/03/28 15:27:48 INFO scheduler.DAGScheduler: Missing parents: List(ShuffleMapStage 0)
21/03/28 15:27:48 INFO scheduler.DAGScheduler: Submitting ShuffleMapStage 0 (MapPartitionsRDD[3] at map at WordCount.scala:46), which has no missing parents
21/03/28 15:27:48 INFO memory.MemoryStore: Block broadcast_1 stored as values in memory (estimated size 4.7 KB, free 366.0 MB)
21/03/28 15:27:48 INFO memory.MemoryStore: Block broadcast_1_piece0 stored as bytes in memory (estimated size 2.7 KB, free 366.0 MB)
21/03/28 15:27:48 INFO storage.BlockManagerInfo: Added broadcast_1_piece0 in memory on 172.23.4.221:40832 (size: 2.7 KB, free: 366.3 MB)
21/03/28 15:27:48 INFO spark.SparkContext: Created broadcast 1 from broadcast at DAGScheduler.scala:996
21/03/28 15:27:48 INFO scheduler.DAGScheduler: Submitting 2 missing tasks from ShuffleMapStage 0 (MapPartitionsRDD[3] at map at WordCount.scala:46)
21/03/28 15:27:48 INFO scheduler.TaskSchedulerImpl: Adding task set 0.0 with 2 tasks
21/03/28 15:27:48 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, localhost, executor driver, partition 0, ANY, 6029 bytes)
21/03/28 15:27:48 INFO scheduler.TaskSetManager: Starting task 1.0 in stage 0.0 (TID 1, localhost, executor driver, partition 1, ANY, 6029 bytes)
21/03/28 15:27:48 INFO executor.Executor: Running task 0.0 in stage 0.0 (TID 0)
21/03/28 15:27:48 INFO executor.Executor: Running task 1.0 in stage 0.0 (TID 1)
21/03/28 15:27:48 INFO executor.Executor: Fetching spark://172.23.4.221:34163/jars/WordCount.jar with timestamp 1616916466961
21/03/28 15:27:48 INFO client.TransportClientFactory: Successfully created connection to /172.23.4.221:34163 after 20 ms (0 ms spent in bootstraps)
21/03/28 15:27:48 INFO util.Utils: Fetching spark://172.23.4.221:34163/jars/WordCount.jar to /tmp/spark-366476ad-6384-4540-a513-ee84fe92743a/userFiles-8c071240-fac9-483f-b912-9885d17c6c76/fetchFileTemp2962755931681596199.tmp
21/03/28 15:27:48 INFO executor.Executor: Adding file:/tmp/spark-366476ad-6384-4540-a513-ee84fe92743a/userFiles-8c071240-fac9-483f-b912-9885d17c6c76/WordCount.jar to class loader
21/03/28 15:27:48 INFO rdd.HadoopRDD: Input split: hdfs://master:9000/input/1.txt:0+36
21/03/28 15:27:48 INFO rdd.HadoopRDD: Input split: hdfs://master:9000/input/1.txt:36+36
21/03/28 15:27:48 INFO executor.Executor: Finished task 1.0 in stage 0.0 (TID 1). 1818 bytes result sent to driver
21/03/28 15:27:48 INFO executor.Executor: Finished task 0.0 in stage 0.0 (TID 0). 1818 bytes result sent to driver
21/03/28 15:27:48 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 0.0 (TID 0) in 229 ms on localhost (executor driver) (1/2)
21/03/28 15:27:48 INFO scheduler.TaskSetManager: Finished task 1.0 in stage 0.0 (TID 1) in 212 ms on localhost (executor driver) (2/2)
21/03/28 15:27:48 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool 
21/03/28 15:27:48 INFO scheduler.DAGScheduler: ShuffleMapStage 0 (map at WordCount.scala:46) finished in 0.244 s
21/03/28 15:27:48 INFO scheduler.DAGScheduler: looking for newly runnable stages
21/03/28 15:27:48 INFO scheduler.DAGScheduler: running: Set()
21/03/28 15:27:48 INFO scheduler.DAGScheduler: waiting: Set(ResultStage 1)
21/03/28 15:27:48 INFO scheduler.DAGScheduler: failed: Set()
21/03/28 15:27:48 INFO scheduler.DAGScheduler: Submitting ResultStage 1 (MapPartitionsRDD[5] at saveAsTextFile at WordCount.scala:46), which has no missing parents
21/03/28 15:27:48 INFO memory.MemoryStore: Block broadcast_2 stored as values in memory (estimated size 73.2 KB, free 366.0 MB)
21/03/28 15:27:48 INFO memory.MemoryStore: Block broadcast_2_piece0 stored as bytes in memory (estimated size 26.7 KB, free 365.9 MB)
21/03/28 15:27:48 INFO storage.BlockManagerInfo: Added broadcast_2_piece0 in memory on 172.23.4.221:40832 (size: 26.7 KB, free: 366.2 MB)
21/03/28 15:27:48 INFO spark.SparkContext: Created broadcast 2 from broadcast at DAGScheduler.scala:996
21/03/28 15:27:48 INFO scheduler.DAGScheduler: Submitting 2 missing tasks from ResultStage 1 (MapPartitionsRDD[5] at saveAsTextFile at WordCount.scala:46)
21/03/28 15:27:48 INFO scheduler.TaskSchedulerImpl: Adding task set 1.0 with 2 tasks
21/03/28 15:27:48 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 1.0 (TID 2, localhost, executor driver, partition 0, ANY, 5812 bytes)
21/03/28 15:27:48 INFO scheduler.TaskSetManager: Starting task 1.0 in stage 1.0 (TID 3, localhost, executor driver, partition 1, ANY, 5812 bytes)
21/03/28 15:27:48 INFO executor.Executor: Running task 0.0 in stage 1.0 (TID 2)
21/03/28 15:27:48 INFO executor.Executor: Running task 1.0 in stage 1.0 (TID 3)
21/03/28 15:27:48 INFO storage.ShuffleBlockFetcherIterator: Getting 2 non-empty blocks out of 2 blocks
21/03/28 15:27:48 INFO storage.ShuffleBlockFetcherIterator: Getting 2 non-empty blocks out of 2 blocks
21/03/28 15:27:48 INFO storage.ShuffleBlockFetcherIterator: Started 0 remote fetches in 4 ms
21/03/28 15:27:48 INFO storage.ShuffleBlockFetcherIterator: Started 0 remote fetches in 4 ms
21/03/28 15:27:48 INFO output.FileOutputCommitter: File Output Committer Algorithm version is 1
21/03/28 15:27:48 INFO output.FileOutputCommitter: File Output Committer Algorithm version is 1
21/03/28 15:27:48 INFO output.FileOutputCommitter: Saved output of task 'attempt_20210328152748_0001_m_000000_2' to hdfs://master:9000/output/_temporary/0/task_20210328152748_0001_m_000000
21/03/28 15:27:48 INFO mapred.SparkHadoopMapRedUtil: attempt_20210328152748_0001_m_000000_2: Committed
21/03/28 15:27:48 INFO executor.Executor: Finished task 0.0 in stage 1.0 (TID 2). 1890 bytes result sent to driver
21/03/28 15:27:48 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 1.0 (TID 2) in 100 ms on localhost (executor driver) (1/2)
21/03/28 15:27:48 INFO output.FileOutputCommitter: Saved output of task 'attempt_20210328152748_0001_m_000001_3' to hdfs://master:9000/output/_temporary/0/task_20210328152748_0001_m_000001
21/03/28 15:27:48 INFO mapred.SparkHadoopMapRedUtil: attempt_20210328152748_0001_m_000001_3: Committed
21/03/28 15:27:48 INFO executor.Executor: Finished task 1.0 in stage 1.0 (TID 3). 1890 bytes result sent to driver
21/03/28 15:27:48 INFO scheduler.TaskSetManager: Finished task 1.0 in stage 1.0 (TID 3) in 101 ms on localhost (executor driver) (2/2)
21/03/28 15:27:48 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 1.0, whose tasks have all completed, from pool 
21/03/28 15:27:48 INFO scheduler.DAGScheduler: ResultStage 1 (saveAsTextFile at WordCount.scala:46) finished in 0.104 s
21/03/28 15:27:48 INFO scheduler.DAGScheduler: Job 0 finished: saveAsTextFile at WordCount.scala:46, took 0.548428 s
21/03/28 15:27:48 INFO server.ServerConnector: Stopped Spark@5471388b{HTTP/1.1}{0.0.0.0:4040}
21/03/28 15:27:48 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@5af28b27{/stages/stage/kill,null,UNAVAILABLE,@Spark}
21/03/28 15:27:48 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@4a3e3e8b{/jobs/job/kill,null,UNAVAILABLE,@Spark}
21/03/28 15:27:48 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@7876d598{/api,null,UNAVAILABLE,@Spark}
21/03/28 15:27:48 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@15a902e7{/,null,UNAVAILABLE,@Spark}
21/03/28 15:27:48 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@5118388b{/static,null,UNAVAILABLE,@Spark}
21/03/28 15:27:48 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@664a9613{/executors/threadDump/json,null,UNAVAILABLE,@Spark}
21/03/28 15:27:48 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@314b8f2d{/executors/threadDump,null,UNAVAILABLE,@Spark}
21/03/28 15:27:48 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@285d851a{/executors/json,null,UNAVAILABLE,@Spark}
21/03/28 15:27:48 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@7859e786{/executors,null,UNAVAILABLE,@Spark}
21/03/28 15:27:48 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@361c294e{/environment/json,null,UNAVAILABLE,@Spark}
21/03/28 15:27:48 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@4a8a60bc{/environment,null,UNAVAILABLE,@Spark}
21/03/28 15:27:48 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@3a1d593e{/storage/rdd/json,null,UNAVAILABLE,@Spark}
21/03/28 15:27:48 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@669513d8{/storage/rdd,null,UNAVAILABLE,@Spark}
21/03/28 15:27:48 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@2f7a7219{/storage/json,null,UNAVAILABLE,@Spark}
21/03/28 15:27:48 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@2cab9998{/storage,null,UNAVAILABLE,@Spark}
21/03/28 15:27:48 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@71c5b236{/stages/pool/json,null,UNAVAILABLE,@Spark}
21/03/28 15:27:48 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@403132fc{/stages/pool,null,UNAVAILABLE,@Spark}
21/03/28 15:27:48 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@1cb3ec38{/stages/stage/json,null,UNAVAILABLE,@Spark}
21/03/28 15:27:48 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@61078690{/stages/stage,null,UNAVAILABLE,@Spark}
21/03/28 15:27:48 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@7f69d591{/stages/json,null,UNAVAILABLE,@Spark}
21/03/28 15:27:48 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@7c041b41{/stages,null,UNAVAILABLE,@Spark}
21/03/28 15:27:48 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@5df417a7{/jobs/job/json,null,UNAVAILABLE,@Spark}
21/03/28 15:27:48 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@2f162cc0{/jobs/job,null,UNAVAILABLE,@Spark}
21/03/28 15:27:48 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@9d157ff{/jobs/json,null,UNAVAILABLE,@Spark}
21/03/28 15:27:48 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@ce5a68e{/jobs,null,UNAVAILABLE,@Spark}
21/03/28 15:27:48 INFO ui.SparkUI: Stopped Spark web UI at http://172.23.4.221:4040
21/03/28 15:27:48 INFO spark.MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
21/03/28 15:27:48 INFO memory.MemoryStore: MemoryStore cleared
21/03/28 15:27:48 INFO storage.BlockManager: BlockManager stopped
21/03/28 15:27:48 INFO storage.BlockManagerMaster: BlockManagerMaster stopped
21/03/28 15:27:48 INFO scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
21/03/28 15:27:48 INFO spark.SparkContext: Successfully stopped SparkContext
21/03/28 15:27:48 INFO util.ShutdownHookManager: Shutdown hook called
21/03/28 15:27:48 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-366476ad-6384-4540-a513-ee84fe92743a

 

  • 1
    点赞
  • 1
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值