spark-shell采用local方式提交SparkTC应用后,Driver端的日志记录

//日志
[spark@node1 spark-2.4.6]$ spark-submit  --class org.apache.spark.examples.SparkTC --master local[4] examples/target/scala-2.11/jars/spark-examples_2.11-2.4.6.jar 1
21/11/11 16:20:54 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
21/11/11 16:20:54 INFO spark.SparkContext: Running Spark version 2.4.6
21/11/11 16:20:54 INFO spark.SparkContext: Submitted application: SparkTC
21/11/11 16:20:54 INFO spark.SecurityManager: Changing view acls to: spark
21/11/11 16:20:54 INFO spark.SecurityManager: Changing modify acls to: spark
21/11/11 16:20:54 INFO spark.SecurityManager: Changing view acls groups to: 
21/11/11 16:20:54 INFO spark.SecurityManager: Changing modify acls groups to: 
21/11/11 16:20:54 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(spark); groups with view permissions: Set(); users  with modify permissions: Set(spark); groups with modify permissions: Set()
21/11/11 16:20:55 INFO util.Utils: Successfully started service 'sparkDriver' on port 34028.
21/11/11 16:20:55 INFO spark.SparkEnv: Registering MapOutputTracker
21/11/11 16:20:55 INFO spark.SparkEnv: Registering BlockManagerMaster
21/11/11 16:20:55 INFO storage.BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
21/11/11 16:20:55 INFO storage.BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
21/11/11 16:20:55 INFO storage.DiskBlockManager: Created local directory at /home/spark/data/scratch/blockmgr-2761bf1d-c8f3-48f7-a4b2-4fe1ea90972f
21/11/11 16:20:55 INFO memory.MemoryStore: MemoryStore started with capacity 366.3 MB
21/11/11 16:20:55 INFO spark.SparkEnv: Registering OutputCommitCoordinator
21/11/11 16:20:55 INFO util.log: Logging initialized @1667ms
21/11/11 16:20:55 INFO server.Server: jetty-9.3.z-SNAPSHOT, build timestamp: unknown, git hash: unknown
21/11/11 16:20:55 INFO server.Server: Started @1733ms
21/11/11 16:20:55 INFO server.AbstractConnector: Started ServerConnector@28c0b664{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
21/11/11 16:20:55 INFO util.Utils: Successfully started service 'SparkUI' on port 4040.
21/11/11 16:20:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3fabf088{/jobs,null,AVAILABLE,@Spark}
21/11/11 16:20:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6f6a7463{/jobs/json,null,AVAILABLE,@Spark}
21/11/11 16:20:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1bdaa23d{/jobs/job,null,AVAILABLE,@Spark}
21/11/11 16:20:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6ca320ab{/jobs/job/json,null,AVAILABLE,@Spark}
21/11/11 16:20:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@50d68830{/stages,null,AVAILABLE,@Spark}
21/11/11 16:20:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1e53135d{/stages/json,null,AVAILABLE,@Spark}
21/11/11 16:20:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@7674a051{/stages/stage,null,AVAILABLE,@Spark}
21/11/11 16:20:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@619bd14c{/stages/stage/json,null,AVAILABLE,@Spark}
21/11/11 16:20:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@323e8306{/stages/pool,null,AVAILABLE,@Spark}
21/11/11 16:20:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@a23a01d{/stages/pool/json,null,AVAILABLE,@Spark}
21/11/11 16:20:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@4acf72b6{/storage,null,AVAILABLE,@Spark}
21/11/11 16:20:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@7561db12{/storage/json,null,AVAILABLE,@Spark}
21/11/11 16:20:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3301500b{/storage/rdd,null,AVAILABLE,@Spark}
21/11/11 16:20:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@24b52d3e{/storage/rdd/json,null,AVAILABLE,@Spark}
21/11/11 16:20:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@15deb1dc{/environment,null,AVAILABLE,@Spark}
21/11/11 16:20:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6e9c413e{/environment/json,null,AVAILABLE,@Spark}
21/11/11 16:20:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@57a4d5ee{/executors,null,AVAILABLE,@Spark}
21/11/11 16:20:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@5af5def9{/executors/json,null,AVAILABLE,@Spark}
21/11/11 16:20:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3a45c42a{/executors/threadDump,null,AVAILABLE,@Spark}
21/11/11 16:20:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@36dce7ed{/executors/threadDump/json,null,AVAILABLE,@Spark}
21/11/11 16:20:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@47a64f7d{/static,null,AVAILABLE,@Spark}
21/11/11 16:20:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@78e16155{/,null,AVAILABLE,@Spark}
21/11/11 16:20:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@54a3ab8f{/api,null,AVAILABLE,@Spark}
21/11/11 16:20:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@50b0bc4c{/jobs/job/kill,null,AVAILABLE,@Spark}
21/11/11 16:20:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@c20be82{/stages/stage/kill,null,AVAILABLE,@Spark}
21/11/11 16:20:55 INFO ui.SparkUI: Bound SparkUI to 0.0.0.0, and started at http://node1:4040
21/11/11 16:20:55 INFO spark.SparkContext: Added JAR file:/home/spark/spark-2.4.6/examples/target/scala-2.11/jars/spark-examples_2.11-2.4.6.jar at spark://node1:34028/jars/spark-examples_2.11-2.4.6.jar with timestamp 1636618855467
21/11/11 16:20:55 INFO executor.Executor: Starting executor ID driver on host localhost
21/11/11 16:20:55 INFO util.Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 33235.
21/11/11 16:20:55 INFO netty.NettyBlockTransferService: Server created on node1:33235
21/11/11 16:20:55 INFO storage.BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
21/11/11 16:20:55 INFO storage.BlockManagerMaster: Registering BlockManager BlockManagerId(driver, node1, 33235, None)
21/11/11 16:20:55 INFO storage.BlockManagerMasterEndpoint: Registering block manager node1:33235 with 366.3 MB RAM, BlockManagerId(driver, node1, 33235, None)
21/11/11 16:20:55 INFO storage.BlockManagerMaster: Registered BlockManager BlockManagerId(driver, node1, 33235, None)
21/11/11 16:20:55 INFO storage.BlockManager: Initialized BlockManager: BlockManagerId(driver, node1, 33235, None)
21/11/11 16:20:55 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@f14e5bf{/metrics/json,null,AVAILABLE,@Spark}
21/11/11 16:20:56 INFO scheduler.EventLoggingListener: Logging events to hdfs://node1:9000/spark/history/local-1636618855504
21/11/11 16:20:56 INFO spark.SparkContext: Starting job: count at SparkTC.scala:62
21/11/11 16:20:56 INFO scheduler.DAGScheduler: Got job 0 (count at SparkTC.scala:62) with 1 output partitions
21/11/11 16:20:56 INFO scheduler.DAGScheduler: Final stage: ResultStage 0 (count at SparkTC.scala:62)
21/11/11 16:20:56 INFO scheduler.DAGScheduler: Parents of final stage: List()
21/11/11 16:20:56 INFO scheduler.DAGScheduler: Missing parents: List()
21/11/11 16:20:56 INFO scheduler.DAGScheduler: Submitting ResultStage 0 (ParallelCollectionRDD[0] at parallelize at SparkTC.scala:50), which has no missing parents
21/11/11 16:20:56 INFO memory.MemoryStore: Block broadcast_0 stored as values in memory (estimated size 1376.0 B, free 366.3 MB)
21/11/11 16:20:56 INFO memory.MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 994.0 B, free 366.3 MB)
21/11/11 16:20:56 INFO storage.BlockManagerInfo: Added broadcast_0_piece0 in memory on node1:33235 (size: 994.0 B, free: 366.3 MB)
21/11/11 16:20:56 INFO spark.SparkContext: Created broadcast 0 from broadcast at DAGScheduler.scala:1163
21/11/11 16:20:57 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ResultStage 0 (ParallelCollectionRDD[0] at parallelize at SparkTC.scala:50) (first 15 tasks are for partitions Vector(0))
21/11/11 16:20:57 INFO scheduler.TaskSchedulerImpl: Adding task set 0.0 with 1 tasks
21/11/11 16:20:57 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, localhost, executor driver, partition 0, PROCESS_LOCAL, 11166 bytes)
21/11/11 16:20:57 INFO executor.Executor: Running task 0.0 in stage 0.0 (TID 0)
21/11/11 16:20:57 INFO executor.Executor: Fetching spark://node1:34028/jars/spark-examples_2.11-2.4.6.jar with timestamp 1636618855467
21/11/11 16:20:57 INFO client.TransportClientFactory: Successfully created connection to node1/114.212.82.49:34028 after 60 ms (0 ms spent in bootstraps)
21/11/11 16:20:57 INFO util.Utils: Fetching spark://node1:34028/jars/spark-examples_2.11-2.4.6.jar to /home/spark/data/scratch/spark-5f80c079-b023-42f5-925c-ace5c7dd12e0/userFiles-5ec402f7-d708-4d70-8562-b2cba5e342a3/fetchFileTemp2941940833623352070.tmp
21/11/11 16:20:57 INFO executor.Executor: Adding file:/home/spark/data/scratch/spark-5f80c079-b023-42f5-925c-ace5c7dd12e0/userFiles-5ec402f7-d708-4d70-8562-b2cba5e342a3/spark-examples_2.11-2.4.6.jar to class loader
21/11/11 16:20:57 INFO memory.MemoryStore: Block rdd_0_0 stored as values in memory (estimated size 7.0 KB, free 366.3 MB)
21/11/11 16:20:57 INFO storage.BlockManagerInfo: Added rdd_0_0 in memory on node1:33235 (size: 7.0 KB, free: 366.3 MB)
21/11/11 16:20:57 INFO executor.Executor: Finished task 0.0 in stage 0.0 (TID 0). 832 bytes result sent to driver
21/11/11 16:20:57 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 0.0 (TID 0) in 368 ms on localhost (executor driver) (1/1)
21/11/11 16:20:57 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool 
21/11/11 16:20:57 INFO scheduler.DAGScheduler: ResultStage 0 (count at SparkTC.scala:62) finished in 0.537 s
21/11/11 16:20:57 INFO scheduler.DAGScheduler: Job 0 finished: count at SparkTC.scala:62, took 0.626544 s
21/11/11 16:20:57 INFO spark.SparkContext: Starting job: count at SparkTC.scala:71
21/11/11 16:20:57 INFO scheduler.DAGScheduler: Registering RDD 1 (map at SparkTC.scala:58) as input to shuffle 1
21/11/11 16:20:57 INFO scheduler.DAGScheduler: Registering RDD 0 (parallelize at SparkTC.scala:50) as input to shuffle 0
21/11/11 16:20:57 INFO scheduler.DAGScheduler: Registering RDD 7 (distinct at SparkTC.scala:70) as input to shuffle 2
21/11/11 16:20:57 INFO scheduler.DAGScheduler: Got job 1 (count at SparkTC.scala:71) with 2 output partitions
21/11/11 16:20:57 INFO scheduler.DAGScheduler: Final stage: ResultStage 4 (count at SparkTC.scala:71)
21/11/11 16:20:57 INFO scheduler.DAGScheduler: Parents of final stage: List(ShuffleMapStage 3)
21/11/11 16:20:57 INFO scheduler.DAGScheduler: Missing parents: List(ShuffleMapStage 3)
21/11/11 16:20:57 INFO scheduler.DAGScheduler: Submitting ShuffleMapStage 1 (MapPartitionsRDD[1] at map at SparkTC.scala:58), which has no missing parents
21/11/11 16:20:57 INFO memory.MemoryStore: Block broadcast_1 stored as values in memory (estimated size 2.7 KB, free 366.3 MB)
21/11/11 16:20:57 INFO memory.MemoryStore: Block broadcast_1_piece0 stored as bytes in memory (estimated size 1783.0 B, free 366.3 MB)
21/11/11 16:20:57 INFO storage.BlockManagerInfo: Added broadcast_1_piece0 in memory on node1:33235 (size: 1783.0 B, free: 366.3 MB)
21/11/11 16:20:57 INFO spark.SparkContext: Created broadcast 1 from broadcast at DAGScheduler.scala:1163
21/11/11 16:20:57 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ShuffleMapStage 1 (MapPartitionsRDD[1] at map at SparkTC.scala:58) (first 15 tasks are for partitions Vector(0))
21/11/11 16:20:57 INFO scheduler.TaskSchedulerImpl: Adding task set 1.0 with 1 tasks
21/11/11 16:20:57 INFO scheduler.DAGScheduler: Submitting ShuffleMapStage 2 (ParallelCollectionRDD[0] at parallelize at SparkTC.scala:50), which has no missing parents
21/11/11 16:20:57 INFO memory.MemoryStore: Block broadcast_2 stored as values in memory (estimated size 2.2 KB, free 366.3 MB)
21/11/11 16:20:57 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 1.0 (TID 1, localhost, executor driver, partition 0, PROCESS_LOCAL, 11155 bytes)
21/11/11 16:20:57 INFO executor.Executor: Running task 0.0 in stage 1.0 (TID 1)
21/11/11 16:20:57 INFO memory.MemoryStore: Block broadcast_2_piece0 stored as bytes in memory (estimated size 1443.0 B, free 366.3 MB)
21/11/11 16:20:57 INFO storage.BlockManagerInfo: Added broadcast_2_piece0 in memory on node1:33235 (size: 1443.0 B, free: 366.3 MB)
21/11/11 16:20:57 INFO spark.SparkContext: Created broadcast 2 from broadcast at DAGScheduler.scala:1163
21/11/11 16:20:57 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ShuffleMapStage 2 (ParallelCollectionRDD[0] at parallelize at SparkTC.scala:50) (first 15 tasks are for partitions Vector(0))
21/11/11 16:20:57 INFO scheduler.TaskSchedulerImpl: Adding task set 2.0 with 1 tasks
21/11/11 16:20:57 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 2.0 (TID 2, localhost, executor driver, partition 0, PROCESS_LOCAL, 11155 bytes)
21/11/11 16:20:57 INFO storage.BlockManager: Found block rdd_0_0 locally
21/11/11 16:20:57 INFO executor.Executor: Running task 0.0 in stage 2.0 (TID 2)
21/11/11 16:20:57 INFO storage.BlockManager: Found block rdd_0_0 locally
21/11/11 16:20:57 INFO executor.Executor: Finished task 0.0 in stage 1.0 (TID 1). 1017 bytes result sent to driver
21/11/11 16:20:57 INFO executor.Executor: Finished task 0.0 in stage 2.0 (TID 2). 1017 bytes result sent to driver
21/11/11 16:20:57 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 1.0 (TID 1) in 34 ms on localhost (executor driver) (1/1)
21/11/11 16:20:57 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 1.0, whose tasks have all completed, from pool 
21/11/11 16:20:57 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 2.0 (TID 2) in 23 ms on localhost (executor driver) (1/1)
21/11/11 16:20:57 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 2.0, whose tasks have all completed, from pool 
21/11/11 16:20:57 INFO scheduler.DAGScheduler: ShuffleMapStage 1 (map at SparkTC.scala:58) finished in 0.058 s
21/11/11 16:20:57 INFO scheduler.DAGScheduler: looking for newly runnable stages
21/11/11 16:20:57 INFO scheduler.DAGScheduler: running: Set(ShuffleMapStage 2)
21/11/11 16:20:57 INFO scheduler.DAGScheduler: waiting: Set(ShuffleMapStage 3, ResultStage 4)
21/11/11 16:20:57 INFO scheduler.DAGScheduler: failed: Set()
21/11/11 16:20:57 INFO scheduler.DAGScheduler: ShuffleMapStage 2 (parallelize at SparkTC.scala:50) finished in 0.041 s
21/11/11 16:20:57 INFO scheduler.DAGScheduler: looking for newly runnable stages
21/11/11 16:20:57 INFO scheduler.DAGScheduler: running: Set()
21/11/11 16:20:57 INFO scheduler.DAGScheduler: waiting: Set(ShuffleMapStage 3, ResultStage 4)
21/11/11 16:20:57 INFO scheduler.DAGScheduler: failed: Set()
21/11/11 16:20:57 INFO scheduler.DAGScheduler: Submitting ShuffleMapStage 3 (MapPartitionsRDD[7] at distinct at SparkTC.scala:70), which has no missing parents
21/11/11 16:20:57 INFO memory.MemoryStore: Block broadcast_3 stored as values in memory (estimated size 4.8 KB, free 366.3 MB)
21/11/11 16:20:57 INFO memory.MemoryStore: Block broadcast_3_piece0 stored as bytes in memory (estimated size 2.7 KB, free 366.3 MB)
21/11/11 16:20:57 INFO storage.BlockManagerInfo: Added broadcast_3_piece0 in memory on node1:33235 (size: 2.7 KB, free: 366.3 MB)
21/11/11 16:20:57 INFO spark.SparkContext: Created broadcast 3 from broadcast at DAGScheduler.scala:1163
21/11/11 16:20:57 INFO scheduler.DAGScheduler: Submitting 2 missing tasks from ShuffleMapStage 3 (MapPartitionsRDD[7] at distinct at SparkTC.scala:70) (first 15 tasks are for partitions Vector(0, 1))
21/11/11 16:20:57 INFO scheduler.TaskSchedulerImpl: Adding task set 3.0 with 2 tasks
21/11/11 16:20:57 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 3.0 (TID 3, localhost, executor driver, partition 0, PROCESS_LOCAL, 11264 bytes)
21/11/11 16:20:57 INFO scheduler.TaskSetManager: Starting task 1.0 in stage 3.0 (TID 4, localhost, executor driver, partition 1, PROCESS_LOCAL, 7823 bytes)
21/11/11 16:20:57 INFO executor.Executor: Running task 1.0 in stage 3.0 (TID 4)
21/11/11 16:20:57 INFO executor.Executor: Running task 0.0 in stage 3.0 (TID 3)
21/11/11 16:20:57 INFO storage.BlockManager: Found block rdd_0_0 locally
21/11/11 16:20:57 INFO storage.ShuffleBlockFetcherIterator: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
21/11/11 16:20:57 INFO storage.ShuffleBlockFetcherIterator: Started 0 remote fetches in 6 ms
21/11/11 16:20:57 INFO storage.ShuffleBlockFetcherIterator: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
21/11/11 16:20:57 INFO storage.ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
21/11/11 16:20:57 INFO executor.Executor: Finished task 0.0 in stage 3.0 (TID 3). 1104 bytes result sent to driver
21/11/11 16:20:57 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 3.0 (TID 3) in 72 ms on localhost (executor driver) (1/2)
21/11/11 16:20:57 INFO executor.Executor: Finished task 1.0 in stage 3.0 (TID 4). 1319 bytes result sent to driver
21/11/11 16:20:57 INFO scheduler.TaskSetManager: Finished task 1.0 in stage 3.0 (TID 4) in 104 ms on localhost (executor driver) (2/2)
21/11/11 16:20:57 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 3.0, whose tasks have all completed, from pool 
21/11/11 16:20:57 INFO scheduler.DAGScheduler: ShuffleMapStage 3 (distinct at SparkTC.scala:70) finished in 0.127 s
21/11/11 16:20:57 INFO scheduler.DAGScheduler: looking for newly runnable stages
21/11/11 16:20:57 INFO scheduler.DAGScheduler: running: Set()
21/11/11 16:20:57 INFO scheduler.DAGScheduler: waiting: Set(ResultStage 4)
21/11/11 16:20:57 INFO scheduler.DAGScheduler: failed: Set()
21/11/11 16:20:57 INFO scheduler.DAGScheduler: Submitting ResultStage 4 (MapPartitionsRDD[9] at distinct at SparkTC.scala:70), which has no missing parents
21/11/11 16:20:57 INFO memory.MemoryStore: Block broadcast_4 stored as values in memory (estimated size 3.6 KB, free 366.3 MB)
21/11/11 16:20:57 INFO memory.MemoryStore: Block broadcast_4_piece0 stored as bytes in memory (estimated size 2.1 KB, free 366.3 MB)
21/11/11 16:20:57 INFO storage.BlockManagerInfo: Added broadcast_4_piece0 in memory on node1:33235 (size: 2.1 KB, free: 366.3 MB)
21/11/11 16:20:57 INFO spark.SparkContext: Created broadcast 4 from broadcast at DAGScheduler.scala:1163
21/11/11 16:20:57 INFO scheduler.DAGScheduler: Submitting 2 missing tasks from ResultStage 4 (MapPartitionsRDD[9] at distinct at SparkTC.scala:70) (first 15 tasks are for partitions Vector(0, 1))
21/11/11 16:20:57 INFO scheduler.TaskSchedulerImpl: Adding task set 4.0 with 2 tasks
21/11/11 16:20:57 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 4.0 (TID 5, localhost, executor driver, partition 0, ANY, 7662 bytes)
21/11/11 16:20:57 INFO scheduler.TaskSetManager: Starting task 1.0 in stage 4.0 (TID 6, localhost, executor driver, partition 1, ANY, 7662 bytes)
21/11/11 16:20:57 INFO executor.Executor: Running task 0.0 in stage 4.0 (TID 5)
21/11/11 16:20:57 INFO executor.Executor: Running task 1.0 in stage 4.0 (TID 6)
21/11/11 16:20:57 INFO storage.ShuffleBlockFetcherIterator: Getting 2 non-empty blocks including 2 local blocks and 0 remote blocks
21/11/11 16:20:57 INFO storage.ShuffleBlockFetcherIterator: Getting 2 non-empty blocks including 2 local blocks and 0 remote blocks
21/11/11 16:20:57 INFO storage.ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
21/11/11 16:20:57 INFO storage.ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
21/11/11 16:20:57 INFO memory.MemoryStore: Block rdd_9_0 stored as values in memory (estimated size 10.7 KB, free 366.3 MB)
21/11/11 16:20:57 INFO storage.BlockManagerInfo: Added rdd_9_0 in memory on node1:33235 (size: 10.7 KB, free: 366.3 MB)
21/11/11 16:20:57 INFO executor.Executor: Finished task 0.0 in stage 4.0 (TID 5). 1176 bytes result sent to driver
21/11/11 16:20:57 INFO memory.MemoryStore: Block rdd_9_1 stored as values in memory (estimated size 11.6 KB, free 366.2 MB)
21/11/11 16:20:57 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 4.0 (TID 5) in 30 ms on localhost (executor driver) (1/2)
21/11/11 16:20:57 INFO storage.BlockManagerInfo: Added rdd_9_1 in memory on node1:33235 (size: 11.6 KB, free: 366.3 MB)
21/11/11 16:20:57 INFO executor.Executor: Finished task 1.0 in stage 4.0 (TID 6). 1219 bytes result sent to driver
21/11/11 16:20:57 INFO scheduler.TaskSetManager: Finished task 1.0 in stage 4.0 (TID 6) in 33 ms on localhost (executor driver) (2/2)
21/11/11 16:20:57 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 4.0, whose tasks have all completed, from pool 
21/11/11 16:20:57 INFO scheduler.DAGScheduler: ResultStage 4 (count at SparkTC.scala:71) finished in 0.050 s
21/11/11 16:20:57 INFO scheduler.DAGScheduler: Job 1 finished: count at SparkTC.scala:71, took 0.281312 s
21/11/11 16:20:57 INFO spark.SparkContext: Starting job: count at SparkTC.scala:71
21/11/11 16:20:57 INFO scheduler.DAGScheduler: Registering RDD 9 (distinct at SparkTC.scala:70) as input to shuffle 3
21/11/11 16:20:57 INFO scheduler.DAGScheduler: Registering RDD 1 (map at SparkTC.scala:58) as input to shuffle 4
21/11/11 16:20:57 INFO scheduler.DAGScheduler: Registering RDD 15 (distinct at SparkTC.scala:70) as input to shuffle 5
21/11/11 16:20:57 INFO scheduler.DAGScheduler: Got job 2 (count at SparkTC.scala:71) with 4 output partitions
21/11/11 16:20:57 INFO scheduler.DAGScheduler: Final stage: ResultStage 11 (count at SparkTC.scala:71)
21/11/11 16:20:57 INFO scheduler.DAGScheduler: Parents of final stage: List(ShuffleMapStage 10)
21/11/11 16:20:57 INFO scheduler.DAGScheduler: Missing parents: List(ShuffleMapStage 10)
21/11/11 16:20:57 INFO scheduler.DAGScheduler: Submitting ShuffleMapStage 8 (MapPartitionsRDD[9] at distinct at SparkTC.scala:70), which has no missing parents
21/11/11 16:20:57 INFO memory.MemoryStore: Block broadcast_5 stored as values in memory (estimated size 3.6 KB, free 366.2 MB)
21/11/11 16:20:57 INFO memory.MemoryStore: Block broadcast_5_piece0 stored as bytes in memory (estimated size 2.1 KB, free 366.2 MB)
21/11/11 16:20:57 INFO storage.BlockManagerInfo: Added broadcast_5_piece0 in memory on node1:33235 (size: 2.1 KB, free: 366.3 MB)
21/11/11 16:20:57 INFO spark.SparkContext: Created broadcast 5 from broadcast at DAGScheduler.scala:1163
21/11/11 16:20:57 INFO scheduler.DAGScheduler: Submitting 2 missing tasks from ShuffleMapStage 8 (MapPartitionsRDD[9] at distinct at SparkTC.scala:70) (first 15 tasks are for partitions Vector(0, 1))
21/11/11 16:20:57 INFO scheduler.TaskSchedulerImpl: Adding task set 8.0 with 2 tasks
21/11/11 16:20:57 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 8.0 (TID 7, localhost, executor driver, partition 0, PROCESS_LOCAL, 7651 bytes)
21/11/11 16:20:57 INFO scheduler.DAGScheduler: Submitting ShuffleMapStage 9 (MapPartitionsRDD[1] at map at SparkTC.scala:58), which has no missing parents
21/11/11 16:20:57 INFO scheduler.TaskSetManager: Starting task 1.0 in stage 8.0 (TID 8, localhost, executor driver, partition 1, PROCESS_LOCAL, 7651 bytes)
21/11/11 16:20:57 INFO executor.Executor: Running task 1.0 in stage 8.0 (TID 8)
21/11/11 16:20:57 INFO executor.Executor: Running task 0.0 in stage 8.0 (TID 7)
21/11/11 16:20:57 INFO memory.MemoryStore: Block broadcast_6 stored as values in memory (estimated size 2.7 KB, free 366.2 MB)
21/11/11 16:20:57 INFO storage.BlockManager: Found block rdd_9_1 locally
21/11/11 16:20:57 INFO memory.MemoryStore: Block broadcast_6_piece0 stored as bytes in memory (estimated size 1786.0 B, free 366.2 MB)
21/11/11 16:20:57 INFO storage.BlockManager: Found block rdd_9_0 locally
21/11/11 16:20:57 INFO storage.BlockManagerInfo: Added broadcast_6_piece0 in memory on node1:33235 (size: 1786.0 B, free: 366.3 MB)
21/11/11 16:20:57 INFO spark.SparkContext: Created broadcast 6 from broadcast at DAGScheduler.scala:1163
21/11/11 16:20:57 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ShuffleMapStage 9 (MapPartitionsRDD[1] at map at SparkTC.scala:58) (first 15 tasks are for partitions Vector(0))
21/11/11 16:20:57 INFO scheduler.TaskSchedulerImpl: Adding task set 9.0 with 1 tasks
21/11/11 16:20:57 INFO executor.Executor: Finished task 1.0 in stage 8.0 (TID 8). 975 bytes result sent to driver
21/11/11 16:20:57 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 9.0 (TID 9, localhost, executor driver, partition 0, PROCESS_LOCAL, 11155 bytes)
21/11/11 16:20:57 INFO executor.Executor: Running task 0.0 in stage 9.0 (TID 9)
21/11/11 16:20:57 INFO executor.Executor: Finished task 0.0 in stage 8.0 (TID 7). 975 bytes result sent to driver
21/11/11 16:20:57 INFO scheduler.TaskSetManager: Finished task 1.0 in stage 8.0 (TID 8) in 12 ms on localhost (executor driver) (1/2)
21/11/11 16:20:57 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 8.0 (TID 7) in 12 ms on localhost (executor driver) (2/2)
21/11/11 16:20:57 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 8.0, whose tasks have all completed, from pool 
21/11/11 16:20:57 INFO scheduler.DAGScheduler: ShuffleMapStage 8 (distinct at SparkTC.scala:70) finished in 0.024 s
21/11/11 16:20:57 INFO scheduler.DAGScheduler: looking for newly runnable stages
21/11/11 16:20:57 INFO storage.BlockManager: Found block rdd_0_0 locally
21/11/11 16:20:57 INFO scheduler.DAGScheduler: running: Set(ShuffleMapStage 9)
21/11/11 16:20:57 INFO scheduler.DAGScheduler: waiting: Set(ShuffleMapStage 10, ResultStage 11)
21/11/11 16:20:57 INFO scheduler.DAGScheduler: failed: Set()
21/11/11 16:20:57 INFO executor.Executor: Finished task 0.0 in stage 9.0 (TID 9). 975 bytes result sent to driver
21/11/11 16:20:57 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 9.0 (TID 9) in 9 ms on localhost (executor driver) (1/1)
21/11/11 16:20:57 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 9.0, whose tasks have all completed, from pool 
21/11/11 16:20:57 INFO scheduler.DAGScheduler: ShuffleMapStage 9 (map at SparkTC.scala:58) finished in 0.019 s
21/11/11 16:20:57 INFO scheduler.DAGScheduler: looking for newly runnable stages
21/11/11 16:20:57 INFO scheduler.DAGScheduler: running: Set()
21/11/11 16:20:57 INFO scheduler.DAGScheduler: waiting: Set(ShuffleMapStage 10, ResultStage 11)
21/11/11 16:20:57 INFO scheduler.DAGScheduler: failed: Set()
21/11/11 16:20:57 INFO scheduler.DAGScheduler: Submitting ShuffleMapStage 10 (MapPartitionsRDD[15] at distinct at SparkTC.scala:70), which has no missing parents
21/11/11 16:20:57 INFO memory.MemoryStore: Block broadcast_7 stored as values in memory (estimated size 5.5 KB, free 366.2 MB)
21/11/11 16:20:57 INFO memory.MemoryStore: Block broadcast_7_piece0 stored as bytes in memory (estimated size 3.0 KB, free 366.2 MB)
21/11/11 16:20:57 INFO storage.BlockManagerInfo: Added broadcast_7_piece0 in memory on node1:33235 (size: 3.0 KB, free: 366.3 MB)
21/11/11 16:20:57 INFO spark.SparkContext: Created broadcast 7 from broadcast at DAGScheduler.scala:1163
21/11/11 16:20:57 INFO scheduler.DAGScheduler: Submitting 4 missing tasks from ShuffleMapStage 10 (MapPartitionsRDD[15] at distinct at SparkTC.scala:70) (first 15 tasks are for partitions Vector(0, 1, 2, 3))
21/11/11 16:20:57 INFO scheduler.TaskSchedulerImpl: Adding task set 10.0 with 4 tasks
21/11/11 16:20:57 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 10.0 (TID 10, localhost, executor driver, partition 0, PROCESS_LOCAL, 7760 bytes)
21/11/11 16:20:57 INFO scheduler.TaskSetManager: Starting task 1.0 in stage 10.0 (TID 11, localhost, executor driver, partition 1, PROCESS_LOCAL, 7760 bytes)
21/11/11 16:20:57 INFO scheduler.TaskSetManager: Starting task 2.0 in stage 10.0 (TID 12, localhost, executor driver, partition 2, PROCESS_LOCAL, 7823 bytes)
21/11/11 16:20:57 INFO scheduler.TaskSetManager: Starting task 3.0 in stage 10.0 (TID 13, localhost, executor driver, partition 3, PROCESS_LOCAL, 7823 bytes)
21/11/11 16:20:57 INFO executor.Executor: Running task 0.0 in stage 10.0 (TID 10)
21/11/11 16:20:57 INFO executor.Executor: Running task 1.0 in stage 10.0 (TID 11)
21/11/11 16:20:57 INFO executor.Executor: Running task 2.0 in stage 10.0 (TID 12)
21/11/11 16:20:57 INFO executor.Executor: Running task 3.0 in stage 10.0 (TID 13)
21/11/11 16:20:57 INFO storage.BlockManager: Found block rdd_9_0 locally
21/11/11 16:20:57 INFO storage.BlockManager: Found block rdd_9_1 locally
21/11/11 16:20:57 INFO storage.ShuffleBlockFetcherIterator: Getting 2 non-empty blocks including 2 local blocks and 0 remote blocks
21/11/11 16:20:57 INFO storage.ShuffleBlockFetcherIterator: Getting 2 non-empty blocks including 2 local blocks and 0 remote blocks
21/11/11 16:20:57 INFO storage.ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
21/11/11 16:20:57 INFO storage.ShuffleBlockFetcherIterator: Started 0 remote fetches in 1 ms
21/11/11 16:20:57 INFO storage.ShuffleBlockFetcherIterator: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
21/11/11 16:20:57 INFO storage.ShuffleBlockFetcherIterator: Started 0 remote fetches in 1 ms
21/11/11 16:20:57 INFO storage.ShuffleBlockFetcherIterator: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
21/11/11 16:20:57 INFO storage.ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
21/11/11 16:20:57 INFO executor.Executor: Finished task 0.0 in stage 10.0 (TID 10). 1106 bytes result sent to driver
21/11/11 16:20:57 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 10.0 (TID 10) in 18 ms on localhost (executor driver) (1/4)
21/11/11 16:20:57 INFO executor.Executor: Finished task 1.0 in stage 10.0 (TID 11). 1106 bytes result sent to driver
21/11/11 16:20:57 INFO scheduler.TaskSetManager: Finished task 1.0 in stage 10.0 (TID 11) in 22 ms on localhost (executor driver) (2/4)
21/11/11 16:20:57 INFO executor.Executor: Finished task 2.0 in stage 10.0 (TID 12). 1321 bytes result sent to driver
21/11/11 16:20:57 INFO executor.Executor: Finished task 3.0 in stage 10.0 (TID 13). 1364 bytes result sent to driver
21/11/11 16:20:57 INFO scheduler.TaskSetManager: Finished task 3.0 in stage 10.0 (TID 13) in 52 ms on localhost (executor driver) (3/4)
21/11/11 16:20:57 INFO scheduler.TaskSetManager: Finished task 2.0 in stage 10.0 (TID 12) in 52 ms on localhost (executor driver) (4/4)
21/11/11 16:20:57 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 10.0, whose tasks have all completed, from pool 
21/11/11 16:20:57 INFO scheduler.DAGScheduler: ShuffleMapStage 10 (distinct at SparkTC.scala:70) finished in 0.064 s
21/11/11 16:20:57 INFO scheduler.DAGScheduler: looking for newly runnable stages
21/11/11 16:20:57 INFO scheduler.DAGScheduler: running: Set()
21/11/11 16:20:57 INFO scheduler.DAGScheduler: waiting: Set(ResultStage 11)
21/11/11 16:20:57 INFO scheduler.DAGScheduler: failed: Set()
21/11/11 16:20:57 INFO scheduler.DAGScheduler: Submitting ResultStage 11 (MapPartitionsRDD[17] at distinct at SparkTC.scala:70), which has no missing parents
21/11/11 16:20:57 INFO memory.MemoryStore: Block broadcast_8 stored as values in memory (estimated size 3.6 KB, free 366.2 MB)
21/11/11 16:20:57 INFO memory.MemoryStore: Block broadcast_8_piece0 stored as bytes in memory (estimated size 2.1 KB, free 366.2 MB)
21/11/11 16:20:57 INFO storage.BlockManagerInfo: Added broadcast_8_piece0 in memory on node1:33235 (size: 2.1 KB, free: 366.3 MB)
21/11/11 16:20:57 INFO spark.SparkContext: Created broadcast 8 from broadcast at DAGScheduler.scala:1163
21/11/11 16:20:57 INFO scheduler.DAGScheduler: Submitting 4 missing tasks from ResultStage 11 (MapPartitionsRDD[17] at distinct at SparkTC.scala:70) (first 15 tasks are for partitions Vector(0, 1, 2, 3))
21/11/11 16:20:57 INFO scheduler.TaskSchedulerImpl: Adding task set 11.0 with 4 tasks
21/11/11 16:20:57 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 11.0 (TID 14, localhost, executor driver, partition 0, ANY, 7662 bytes)
21/11/11 16:20:57 INFO scheduler.TaskSetManager: Starting task 1.0 in stage 11.0 (TID 15, localhost, executor driver, partition 1, ANY, 7662 bytes)
21/11/11 16:20:57 INFO scheduler.TaskSetManager: Starting task 2.0 in stage 11.0 (TID 16, localhost, executor driver, partition 2, ANY, 7662 bytes)
21/11/11 16:20:57 INFO scheduler.TaskSetManager: Starting task 3.0 in stage 11.0 (TID 17, localhost, executor driver, partition 3, ANY, 7662 bytes)
21/11/11 16:20:57 INFO executor.Executor: Running task 0.0 in stage 11.0 (TID 14)
21/11/11 16:20:57 INFO executor.Executor: Running task 1.0 in stage 11.0 (TID 15)
21/11/11 16:20:57 INFO executor.Executor: Running task 3.0 in stage 11.0 (TID 17)
21/11/11 16:20:57 INFO executor.Executor: Running task 2.0 in stage 11.0 (TID 16)
21/11/11 16:20:57 INFO storage.ShuffleBlockFetcherIterator: Getting 3 non-empty blocks including 3 local blocks and 0 remote blocks
21/11/11 16:20:57 INFO storage.ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
21/11/11 16:20:57 INFO storage.ShuffleBlockFetcherIterator: Getting 3 non-empty blocks including 3 local blocks and 0 remote blocks
21/11/11 16:20:57 INFO storage.ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
21/11/11 16:20:57 INFO storage.ShuffleBlockFetcherIterator: Getting 3 non-empty blocks including 3 local blocks and 0 remote blocks
21/11/11 16:20:57 INFO storage.ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
21/11/11 16:20:57 INFO storage.ShuffleBlockFetcherIterator: Getting 3 non-empty blocks including 3 local blocks and 0 remote blocks
21/11/11 16:20:57 INFO storage.ShuffleBlockFetcherIterator: Started 0 remote fetches in 1 ms
21/11/11 16:20:57 INFO memory.MemoryStore: Block rdd_17_3 stored as values in memory (estimated size 13.1 KB, free 366.2 MB)
21/11/11 16:20:57 INFO storage.BlockManagerInfo: Added rdd_17_3 in memory on node1:33235 (size: 13.1 KB, free: 366.2 MB)
21/11/11 16:20:57 INFO memory.MemoryStore: Block rdd_17_0 stored as values in memory (estimated size 11.5 KB, free 366.2 MB)
21/11/11 16:20:57 INFO storage.BlockManagerInfo: Added rdd_17_0 in memory on node1:33235 (size: 11.5 KB, free: 366.2 MB)
21/11/11 16:20:57 INFO executor.Executor: Finished task 3.0 in stage 11.0 (TID 17). 1176 bytes result sent to driver
21/11/11 16:20:57 INFO executor.Executor: Finished task 0.0 in stage 11.0 (TID 14). 1176 bytes result sent to driver
21/11/11 16:20:57 INFO scheduler.TaskSetManager: Finished task 3.0 in stage 11.0 (TID 17) in 22 ms on localhost (executor driver) (1/4)
21/11/11 16:20:57 INFO memory.MemoryStore: Block rdd_17_1 stored as values in memory (estimated size 13.3 KB, free 366.2 MB)
21/11/11 16:20:57 INFO memory.MemoryStore: Block rdd_17_2 stored as values in memory (estimated size 12.4 KB, free 366.2 MB)
21/11/11 16:20:57 INFO storage.BlockManagerInfo: Added rdd_17_1 in memory on node1:33235 (size: 13.3 KB, free: 366.2 MB)
21/11/11 16:20:57 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 11.0 (TID 14) in 24 ms on localhost (executor driver) (2/4)
21/11/11 16:20:57 INFO storage.BlockManagerInfo: Added rdd_17_2 in memory on node1:33235 (size: 12.4 KB, free: 366.2 MB)
21/11/11 16:20:57 INFO executor.Executor: Finished task 1.0 in stage 11.0 (TID 15). 1176 bytes result sent to driver
21/11/11 16:20:57 INFO scheduler.TaskSetManager: Finished task 1.0 in stage 11.0 (TID 15) in 26 ms on localhost (executor driver) (3/4)
21/11/11 16:20:57 INFO executor.Executor: Finished task 2.0 in stage 11.0 (TID 16). 1176 bytes result sent to driver
21/11/11 16:20:57 INFO scheduler.TaskSetManager: Finished task 2.0 in stage 11.0 (TID 16) in 26 ms on localhost (executor driver) (4/4)
21/11/11 16:20:57 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 11.0, whose tasks have all completed, from pool 
21/11/11 16:20:57 INFO scheduler.DAGScheduler: ResultStage 11 (count at SparkTC.scala:71) finished in 0.036 s
21/11/11 16:20:57 INFO scheduler.DAGScheduler: Job 2 finished: count at SparkTC.scala:71, took 0.148658 s
21/11/11 16:20:57 INFO spark.SparkContext: Starting job: count at SparkTC.scala:76
21/11/11 16:20:57 INFO scheduler.DAGScheduler: Got job 3 (count at SparkTC.scala:76) with 4 output partitions
21/11/11 16:20:57 INFO scheduler.DAGScheduler: Final stage: ResultStage 18 (count at SparkTC.scala:76)
21/11/11 16:20:57 INFO scheduler.DAGScheduler: Parents of final stage: List(ShuffleMapStage 17)
21/11/11 16:20:57 INFO scheduler.DAGScheduler: Missing parents: List()
21/11/11 16:20:57 INFO scheduler.DAGScheduler: Submitting ResultStage 18 (MapPartitionsRDD[17] at distinct at SparkTC.scala:70), which has no missing parents
21/11/11 16:20:57 INFO memory.MemoryStore: Block broadcast_9 stored as values in memory (estimated size 3.6 KB, free 366.2 MB)
21/11/11 16:20:57 INFO memory.MemoryStore: Block broadcast_9_piece0 stored as bytes in memory (estimated size 2.1 KB, free 366.2 MB)
21/11/11 16:20:57 INFO storage.BlockManagerInfo: Added broadcast_9_piece0 in memory on node1:33235 (size: 2.1 KB, free: 366.2 MB)
21/11/11 16:20:57 INFO spark.SparkContext: Created broadcast 9 from broadcast at DAGScheduler.scala:1163
21/11/11 16:20:57 INFO scheduler.DAGScheduler: Submitting 4 missing tasks from ResultStage 18 (MapPartitionsRDD[17] at distinct at SparkTC.scala:70) (first 15 tasks are for partitions Vector(0, 1, 2, 3))
21/11/11 16:20:57 INFO scheduler.TaskSchedulerImpl: Adding task set 18.0 with 4 tasks
21/11/11 16:20:57 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 18.0 (TID 18, localhost, executor driver, partition 0, PROCESS_LOCAL, 7662 bytes)
21/11/11 16:20:57 INFO scheduler.TaskSetManager: Starting task 1.0 in stage 18.0 (TID 19, localhost, executor driver, partition 1, PROCESS_LOCAL, 7662 bytes)
21/11/11 16:20:57 INFO scheduler.TaskSetManager: Starting task 2.0 in stage 18.0 (TID 20, localhost, executor driver, partition 2, PROCESS_LOCAL, 7662 bytes)
21/11/11 16:20:57 INFO scheduler.TaskSetManager: Starting task 3.0 in stage 18.0 (TID 21, localhost, executor driver, partition 3, PROCESS_LOCAL, 7662 bytes)
21/11/11 16:20:57 INFO executor.Executor: Running task 0.0 in stage 18.0 (TID 18)
21/11/11 16:20:57 INFO executor.Executor: Running task 2.0 in stage 18.0 (TID 20)
21/11/11 16:20:57 INFO executor.Executor: Running task 1.0 in stage 18.0 (TID 19)
21/11/11 16:20:57 INFO executor.Executor: Running task 3.0 in stage 18.0 (TID 21)
21/11/11 16:20:57 INFO storage.BlockManager: Found block rdd_17_2 locally
21/11/11 16:20:57 INFO storage.BlockManager: Found block rdd_17_1 locally
21/11/11 16:20:57 INFO storage.BlockManager: Found block rdd_17_3 locally
21/11/11 16:20:57 INFO storage.BlockManager: Found block rdd_17_0 locally
21/11/11 16:20:57 INFO executor.Executor: Finished task 3.0 in stage 18.0 (TID 21). 832 bytes result sent to driver
21/11/11 16:20:57 INFO executor.Executor: Finished task 2.0 in stage 18.0 (TID 20). 832 bytes result sent to driver
21/11/11 16:20:57 INFO executor.Executor: Finished task 1.0 in stage 18.0 (TID 19). 832 bytes result sent to driver
21/11/11 16:20:57 INFO executor.Executor: Finished task 0.0 in stage 18.0 (TID 18). 789 bytes result sent to driver
21/11/11 16:20:57 INFO scheduler.TaskSetManager: Finished task 3.0 in stage 18.0 (TID 21) in 5 ms on localhost (executor driver) (1/4)
21/11/11 16:20:57 INFO scheduler.TaskSetManager: Finished task 2.0 in stage 18.0 (TID 20) in 5 ms on localhost (executor driver) (2/4)
21/11/11 16:20:57 INFO scheduler.TaskSetManager: Finished task 1.0 in stage 18.0 (TID 19) in 6 ms on localhost (executor driver) (3/4)
21/11/11 16:20:57 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 18.0 (TID 18) in 7 ms on localhost (executor driver) (4/4)
21/11/11 16:20:57 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 18.0, whose tasks have all completed, from pool 
21/11/11 16:20:57 INFO scheduler.DAGScheduler: ResultStage 18 (count at SparkTC.scala:76) finished in 0.014 s
21/11/11 16:20:57 INFO scheduler.DAGScheduler: Job 3 finished: count at SparkTC.scala:76, took 0.017208 s
TC has 1427 edges.
Our change for SparkTC.
21/11/11 16:20:57 INFO server.AbstractConnector: Stopped Spark@28c0b664{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
21/11/11 16:20:57 INFO ui.SparkUI: Stopped Spark web UI at http://node1:4040
21/11/11 16:20:58 INFO spark.MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
21/11/11 16:20:58 INFO memory.MemoryStore: MemoryStore cleared
21/11/11 16:20:58 INFO storage.BlockManager: BlockManager stopped
21/11/11 16:20:58 INFO storage.BlockManagerMaster: BlockManagerMaster stopped
21/11/11 16:20:58 INFO scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
21/11/11 16:20:58 INFO spark.SparkContext: Successfully stopped SparkContext
21/11/11 16:20:58 INFO util.ShutdownHookManager: Shutdown hook called
21/11/11 16:20:58 INFO util.ShutdownHookManager: Deleting directory /home/spark/data/scratch/spark-5f80c079-b023-42f5-925c-ace5c7dd12e0
21/11/11 16:20:58 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-e7a44284-2c65-457b-b3f5-b52babc00082

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值