spark-submit

[jifeng@feng03 spark-1.4.0-bin-hadoop2.6]$ ./bin/spark-submit --class "SimpleApp" --master spark://feng03:7077 /home/jifeng/code/simple/target/scala-2.10/simple-project_2.10-1.0.jar
15/08/20 23:23:52 INFO spark.SparkContext: Running Spark version 1.4.0
15/08/20 23:23:53 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
15/08/20 23:23:53 INFO spark.SecurityManager: Changing view acls to: jifeng
15/08/20 23:23:53 INFO spark.SecurityManager: Changing modify acls to: jifeng
15/08/20 23:23:53 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(jifeng); users with modify permissions: Set(jifeng)
15/08/20 23:23:54 INFO slf4j.Slf4jLogger: Slf4jLogger started
15/08/20 23:23:54 INFO Remoting: Starting remoting
15/08/20 23:23:54 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriver@192.168.0.110:49085]
15/08/20 23:23:54 INFO util.Utils: Successfully started service 'sparkDriver' on port 49085.
15/08/20 23:23:54 INFO spark.SparkEnv: Registering MapOutputTracker
15/08/20 23:23:54 INFO spark.SparkEnv: Registering BlockManagerMaster
15/08/20 23:23:54 INFO storage.DiskBlockManager: Created local directory at /tmp/spark-284c770f-3a73-4112-82b3-4eebfe1894d7/blockmgr-7375e0a6-5f95-460a-94ce-52ecd27102e3
15/08/20 23:23:54 INFO storage.MemoryStore: MemoryStore started with capacity 267.3 MB
15/08/20 23:23:55 INFO spark.HttpFileServer: HTTP File server directory is /tmp/spark-284c770f-3a73-4112-82b3-4eebfe1894d7/httpd-ceafd23c-1cba-41fc-a7f6-0df2655f0125
15/08/20 23:23:55 INFO spark.HttpServer: Starting HTTP Server
15/08/20 23:23:55 INFO server.Server: jetty-8.y.z-SNAPSHOT
15/08/20 23:23:55 INFO server.AbstractConnector: Started SocketConnector@0.0.0.0:47652
15/08/20 23:23:55 INFO util.Utils: Successfully started service 'HTTP file server' on port 47652.
15/08/20 23:23:55 INFO spark.SparkEnv: Registering OutputCommitCoordinator
15/08/20 23:23:55 INFO server.Server: jetty-8.y.z-SNAPSHOT
15/08/20 23:23:55 INFO server.AbstractConnector: Started SelectChannelConnector@0.0.0.0:4040
15/08/20 23:23:55 INFO util.Utils: Successfully started service 'SparkUI' on port 4040.
15/08/20 23:23:55 INFO ui.SparkUI: Started SparkUI at http://192.168.0.110:4040
15/08/20 23:23:56 INFO spark.SparkContext: Added JAR file:/home/jifeng/code/simple/target/scala-2.10/simple-project_2.10-1.0.jar at http://192.168.0.110:47652/jars/simple-project_2.10-1.0.jar with timestamp 1440084236067
15/08/20 23:23:56 INFO client.AppClient$ClientActor: Connecting to master akka.tcp://sparkMaster@feng03:7077/user/Master...
15/08/20 23:23:57 INFO cluster.SparkDeploySchedulerBackend: Connected to Spark cluster with app ID app-20150820232357-0000
15/08/20 23:23:57 INFO client.AppClient$ClientActor: Executor added: app-20150820232357-0000/0 on worker-20150820232207-192.168.0.110-57239 (192.168.0.110:57239) with 1 cores
15/08/20 23:23:57 INFO cluster.SparkDeploySchedulerBackend: Granted executor ID app-20150820232357-0000/0 on hostPort 192.168.0.110:57239 with 1 cores, 512.0 MB RAM
15/08/20 23:23:57 INFO client.AppClient$ClientActor: Executor updated: app-20150820232357-0000/0 is now RUNNING
15/08/20 23:23:57 INFO client.AppClient$ClientActor: Executor updated: app-20150820232357-0000/0 is now LOADING
15/08/20 23:23:58 INFO util.Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 43956.
15/08/20 23:23:58 INFO netty.NettyBlockTransferService: Server created on 43956
15/08/20 23:23:58 INFO storage.BlockManagerMaster: Trying to register BlockManager
15/08/20 23:23:58 INFO storage.BlockManagerMasterEndpoint: Registering block manager 192.168.0.110:43956 with 267.3 MB RAM, BlockManagerId(driver, 192.168.0.110, 43956)
15/08/20 23:23:58 INFO storage.BlockManagerMaster: Registered BlockManager
15/08/20 23:23:58 INFO cluster.SparkDeploySchedulerBackend: SchedulerBackend is ready for scheduling beginning after reached minRegisteredResourcesRatio: 0.0
15/08/20 23:24:01 INFO storage.MemoryStore: ensureFreeSpace(157784) called with curMem=0, maxMem=280248975
15/08/20 23:24:01 INFO storage.MemoryStore: Block broadcast_0 stored as values in memory (estimated size 154.1 KB, free 267.1 MB)
15/08/20 23:24:01 INFO storage.MemoryStore: ensureFreeSpace(14651) called with curMem=157784, maxMem=280248975
15/08/20 23:24:01 INFO storage.MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 14.3 KB, free 267.1 MB)
15/08/20 23:24:01 INFO storage.BlockManagerInfo: Added broadcast_0_piece0 in memory on 192.168.0.110:43956 (size: 14.3 KB, free: 267.3 MB)
15/08/20 23:24:01 INFO spark.SparkContext: Created broadcast 0 from textFile at SimpleApp.scala:11
15/08/20 23:24:05 INFO mapred.FileInputFormat: Total input paths to process : 1
15/08/20 23:24:05 INFO spark.SparkContext: Starting job: count at SimpleApp.scala:12
15/08/20 23:24:06 INFO scheduler.DAGScheduler: Got job 0 (count at SimpleApp.scala:12) with 2 output partitions (allowLocal=false)
15/08/20 23:24:06 INFO scheduler.DAGScheduler: Final stage: ResultStage 0(count at SimpleApp.scala:12)
15/08/20 23:24:06 INFO scheduler.DAGScheduler: Parents of final stage: List()
15/08/20 23:24:06 INFO scheduler.DAGScheduler: Missing parents: List()
15/08/20 23:24:06 INFO scheduler.DAGScheduler: Submitting ResultStage 0 (MapPartitionsRDD[2] at filter at SimpleApp.scala:12), which has no missing parents
15/08/20 23:24:06 INFO storage.MemoryStore: ensureFreeSpace(3184) called with curMem=172435, maxMem=280248975
15/08/20 23:24:06 INFO storage.MemoryStore: Block broadcast_1 stored as values in memory (estimated size 3.1 KB, free 267.1 MB)
15/08/20 23:24:06 INFO storage.MemoryStore: ensureFreeSpace(1888) called with curMem=175619, maxMem=280248975
15/08/20 23:24:06 INFO storage.MemoryStore: Block broadcast_1_piece0 stored as bytes in memory (estimated size 1888.0 B, free 267.1 MB)
15/08/20 23:24:06 INFO storage.BlockManagerInfo: Added broadcast_1_piece0 in memory on 192.168.0.110:43956 (size: 1888.0 B, free: 267.3 MB)
15/08/20 23:24:06 INFO spark.SparkContext: Created broadcast 1 from broadcast at DAGScheduler.scala:874
15/08/20 23:24:06 INFO scheduler.DAGScheduler: Submitting 2 missing tasks from ResultStage 0 (MapPartitionsRDD[2] at filter at SimpleApp.scala:12)
15/08/20 23:24:06 INFO scheduler.TaskSchedulerImpl: Adding task set 0.0 with 2 tasks
15/08/20 23:24:07 INFO cluster.SparkDeploySchedulerBackend: Registered executor: AkkaRpcEndpointRef(Actor[akka.tcp://sparkExecutor@192.168.0.110:35296/user/Executor#-458247930]) with ID 0
15/08/20 23:24:07 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, 192.168.0.110, PROCESS_LOCAL, 1486 bytes)
15/08/20 23:24:07 INFO storage.BlockManagerMasterEndpoint: Registering block manager 192.168.0.110:58984 with 267.3 MB RAM, BlockManagerId(0, 192.168.0.110, 58984)
15/08/20 23:24:09 INFO storage.BlockManagerInfo: Added broadcast_1_piece0 in memory on 192.168.0.110:58984 (size: 1888.0 B, free: 267.3 MB)
15/08/20 23:24:09 INFO storage.BlockManagerInfo: Added broadcast_0_piece0 in memory on 192.168.0.110:58984 (size: 14.3 KB, free: 267.3 MB)
15/08/20 23:24:11 INFO storage.BlockManagerInfo: Added rdd_1_0 in memory on 192.168.0.110:58984 (size: 6.1 KB, free: 267.2 MB)
15/08/20 23:24:11 INFO scheduler.TaskSetManager: Starting task 1.0 in stage 0.0 (TID 1, 192.168.0.110, PROCESS_LOCAL, 1486 bytes)
15/08/20 23:24:11 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 0.0 (TID 0) in 4904 ms on 192.168.0.110 (1/2)
15/08/20 23:24:12 INFO storage.BlockManagerInfo: Added rdd_1_1 in memory on 192.168.0.110:58984 (size: 5.3 KB, free: 267.2 MB)
15/08/20 23:24:12 INFO scheduler.DAGScheduler: ResultStage 0 (count at SimpleApp.scala:12) finished in 5.661 s
15/08/20 23:24:12 INFO scheduler.TaskSetManager: Finished task 1.0 in stage 0.0 (TID 1) in 159 ms on 192.168.0.110 (2/2)
15/08/20 23:24:12 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool 
15/08/20 23:24:12 INFO scheduler.DAGScheduler: Job 0 finished: count at SimpleApp.scala:12, took 6.249939 s
15/08/20 23:24:12 INFO spark.SparkContext: Starting job: count at SimpleApp.scala:13
15/08/20 23:24:12 INFO scheduler.DAGScheduler: Got job 1 (count at SimpleApp.scala:13) with 2 output partitions (allowLocal=false)
15/08/20 23:24:12 INFO scheduler.DAGScheduler: Final stage: ResultStage 1(count at SimpleApp.scala:13)
15/08/20 23:24:12 INFO scheduler.DAGScheduler: Parents of final stage: List()
15/08/20 23:24:12 INFO scheduler.DAGScheduler: Missing parents: List()
15/08/20 23:24:12 INFO scheduler.DAGScheduler: Submitting ResultStage 1 (MapPartitionsRDD[3] at filter at SimpleApp.scala:13), which has no missing parents
15/08/20 23:24:12 INFO storage.MemoryStore: ensureFreeSpace(3184) called with curMem=177507, maxMem=280248975
15/08/20 23:24:12 INFO storage.MemoryStore: Block broadcast_2 stored as values in memory (estimated size 3.1 KB, free 267.1 MB)
15/08/20 23:24:12 INFO storage.MemoryStore: ensureFreeSpace(1888) called with curMem=180691, maxMem=280248975
15/08/20 23:24:12 INFO storage.MemoryStore: Block broadcast_2_piece0 stored as bytes in memory (estimated size 1888.0 B, free 267.1 MB)
15/08/20 23:24:12 INFO storage.BlockManagerInfo: Added broadcast_2_piece0 in memory on 192.168.0.110:43956 (size: 1888.0 B, free: 267.2 MB)
15/08/20 23:24:12 INFO spark.SparkContext: Created broadcast 2 from broadcast at DAGScheduler.scala:874
15/08/20 23:24:12 INFO scheduler.DAGScheduler: Submitting 2 missing tasks from ResultStage 1 (MapPartitionsRDD[3] at filter at SimpleApp.scala:13)
15/08/20 23:24:12 INFO scheduler.TaskSchedulerImpl: Adding task set 1.0 with 2 tasks
15/08/20 23:24:12 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 1.0 (TID 2, 192.168.0.110, PROCESS_LOCAL, 1486 bytes)
15/08/20 23:24:12 INFO storage.BlockManagerInfo: Added broadcast_2_piece0 in memory on 192.168.0.110:58984 (size: 1888.0 B, free: 267.2 MB)
15/08/20 23:24:12 INFO scheduler.TaskSetManager: Starting task 1.0 in stage 1.0 (TID 3, 192.168.0.110, PROCESS_LOCAL, 1486 bytes)
15/08/20 23:24:12 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 1.0 (TID 2) in 224 ms on 192.168.0.110 (1/2)
15/08/20 23:24:12 INFO scheduler.DAGScheduler: ResultStage 1 (count at SimpleApp.scala:13) finished in 0.246 s
15/08/20 23:24:12 INFO scheduler.DAGScheduler: Job 1 finished: count at SimpleApp.scala:13, took 0.397593 s
15/08/20 23:24:12 INFO scheduler.TaskSetManager: Finished task 1.0 in stage 1.0 (TID 3) in 101 ms on 192.168.0.110 (2/2)
15/08/20 23:24:12 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 1.0, whose tasks have all completed, from pool 
Lines with a: 60, Lines with b: 29
15/08/20 23:24:12 INFO storage.MemoryStore: ensureFreeSpace(89192) called with curMem=182579, maxMem=280248975
15/08/20 23:24:12 INFO storage.MemoryStore: Block broadcast_3 stored as values in memory (estimated size 87.1 KB, free 267.0 MB)
15/08/20 23:24:12 INFO storage.MemoryStore: ensureFreeSpace(20038) called with curMem=271771, maxMem=280248975
15/08/20 23:24:12 INFO storage.MemoryStore: Block broadcast_3_piece0 stored as bytes in memory (estimated size 19.6 KB, free 267.0 MB)
15/08/20 23:24:12 INFO storage.BlockManagerInfo: Added broadcast_3_piece0 in memory on 192.168.0.110:43956 (size: 19.6 KB, free: 267.2 MB)
15/08/20 23:24:12 INFO spark.SparkContext: Created broadcast 3 from textFile at SimpleApp.scala:16
15/08/20 23:24:12 INFO mapred.FileInputFormat: Total input paths to process : 1
15/08/20 23:24:12 INFO spark.SparkContext: Starting job: collect at SimpleApp.scala:18
15/08/20 23:24:12 INFO scheduler.DAGScheduler: Registering RDD 7 (map at SimpleApp.scala:17)
15/08/20 23:24:12 INFO scheduler.DAGScheduler: Got job 2 (collect at SimpleApp.scala:18) with 2 output partitions (allowLocal=false)
15/08/20 23:24:12 INFO scheduler.DAGScheduler: Final stage: ResultStage 3(collect at SimpleApp.scala:18)
15/08/20 23:24:12 INFO scheduler.DAGScheduler: Parents of final stage: List(ShuffleMapStage 2)
15/08/20 23:24:12 INFO scheduler.DAGScheduler: Missing parents: List(ShuffleMapStage 2)
15/08/20 23:24:12 INFO scheduler.DAGScheduler: Submitting ShuffleMapStage 2 (MapPartitionsRDD[7] at map at SimpleApp.scala:17), which has no missing parents
15/08/20 23:24:12 INFO storage.MemoryStore: ensureFreeSpace(3992) called with curMem=291809, maxMem=280248975
15/08/20 23:24:12 INFO storage.MemoryStore: Block broadcast_4 stored as values in memory (estimated size 3.9 KB, free 267.0 MB)
15/08/20 23:24:12 INFO storage.MemoryStore: ensureFreeSpace(2309) called with curMem=295801, maxMem=280248975
15/08/20 23:24:12 INFO storage.MemoryStore: Block broadcast_4_piece0 stored as bytes in memory (estimated size 2.3 KB, free 267.0 MB)
15/08/20 23:24:12 INFO storage.BlockManagerInfo: Added broadcast_4_piece0 in memory on 192.168.0.110:43956 (size: 2.3 KB, free: 267.2 MB)
15/08/20 23:24:12 INFO spark.SparkContext: Created broadcast 4 from broadcast at DAGScheduler.scala:874
15/08/20 23:24:12 INFO scheduler.DAGScheduler: Submitting 2 missing tasks from ShuffleMapStage 2 (MapPartitionsRDD[7] at map at SimpleApp.scala:17)
15/08/20 23:24:12 INFO scheduler.TaskSchedulerImpl: Adding task set 2.0 with 2 tasks
15/08/20 23:24:12 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 2.0 (TID 4, 192.168.0.110, PROCESS_LOCAL, 1475 bytes)
15/08/20 23:24:13 INFO storage.BlockManagerInfo: Added broadcast_4_piece0 in memory on 192.168.0.110:58984 (size: 2.3 KB, free: 267.2 MB)
15/08/20 23:24:13 INFO storage.BlockManagerInfo: Added broadcast_3_piece0 in memory on 192.168.0.110:58984 (size: 19.6 KB, free: 267.2 MB)
15/08/20 23:24:13 INFO scheduler.TaskSetManager: Starting task 1.0 in stage 2.0 (TID 5, 192.168.0.110, PROCESS_LOCAL, 1475 bytes)
15/08/20 23:24:13 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 2.0 (TID 4) in 861 ms on 192.168.0.110 (1/2)
15/08/20 23:24:13 INFO scheduler.DAGScheduler: ShuffleMapStage 2 (map at SimpleApp.scala:17) finished in 0.901 s
15/08/20 23:24:13 INFO scheduler.DAGScheduler: looking for newly runnable stages
15/08/20 23:24:13 INFO scheduler.DAGScheduler: running: Set()
15/08/20 23:24:13 INFO scheduler.DAGScheduler: waiting: Set(ResultStage 3)
15/08/20 23:24:13 INFO scheduler.DAGScheduler: failed: Set()
15/08/20 23:24:13 INFO scheduler.TaskSetManager: Finished task 1.0 in stage 2.0 (TID 5) in 78 ms on 192.168.0.110 (2/2)
15/08/20 23:24:13 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 2.0, whose tasks have all completed, from pool 
15/08/20 23:24:13 INFO scheduler.DAGScheduler: Missing parents for ResultStage 3: List()
15/08/20 23:24:13 INFO scheduler.DAGScheduler: Submitting ResultStage 3 (ShuffledRDD[8] at reduceByKey at SimpleApp.scala:17), which is now runnable
15/08/20 23:24:13 INFO storage.MemoryStore: ensureFreeSpace(2248) called with curMem=298110, maxMem=280248975
15/08/20 23:24:13 INFO storage.MemoryStore: Block broadcast_5 stored as values in memory (estimated size 2.2 KB, free 267.0 MB)
15/08/20 23:24:13 INFO storage.MemoryStore: ensureFreeSpace(1365) called with curMem=300358, maxMem=280248975
15/08/20 23:24:13 INFO storage.MemoryStore: Block broadcast_5_piece0 stored as bytes in memory (estimated size 1365.0 B, free 267.0 MB)
15/08/20 23:24:13 INFO storage.BlockManagerInfo: Added broadcast_5_piece0 in memory on 192.168.0.110:43956 (size: 1365.0 B, free: 267.2 MB)
15/08/20 23:24:13 INFO spark.SparkContext: Created broadcast 5 from broadcast at DAGScheduler.scala:874
15/08/20 23:24:13 INFO scheduler.DAGScheduler: Submitting 2 missing tasks from ResultStage 3 (ShuffledRDD[8] at reduceByKey at SimpleApp.scala:17)
15/08/20 23:24:13 INFO scheduler.TaskSchedulerImpl: Adding task set 3.0 with 2 tasks
15/08/20 23:24:13 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 3.0 (TID 6, 192.168.0.110, PROCESS_LOCAL, 1234 bytes)
15/08/20 23:24:13 INFO storage.BlockManagerInfo: Added broadcast_5_piece0 in memory on 192.168.0.110:58984 (size: 1365.0 B, free: 267.2 MB)
15/08/20 23:24:13 INFO spark.MapOutputTrackerMasterEndpoint: Asked to send map output locations for shuffle 0 to 192.168.0.110:35296
15/08/20 23:24:13 INFO spark.MapOutputTrackerMaster: Size of output statuses for shuffle 0 is 151 bytes
15/08/20 23:24:14 INFO scheduler.TaskSetManager: Starting task 1.0 in stage 3.0 (TID 7, 192.168.0.110, PROCESS_LOCAL, 1234 bytes)
15/08/20 23:24:14 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 3.0 (TID 6) in 192 ms on 192.168.0.110 (1/2)
15/08/20 23:24:14 INFO scheduler.DAGScheduler: ResultStage 3 (collect at SimpleApp.scala:18) finished in 0.225 s
15/08/20 23:24:14 INFO scheduler.DAGScheduler: Job 2 finished: collect at SimpleApp.scala:18, took 1.277195 s
15/08/20 23:24:14 INFO scheduler.TaskSetManager: Finished task 1.0 in stage 3.0 (TID 7) in 89 ms on 192.168.0.110 (2/2)
15/08/20 23:24:14 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 3.0, whose tasks have all completed, from pool 
15/08/20 23:24:14 INFO Configuration.deprecation: mapred.tip.id is deprecated. Instead, use mapreduce.task.id
15/08/20 23:24:14 INFO Configuration.deprecation: mapred.task.id is deprecated. Instead, use mapreduce.task.attempt.id
15/08/20 23:24:14 INFO Configuration.deprecation: mapred.task.is.map is deprecated. Instead, use mapreduce.task.ismap
15/08/20 23:24:14 INFO Configuration.deprecation: mapred.task.partition is deprecated. Instead, use mapreduce.task.partition
15/08/20 23:24:14 INFO Configuration.deprecation: mapred.job.id is deprecated. Instead, use mapreduce.job.id
15/08/20 23:24:14 INFO spark.SparkContext: Starting job: saveAsTextFile at SimpleApp.scala:19
15/08/20 23:24:14 INFO scheduler.DAGScheduler: Got job 3 (saveAsTextFile at SimpleApp.scala:19) with 2 output partitions (allowLocal=false)
15/08/20 23:24:14 INFO scheduler.DAGScheduler: Final stage: ResultStage 5(saveAsTextFile at SimpleApp.scala:19)
15/08/20 23:24:14 INFO scheduler.DAGScheduler: Parents of final stage: List(ShuffleMapStage 4)
15/08/20 23:24:14 INFO scheduler.DAGScheduler: Missing parents: List()
15/08/20 23:24:14 INFO scheduler.DAGScheduler: Submitting ResultStage 5 (MapPartitionsRDD[9] at saveAsTextFile at SimpleApp.scala:19), which has no missing parents
15/08/20 23:24:14 INFO storage.MemoryStore: ensureFreeSpace(127920) called with curMem=301723, maxMem=280248975
15/08/20 23:24:14 INFO storage.MemoryStore: Block broadcast_6 stored as values in memory (estimated size 124.9 KB, free 266.9 MB)
15/08/20 23:24:14 INFO storage.MemoryStore: ensureFreeSpace(43232) called with curMem=429643, maxMem=280248975
15/08/20 23:24:14 INFO storage.MemoryStore: Block broadcast_6_piece0 stored as bytes in memory (estimated size 42.2 KB, free 266.8 MB)
15/08/20 23:24:14 INFO storage.BlockManagerInfo: Added broadcast_6_piece0 in memory on 192.168.0.110:43956 (size: 42.2 KB, free: 267.2 MB)
15/08/20 23:24:14 INFO spark.SparkContext: Created broadcast 6 from broadcast at DAGScheduler.scala:874
15/08/20 23:24:14 INFO scheduler.DAGScheduler: Submitting 2 missing tasks from ResultStage 5 (MapPartitionsRDD[9] at saveAsTextFile at SimpleApp.scala:19)
15/08/20 23:24:14 INFO scheduler.TaskSchedulerImpl: Adding task set 5.0 with 2 tasks
15/08/20 23:24:14 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 5.0 (TID 8, 192.168.0.110, PROCESS_LOCAL, 1234 bytes)
15/08/20 23:24:14 INFO storage.BlockManagerInfo: Added broadcast_6_piece0 in memory on 192.168.0.110:58984 (size: 42.2 KB, free: 267.2 MB)
15/08/20 23:24:15 INFO scheduler.TaskSetManager: Starting task 1.0 in stage 5.0 (TID 9, 192.168.0.110, PROCESS_LOCAL, 1234 bytes)
15/08/20 23:24:15 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 5.0 (TID 8) in 964 ms on 192.168.0.110 (1/2)
15/08/20 23:24:16 INFO scheduler.DAGScheduler: ResultStage 5 (saveAsTextFile at SimpleApp.scala:19) finished in 1.351 s
15/08/20 23:24:16 INFO scheduler.DAGScheduler: Job 3 finished: saveAsTextFile at SimpleApp.scala:19, took 1.610664 s
15/08/20 23:24:16 INFO scheduler.TaskSetManager: Finished task 1.0 in stage 5.0 (TID 9) in 427 ms on 192.168.0.110 (2/2)
15/08/20 23:24:16 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 5.0, whose tasks have all completed, from pool 
15/08/20 23:24:16 INFO spark.SparkContext: Invoking stop() from shutdown hook
15/08/20 23:24:16 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/metrics/json,null}
15/08/20 23:24:16 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/kill,null}
15/08/20 23:24:16 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/api,null}
15/08/20 23:24:16 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/,null}
15/08/20 23:24:16 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/static,null}
15/08/20 23:24:16 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/threadDump/json,null}
15/08/20 23:24:16 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/threadDump,null}
15/08/20 23:24:16 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/json,null}
15/08/20 23:24:16 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors,null}
15/08/20 23:24:16 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/environment/json,null}
15/08/20 23:24:16 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/environment,null}
15/08/20 23:24:16 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/rdd/json,null}
15/08/20 23:24:16 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/rdd,null}
15/08/20 23:24:16 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/json,null}
15/08/20 23:24:16 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage,null}
15/08/20 23:24:16 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/pool/json,null}
15/08/20 23:24:16 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/pool,null}
15/08/20 23:24:16 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/json,null}
15/08/20 23:24:16 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage,null}
15/08/20 23:24:16 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/json,null}
15/08/20 23:24:16 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages,null}
15/08/20 23:24:16 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/job/json,null}
15/08/20 23:24:16 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/job,null}
15/08/20 23:24:16 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/json,null}
15/08/20 23:24:16 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs,null}
15/08/20 23:24:16 INFO ui.SparkUI: Stopped Spark web UI at http://192.168.0.110:4040
15/08/20 23:24:16 INFO scheduler.DAGScheduler: Stopping DAGScheduler
15/08/20 23:24:16 INFO cluster.SparkDeploySchedulerBackend: Shutting down all executors
15/08/20 23:24:16 INFO cluster.SparkDeploySchedulerBackend: Asking each executor to shut down
15/08/20 23:24:16 INFO spark.MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
15/08/20 23:24:16 INFO util.Utils: path = /tmp/spark-284c770f-3a73-4112-82b3-4eebfe1894d7/blockmgr-7375e0a6-5f95-460a-94ce-52ecd27102e3, already present as root for deletion.
15/08/20 23:24:16 INFO storage.MemoryStore: MemoryStore cleared
15/08/20 23:24:16 INFO storage.BlockManager: BlockManager stopped
15/08/20 23:24:16 INFO storage.BlockManagerMaster: BlockManagerMaster stopped
15/08/20 23:24:16 INFO scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
15/08/20 23:24:16 INFO remote.RemoteActorRefProvider$RemotingTerminator: Shutting down remote daemon.
15/08/20 23:24:16 INFO spark.SparkContext: Successfully stopped SparkContext
15/08/20 23:24:16 INFO util.Utils: Shutdown hook called
15/08/20 23:24:16 INFO remote.RemoteActorRefProvider$RemotingTerminator: Remote daemon shut down; proceeding with flushing remote transports.
15/08/20 23:24:16 INFO util.Utils: Deleting directory /tmp/spark-284c770f-3a73-4112-82b3-4eebfe1894d7
15/08/20 23:24:16 INFO remote.RemoteActorRefProvider$RemotingTerminator: Remoting shut down.
[jifeng@feng03 spark-1.4.0-bin-hadoop2.6]$ 

 stderr log page for app-20150820232357-0000/0

Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties

15/08/20 23:23:59 INFO CoarseGrainedExecutorBackend: Registered signal handlers for [TERM, HUP, INT]
15/08/20 23:24:00 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
15/08/20 23:24:01 INFO SecurityManager: Changing view acls to: jifeng
15/08/20 23:24:01 INFO SecurityManager: Changing modify acls to: jifeng
15/08/20 23:24:01 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(jifeng); users with modify permissions: Set(jifeng)
15/08/20 23:24:03 INFO Slf4jLogger: Slf4jLogger started
15/08/20 23:24:03 INFO Remoting: Starting remoting
15/08/20 23:24:03 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://driverPropsFetcher@192.168.0.110:47879]
15/08/20 23:24:03 INFO Utils: Successfully started service 'driverPropsFetcher' on port 47879.
15/08/20 23:24:04 INFO SecurityManager: Changing view acls to: jifeng
15/08/20 23:24:04 INFO SecurityManager: Changing modify acls to: jifeng
15/08/20 23:24:04 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(jifeng); users with modify permissions: Set(jifeng)
15/08/20 23:24:04 INFO RemoteActorRefProvider$RemotingTerminator: Shutting down remote daemon.
15/08/20 23:24:04 INFO RemoteActorRefProvider$RemotingTerminator: Remote daemon shut down; proceeding with flushing remote transports.
15/08/20 23:24:05 INFO Slf4jLogger: Slf4jLogger started
15/08/20 23:24:05 INFO RemoteActorRefProvider$RemotingTerminator: Remoting shut down.
15/08/20 23:24:05 INFO Remoting: Starting remoting
15/08/20 23:24:05 INFO Utils: Successfully started service 'sparkExecutor' on port 35296.
15/08/20 23:24:05 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkExecutor@192.168.0.110:35296]
15/08/20 23:24:05 INFO DiskBlockManager: Created local directory at /tmp/spark-6055212e-bb3d-4566-9dfc-28973c7b1c3f/executor-21c3a9be-c3d5-4dfa-9124-8215c2b5a891/blockmgr-29900663-d96c-4972-aaac-06b9c6489ab2
15/08/20 23:24:05 INFO MemoryStore: MemoryStore started with capacity 267.3 MB
15/08/20 23:24:06 INFO CoarseGrainedExecutorBackend: Connecting to driver: akka.tcp://sparkDriver@192.168.0.110:49085/user/CoarseGrainedScheduler
15/08/20 23:24:06 INFO WorkerWatcher: Connecting to worker akka.tcp://sparkWorker@192.168.0.110:57239/user/Worker
15/08/20 23:24:06 INFO WorkerWatcher: Successfully connected to akka.tcp://sparkWorker@192.168.0.110:57239/user/Worker
15/08/20 23:24:07 INFO CoarseGrainedExecutorBackend: Successfully registered with driver
15/08/20 23:24:07 INFO Executor: Starting executor ID 0 on host 192.168.0.110
15/08/20 23:24:07 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 58984.
15/08/20 23:24:07 INFO NettyBlockTransferService: Server created on 58984
15/08/20 23:24:07 INFO BlockManagerMaster: Trying to register BlockManager
15/08/20 23:24:07 INFO BlockManagerMaster: Registered BlockManager
15/08/20 23:24:07 INFO CoarseGrainedExecutorBackend: Got assigned task 0
15/08/20 23:24:07 INFO Executor: Running task 0.0 in stage 0.0 (TID 0)
15/08/20 23:24:07 INFO Executor: Fetching http://192.168.0.110:47652/jars/simple-project_2.10-1.0.jar with timestamp 1440084236067
15/08/20 23:24:07 INFO Utils: Fetching http://192.168.0.110:47652/jars/simple-project_2.10-1.0.jar to /tmp/spark-6055212e-bb3d-4566-9dfc-28973c7b1c3f/executor-21c3a9be-c3d5-4dfa-9124-8215c2b5a891/fetchFileTemp8408119354178446390.tmp
15/08/20 23:24:07 INFO Utils: Copying /tmp/spark-6055212e-bb3d-4566-9dfc-28973c7b1c3f/executor-21c3a9be-c3d5-4dfa-9124-8215c2b5a891/-4227031201440084236067_cache to /home/jifeng/spark-1.4.0-bin-hadoop2.6/work/app-20150820232357-0000/0/./simple-project_2.10-1.0.jar
15/08/20 23:24:07 INFO Executor: Adding file:/home/jifeng/spark-1.4.0-bin-hadoop2.6/work/app-20150820232357-0000/0/./simple-project_2.10-1.0.jar to class loader
15/08/20 23:24:08 INFO TorrentBroadcast: Started reading broadcast variable 1
15/08/20 23:24:09 INFO MemoryStore: ensureFreeSpace(1888) called with curMem=0, maxMem=280248975
15/08/20 23:24:09 INFO MemoryStore: Block broadcast_1_piece0 stored as bytes in memory (estimated size 1888.0 B, free 267.3 MB)
15/08/20 23:24:09 INFO TorrentBroadcast: Reading broadcast variable 1 took 1093 ms
15/08/20 23:24:09 INFO MemoryStore: ensureFreeSpace(3184) called with curMem=1888, maxMem=280248975
15/08/20 23:24:09 INFO MemoryStore: Block broadcast_1 stored as values in memory (estimated size 3.1 KB, free 267.3 MB)
15/08/20 23:24:09 INFO CacheManager: Partition rdd_1_0 not found, computing it
15/08/20 23:24:09 INFO HadoopRDD: Input split: file:/home/jifeng/code/spark-1.4.0/README.md:0+1812
15/08/20 23:24:09 INFO TorrentBroadcast: Started reading broadcast variable 0
15/08/20 23:24:09 INFO MemoryStore: ensureFreeSpace(14651) called with curMem=5072, maxMem=280248975
15/08/20 23:24:09 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 14.3 KB, free 267.2 MB)
15/08/20 23:24:09 INFO TorrentBroadcast: Reading broadcast variable 0 took 41 ms
15/08/20 23:24:09 INFO MemoryStore: ensureFreeSpace(222384) called with curMem=19723, maxMem=280248975
15/08/20 23:24:09 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 217.2 KB, free 267.0 MB)
15/08/20 23:24:11 INFO deprecation: mapred.tip.id is deprecated. Instead, use mapreduce.task.id
15/08/20 23:24:11 INFO deprecation: mapred.task.id is deprecated. Instead, use mapreduce.task.attempt.id
15/08/20 23:24:11 INFO deprecation: mapred.task.is.map is deprecated. Instead, use mapreduce.task.ismap
15/08/20 23:24:11 INFO deprecation: mapred.task.partition is deprecated. Instead, use mapreduce.task.partition
15/08/20 23:24:11 INFO deprecation: mapred.job.id is deprecated. Instead, use mapreduce.job.id
15/08/20 23:24:11 INFO MemoryStore: ensureFreeSpace(6208) called with curMem=242107, maxMem=280248975
15/08/20 23:24:11 INFO MemoryStore: Block rdd_1_0 stored as values in memory (estimated size 6.1 KB, free 267.0 MB)
15/08/20 23:24:11 INFO Executor: Finished task 0.0 in stage 0.0 (TID 0). 2414 bytes result sent to driver
15/08/20 23:24:11 INFO CoarseGrainedExecutorBackend: Got assigned task 1
15/08/20 23:24:11 INFO Executor: Running task 1.0 in stage 0.0 (TID 1)
15/08/20 23:24:11 INFO CacheManager: Partition rdd_1_1 not found, computing it
15/08/20 23:24:11 INFO HadoopRDD: Input split: file:/home/jifeng/code/spark-1.4.0/README.md:1812+1812
15/08/20 23:24:12 INFO MemoryStore: ensureFreeSpace(5384) called with curMem=248315, maxMem=280248975
15/08/20 23:24:12 INFO MemoryStore: Block rdd_1_1 stored as values in memory (estimated size 5.3 KB, free 267.0 MB)
15/08/20 23:24:12 INFO Executor: Finished task 1.0 in stage 0.0 (TID 1). 2414 bytes result sent to driver
15/08/20 23:24:12 INFO CoarseGrainedExecutorBackend: Got assigned task 2
15/08/20 23:24:12 INFO Executor: Running task 0.0 in stage 1.0 (TID 2)
15/08/20 23:24:12 INFO TorrentBroadcast: Started reading broadcast variable 2
15/08/20 23:24:12 INFO MemoryStore: ensureFreeSpace(1888) called with curMem=253699, maxMem=280248975
15/08/20 23:24:12 INFO MemoryStore: Block broadcast_2_piece0 stored as bytes in memory (estimated size 1888.0 B, free 267.0 MB)
15/08/20 23:24:12 INFO TorrentBroadcast: Reading broadcast variable 2 took 51 ms
15/08/20 23:24:12 INFO MemoryStore: ensureFreeSpace(3184) called with curMem=255587, maxMem=280248975
15/08/20 23:24:12 INFO MemoryStore: Block broadcast_2 stored as values in memory (estimated size 3.1 KB, free 267.0 MB)
15/08/20 23:24:12 INFO BlockManager: Found block rdd_1_0 locally
15/08/20 23:24:12 INFO Executor: Finished task 0.0 in stage 1.0 (TID 2). 1834 bytes result sent to driver
15/08/20 23:24:12 INFO CoarseGrainedExecutorBackend: Got assigned task 3
15/08/20 23:24:12 INFO Executor: Running task 1.0 in stage 1.0 (TID 3)
15/08/20 23:24:12 INFO BlockManager: Found block rdd_1_1 locally
15/08/20 23:24:12 INFO Executor: Finished task 1.0 in stage 1.0 (TID 3). 1834 bytes result sent to driver
15/08/20 23:24:12 INFO CoarseGrainedExecutorBackend: Got assigned task 4
15/08/20 23:24:12 INFO Executor: Running task 0.0 in stage 2.0 (TID 4)
15/08/20 23:24:13 INFO TorrentBroadcast: Started reading broadcast variable 4
15/08/20 23:24:13 INFO MemoryStore: ensureFreeSpace(2309) called with curMem=258771, maxMem=280248975
15/08/20 23:24:13 INFO MemoryStore: Block broadcast_4_piece0 stored as bytes in memory (estimated size 2.3 KB, free 267.0 MB)
15/08/20 23:24:13 INFO TorrentBroadcast: Reading broadcast variable 4 took 34 ms
15/08/20 23:24:13 INFO MemoryStore: ensureFreeSpace(3992) called with curMem=261080, maxMem=280248975
15/08/20 23:24:13 INFO MemoryStore: Block broadcast_4 stored as values in memory (estimated size 3.9 KB, free 267.0 MB)
15/08/20 23:24:13 INFO HadoopRDD: Input split: file:/home/jifeng/code/spark-1.4.0/README.md:0+1812
15/08/20 23:24:13 INFO TorrentBroadcast: Started reading broadcast variable 3
15/08/20 23:24:13 INFO MemoryStore: ensureFreeSpace(20038) called with curMem=265072, maxMem=280248975
15/08/20 23:24:13 INFO MemoryStore: Block broadcast_3_piece0 stored as bytes in memory (estimated size 19.6 KB, free 267.0 MB)
15/08/20 23:24:13 INFO TorrentBroadcast: Reading broadcast variable 3 took 18 ms
15/08/20 23:24:13 INFO MemoryStore: ensureFreeSpace(342104) called with curMem=285110, maxMem=280248975
15/08/20 23:24:13 INFO MemoryStore: Block broadcast_3 stored as values in memory (estimated size 334.1 KB, free 266.7 MB)
15/08/20 23:24:13 INFO Executor: Finished task 0.0 in stage 2.0 (TID 4). 2005 bytes result sent to driver
15/08/20 23:24:13 INFO CoarseGrainedExecutorBackend: Got assigned task 5
15/08/20 23:24:13 INFO Executor: Running task 1.0 in stage 2.0 (TID 5)
15/08/20 23:24:13 INFO HadoopRDD: Input split: file:/home/jifeng/code/spark-1.4.0/README.md:1812+1812
15/08/20 23:24:13 INFO Executor: Finished task 1.0 in stage 2.0 (TID 5). 2005 bytes result sent to driver
15/08/20 23:24:13 INFO CoarseGrainedExecutorBackend: Got assigned task 6
15/08/20 23:24:13 INFO Executor: Running task 0.0 in stage 3.0 (TID 6)
15/08/20 23:24:13 INFO MapOutputTrackerWorker: Updating epoch to 1 and clearing cache
15/08/20 23:24:13 INFO TorrentBroadcast: Started reading broadcast variable 5
15/08/20 23:24:13 INFO MemoryStore: ensureFreeSpace(1365) called with curMem=627214, maxMem=280248975
15/08/20 23:24:13 INFO MemoryStore: Block broadcast_5_piece0 stored as bytes in memory (estimated size 1365.0 B, free 266.7 MB)
15/08/20 23:24:13 INFO TorrentBroadcast: Reading broadcast variable 5 took 28 ms
15/08/20 23:24:13 INFO MemoryStore: ensureFreeSpace(2248) called with curMem=628579, maxMem=280248975
15/08/20 23:24:13 INFO MemoryStore: Block broadcast_5 stored as values in memory (estimated size 2.2 KB, free 266.7 MB)
15/08/20 23:24:13 INFO MapOutputTrackerWorker: Don't have map outputs for shuffle 0, fetching them
15/08/20 23:24:13 INFO MapOutputTrackerWorker: Doing the fetch; tracker endpoint = AkkaRpcEndpointRef(Actor[akka.tcp://sparkDriver@192.168.0.110:49085/user/MapOutputTracker#1391227610])
15/08/20 23:24:13 INFO MapOutputTrackerWorker: Got the output locations
15/08/20 23:24:14 INFO ShuffleBlockFetcherIterator: Getting 2 non-empty blocks out of 2 blocks
15/08/20 23:24:14 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 6 ms
15/08/20 23:24:14 INFO Executor: Finished task 0.0 in stage 3.0 (TID 6). 4385 bytes result sent to driver
15/08/20 23:24:14 INFO CoarseGrainedExecutorBackend: Got assigned task 7
15/08/20 23:24:14 INFO Executor: Running task 1.0 in stage 3.0 (TID 7)
15/08/20 23:24:14 INFO ShuffleBlockFetcherIterator: Getting 2 non-empty blocks out of 2 blocks
15/08/20 23:24:14 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 17 ms
15/08/20 23:24:14 INFO Executor: Finished task 1.0 in stage 3.0 (TID 7). 3920 bytes result sent to driver
15/08/20 23:24:14 INFO CoarseGrainedExecutorBackend: Got assigned task 8
15/08/20 23:24:14 INFO Executor: Running task 0.0 in stage 5.0 (TID 8)
15/08/20 23:24:14 INFO TorrentBroadcast: Started reading broadcast variable 6
15/08/20 23:24:14 INFO MemoryStore: ensureFreeSpace(43232) called with curMem=630827, maxMem=280248975
15/08/20 23:24:14 INFO MemoryStore: Block broadcast_6_piece0 stored as bytes in memory (estimated size 42.2 KB, free 266.6 MB)
15/08/20 23:24:14 INFO TorrentBroadcast: Reading broadcast variable 6 took 38 ms
15/08/20 23:24:14 INFO MemoryStore: ensureFreeSpace(127920) called with curMem=674059, maxMem=280248975
15/08/20 23:24:14 INFO MemoryStore: Block broadcast_6 stored as values in memory (estimated size 124.9 KB, free 266.5 MB)
15/08/20 23:24:15 INFO ShuffleBlockFetcherIterator: Getting 2 non-empty blocks out of 2 blocks
15/08/20 23:24:15 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
15/08/20 23:24:15 INFO FileOutputCommitter: Saved output of task 'attempt_201508202324_0005_m_000000_8' to file:/home/jifeng/code/out/bb/_temporary/0/task_201508202324_0005_m_000000
15/08/20 23:24:15 INFO SparkHadoopMapRedUtil: attempt_201508202324_0005_m_000000_8: Committed
15/08/20 23:24:15 INFO Executor: Finished task 0.0 in stage 5.0 (TID 8). 1832 bytes result sent to driver
15/08/20 23:24:15 INFO CoarseGrainedExecutorBackend: Got assigned task 9
15/08/20 23:24:15 INFO Executor: Running task 1.0 in stage 5.0 (TID 9)
15/08/20 23:24:16 INFO ShuffleBlockFetcherIterator: Getting 2 non-empty blocks out of 2 blocks
15/08/20 23:24:16 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 4 ms
15/08/20 23:24:16 INFO FileOutputCommitter: Saved output of task 'attempt_201508202324_0005_m_000001_9' to file:/home/jifeng/code/out/bb/_temporary/0/task_201508202324_0005_m_000001
15/08/20 23:24:16 INFO SparkHadoopMapRedUtil: attempt_201508202324_0005_m_000001_9: Committed
15/08/20 23:24:16 INFO Executor: Finished task 1.0 in stage 5.0 (TID 9). 1832 bytes result sent to driver
15/08/20 23:24:16 INFO CoarseGrainedExecutorBackend: Driver commanded a shutdown
15/08/20 23:24:16 INFO MemoryStore: MemoryStore cleared

  • 0
    点赞
  • 1
    收藏
    觉得还不错? 一键收藏
  • 3
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论 3
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值