spark run local

[root@cdh1 scala-2.9.3]# cd 
[root@cdh1 ~]# cd $HADOOP_HOME
[root@cdh1 hadoop-2.6.0]# pwd
/user/local/hadoop-2.6.0
[root@cdh1 hadoop-2.6.0]# cd etc/hadoop/
[root@cdh1 hadoop]# pwd
/user/local/hadoop-2.6.0/etc/hadoop
[root@cdh1 hadoop]# bin/run-example SparkPi 10 --master local[2] 
bash: bin/run-example: No such file or directory
[root@cdh1 hadoop]# run-example SparkPi 10 --master local[2] 
bash: run-example: command not found
[root@cdh1 hadoop]# source /etc/profile
[root@cdh1 hadoop]# run-example SparkPi 10 --master local[2] 
16/06/17 09:18:17 INFO spark.SparkContext: Running Spark version 1.4.0
16/06/17 09:18:19 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
16/06/17 09:18:20 WARN util.Utils: Your hostname, cdh1 resolves to a loopback address: 127.0.0.1; using 192.168.0.103 instead (on interface eth3)
16/06/17 09:18:20 WARN util.Utils: Set SPARK_LOCAL_IP if you need to bind to another address

16/06/17 09:18:20 INFO spark.SecurityManager: Changing view acls to: root
16/06/17 09:18:20 INFO spark.SecurityManager: Changing modify acls to: root
16/06/17 09:18:20 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); users with modify permissions: Set(root)
16/06/17 09:18:24 INFO slf4j.Slf4jLogger: Slf4jLogger started
16/06/17 09:18:25 INFO Remoting: Starting remoting
16/06/17 09:18:26 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriver@192.168.0.103:38758]
16/06/17 09:18:26 INFO util.Utils: Successfully started service 'sparkDriver' on port 38758.
16/06/17 09:18:26 INFO spark.SparkEnv: Registering MapOutputTracker
16/06/17 09:18:26 INFO spark.SparkEnv: Registering BlockManagerMaster
16/06/17 09:18:27 INFO storage.DiskBlockManager: Created local directory at /tmp/spark-3b4a18d7-9cba-44c6-a12c-ef262a8299be/blockmgr-cf9766ee-d283-4c16-a64e-45ee1c3a2a15
16/06/17 09:18:27 INFO storage.MemoryStore: MemoryStore started with capacity 267.3 MB
16/06/17 09:18:27 INFO spark.HttpFileServer: HTTP File server directory is /tmp/spark-3b4a18d7-9cba-44c6-a12c-ef262a8299be/httpd-f5340b24-be66-4407-8cc3-b03b6854bb93
16/06/17 09:18:27 INFO spark.HttpServer: Starting HTTP Server
16/06/17 09:18:27 INFO server.Server: jetty-8.y.z-SNAPSHOT
16/06/17 09:18:27 INFO server.AbstractConnector: Started SocketConnector@0.0.0.0:40595
16/06/17 09:18:27 INFO util.Utils: Successfully started service 'HTTP file server' on port 40595.
16/06/17 09:18:28 INFO spark.SparkEnv: Registering OutputCommitCoordinator
16/06/17 09:18:33 INFO server.Server: jetty-8.y.z-SNAPSHOT
16/06/17 09:18:33 INFO server.AbstractConnector: Started SelectChannelConnector@0.0.0.0:4040
16/06/17 09:18:33 INFO util.Utils: Successfully started service 'SparkUI' on port 4040.
16/06/17 09:18:33 INFO ui.SparkUI: Started SparkUI at http://192.168.0.103:4040
16/06/17 09:18:42 INFO spark.SparkContext: Added JAR file:/user/local/spark-1.4.0-bin-hadoop2.6/lib/spark-examples-1.4.0-hadoop2.6.0.jar at http://192.168.0.103:40595/jars/spark-examples-1.4.0-hadoop2.6.0.jar with timestamp 1466180321955
16/06/17 09:18:42 INFO executor.Executor: Starting executor ID driver on host localhost
16/06/17 09:18:43 INFO util.Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 34509.
16/06/17 09:18:43 INFO netty.NettyBlockTransferService: Server created on 34509
16/06/17 09:18:44 INFO storage.BlockManagerMaster: Trying to register BlockManager
16/06/17 09:18:44 INFO storage.BlockManagerMasterEndpoint: Registering block manager localhost:34509 with 267.3 MB RAM, BlockManagerId(driver, localhost, 34509)
16/06/17 09:18:44 INFO storage.BlockManagerMaster: Registered BlockManager
16/06/17 09:18:46 INFO spark.SparkContext: Starting job: reduce at SparkPi.scala:35
16/06/17 09:18:46 INFO scheduler.DAGScheduler: Got job 0 (reduce at SparkPi.scala:35) with 10 output partitions (allowLocal=false)
16/06/17 09:18:46 INFO scheduler.DAGScheduler: Final stage: ResultStage 0(reduce at SparkPi.scala:35)
16/06/17 09:18:46 INFO scheduler.DAGScheduler: Parents of final stage: List()
16/06/17 09:18:46 INFO scheduler.DAGScheduler: Missing parents: List()
16/06/17 09:18:46 INFO scheduler.DAGScheduler: Submitting ResultStage 0 (MapPartitionsRDD[1] at map at SparkPi.scala:31), which has no missing parents
16/06/17 09:18:47 WARN util.SizeEstimator: Failed to check whether UseCompressedOops is set; assuming yes
16/06/17 09:18:47 INFO storage.MemoryStore: ensureFreeSpace(1888) called with curMem=0, maxMem=280248975
16/06/17 09:18:47 INFO storage.MemoryStore: Block broadcast_0 stored as values in memory (estimated size 1888.0 B, free 267.3 MB)
16/06/17 09:18:47 INFO storage.MemoryStore: ensureFreeSpace(1200) called with curMem=1888, maxMem=280248975
16/06/17 09:18:47 INFO storage.MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 1200.0 B, free 267.3 MB)
16/06/17 09:18:47 INFO storage.BlockManagerInfo: Added broadcast_0_piece0 in memory on localhost:34509 (size: 1200.0 B, free: 267.3 MB)
16/06/17 09:18:47 INFO spark.SparkContext: Created broadcast 0 from broadcast at DAGScheduler.scala:874
16/06/17 09:18:48 INFO scheduler.DAGScheduler: Submitting 10 missing tasks from ResultStage 0 (MapPartitionsRDD[1] at map at SparkPi.scala:31)
16/06/17 09:18:48 INFO scheduler.TaskSchedulerImpl: Adding task set 0.0 with 10 tasks
16/06/17 09:18:48 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, localhost, PROCESS_LOCAL, 1447 bytes)
16/06/17 09:18:48 INFO executor.Executor: Running task 0.0 in stage 0.0 (TID 0)
16/06/17 09:18:48 INFO executor.Executor: Fetching http://192.168.0.103:40595/jars/spark-examples-1.4.0-hadoop2.6.0.jar with timestamp 1466180321955
16/06/17 09:18:48 INFO util.Utils: Fetching http://192.168.0.103:40595/jars/spark-examples-1.4.0-hadoop2.6.0.jar to /tmp/spark-3b4a18d7-9cba-44c6-a12c-ef262a8299be/userFiles-b5232822-d080-43e9-a660-21c57227a5f5/fetchFileTemp4267591009933357699.tmp
16/06/17 09:19:03 INFO executor.Executor: Adding file:/tmp/spark-3b4a18d7-9cba-44c6-a12c-ef262a8299be/userFiles-b5232822-d080-43e9-a660-21c57227a5f5/spark-examples-1.4.0-hadoop2.6.0.jar to class loader
16/06/17 09:19:04 INFO executor.Executor: Finished task 0.0 in stage 0.0 (TID 0). 736 bytes result sent to driver
16/06/17 09:19:04 INFO scheduler.TaskSetManager: Starting task 1.0 in stage 0.0 (TID 1, localhost, PROCESS_LOCAL, 1447 bytes)
16/06/17 09:19:04 INFO executor.Executor: Running task 1.0 in stage 0.0 (TID 1)
16/06/17 09:19:04 INFO executor.Executor: Finished task 1.0 in stage 0.0 (TID 1). 736 bytes result sent to driver
16/06/17 09:19:04 INFO scheduler.TaskSetManager: Starting task 2.0 in stage 0.0 (TID 2, localhost, PROCESS_LOCAL, 1447 bytes)
16/06/17 09:19:04 INFO executor.Executor: Running task 2.0 in stage 0.0 (TID 2)
16/06/17 09:19:04 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 0.0 (TID 0) in 16077 ms on localhost (1/10)
16/06/17 09:19:04 INFO scheduler.TaskSetManager: Finished task 1.0 in stage 0.0 (TID 1) in 195 ms on localhost (2/10)
16/06/17 09:19:04 INFO executor.Executor: Finished task 2.0 in stage 0.0 (TID 2). 736 bytes result sent to driver
16/06/17 09:19:04 INFO scheduler.TaskSetManager: Starting task 3.0 in stage 0.0 (TID 3, localhost, PROCESS_LOCAL, 1447 bytes)
16/06/17 09:19:04 INFO executor.Executor: Running task 3.0 in stage 0.0 (TID 3)
16/06/17 09:19:04 INFO scheduler.TaskSetManager: Finished task 2.0 in stage 0.0 (TID 2) in 162 ms on localhost (3/10)
16/06/17 09:19:04 INFO executor.Executor: Finished task 3.0 in stage 0.0 (TID 3). 736 bytes result sent to driver
16/06/17 09:19:04 INFO scheduler.TaskSetManager: Starting task 4.0 in stage 0.0 (TID 4, localhost, PROCESS_LOCAL, 1447 bytes)
16/06/17 09:19:04 INFO executor.Executor: Running task 4.0 in stage 0.0 (TID 4)
16/06/17 09:19:04 INFO scheduler.TaskSetManager: Finished task 3.0 in stage 0.0 (TID 3) in 469 ms on localhost (4/10)
16/06/17 09:19:04 INFO executor.Executor: Finished task 4.0 in stage 0.0 (TID 4). 736 bytes result sent to driver
16/06/17 09:19:04 INFO scheduler.TaskSetManager: Starting task 5.0 in stage 0.0 (TID 5, localhost, PROCESS_LOCAL, 1447 bytes)
16/06/17 09:19:04 INFO scheduler.TaskSetManager: Finished task 4.0 in stage 0.0 (TID 4) in 43 ms on localhost (5/10)
16/06/17 09:19:04 INFO executor.Executor: Running task 5.0 in stage 0.0 (TID 5)
16/06/17 09:19:05 INFO executor.Executor: Finished task 5.0 in stage 0.0 (TID 5). 736 bytes result sent to driver
16/06/17 09:19:05 INFO scheduler.TaskSetManager: Starting task 6.0 in stage 0.0 (TID 6, localhost, PROCESS_LOCAL, 1447 bytes)
16/06/17 09:19:05 INFO scheduler.TaskSetManager: Finished task 5.0 in stage 0.0 (TID 5) in 250 ms on localhost (6/10)
16/06/17 09:19:05 INFO executor.Executor: Running task 6.0 in stage 0.0 (TID 6)
16/06/17 09:19:05 INFO executor.Executor: Finished task 6.0 in stage 0.0 (TID 6). 736 bytes result sent to driver
16/06/17 09:19:05 INFO scheduler.TaskSetManager: Starting task 7.0 in stage 0.0 (TID 7, localhost, PROCESS_LOCAL, 1447 bytes)
16/06/17 09:19:05 INFO executor.Executor: Running task 7.0 in stage 0.0 (TID 7)
16/06/17 09:19:05 INFO scheduler.TaskSetManager: Finished task 6.0 in stage 0.0 (TID 6) in 134 ms on localhost (7/10)
16/06/17 09:19:05 INFO executor.Executor: Finished task 7.0 in stage 0.0 (TID 7). 736 bytes result sent to driver
16/06/17 09:19:05 INFO scheduler.TaskSetManager: Starting task 8.0 in stage 0.0 (TID 8, localhost, PROCESS_LOCAL, 1447 bytes)
16/06/17 09:19:05 INFO scheduler.TaskSetManager: Finished task 7.0 in stage 0.0 (TID 7) in 38 ms on localhost (8/10)
16/06/17 09:19:05 INFO executor.Executor: Running task 8.0 in stage 0.0 (TID 8)
16/06/17 09:19:05 INFO executor.Executor: Finished task 8.0 in stage 0.0 (TID 8). 736 bytes result sent to driver
16/06/17 09:19:05 INFO scheduler.TaskSetManager: Starting task 9.0 in stage 0.0 (TID 9, localhost, PROCESS_LOCAL, 1447 bytes)
16/06/17 09:19:05 INFO scheduler.TaskSetManager: Finished task 8.0 in stage 0.0 (TID 8) in 50 ms on localhost (9/10)
16/06/17 09:19:05 INFO executor.Executor: Running task 9.0 in stage 0.0 (TID 9)
16/06/17 09:19:05 INFO executor.Executor: Finished task 9.0 in stage 0.0 (TID 9). 736 bytes result sent to driver
16/06/17 09:19:05 INFO scheduler.TaskSetManager: Finished task 9.0 in stage 0.0 (TID 9) in 99 ms on localhost (10/10)
16/06/17 09:19:05 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool 
16/06/17 09:19:05 INFO scheduler.DAGScheduler: ResultStage 0 (reduce at SparkPi.scala:35) finished in 17.272 s
16/06/17 09:19:05 INFO scheduler.DAGScheduler: Job 0 finished: reduce at SparkPi.scala:35, took 18.894726 s
Pi is roughly 3.142008
16/06/17 09:19:05 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/metrics/json,null}
16/06/17 09:19:05 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/kill,null}
16/06/17 09:19:05 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/api,null}
16/06/17 09:19:05 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/,null}
16/06/17 09:19:05 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/static,null}
16/06/17 09:19:05 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/threadDump/json,null}
16/06/17 09:19:05 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/threadDump,null}
16/06/17 09:19:05 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/json,null}
16/06/17 09:19:05 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors,null}
16/06/17 09:19:05 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/environment/json,null}
16/06/17 09:19:05 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/environment,null}
16/06/17 09:19:05 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/rdd/json,null}
16/06/17 09:19:05 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/rdd,null}
16/06/17 09:19:05 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/json,null}
16/06/17 09:19:05 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage,null}
16/06/17 09:19:05 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/pool/json,null}
16/06/17 09:19:05 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/pool,null}
16/06/17 09:19:05 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/json,null}
16/06/17 09:19:05 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage,null}
16/06/17 09:19:05 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/json,null}
16/06/17 09:19:05 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages,null}
16/06/17 09:19:05 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/job/json,null}
16/06/17 09:19:05 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/job,null}
16/06/17 09:19:05 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/json,null}
16/06/17 09:19:05 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs,null}
16/06/17 09:19:05 INFO ui.SparkUI: Stopped Spark web UI at http://192.168.0.103:4040
16/06/17 09:19:05 INFO scheduler.DAGScheduler: Stopping DAGScheduler
16/06/17 09:19:05 INFO spark.MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
16/06/17 09:19:05 INFO util.Utils: path = /tmp/spark-3b4a18d7-9cba-44c6-a12c-ef262a8299be/blockmgr-cf9766ee-d283-4c16-a64e-45ee1c3a2a15, already present as root for deletion.
16/06/17 09:19:05 INFO storage.MemoryStore: MemoryStore cleared
16/06/17 09:19:05 INFO storage.BlockManager: BlockManager stopped
16/06/17 09:19:06 INFO storage.BlockManagerMaster: BlockManagerMaster stopped
16/06/17 09:19:06 INFO scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
16/06/17 09:19:06 INFO spark.SparkContext: Successfully stopped SparkContext
16/06/17 09:19:06 INFO remote.RemoteActorRefProvider$RemotingTerminator: Shutting down remote daemon.
16/06/17 09:19:06 INFO remote.RemoteActorRefProvider$RemotingTerminator: Remote daemon shut down; proceeding with flushing remote transports.
16/06/17 09:19:06 INFO util.Utils: Shutdown hook called
16/06/17 09:19:06 INFO util.Utils: Deleting directory /tmp/spark-3b4a18d7-9cba-44c6-a12c-ef262a8299be
[root@cdh1 hadoop]# 
SparkLocal运行模式是指在本地一台机器的虚拟机中运行Spark的所有进程,无需任何资源管理器。它主要用于测试目的,使用单机的多个线程来模拟Spark分布式计算。在本地模式下,可以直接使用SparkLocal运行模式,无需启动Hadoop。如果需要使用HDFS,需要先启动Hadoop/HDFS。在终端窗口中,可以使用以下命令来运行Spark自带的示例程序: 1. 进入Spark安装目录:cd /data/bigdata/spark-2.3.2 2. 运行示例程序:./bin/run-example SparkPi 这样就可以验证Spark是否安装成功,并在本地模式下运行Spark任务。 #### 引用[.reference_title] - *1* [Spark Local模式安装](https://blog.csdn.net/feizuiku0116/article/details/122687814)[target="_blank" data-report-click={"spm":"1018.2226.3001.9630","extra":{"utm_source":"vip_chatgpt_common_search_pc_result","utm_medium":"distribute.pc_search_result.none-task-cask-2~all~insert_cask~default-1-null.142^v91^control_2,239^v3^insert_chatgpt"}} ] [.reference_item] - *2* [Spark源码阅读04-Spark运行架构之Local运行模式](https://blog.csdn.net/weixin_44480968/article/details/121892966)[target="_blank" data-report-click={"spm":"1018.2226.3001.9630","extra":{"utm_source":"vip_chatgpt_common_search_pc_result","utm_medium":"distribute.pc_search_result.none-task-cask-2~all~insert_cask~default-1-null.142^v91^control_2,239^v3^insert_chatgpt"}} ] [.reference_item] - *3* [Spark本地环境搭建(local模式)](https://blog.csdn.net/qq_44807756/article/details/125551698)[target="_blank" data-report-click={"spm":"1018.2226.3001.9630","extra":{"utm_source":"vip_chatgpt_common_search_pc_result","utm_medium":"distribute.pc_search_result.none-task-cask-2~all~insert_cask~default-1-null.142^v91^control_2,239^v3^insert_chatgpt"}} ] [.reference_item] [ .reference_list ]
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包

打赏作者

5icode.top

你的鼓励将是我创作的最大动力

¥1 ¥2 ¥4 ¥6 ¥10 ¥20
扫码支付:¥1
获取中
扫码支付

您的余额不足,请更换扫码支付或充值

打赏作者

实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值