spark单机版搭建记录

spark单机版搭建记录

1.关闭防火墙 & selinux disable

service iptables stop && chkconfig iptables off  

2.免ssh登陆设置

网上很多,这里写上最后一步  

cat id_rsa.pub >> authorized_keys

3.下载scala、jdk、spark版本

[root@spark1 opt]# ll
total 364104
drwxr-xr-x.  8 uucp  143      4096 Mar 18 04:03 jdk1.8.0_05
-rw-r--r--.  1 root root 159910666 Jun 24 22:56 jdk-8u5-linux-i586.gz
drwxrwxr-x.  6 2000 2000      4096 May 21 04:12 scala
-rw-r--r--.  1 root root  25685521 Jun 25 06:53 scala-2.11.1.tgz
drwxrwxr-x. 11 1000 1000      4096 Jun 25 08:32 spark
-rw-r--r--.  1 root root 187224108 Jun 24 22:53 spark-1.0.0-bin-hadoop2.tgz

4.配置/etc/profile文件

JAVA_HOME=/opt/jdk1.8.0_05
CLASSPATH=.:$JAVA_HOME/lib.tools.jar
PATH=$JAVA_HOME/bin:$PATH
export JAVA_HOME CLASSPATH PATH
export SCALA_HOME=/opt/scala
export SPARK_HOME=/opt/spark
export PATH=$SCALA_HOME/bin:$SPARK_HOME/bin:$PATH
source  /etc/profile生效

5.配置spark文件

export SPARK_MASTER_IP=192.168.43.188
export SPARK_WORKER_CORES=1
export SPARK_WORKER_INSTANCES=1
export SPARK_MASTER_PORT=7074
export SPARK_WORKER_MEMORY=512m
export MASTER=spark://${SPARK_MASTER_IP}:${SPARK_MASTER_PORT}
[root@spark1 spark]# cat  conf/spark-env.sh

[root@spark1 conf]# cat slaves 
# A Spark Worker will be started on each of the machines listed below.
192.168.43.188

6.启动集群

[root@spark1 ~]# cd /opt/spark/sbin/
[root@spark1 sbin]# ls
slaves.sh        spark-daemon.sh   spark-executor  start-history-server.sh  start-slave.sh   stop-all.sh             stop-master.sh
spark-config.sh  spark-daemons.sh  start-all.sh    start-master.sh          start-slaves.sh  stop-history-server.sh  stop-slaves.sh
[root@spark1 sbin]# ./stop-all.sh 
192.168.43.188: stopping org.apache.spark.deploy.worker.Worker
stopping org.apache.spark.deploy.master.Master
[root@spark1 sbin]# ./start-all.sh 
starting org.apache.spark.deploy.master.Master, logging to /opt/spark/sbin/../logs/spark-root-org.apache.spark.deploy.master.Master-1-spark1.ouyang.cn.out
192.168.43.188: starting org.apache.spark.deploy.worker.Worker, logging to /opt/spark/sbin/../logs/spark-root-org.apache.spark.deploy.worker.Worker-1-spark1.ouyang.cn.out
[root@spark1 sbin]# 


7.查看启动界面





  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值