spark-spark集群部署

5 篇文章 0 订阅

1.scala部署

解压scala

tar –zxvf scala-2.12.0.tgz

配置环境变量

vi /etc/profile
export SCALA_HOME=/usr/tools/scala-2.12.0
export PATH=$PATH:$SCALA_HOME/bin

使环境变量生效

source /etc/profile

检查安装成功:
 

scala –version
Scala code runner version 2.12.0 -- Copyright 2002-2016, LAMP/EPFL
 scp -r /opt/cm/hadoop/scala-2.12.0 root@192.168.50.236:/opt/cm/hadoop/
 scp -r /opt/cm/hadoop/scala-2.12.0 root@192.168.50.237:/opt/cm/hadoop/
[root@hd01 hadoop]# scp -r /etc/profile root@192.168.50.236:/etc/
profile                                                                                                   100% 2363     2.3KB/s   00:00    
[root@hd01 hadoop]# scp -r /etc/profile root@192.168.50.237:/etc/
profile   

2.spark部署
 

[root@hd01 hadoop]# tar -zxvf spark-2.2.0-bin-hadoop2.7.tgz
[root@hd01 hadoop]# ll
总用量 758020
drwxr-xr-x. 10 1000 1000      4096 8月  27 16:08 hadoop-2.9.0
-rw-r--r--.  1 root root 366744329 8月  27 15:01 hadoop-2.9.0.tar.gz
drwxr-xr-x.  8   10  143      4096 3月  15 2017 jdk1.8.0_131
-rw-r--r--.  1 root root 185540433 8月  27 14:54 jdk-8u131-linux-x64.tar.gz
drwxrwxr-x.  6 1001 1001        46 11月  8 2016 scala-2.12.0
-rw-r--r--.  1 root root  20177534 8月  27 18:53 scala-2.12.0.tgz
drwxr-xr-x. 12  500  500      4096 7月   1 2017 spark-2.2.0-bin-hadoop2.7
-rw-r--r--.  1 root root 203728858 8月  27 18:58 spark-2.2.0-bin-hadoop2.7.tgz

配置环境变量
vi /etc/profile

#spark
export SPARK_HOME=/opt/cm/hadoop/spark-2.2.0-bin-hadoop2.7
export PATH=$PATH:$SPARK_HOME/bin

[root@hd01 hadoop]# cd spark-2.2.0-bin-hadoop2.7/conf/
[root@hd01 conf]# cp spark-env.sh.template spark-env.sh
[root@hd01 conf]# cp log4j.properties.template log4j.properties
[root@hd01 conf]# cp slaves.template slaves


[root@hd01 conf]# vi spark-env.sh

#!/usr/bin/env bash

export SCALA_HOME=/opt/cm/hadoop/scala-2.12.0
export JAVA_HOME=/opt/cm/hadoop/jdk1.8.0_131
export SPARK_WORKER_MEMORY=1G
export HADOOP_CONF_DIR=/opt/cm/hadoop/hadoop-2.9.0/etc/hadoop

[root@hd01 conf]# vi slaves

hd01
hd02
hd03


scp -r /opt/cm/hadoop/spark-2.2.0-bin-hadoop2.7 root@192.168.50.237:/opt/cm/hadoop/
scp -r /opt/cm/hadoop/spark-2.2.0-bin-hadoop2.7 root@192.168.50.236:/opt/cm/hadoop/

[root@hd01 hadoop]# scp -r /etc/profile root@192.168.50.236:/etc/profile                                                                                                   100% 2363     2.3KB/s   00:00    
[root@hd01 hadoop]# scp -r /etc/profile root@192.168.50.237:/etc/profile   

[root@hd01 sbin]# /opt/cm/hadoop/spark-2.2.0-bin-hadoop2.7/sbin/start-all.sh
starting org.apache.spark.deploy.master.Master, logging to /opt/cm/hadoop/spark-2.2.0-bin-hadoop2.7/logs/spark-root-org.apache.spark.deploy.master.Master-1-hd01.out
hd02: starting org.apache.spark.deploy.worker.Worker, logging to /opt/cm/hadoop/spark-2.2.0-bin-hadoop2.7/logs/spark-root-org.apache.spark.deploy.worker.Worker-1-hd02.out
hd01: starting org.apache.spark.deploy.worker.Worker, logging to /opt/cm/hadoop/spark-2.2.0-bin-hadoop2.7/logs/spark-root-org.apache.spark.deploy.worker.Worker-1-hd01.out
hd03: starting org.apache.spark.deploy.worker.Worker, logging to /opt/cm/hadoop/spark-2.2.0-bin-hadoop2.7/logs/spark-root-org.apache.spark.deploy.worker.Worker-1-hd03.out
[root@hd01 sbin]# jps
23840 Worker
23735 Master
23915 Jps

 

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值