spark集群安装部署(spark on yarn)

spark集群安装部署(spark on yarn)

1.前提
已经部署了Hadoop 2.2集群

2.下载并安装scala
2.1下载scala


2.2安装scala
mkdir   -p    /usr/local/myspark/scala
cd    /opt
tar   -zxvf   scala-2.11.2.tgz
cp   -r     scala-2.11.2           /usr/local/myspark/scala/
echo   "export   SCALA_HOME=/usr/local/myspark/scala/scala-2.11.2"  >> /etc/profile
echo    "export   PATH=$SCALA_HOME/bin:$PATH"  >> /etc/profile
source   /etc/profile

3.下载spark


4.安装spark
mkdir   -p    /usr/local/myspark/spark
cd  /opt
tar  -zxvf  spark-1.0.2-bin-hadoop2.tgz
cp  -r   spark-1.0.2-bin-hadoop2     /usr/local/myspark/spark/


5.配置
5.2配置 spark-env.sh
cd   /usr/local/myspark/spark/spark-1.0.2-bin-hadoop2/conf
cp   spark-env.sh.template       spark-env.sh
 vi    spark-env.sh

export JAVA_HOME=/usr/local/java/jdk1.7.0_25

export SCALA_HOME=/usr/local/myspark/scala/scala-2.11.2

export HADOOP_CONF_DIR=/usr/local/hadoop/hadoop-2.2.0/etc/hadoop

export SPARK_HOME=/usr/local/myspark/spark/spark-1.0.2-bin-hadoop2





5.2修改spark-defaults.conf
cd   /usr/local/myspark/spark/spark-1.0.2-bin-hadoop2/conf
cp   spark-defaults.conf.template              spark-defaults.conf
vi   spark-defaults.conf

spark.master yarn-cluster
spark.eventLog.enabled true
spark.eventLog.dir hdfs://master:9000/sparkeventlog
spark.serializer org.apache.spark.serializer.KryoSerializer



5.3修改slaves
cd   /usr/local/myspark/spark/spark-1.0.2-bin-hadoop2/conf
vi  slaves
master
master2
slave1
slave2
slave3




5.4修改log4j.properties
cd   /usr/local/myspark/spark/spark-1.0.2-bin-hadoop2/conf
cp   log4j.properties.template         log4j.properties
vi     log4j.properties

# Set everything to be logged to the console
spark.log=/var/log
log4j.rootCategory=INFO, console,file
log4j.appender.console=org.apache.log4j.ConsoleAppender
log4j.appender.console.target=System.err
log4j.appender.console.layout=org.apache.log4j.PatternLayout
log4j.appender.console.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} %p %c{1}: %m%n


log4j.appender.file=org.apache.log4j.RollingFileAppender
log4j.appender.file.MaxFileSize=5MB
log4j.appender.file.MaxBackupIndex=10
log4j.appender.file.layout=org.apache.log4j.PatternLayout
log4j.appender.file.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} %p %c{1}: %m%n
log4j.appender.file.File=${spark.log}/spark.log


# Settings to quiet third party logs that are too verbose
log4j.logger.org.eclipse.jetty=WARN
log4j.logger.org.apache.spark.repl.SparkIMain$exprTyper=INFO
log4j.logger.org.apache.spark.repl.SparkILoop$SparkILoopInterpreter=INFO




5.5修改/etc/profile
vi  /etc/profile
echo   "export    SPARK_EXAMPLES_JAR=/usr/local/myspark/spark/spark-1.0.2-bin-hadoop2/lib/spark-examples-1.0.2-hadoop2.2.0.jar"   >>   /etc/profile
echo  "export   SPARK_HOME=/usr/local/myspark/spark/spark-1.0.2-bin-hadoop2"   >>      /etc/profile
echo   "export   PATH=$SPARK_HOME/bin:$PATH"    >>      /etc/profile
 source    /etc/profile



6.拷贝到其他节点
mkdir   -p    /usr/local/myspark/scala
mkdir   -p    /usr/local/myspark/spark
scp   -r      10.41.2.82:/usr/local/myspark/scala/scala-2.11.2                    /usr/local/myspark/scala/
scp     -r    10.41.2.82:/usr/local/myspark/spark/spark-1.0.2-bin-hadoop2        /usr/local/myspark/spark/
echo   "export   SCALA_HOME=/usr/local/myspark/scala/scala-2.11.2"  >> /etc/profile
echo    "export   PATH=$SCALA_HOME/bin:$PATH"  >> /etc/profile
source   /etc/profile
echo  "export   SPARK_HOME=/usr/local/myspark/spark/spark-1.0.2-bin-hadoop2"   >>      /etc/profile
echo   "export   PATH=$SPARK_HOME/bin:$PATH"    >>      /etc/profile
 source    /etc/profile


7.启动停止
7.1启动
在master(10.41.2.82)上运行
/usr/local/myspark/spark/spark-1.0.2-bin-hadoop2/sbin/start-all.sh




7.2停止
/usr/local/myspark/spark/spark-1.0.2-bin-hadoop2/sbin/stop-all.sh 

8.测试
8.1Web UI
http://10.41.2.82:8080
http://master:8080






8.2运行demo
在10.41.2.82上执行以下命令:

/usr/local/myspark/spark/spark-1.0.2-bin-hadoop2/bin/spark-submit --class org.apache.spark.examples.SparkPi --master yarn-cluster --num-executors 3 --driver-memory 4g --executor-memory 2g --executor-cores 1 /usr/local/myspark/spark/spark-1.0.2-bin-hadoop2/lib/spark-examples-1.0.2-hadoop2.2.0.jar 10 















查看结果:
访问:
http://master:8088/proxy/application_1409622175934_0001/A





点击logs










结果为:
Pi is roughly 3.145044






















































  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值