spark任务启动脚本备忘

#!/bin/bash

# Here, I am assuming that you want to run your Spark program in "Spark Cluster"
# Assuming that your Spark master is running on server "myserver100"
# This script is a kind of template ...
# --------------------------------------------------------------------------------
# 1. You have installed the data-algorithms-book in /home/mp/data-algorithms-book (BOOK_HOME)
# 2. Spark 1.5.2 is installed at /usr/local/spark-1.5.2
# 3. And you have built the source code and generated $DAB/dist/data_algorithms_book.jar
# 4. And you have two input parameters identified as P1 and P2
# 5. You need to modify spark-submit parameters accordingly
# --------------------------------------------------------------------------------
#
export JAVA_HOME=/home/nianhua/soft/jdk1.8.0_45
# java is defined at $JAVA_HOME/bin/java
export BOOK_HOME=/data/spark/demo3
export SPARK_HOME=/home/nianhua/soft/spark-1.3.0-bin-hadoop2.4
export SPARK_MASTER=spark://tuijian-mnger.cando.site:7077
#export SPARK_JAR=$BOOK_HOME/lib/spark-assembly-1.5.2-hadoop2.6.0.jar
export APP_JAR=$BOOK_HOME/sparkwordcount.jar
#
# build all other dependent jars in OTHER_JARS
JARS=`find $BOOK_HOME/lib -name '*.jar'`
OTHER_JARS=""
for J in $JARS ; do
OTHER_JARS=$J,$OTHER_JARS
done

#
echo $JAVA_HOME
echo ${JAVA_HOME}
P1=local
P2=1
DRIVER_CLASS_NAME=$1
nohup $SPARK_HOME/bin/spark-submit --class $DRIVER_CLASS_NAME --master $SPARK_MASTER --num-executors 10 --driver-memory 2g --executor-memory 5g --total-executor-cores 12 --executor-cores 10 --driver-java-options "-Dspark.akka.frameSize=25" --jars $OTHER_JARS $APP_JAR $P1 $P2 &
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值