1. Installing Spark Standalone to a Cluster
To install Spark Standalone mode, you simply place a compiled version of Spark on each node on the cluster.
to visit the below blog:
http://blog.chinaunix.net/uid-29454152-id-5148300.html
http://blog.chinaunix.net/uid-29454152-id-5148347.html
2. Starting a Cluster Manually
1)at master
command to start sparksudo ./sbin/start-master.sh
spark://HOST:PORT can be find in webUI address :
http://localhost:8080
2)at worker
command to connect master./bin/spark-class org.apache.spark.deploy.worker.Worker spark://IP:PORT
3. Connecting an Application to the Cluster
command to start app in spark
sudo ./bin/spark-shell --master spark://IP:PORT --total-executor-cores <numCores>
4.submit jar
mode:down at spark dir
./bin/spark-submit --class path.to.your.class [options] <app jar>example:at standalone
./bin/spark-submit \--class my.main.classname \
--master spark://127.0.0.1:7077
--executor-memory 2G \
--total-executor-cores 4 \
/home/warrior/IdeaProjects/sparkTest/out/artifacts/sparkTest_jar/sparkTest.jar