准备
- 下载spark,地址:http://spark.apache.org/downloads.html
- 下载不带hadoop预编译环境的spark最新版本,好处是可以自由使用最新版本的hadoop
- 下载hadoop,地址:https://hadoop.apache.org/releases.html
1.基本环境配置
[ec2-user@rcf-ai-datafeed-spark-prd-01 conf]$ cat /etc/hosts
127.0.0.1 localhost localhost.localdomain localhost4 localhost4.localdomain4 ::1 localhost6 localhost6.localdomain6 10.16.5.162 rcf-ai-datafeed-spark-prd-01.wisers.com rcf-ai-datafeed-spark-prd-01 #master 10.16.5.177 rcf-ai-datafeed-spark-prd-02.wisers.com rcf-ai-datafeed-spark-prd-02 #slave 10.16.5.22 rcf-ai-datafeed-spark-prd-03.wisers.com rcf-ai-datafeed-spark-prd-03 #slave 10.16.5.243 rcf-ai-datafeed-spark-prd-04.wisers.com rcf-ai-datafeed-spark-prd-04 #slave
[ec2-user@rcf-ai-datafeed-spark-prd-01 conf]$ cat /etc/profile
#java config export JAVA_HOME=/data/server/jdk