环境:
hadoop2.6.0-cdh5.7.0
jdk 8 +
maven 3.5.4 +
scala 2.11
这里提醒下,spark2.4.0之前的版本maven使用 3.3.9+ 就可以了; spark2.2.0之前的版本支持jdk 7。
- 源码下载安装
[hadoop@hdp001 softwore]$ wget https://archive.apache.org/dist/spark/spark-2.4.0/spark-2.4.0.tgz
[hadoop@hdo001 softwore]$ mv spark-2.4.0.tgz spark-2.4.0_source.tgz
[hadoop@hdo001 softwore]$ tar -zxvf spark-2.4.0_source.tgz -C ~/source/
[hadoop@hdo001 softwore]$ mv ~/source/spark-2.4.0 ~/source/spark-2.4.0_source
[hadoop@hdo001 softwore]$ cd ~/source/spark-2.4.0_source
2.编译
2.1 修改配置文件
(1). vim ./dev/make-distribution.sh
# that is 128~146 Lines
#VERSION=$("$MVN" help:evaluate -Dexpression=project.version $@ 2>/dev/null\
# | grep -v "INFO"\
# | g