说明下:从spark 2.2版本不在支持JDK1.7,所有的编译和运行都在JDK1.8 及scala 2.11以上版本进行。
编译环境:
java version "1.8.0_131"
Apache Maven 3.3.9
linux
进入linux:执行如下命令:
cd $spark_home/spark-2.2.0-rc4
./dev/make-distribution.sh --name custom-spark --tgz -Psparkr -Phadoop-2.6 -Phive -Phive-thriftserver -Pmesos -Pyarn
The following exception occurs:
[error] javac: invalid source release: 1.8
[error] Usage: javac <options> <source files>
[error] use -help for a list of possible options
[error] Compile failed at Jun 13, 2017 4:37:31 PM [0.979s]
[ERROR] Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile (default-cli) on project spark-tags_2.11: Execution default-cli of goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile failed. CompileFailed -> [Help 1]
解决办法:
执行如下命令:查看java环境和apache maven环境
java -version
which java
javac -version
which java
mvn -version
出现上面的问题原因是因为我之前的在这台服务是安装 jdk1.7
后面虽然修改了/etc/profile文件改成了JDK 1.8,但是在1.7上面执行了编译命令,导致后台启动了zinc服务
执行如下命令查询:
ps -ef|grep zinc
kill -9
重新编译通过:
./dev/make-distribution.sh --name custom-spark --tgz -Psparkr -Phadoop-2.6 -Dhadoop.version=2.6.0 -Phive -Phive-thriftserver
最后加上apache官网的提供解决办法的地址:
https://issues.apache.org/jira/browse/SPARK-21075