上一篇简单介绍了Spark Hello World,这一篇主要记录一下怎么获得Spark Java源代码,导入Eclipse,并且用Maven执行。
1、从github checkout 源代码
$git clone git://github.com/perwendel/spark.git
2、把这个项目Eclipse化
$cd spark
$mvn eclipse:eclipse -Dwtpversion=2.0
这时候可以打开Eclipse,import Existing project,Eclipse的Import Wizard应该可以识别spark是个Eclipse项目。
3、执行Maven build
$mvn clean install
如果这一步失败在
[ERROR] COMPILATION ERROR :
[INFO] -------------------------------------------------------------
[ERROR] Failure executing javac, but could not parse the error:
javac: invalid target release: 1.7
Usage: javac <options> <source files>
use -help for a list of possible options
[INFO] 1 error
[INFO] -------------------------------------------------------------
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1.283s
[INFO] Finished at: Mon Apr 15 17:12:45 EDT 2013
[INFO] Final Memory: 5M/81M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:2.3.2:compile (default-compile) on project spark-core: Compilation failure
[ERROR] Failure executing javac, but could not parse the error:
[ERROR] javac: invalid target release: 1.7
[ERROR] Usage: javac <options> <source files>
[ERROR] use -help for a list of possible options
[ERROR] -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
原因是spark需要Java jdk 1.7,没有的可以去Oracle官网下载。
$java -version
java version "1.7.0_17"
Java(TM) SE Runtime Environment (build 1.7.0_17-b02)
Java HotSpot(TM) 64-Bit Server VM (build 23.7-b01, mixed mode)
之后再执行
$mvn clean install
应该可以顺利download dependency,执行compile,test,package,生成一个spark jar。