首先需要安装maven
需要在maven官方下载,文件名为apache-maven-3.3.9-bin.zip
下载后选择安装在/usr/local/maven中:
先解压maven的包:
sudo unzip ~/下载/apache-maven-3.3.9-bin.zip -d /usr/local
然后将apache-maven-3.3.9的文件名改为maven,以及将maven改为hadoop用户权限
JAVA的应用程序代码:
先创建一个文件夹sparkapp作为应用程序的根目录
mkdir -p ./sparkapp2/src/main/java
在 ./sparkapp/src/main/java 下建立一个名为 SimpleApp.java 的文件,并且加入相关代码:
- /*** SimpleApp.java ***/
- import org.apache.spark.api.java.*; //载入spark的相关API类
- import org.apache.spark.api.java.function.Function;
-
- public class SimpleApp {
- public static void main(String[] args) {
- String logFile = "file:///usr/local/spark/README.md";
- JavaSparkContext sc = new JavaSparkContext("local", "Simple App","file:///usr/local/spark/", new String[]{"target/simple-project-1.0.jar"});
- //初始化Spark程序
- JavaRDD<String> logData = sc.textFile(logFile).cache();
-
- long numAs = logData.filter(new Function<String, Boolean>() {
- public Boolean call(String s) { return s.contains("a"); }
- }).count();
-
- long numBs = logData.filter(new Function<String, Boolean>() {
- public Boolean call(String s) { return s.contains("b"); }
- }).count();
-
- System.out.println("Lines with a: " + numAs + ", lines with b: " + numBs);
- }
- }
该程序依赖Spark Java API,因此我们需要通过Maven进行编译打包。
在./sparkapp2中新建文件pom.xml(vim ./sparkapp/pom.xml),添加内容如下
声明该独立应用程序的信息以及与Spark的依赖关系:
<project>
<groupId>edu.berkeley</groupId>
<artifactId>simple-project</artifactId>
<modelVersion>4.0.0</modelVersion>
<name>Simple Project</name>
<packaging>jar</packaging>
<version>1.0</version>
<repositories>
<repository>
<id>Akka repository</id>
<url>http://repo.akka.io/releases</url>
</repository>
</repositories>
<dependencies>
<dependency> <!-- Spark dependency -->
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>1.6.2</version>
</dependency>
</dependencies>
</project>
使用maven打包java程序:
找到当前项目目录,并进入当前目录下执行命令:
/usr/local/maven/bin/mvn package
运行JAVA程序
在这里运用到spark-submit来运行打包的JAVA程序
/usr/local/spark/bin/spark-submit --class "SimpleApp" ~/sparkapp2/target/simple-project-1.0.jar