本地调试:
代码:
package com.atguigu
import org.apache.spark.{SparkConf, SparkContext}
object WordCount{
def main(args: Array[String]): Unit = {
//创建SparkConf并设置App名称
val conf = new SparkConf().setAppName("WC").setMaster("local[*]")
//创建SparkContext,该对象是提交Spark App的入口
val sc = new SparkContext(conf)
//使用sc创建RDD并执行相应的transformation和action
sc.textFile(args(0)).flatMap(_.split(" ")).map((_, 1)).reduceByKey(_+_, 1).sortBy(_._2, false).saveAsTextFile(args(1))
sc.stop()
}
}
如果出现异常如下:
java.io.IOException: Could not locate executable null\bin\winutils.exe in the Hadoop binaries.
原因是spark本地调试需要hadoop环境依赖,在运行环境中指定hadoop路径即可
远程调试:
通过IDEA进行远程调试,主要是将IDEA作为Driver来提交应用程序,配置过程如下:
修改sparkConf,添加最终需要运行的Jar包、Driver程序的地址,并设置Master的提交地址:
val conf = new SparkConf().setAppName("WC")
.setMaster("spark://hadoop102:7077")
.setJars(List("E:\\SparkIDEA\\spark_test\\target\\WordCount.jar"))
然后加入断点,直接调试即可:
spark的pom文件
<dependencies>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.1.1</version>
</dependency>
</dependencies>
<build>
<finalName>WordCount</finalName>
<plugins>
<plugin>
<groupId>net.alchim31.maven</groupId>
<artifactId>scala-maven-plugin</artifactId>
<version>3.2.2</version>
<executions>
<execution>
<goals>
<goal>compile</goal>
<goal>testCompile</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-assembly-plugin</artifactId>
<version>3.0.0</version>
<configuration>
<archive>
<manifest>
<mainClass>WordCount(修改)</mainClass>
</manifest>
</archive>
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
</configuration>
<executions>
<execution>
<id>make-assembly</id>
<phase>package</phase>
<goals>
<goal>single</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>