IDEA版本:IntelliJ IDEA Community Edition 2016.1.1(64)
IDEA创建Spark项目和pom.xml
1,IDEA 创建Spark项目
1.1,点击IDEA执行文件
1.2 添加scala
1.3 选择Create New Project,选择scala-tools
[外链图片转存失败,源站可能有防盗链机制,建议将图片保存下来直接上传(img-RdJKv1c0-1571660910566)(https://img-blog.csdn.net/20171109120748887?watermark/2/text/aHR0cDovL2Jsb2cuY3Nkbi5uZXQvbHVvbGlubGwxMjEy/font/5a6L5L2T/fontsize/400/fill/I0JBQkFCMA==/dissolve/70/gravity/SouthEast)]
1.4,输入包名和项目名
1.5,再输入项目名
经过以上步骤,spark开发的环境已经创建完成。但是会出现2个问题,将在下一个篇博客介绍。
2,pom.xml
<properties>
<maven.compiler.source>1.7</maven.compiler.source>
<maven.compiler.target>1.7</maven.compiler.target>
<encoding>UTF-8</encoding>
<scala.version>2.10.6</scala.version>
<scala.compat.version>2.10.6</scala.compat.version>
<spark.version>1.6.3</spark.version>
<hadoop.version>2.6.4</hadoop.version>
</properties>
<dependencies>
<!--scala dependency-->
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>${scala.version}</version>
</dependency>
<!--spark dependency-->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>1.6.3</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_2.10</artifactId>
<version>1.6.3</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.10</artifactId>
<version>1.6.3</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming-kafka_2.10</artifactId>
<version>1.6.3</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming-flume_2.10</artifactId>
<version>1.6.3</version>
</dependency>-->
<!--hadoop dependency-->
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-client</artifactId>
<version>2.6.4</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>2.6.4</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-hdfs</artifactId>
<version>2.6.4</version>
</dependency>
<!--system out-->
<dependency>
<groupId>org.specs</groupId>
<artifactId>specs</artifactId>
<version>1.2.5</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>commons-logging</groupId>
<artifactId>commons-logging</artifactId>
<version>1.1.1</version>
<type>jar</type>
</dependency>
<dependency>
<groupId>org.apache.commons</groupId>
<artifactId>commons-lang3</artifactId>
<version>3.1</version>
</dependency>
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<version>1.2.9</version>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.12</version>
</dependency>
</dependencies>
问题1,无法编译,出现
解决这个问题,只需要注释test\scala下面系统自动生成的这两个类
问题2,Error:scalac: Error: object scala.runtime in compiler mirror not found.
Error:scalac: error while loading <root>, zip file is empty
Error:scalac: Error: object scala.runtime in compiler mirror not found.
scala.reflect.internal.MissingRequirementError: object scala.runtime in compiler mirror not found.
at scala.reflect.internal.MissingRequirementError$.signal(MissingRequirementError.scala:16)
at scala.reflect.internal.MissingRequirementError$.notFound(MissingRequirementError.scala:17)
at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:48)
at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:40)
at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:61)
at scala.reflect.internal.Mirrors$RootsBase.getPackage(Mirrors.scala:172)
at scala.reflect.internal.Mirrors$RootsBase.getRequiredPackage(Mirrors.scala:175)
.
.
.
笔者也不知道这是什么原因造成的,可能是网络原因造成的。笔者的方法是删除maven的旧依赖,重新下载maven依赖。
maven的默认依赖在:C:\Users\主机名\.m2\repository
删除repository文件夹,让maven重新下载依赖。需要网络质量好,总共下载需要200M以上。