Inteliij IDEA 配置CDH5.11.0版本spark本地开发环境

Inteliij IDEA 配置CDH5.11.0版本spark本地开发环境

 

1、        IDEA创建maven项目

注意配置图上红框的版本号,配成1.6版本,主要解决scala,junit版本报错问题:

http://blog.csdn.net/u012551524/article/details/78967646

2、        然后下一步

定义项目基本属性

3、        下一步

配置maven版本,及依赖路径,配置文件路径

4、        下一步,定义项目路径

 

5、        建完项目后配置pom文件

修改pom文件内容如下:

<project xmlns="http://maven.apache.org/POM/4.0.0"xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"xsi:schemaLocation="http://maven.apache.org/POM/4.0.0http://maven.apache.org/maven-v4_0_0.xsd">
  <modelVersion>4.0.0</modelVersion>
  <groupId>spark.learning</groupId>
  <artifactId>spark</artifactId>
  <version>1.0-SNAPSHOT</version>
  <name>${project.artifactId}</name>
  <description>My wonderfull scala app</description>
  <inceptionYear>2015</inceptionYear>
  <licenses>
    <license>
      <name>My License</name>
      <url>http://....</url>
      <distribution>repo</distribution>
    </license>
  </licenses>

  <properties>
    <maven.compiler.source>1.6</maven.compiler.source>
    <maven.compiler.target>1.6</maven.compiler.target>
    <encoding>UTF-8</encoding>
    <scala.version>2.10.5</scala.version>
    <scala.compat.version>2.10</scala.compat.version>
    <spark.version>1.6.0-cdh5.11.0</spark.version>
    <spark.artifact>2.10</spark.artifact>
    <hbase.version>1.2.0-cdh5.11.0</hbase.version>
    <dependency.scope>compile</dependency.scope>
  </properties>

  <!--构建cloudera远程仓库-->
 
<repositories>
    <repository>
      <id>cloudera</id>
      <url>https://repository.cloudera.com/artifactory/cloudera-repos/</url>
    </repository>
  </repositories>

  <dependencies>
    <dependency>
      <groupId>org.scala-lang</groupId>
      <artifactId>scala-library</artifactId>
      <version>${scala.version}</version>
    </dependency>
    <!-- Test -->
   
<dependency>
      <groupId>junit</groupId>
      <artifactId>junit</artifactId>
      <version>4.11</version>
      <scope>test</scope>
    </dependency>
    <dependency>
      <groupId>org.specs2</groupId>
      <artifactId>specs2-core_${scala.compat.version}</artifactId>
      <version>2.4.16</version>
      <scope>test</scope>
    </dependency>
    <dependency>
      <groupId>org.scalatest</groupId>
      <artifactId>scalatest_${scala.compat.version}</artifactId>
      <version>2.2.4</version>
      <scope>test</scope>
    </dependency>

    <!-- CDH !-->

     
<dependency>
        <groupId>org.apache.commons</groupId>
        <artifactId>commons-lang3</artifactId>
        <version>3.0</version>
      </dependency>
      <dependency>
        <groupId>org.apache.hadoop</groupId>
        <artifactId>hadoop-hdfs</artifactId>
        <version>2.6.0-cdh5.11.0</version>
        <exclusions>
          <exclusion>
            <groupId>javax.servlet</groupId>
            <artifactId>*</artifactId>
          </exclusion>
          <exclusion>
            <groupId>javax.servlet.jsp</groupId>
            <artifactId>*</artifactId>
          </exclusion>
        </exclusions>
      </dependency>
      <dependency>
        <groupId>org.apache.hadoop</groupId>
        <artifactId>hadoop-common</artifactId>
        <version>2.6.0-cdh5.11.0</version>
        <exclusions>
          <exclusion>
            <groupId>javax.servlet</groupId>
            <artifactId>*</artifactId>
          </exclusion>
          <exclusion>
            <groupId>javax.servlet.jsp</groupId>
            <artifactId>*</artifactId>
          </exclusion>
        </exclusions>
      </dependency>
      <dependency>
        <groupId>org.apache.hadoop</groupId>
        <artifactId>hadoop-mapreduce-client-core</artifactId>
        <version>2.6.0-cdh5.11.0</version>
        <exclusions>
          <exclusion>
            <groupId>javax.servlet</groupId>
            <artifactId>*</artifactId>
          </exclusion>
        </exclusions>
      </dependency>
      <dependency>
        <groupId>org.apache.hbase</groupId>
        <artifactId>hbase-client</artifactId>
        <version>${hbase.version}</version>
        <exclusions>
          <exclusion>
            <groupId>*</groupId>
            <artifactId>javax.servlet</artifactId>
          </exclusion>
        </exclusions>
      </dependency>
      <dependency>
          <groupId>org.apache.hbase</groupId>
          <artifactId>hbase-common</artifactId>
          <version>${hbase.version}</version>
          <exclusions>
            <exclusion>
              <groupId>javax.servlet</groupId>
              <artifactId>*</artifactId>
            </exclusion>
            <exclusion>
              <groupId>javax.servlet.jsp</groupId>
              <artifactId>*</artifactId>
            </exclusion>
          </exclusions>
      </dependency>
      <dependency>
        <groupId>org.apache.hbase</groupId>
        <artifactId>hbase-spark</artifactId>
        <version>${hbase.version}</version>
      </dependency>
      <dependency>
        <groupId>org.apache.hbase</groupId>
        <artifactId>hbase-server</artifactId>
        <version>${hbase.version}</version>
        <exclusions>
          <exclusion>
            <groupId>*</groupId>
            <artifactId>javax.servlet</artifactId>
          </exclusion>
        </exclusions>
      </dependency>
      <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-core_${spark.artifact}</artifactId>
        <version>${spark.version}</version>
        <scope>${dependency.scope}</scope>
        <exclusions>
          <exclusion>
            <groupId>org.eclipse.jetty.orbit</groupId>
            <artifactId>javax.servlet</artifactId>
          </exclusion>
        </exclusions>
      </dependency>
      <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-sql_${spark.artifact}</artifactId>
        <version>${spark.version}</version>
        <scope>${dependency.scope}</scope>
      </dependency>
      <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-hive_${spark.artifact}</artifactId>
        <version>${spark.version}</version>
        <scope>${dependency.scope}</scope>
      </dependency>
      <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-mllib_${spark.artifact}</artifactId>
        <version>${spark.version}</version>
        <scope>${dependency.scope}</scope>
      </dependency>
      <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-streaming-kafka_${spark.artifact}</artifactId>
        <version>${spark.version}</version>
        <scope>${dependency.scope}</scope>
      </dependency>

      <!--注释掉所有javax.servlet的依赖,重新下载-->
     
<dependency>
        <groupId>javax.servlet</groupId>
        <artifactId>javax.servlet-api</artifactId>
        <version>3.0.1</version>
        <scope>${dependency.scope}</scope>
      </dependency>

  </dependencies>

  <build>
    <sourceDirectory>src/main/scala</sourceDirectory>
    <testSourceDirectory>src/test/scala</testSourceDirectory>
    <plugins>
      <plugin>
        <!-- see http://davidb.github.com/scala-maven-plugin-->
       
<groupId>net.alchim31.maven</groupId>
        <artifactId>scala-maven-plugin</artifactId>
        <version>3.3.1</version>
        <executions>
          <execution>
            <goals>
              <goal>compile</goal>
              <goal>testCompile</goal>
            </goals>
            <configuration>
              <args>
                <arg>-make:transitive</arg>
                <arg>-dependencyfile</arg>
                <arg>${project.build.directory}/.scala_dependencies</arg>
              </args>
            </configuration>
          </execution>
        </executions>
      </plugin>
      <plugin>
        <groupId>org.apache.maven.plugins</groupId>
        <artifactId>maven-surefire-plugin</artifactId>
        <version>2.18.1</version>
        <configuration>
          <useFile>false</useFile>
          <disableXmlReport>true</disableXmlReport>
          <!-- If you have classpath issue likeNoDefClassError,... -->
          <!--useManifestOnlyJar>false</useManifestOnlyJar -->
          
<includes>
            <include>**/*Test.*</include>
            <include>**/*Suite.*</include>
          </includes>
        </configuration>
      </plugin>
    </plugins>
  </build>
</project>
 

修改pom注意事项:

a)      CDH spark版本依赖配置可参考cloudera官网说明:

https://www.cloudera.com/documentation/enterprise/release-notes/topics/cdh_vd_cdh5_maven_repo_511x.html#concept_xnx_9kn_yk

b)      如上配置注意一点

这里要注视掉含有javax.servlet所有依赖中的相关依赖包,避免版本冲突,然后重新配置下载

注释方式如下:

如果不注释的话,会报jar包冲突问题,如下:

“javax.servlet.FilterRegistration”‘s signer information does not match signer information of otherclasses in the same package

 

 

寻找冲突依赖可以通过maven helper插件:

安装maven helper:

http://blog.csdn.net/difffate/article/details/76279005

使用该插件找到冲突的依赖,然后用上边方式注掉:

 

      

6、        测试

7、                       import org.apache.spark.{SparkConf, SparkContext}

object Demo {

 
def main(args:Array[String]) {

    System.setProperty(
"hadoop.home.dir", "C:\\hadoop-2.6.0-cdh5.11.0")
   
// 设置Spark的序列化方式
   
System.setProperty("spark.serializer", "org.apache.spark.serializer.KryoSerializer")

   
// 初始化Spark
   
val sparkConf= new SparkConf().setAppName("CountDemo").setMaster("local")
   
val sc = new SparkContext(sparkConf)

   
// 读取文件
   
val rdd =sc.textFile("C:\\1.txt")

    println(
"文件的行数为:" +rdd.count())

   
val re=rdd.map(x=>(x,x)).collect()

    re.foreach(x => println(
"num=" + x._1+",name=" +x._2))

   
val age=rdd.map(x=>(x,"30"))

    age.collect().foreach(x => println(
"num="+x._1+",age="+x._2))

   
val joinRDD= sc.makeRDD(re).join(age).collect.foreach(println)

    sc.stop()
  }

}

 

注:System.setProperty("hadoop.home.dir","C:\\hadoop-2.6.0-cdh5.11.0")

1、        本地hadoop,spark开发,都要配一下本地的Hadoop环境,windows环境下只需要下载hadoop的tar包,在代码中指定目录位置即可

2、        单单第一步执行时还是会报错

java.io.IOException:Could not locate executablenull\bin\winutils.exe in the Hadoop binaries.

这里要下载winutils.exe插件

https://github.com/srccodes/hadoop-common-2.2.0-bin/blob/master/bin/winutils.exe

最后执行成功:

 

评论 1
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值