Spark系列(2)—A Standalone App in Scala

spark application 编译是用sbt编译并运行的,下面介绍下官网上实例。

1.编写源码

建立一个文件夹,命名为simpleApp

$mkdir simpleApp

另外还要创建一些列文件夹,其结构如下:

$ cd ..
$ cd simpleApp/
$ find .
.
./simple.sbt
./src
./src/main
./src/main/scala
./src/main/scala/simpleApp.scala

其中有两个文件,一个是simple.sbt,这个是sbt的编译配置文件,另一个simpleApp.scala是此app的源文件。其代码分别如下

simple.sbt

name := "Simple Project"

version := "1.0"

scalaVersion := "2.10.3"

libraryDependencies += "org.apache.spark" %% "spark-core" % "0.9.1"

resolvers += "Akka Repository" at "http://repo.akka.io/releases/"


simpleApp.scala

import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._

object SimpleApp {
  def main(args: Array[String]) {
    val logFile = "/home/jpan/Software/spark-0.9.1/README.md" // Should be some file on your system
    val sc = new SparkContext("local", "Simple App", "/home/jpan/Software/spark-0.9.1/",
    List("target/scala-2.10/simple-project_2.10-1.0.jar"))
    val logData = sc.textFile(logFile, 2).cache()
    val numAs = logData.filter(line => line.contains("a")).count()
    val numBs = logData.filter(line => line.contains("b")).count()
    println("Lines with a: %s, Lines with b: %s".format(numAs, numBs))
  }
}

2.运行代码

进入到simpleApp文件夹下,运行

$ ~/Software/spark-0.9.1/sbt/sbt package

出错啦!错误信息如下:

awk: fatal: cannot open file `./project/build.properties' for reading (No such file or directory)
Attempting to fetch sbt
/home/jpan/Software/spark-0.9.1/sbt/sbt: line 35: /sbt/sbt-launch-.jar: No such file or directory
/home/jpan/Software/spark-0.9.1/sbt/sbt: line 35: /sbt/sbt-launch-.jar: No such file or directory
Our attempt to download sbt locally to /sbt/sbt-launch-.jar failed. Please install sbt manually from http://www.scala-sbt.org/

查看错误信息,分析sbt代码,发现sbt有点问题,在0.9.1这个版本中用的相对地址,当你进入simpleApp这个目录时,导致版本信息和jar包找不到。修改$SPARK_HOME/sbt/stb文件

把以下两句

 SBT_VERSION=`awk -F "=" '/sbt\\.version/ {print $2}' ./project/build.properties`
 JAR=/sbt/sbt-launch-${SBT_VERSION}.jar
修改为

#SBT_VERSION=`awk -F "=" '/sbt\\.version/ {print $2}' $SPARK_HOME/project/build.properties`
#JAR=$SPARK_HOME/sbt/sbt-launch-${SBT_VERSION}.jar

当然,你首先得在自己的系统变量中把$SPARK_HOME设置好。

再次运行,成功啦!

jpan@jpan-Beijing:~/Software/simpleApp$ ~/Software/spark-0.9.1/sbt/sbt package
Launching sbt from /home/jpan/Software/spark-0.9.1/sbt/sbt-launch-0.12.4.jar
[info] Set current project to Simple Project (in build file:/home/jpan/Software/simpleApp/)
[info] Updating {file:/home/jpan/Software/simpleApp/}default-f9c2c0...
[info] Resolving com.codahale.metrics#metrics-graphite;3.0.0 ...
[info] Done updating.
[info] Compiling 1 Scala source to /home/jpan/Software/simpleApp/target/scala-2.10/classes...
[info] Packaging /home/jpan/Software/simpleApp/target/scala-2.10/simple-project_2.10-1.0.jar ...
[info] Done packaging.
[success] Total time: 14 s, completed May 8, 2014 5:59:34 PM

jpan@jpan-Beijing:~/Software/simpleApp$ ~/Software/spark-0.9.1/sbt/sbt run
Launching sbt from /home/jpan/Software/spark-0.9.1/sbt/sbt-launch-0.12.4.jar
[info] Set current project to Simple Project (in build file:/home/jpan/Software/simpleApp/)
[info] Running SimpleApp 
14/05/08 18:00:14 WARN util.Utils: Your hostname, jpan-Beijing resolves to a loopback address: 127.0.1.1; using 192.168.1.102 instead (on interface eth0)
14/05/08 18:00:14 WARN util.Utils: Set SPARK_LOCAL_IP if you need to bind to another address
14/05/08 18:00:16 INFO slf4j.Slf4jLogger: Slf4jLogger started
14/05/08 18:00:16 INFO Remoting: Starting remoting
14/05/08 18:00:16 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://spark@jpan-Beijing.local:40021]
14/05/08 18:00:16 INFO Remoting: Remoting now listens on addresses: [akka.tcp://spark@jpan-Beijing.local:40021]
14/05/08 18:00:16 INFO spark.SparkEnv: Registering BlockManagerMaster
14/05/08 18:00:16 INFO storage.DiskBlockManager: Created local directory at /tmp/spark-local-20140508180016-da5a
....................
Lines with a: 62, Lines with b: 35
14/05/08 18:00:19 INFO network.ConnectionManager: Selector thread was interrupted!
[success] Total time: 6 s, completed May 8, 2014 6:00:19 PM

注:和Spark Admin交流说上面那个错误不是BUG,Spark下面的sbt/sbt只用于编译Spark项目,不用于运行其它项目。其它项目可以直接运行标准的sbt就行,即直接在simpleAPP目录下运行下面语句即可:

$sbt package
$sbt run



评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值