Scala版本
scala-2.10.4
说明:
之前搭建环境一直不成功,原因可能是使用了Scala-2.11.4版本导致的。Spark的官方网站明确的说Spark-1.2.0不支持Scala2.11.4版本:
Note: Scala 2.11 users should download the Spark source package and build with Scala 2.11 support.
Spark版本:
spark-1.2.0-bin-hadoop2.4.tgz
配置环境变量
export SCALA_HOME=/home/hadoop/spark1.2.0/scala-2.10.4
export PATH=$SCALA_HOME/bin:$PATH
export SPARK_HOME=/home/hadoop/spark1.2.0/spark-1.2.0-bin-hadoop2.4
export PATH=$SPARK_HOME/bin:$PATH
搭建Intellij Idea开发Spark程序的环境
1. 下载安装Scala插件
2. 创建 Scala的Non-SBT项目
3. 导入Spark的jar包
spark-1.2.0-bin-hadoop2.4
4.编写wordcount例子代码
package spark.examples
import org.apache.spark.SparkConf
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
object SparkWordCount {
def main(args: Array[String]) {
///注意setMaster("local")这行代码,表明Spark以local运行(注意local与standalone模式的区别)
val conf = new SparkConf().setAppName("SparkWordCount").setMaster("local")
val sc = new SparkContext(conf)
val rdd = sc.textFile("file:///home/hadoop/spark1.2.0/word.txt")
rdd.flatMap(_.split(" ")).map((_, 1)).reduceByKey(_ + _).map(x => (x._2, x._1)).sortByKey(false).map(x => (x._2, x._1)).saveAsTextFile("file:///home/hadoop/spark1.2.0/WordCountResult")
sc.stop
}
}
控制台日志:
15/01/14 22:06:34 WARN Utils: Your hostname, hadoop-Inspiron-3521 resolves to a loopback address: 127.0.1.1; using 192.168.0.111 instead (on interface eth1)
15/01/14 22:06:34 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
15/01/14 22:06:35 INFO SecurityManager: Changing view acls to: hadoop
15/01/14 22:06:35 INFO SecurityManager: Changing modify acls to: hadoop
15/01/14 22:06:35 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(hadoop); users with modify permissions: Set(hadoop)
15/01/14 22:06:36 INFO Slf4jLogger: Slf4jLogger started
15/01/14 22:06:36 INFO Remoting: Starting remoting
15/01/14 22:06:36 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriver@hadoop-Inspiron-3521.local:53624]
15/01/14 22:06:36 INFO Utils: Successfully started service 'sparkDriver' on port 53624.
15/01/14 22:06:36 INFO SparkEnv: Registering MapOutputTracker
15/01/14 22:06:36 INFO SparkEnv: Registering BlockManagerMaster
15/01/14 22:06:36 INFO DiskBlockManager: Created local directory at /tmp/spark-local-20150114220636-4826
15/01/14 22:06:36 INFO MemoryStore: MemoryStore started with capacity 461.7 MB
15/01/14 22:06:37 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
15/01/14 22:06:37 INFO HttpFileServer: HTTP File server directory is /tmp/spark-19683393-0315-498c-9b72-9c6a13684f44
15/01/14 22:06:37 INFO HttpServer: Starting HTTP Server
1