Spark RDD Transformation 详解---Spark学习笔记7

这篇博客详细介绍了Spark中的RDD转换操作,包括filter、map、flatMap、union、groupByKey、reduceByKey、distinct、sortByKey、join、cogroup和cartesian等,通过实例解析了每个操作的功能和用法,是Spark学习者的重要参考资料。
摘要由CSDN通过智能技术生成

这几天学习了Spark RDD transformation 和 action ,做个笔记记录下心得,顺便分享给大家。

1. 启动spark-shell 

SPARK_MASTER=local[4] ./spark-shell.sh
Welcome to
      ____              __  
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 0.8.1
      /_/                  


Using Scala version 2.9.3 (Java HotSpot(TM) 64-Bit Server VM, Java 1.6.0_20)
Initializing interpreter...
14/04/04 10:49:44 INFO server.Server: jetty-7.x.y-SNAPSHOT
14/04/04 10:49:44 INFO server.AbstractConnector: Started SocketConnector@0.0.0.0:5757
Creating SparkContext...
14/04/04 10:49:50 INFO slf4j.Slf4jEventHandler: Slf4jEventHandler started
14/04/04 10:49:50 INFO spark.SparkEnv: Registering BlockManagerMaster
14/04/04 10:49:50 INFO storage.DiskBlockManager: Created local directory at /tmp/spark-local-20140404104950-5dd2

2.我们就拿根目录下的CHANGES.txt和README.txt文件做示例吧。

scala> sc
res0: org.apache.spark.SparkContext = org.apache.spark.SparkContext@5849b49d

scala> val changes = sc.textFile("CHANGES.txt")
14/04/04 10:51:39 INFO storage.MemoryStore: ensureFreeSpace(44905) called with curMem=0, maxMem=339585269
14/04/04 10:51:39 INFO storage.MemoryStore: Block broadcast_0 stored as values to memory (estimated size 43.9 KB, free 323.8 MB)
changes: org.apache.spark.rdd.RDD[String] = MappedRDD[1] at textFile at <console>:12

scala> changes foreach println
14/04/04 10:52:03 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
14/04/04 10:52:03 WARN snappy.LoadSnappy: Snappy native library not loaded
14/04/04 10:52:03 INFO mapred.FileInputFormat: Total input paths to process : 1
14/04/04 10:52:03 INFO spark.SparkContext: Starting job: foreach at <console>:15
14/04/04 10:52:03 INFO scheduler.DAGScheduler: Got job 0 (foreach at <console>:15) with 1 output partitions (allowLocal=false)
14/04/04 10:52:03 INFO scheduler.DAGScheduler: Final stage: Stage 0 (foreach at <console>:15)
14/04/04 10:52:03 INFO scheduler.DAGScheduler: Parents of final stage: List()
14/04/04 10:52:03 INFO scheduler.DAGScheduler: Missing parents: List()
14/04/04 10:52:03 INFO scheduler.DAGScheduler: Submitting Stage 0 (MappedRDD[1] at textFile at <console>:12), which has no missing parents
14/04/04 10:52:03 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from Stage 0 (MappedRDD[1] at textFile at <console>:12)
14/04/04 10:52:03 INFO local.LocalTaskSetManager: Size of task 0 is 1664 bytes
14/04/04 10:52:03 INFO executor.Executor: Running task ID 0
14/04/04 10:52:03 INFO storage.BlockManager: Found block broadcast_0 locally
14/04/04 10:52:03 INFO rdd.HadoopRDD: Input split: file:/app/home/hadoop/shengli/spark-0.8.1-incubating-bin-hadoop1/CHANGES.txt:0+65951
Spark Change Log

Release 0.8.1-incubating

  d03589d Mon Dec 9 23:10:00 2013 -0800
  M
评论 2
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值