12.randomSplit操作
将由数字1~10组成的RDD,用randomSplit操作拆分成3个RDD。
scala> val rddData1 = sc.parallelize(1 to 10,3)
rddData1: org.apache.spark.rdd.RDD[Int] = ParallelCollectionRDD[28] at parallelize at <console>:24
scala> val splitRDD = rddData1.randomSplit(Array(1,4,5))
splitRDD: Array[org.apache.spark.rdd.RDD[Int]] = Array(MapPartitionsRDD[29] at randomSplit at <console>:26, MapPartitionsRDD[30] at randomSplit at <console>:26, MapPartitionsRDD[31] at randomSplit at <console>:26)
scala> splitRDD(0).collect
res9: Array[Int] = Array(3)
scala> splitRDD(1).collect
res10: Array[Int] = Array(2, 7)
scala> splitRDD(2).collect
res12: Array[Int] = Array(1, 4, 5, 6, 8, 9, 10)
说明:
val splitRDD = rddData1.randomSplit(Array(1,4,5)):将rddData1按照1:4:5的比例拆分成3个RDD。生成的splitRDD对象是一个Array,其中存放3个RDD。
randomSplit的权重比例之和最好为1,否则可能无法得到期望的结果。