1.数据格式
1 2 3
1 4 5
4 5 6
4 7 8
7 8 9
10 11 12
10 13 14
10 1 2
1 100 100
10 11 2
10 11 2
1 2 5
4 7 6
2.程序
val conf = new SparkConf().setAppName("Mode")
conf.setMaster("local[3]")
val sc.new SparkContext(conf)
val data = sc.textFile("/home/i.txt") //读入测试数据
val dataMap = data.map(_.split("\t"))
.map(f=>f.map(f=>f.toDouble))
.map(f=>("k"+f(0),f(1)))
//dataMap: RDD[(String,