1:
conf.set(“spark.kryoserializer.buffer.max”,”100m”)
2:
testDataRdd.map(p =>Person (p._1,p._2,sameModel.predict(p._3).toInt)).toDF()
这样写占容易 java.lang.OutOfMemoryError: Java heap space,改成下面这样
sameModel.predict(testDataRdd.map(_._3)).map(_.toInt).zip(testDataRdd)
.map(p =>(p._2._1,p._2._2,p._1)).toDF("imei","model","k")
3:
报错:Can not set final Scala.collection.mutable.ListBuffer field org.apache.spark.mllib.fpm.FPTree$Summary.nodes to scala.collection.mutable.ArrayBuffer
val sparkConf = new SparkConf()
//sparkConf .set("spark.serializer", "org.apache.spark.serializer.JavaSerializer")
sparkConf.registerKryoClasses(Array(classOf[ArrayBuffer[String]], classOf[ListBuffer[String]]))