1.spark运行任务
报错 Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources
提示没有足够内存
查看日志发现啊内存问题
Exception in thread "main" java.lang.IllegalArgumentException: System memory 466092032 must be at least 471859200.
Please increase heap size using the --driver-memory option or spark.driver.memory in Spark configuration.
2.代码
import org.apache.spark.rdd.RDD
import org.apache.spark.{SparkConf, SparkContext}
object ActionDemo {
def main(args: Array[String]): Unit = {
SparkConf().setAppName("ArraySparkTestDemo").setMaster("spark://192.168.139.169:7077")
val context = new SparkContext(conf)
val rdd1:RDD[Int] = context.parallelize(List(1,2,3,4,5,6,4))
println(rdd1.collect().toList)
println(rdd1.count())
val rdd2:Array[Int]=rdd1.top(3)
println(rdd2.toBuffer)
val rdd3:Array[Int]=rdd1.take(3)
println(rdd3.toBuffer)
}
}
3.启动命令需要修改下,调整下内存,不能太大,根据spark后台查看每个worker内存
spark-submit --master spark://192.168.139.169:7077 --name ArraySparkTestDemo --class core.ActionDemo --executor-memory 499m --total-executor-cores 2 spark-test-0.0.1-SNAPSHOT.jar