apache.spark.SparkException: Job aborted due to stage failure: Serialized task 32:5 was 204136673 bytes, which exceeds max allowed: spark.rpc.message.maxSize (134217728 bytes).Consider increasing spark.rpc.message.maxSize or using broadcast variables for large values.
用sc.parallelize(data,slices)时,如果data数据过大,易出现该问题
解决:增加spark.rpc.message.maxSize,该值默认大小128M
提交任务是加:–conf spark.rpc.message.maxSize=512
解决Consider increasing spark.rpc.message.maxSize
最新推荐文章于 2021-10-26 14:24:23 发布