pyspark --queue default\
--driver-memory 10G \
--executor-cores 6 \
--executor-memory 10G \
--executor-cores 6 \
--conf spark.kryoserializer.buffer.max=256m \
--conf spark.kryoserializer.buffer=64m \
--conf spark.driver.maxResultSize=4096m \
--conf spark.executor.memoryOverhead=2048m \
--conf spark.driver.memoryOverhead=2048m
使用spark sql的thrift jdbc接口查询数据时报这个错误
Exception in thread "main" java.sql.SQLException: org.apache.spark.SparkException: Job aborted due to stage failure: Task 3107 in stage 308.0 failed 4 times, most recent failure: Lost task 3107.3 in stage 308.0 (TID 620318, XXX): org.apache.spark.SparkException: Kryo serialization failed: Buffer overflow. Available: 1572864, required: 3236381
Serialization trace:
values (org.apache.spark.sql.catalyst.expressions.GenericInternalRow). To avoid this, increase spark.kryoserializer.buffer.max value.
at org.ap