Maximum was set to 100 partitions per node, number of dynamic partitions on this node: 101
原因:
在Hive动态分区插入数据的时候,报最大分区数错误,因为默认最大的分区数是100,而当前任务超过了这个数值,需要重新进行手动设置。
[2023-03-23 10:15:28] [42000][3] Error while processing statement: FAILED: Execution Error, return code 3 from org.apache.hadoop.hive.ql.exec.spark.SparkTask. Spark job failed due to task failures: [Error 20004]: Fatal error occurred when node tried to create too many dynamic partitions. The maximum number of dynamic partitions is controlled by hive.exec.max.dynamic.partitions and hive.exec.max.dynamic.partitions.pernode. Maximum was set to 100 partitions per node, number of dynamic partitions on this node: 101
解决办法:
-- 设置动态分区最大分区数、每个节点数参数:
hive > set hive.exec.max.dynamic.partitions = 100000
hive > set hive.exec.max.dynamic.partitions.pernode = 100000
-- 补充:设置动态分区严格模式参数:
hive > set hive.exec.dynamic.partition = true;
hive > set hive.exec.dynamic.partition.mode = nonstrict;
单纯个人记录和分享, 希望得到支持和鼓励。
如果对您有帮助,可以点赞评论鼓励一下!
如果有更优的的建议或方法,可以在评论区留下见解!