hive插入分区表报错
分区最大1000,3年1096超过1000
[INFO] 2022-05-10 10:35:39.767 - [taskAppId=TASK-977-43977-61297]:[112] - -> client token: N/A
diagnostics: User class threw exception: org.apache.spark.sql.AnalysisException: org.apache.hadoop.hive.ql.metadata.HiveException: Number of dynamic partitions created is 1096, which is more than 1000. To solve this try to set hive.exec.max.dynamic.partitions to at least 1096.;
ApplicationMaster host: 10.251.xx.xx
ApplicationMaster RPC port: 0
queue: root.ads_sense_rep
start time: 1652118301901
final status: FAILED
tracking URL:
user: use_sense_rw