hive做大数据处理时,遇到如下异常:
问题原因:
动态分区过大,需要设置hive.exec.max.dynamic.partitions、hive.exec.max.dynamic.partitions.pernode参数
问题解决:
SET hive.exec.max.dynamic.partitions=100000;
SET hive.exec.max.dynamic.partitions.pernode=100000;
总结:
hive分区数目不是无限制的,最大分区数可以通过参数设置。
[Fatal Error] Operator FS_2 (id=2): Number of dynamic partitions exceeded hive.exec.max.dynamic.partitions.pernode.
......
org.apache.hadoop.hive.ql.metadata.HiveException: org.apache.hadoop.ipc.RemoteException: org.apache.hadoop.hdfs.server.namenode.LeaseExpiredException: No lease on /tmp/hive-maintain/hive_2012-11-28_22-39-43_810_1689858262130334284/_task_tmp.-ext-10002/part=33436268/_tmp.000004_0 File does not exist.
Holder DFSClient_attempt_201211250925_9859_m_000004_0 does not have any open files.
问题原因:
动态分区过大,需要设置hive.exec.max.dynamic.partitions、hive.exec.max.dynamic.partitions.pernode参数
问题解决:
SET hive.exec.max.dynamic.partitions=100000;
SET hive.exec.max.dynamic.partitions.pernode=100000;
总结:
hive分区数目不是无限制的,最大分区数可以通过参数设置。