背景:
最近使用MapReduce做离线数据清洗,在map段做简单的数据过滤,有经纬度的发送到reduce端,没经纬的过滤掉。reduce端将数据整理出来,按业务模型拼接成字符串写入HDFS。供hive作为外表进行后续数据处理分析。
问题:
该批数据总共2T大小,MapReduce执行第一次时,不出意料的崩溃了。每次都大概在map阶段执行到61%左右。
排查:
查看日志发现果然内存溢出:java.lang.OutOfMemoryError: GC overhead limit exceeded
。
Exception in thread thread_name: java.lang.OutOfMemoryError: GC Overhead limit exceededCause: The detail message “GC overhead limit exceeded” indicates that the garbage collector is running all the time and Java program is making very slow progress. After a garbage collection, if the Java process is spending more than approximately 98% of its time doing garbage collection and if it is recovering less than 2% of the heap and has been doing so far the last 5 (compile time constant) consecutive garbage collections, then a java.lang.OutOfMemoryError is thrown. This exception is typically thr