Spark每个任务输出一次日志文件
版本信息:
spark-2.4.3
hadoop-2.6.4
前些天在解决spark local模式的日志输出问题,需要每次执行一次spark作业就把该次作业的日志输出到一个日志文件中,这里记录下,分享下实现过程以及踩的坑;
先自定义一个FileAppender,如下:
package com.demo.util;
import org.apache.log4j.FileAppender;
import org.apache.log4j.Layout;
import org.apache.log4j.spi.ErrorCode;
import java.io.File;
import java.io.IOException;
/**
* This is a customized log4j appender, which will create a new file for every
* run of the application.
*/
public class NewLogForEachRunFileAppender extends FileAppender {
private String homeDir = "E:\\tmp\\logs";
private String fmiLogDir = "";
public NewLogForEachRunFileAppender() {
}
public NewLogForEachRunFileAppender(Layout layout, String filename,
boolean appe