在 /这里是你的spark路径,spark1.6.1_hadoop2.6/conf
目录下,运行
cp log4j.properties.template log4j.properties
然后打开 log4j.properties
#Set everything to be logged to the console
log4j.rootCategory=INFO, console
log4j.appender.console=org.apache.log4j.ConsoleAppender
log4j.appender.console.target=System.err
log4j.appender.console.layout=org.apache.log4j.PatternLayout
log4j.appender.console.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} %p %c{1}: %m%n
#Settings to quiet third party logs that are too verbose
log4j.logger.org.spark-project.jetty=WARN
log4j.logger.org.spark-project.jetty.util.component.AbstractLifeCycle=ERROR
log4j.logger.org.apache.spark.repl.SparkIMain$exprTyper=INFO
log4j.logger.org.apache.spark.repl.SparkILoop$SparkILoopInterpreter=INFO
将第二行的
log4j.rootCategory=INFO, console
改成
log4j.rootCategory=WARN, console
重启spark , 打开spark-shell就控制台就没有INFO 级别的输出了。