- log4j.properties(spark本地配置)
在D:\spark\spark-core\src\main下新建resources目录,然后复制以下信息
Licensed to the Apache Software Foundation (ASF) under one or more
contributor license agreements. See the NOTICE file distributed with
this work for additional information regarding copyright ownership.
The ASF licenses this file to You under the Apache License, Version 2.0
(the “License”); you may not use this file except in compliance with
the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an “AS IS” BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
Set everything to be logged to the console
log4j.rootCategory=WARN, console
log4j.appender.console=org.apache.log4j.ConsoleAppender
log4j.appender.console.target=System.err
log4j.appender.console.layout=org.apache.log4j.PatternLayout
log4j.appender.console.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} %p %c{1}: %m%n
Set the default spark-shell log level to WARN. When running the spark-shell, the
log level for this class is used to overwrite the root logger’s log level, so that
the user can have different defaults for the shell and regular Spark apps.
log4j.logger.org.apache.spark.repl.Main=WARN
Settings to quiet third party logs that are too verbose
log4j.logger.org.spark_project.jetty=WARN
log4j.logger.org.spark_project.jetty.util.component.AbstractLifeCycle=ERROR
log4j.logger.org.apache.spark.repl.SparkIMain
e
x
p
r
T
y
p
e
r
=
I
N
F
O
l
o
g
4
j
.
l
o
g
g
e
r
.
o
r
g
.
a
p
a
c
h
e
.
s
p
a
r
k
.
r
e
p
l
.
S
p
a
r
k
I
L
o
o
p
exprTyper=INFO log4j.logger.org.apache.spark.repl.SparkILoop
exprTyper=INFOlog4j.logger.org.apache.spark.repl.SparkILoopSparkILoopInterpreter=INFO
log4j.logger.org.apache.parquet=ERROR
log4j.logger.parquet=ERROR
SPARK-9183: Settings to avoid annoying messages when looking up nonexistent UDFs in SparkSQL with Hive support
log4j.logger.org.apache.hadoop.hive.metastore.RetryingHMSHandler=FATAL
log4j.logger.org.apache.hadoop.hive.ql.exec.FunctionRegistry=ERROR
spark
org.example
1.0-SNAPSHOT
4.0.0
<artifactId>spark-core</artifactId>
org.apache.spark spark-core_2.12 3.0.0
<dependency>
<groupId>joda-time</groupId>
<artifactId>joda-time</artifactId> 导入java时间
<version>2.9.7</version>
</dependency>
<dependency>
<groupId>mysql</groupId>
<artifactId>mysql-connector-java</artifactId> 导入mysql
<version>5.1.44</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.12</artifactId> 导入spark-sql
<version>3.0.0</version>
</dependency>