spark执行结果不打印日志,pom.xml配置

  1. log4j.properties(spark本地配置)
    在D:\spark\spark-core\src\main下新建resources目录,然后复制以下信息

Licensed to the Apache Software Foundation (ASF) under one or more

contributor license agreements. See the NOTICE file distributed with

this work for additional information regarding copyright ownership.

The ASF licenses this file to You under the Apache License, Version 2.0

(the “License”); you may not use this file except in compliance with

the License. You may obtain a copy of the License at

http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software

distributed under the License is distributed on an “AS IS” BASIS,

WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.

See the License for the specific language governing permissions and

limitations under the License.

Set everything to be logged to the console

log4j.rootCategory=WARN, console
log4j.appender.console=org.apache.log4j.ConsoleAppender
log4j.appender.console.target=System.err
log4j.appender.console.layout=org.apache.log4j.PatternLayout
log4j.appender.console.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} %p %c{1}: %m%n

Set the default spark-shell log level to WARN. When running the spark-shell, the

log level for this class is used to overwrite the root logger’s log level, so that

the user can have different defaults for the shell and regular Spark apps.

log4j.logger.org.apache.spark.repl.Main=WARN

Settings to quiet third party logs that are too verbose

log4j.logger.org.spark_project.jetty=WARN
log4j.logger.org.spark_project.jetty.util.component.AbstractLifeCycle=ERROR
log4j.logger.org.apache.spark.repl.SparkIMain e x p r T y p e r = I N F O l o g 4 j . l o g g e r . o r g . a p a c h e . s p a r k . r e p l . S p a r k I L o o p exprTyper=INFO log4j.logger.org.apache.spark.repl.SparkILoop exprTyper=INFOlog4j.logger.org.apache.spark.repl.SparkILoopSparkILoopInterpreter=INFO
log4j.logger.org.apache.parquet=ERROR
log4j.logger.parquet=ERROR

SPARK-9183: Settings to avoid annoying messages when looking up nonexistent UDFs in SparkSQL with Hive support

log4j.logger.org.apache.hadoop.hive.metastore.RetryingHMSHandler=FATAL
log4j.logger.org.apache.hadoop.hive.ql.exec.FunctionRegistry=ERROR

<?xml version="1.0" encoding="UTF-8"?>



spark
org.example
1.0-SNAPSHOT

4.0.0

<artifactId>spark-core</artifactId>
org.apache.spark spark-core_2.12 3.0.0
<dependency>
    <groupId>joda-time</groupId>
    <artifactId>joda-time</artifactId>    导入java时间
    <version>2.9.7</version>
</dependency>

<dependency>
    <groupId>mysql</groupId>
    <artifactId>mysql-connector-java</artifactId>   导入mysql
    <version>5.1.44</version>
</dependency>

<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-sql_2.12</artifactId>   导入spark-sql
    <version>3.0.0</version>
</dependency>
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值