Flink Yarn TaskManager|JobManager日志采集方案

背景

Flink on Yarn 的模式下,TaskManager的日志会存储到所在的 DataNode 上,当 Flink 任务发生异常,产生异常日志时,需要第一时间感知任务已经出现异常,避免影响业务。因此需要将Flink任务的日志实时收集起来,任务出现异常日志立刻报警,做的真正意义上的实时报警,,而不是等Flink任务掉线了再发出报警。收集任务日志的好处除了任务掉线方便我们通过日志定位问题,还可以支持日志实时报警,搜索,展示等功能的实现。

Flink-1.7.2/1.10

1、修改$FLINK_HOME/conf/log4j.properties配置文件如下

################################################################################
#  Licensed to the Apache Software Foundation (ASF) under one
#  or more contributor license agreements.  See the NOTICE file
#  distributed with this work for additional information
#  regarding copyright ownership.  The ASF licenses this file
#  to you under the Apache License, Version 2.0 (the
#  "License"); you may not use this file except in compliance
#  with the License.  You may obtain a copy of the License at
#
#      http://www.apache.org/licenses/LICENSE-2.0
#
#  Unless required by applicable law or agreed to in writing, software
#  distributed under the License is distributed on an "AS IS" BASIS,
#  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
#  See the License for the specific language governing permissions and
# limitations under the License.
################################################################################

# This affects logging for both user code and Flink
log4j.rootLogger=INFO, file,kafka

# Uncomment this if you want to _only_ change Flink's logging
#log4j.logger.org.apache.flink=INFO

# The following lines keep the log level of common libraries/connectors on
# log level INFO. The root logger does not override this. You have to manually
# change the log levels here.
log4j.logger.akka=INFO
log4j.logger.org.apache.kafka=INFO
log4j.logger.org.apache.hadoop=INFO
log4j.logger.org.apache.zookeeper=INFO

# Log all infos in the given file
log4j.appender.file=org.apache.log4j.FileAppender
log4j.appender.file.file=${log.file}
log4j.appender.file.append=false
log4j.appender.file.layout=org.apache.log4j.PatternLayout
log4j.appender.file.layout.ConversionPattern=%d{yyyy-MM-dd HH:mm:ss,SSS} %-5p %-60c %x - %m%n

# Suppress the irrelevant (wrong) warnings from the Netty channel handler
log4j.logger.org.apache.flink.shaded.akka.org.jboss.netty.channel.DefaultChannelPipeline=ERROR, file

log4j.appender.kafka=org.apache.kafka.log4jappender.KafkaLog4jAppender
log4j.appender.kafka.brokerList=CentOS:9092
log4j.appender.kafka.topic=flink_job_logs
log4j.appender.kafka.compressionType=none
log4j.appender.kafka.requiredNumAcks=0
log4j.appender.kafka.syncSend=true
log4j.appender.kafka.layout=org.apache.log4j.PatternLayout
log4j.appender.kafka.layout.ConversionPattern={"time":"%d{yyyy-MM-dd HH:mm:ss}","level":"%p","thread":"%t","source":"${log.file}","message":"%d{yyyy-MM-dd HH:mm:ss,SSS} %-5p %-60c %x - %m"}
log4j.appender.kafka.level=INFO
log4j.logger.kafka=INFO
log4j.logger.org.apache.kafka=WARN

2、在$FLINK_HOME/lib目录下添加如下jar文件kafka-clients-x.x.x.jarkafka-log4j-appender-x.x.x.jar

[root@centos ~]# ls -l /usr/flink-1.7.2/lib/
total 94504
-rw-r--r--. <
  • 0
    点赞
  • 3
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值