一、启动flume报错Class path contains multiple SLF4J bindings.
启动flume的时候报错如下:
Info: Including Hadoop libraries found via (/opt/module/hadoop-3.1.3/bin/hadoop) for HDFS access
Info: Including Hive libraries found via (/opt/module/hive) for Hive access
+ exec /opt/module/jdk1.8.0_212/bin/java -Xmx20m -cp 'conf/:/opt/module/flume/lib/*:/opt/module/hadoop-3.1.3/share/hadoop/common/lib/*:/opt/module/hadoop-3.1.3/share/hadoo p/common/*:/opt/module/hadoop-3.1.3/share/hadoop/hdfs:/opt/module/hadoop-3.1.3/share/hadoop/hdfs/lib/*:/opt/module/hadoop-3.1.3/share/hadoop/hdfs/*:/opt/module/hadoop-3.1. 3/share/hadoop/mapreduce/lib/*:/opt/module/hadoop-3.1.3/share/hadoop/mapreduce/*:/opt/module/hadoop-3.1.3/share/hadoop/yarn:/opt/module/hadoop-3.1.3/share/hadoop/yarn/lib/ *:/opt/module/hadoop-3.1.3/share/hadoop/yarn/*:/opt/module/hadoop-3.1.3/etc/hadoop:/opt/module/tez/*:/opt/module/tez/lib/*:/opt/module/tez/hadoop-shim-0.10.1-SNAPSHOT.jar: /opt/module/tez/hadoop-shim-2.8-0.10.1-SNAPSHOT.jar:/opt/module/tez/lib:/opt/module/tez/LICENSE:/opt/module/tez/LICENSE-BSD-3clause:/opt/module/tez/LICENSE-CDDLv1.1-GPLv2_ withCPE:/opt/module/tez/LICENSE-MIT:/opt/module/tez/LICENSE-SIL_OpenFontLicense-v1.1:/opt/module/tez/NOTICE:/opt/module/tez/tez-api-0.10.1-SNAPSHOT.jar:/opt/module/tez/tez -build-tools-0.10.1-SNAPSHOT.jar:/opt/module/tez/tez-common-0.10.1-SNAPSHOT.jar:/opt/module/tez/tez-dag-0.10.1-SNAPSHOT.jar:/opt/module/tez/tez-examples-0.10.1-SNAPSHOT.ja r:/opt/module/tez/tez-history-parser-0.10.1-SNAPSHOT.jar:/opt/module/tez/tez-javadoc-tools-0.10.1-SNAPSHOT.jar:/opt/module/tez/tez-job-analyzer-0.10.1-SNAPSHOT.jar:/opt/mo dule/tez/tez-mapreduce-0.10.1-SNAPSHOT.jar:/opt/module/tez/tez-protobuf-history-plugin-0.10.1-SNAPSHOT.jar:/opt/module/tez/tez-runtime-internals-0.10.1-SNAPSHOT.jar:/opt/m odule/tez/tez-runtime-library-0.10.1-SNAPSHOT.jar:/opt/module/tez/tez-tests-0.10.1-SNAPSHOT.jar:/opt/module/tez/tez-yarn-timeline-cache-plugin-0.10.1-SNAPSHOT.jar:/opt/mod ule/tez/tez-yarn-timeline-history-0.10.1-SNAPSHOT.jar:/opt/module/tez/tez-yarn-timeline-history-with-acls-0.10.1-SNAPSHOT.jar:/opt/module/tez/tez-yarn-timeline-history-wit h-fs-0.10.1-SNAPSHOT.jar:/opt/module/tez/lib/async-http-client-1.9.40.jar:/opt/module/tez/lib/commons-cli-1.2.jar:/opt/module/tez/lib/commons-codec-1.11.jar:/opt/module/te z/lib/commons-collections4-4.1.jar:/opt/module/tez/lib/commons-io-2.4.jar:/opt/module/tez/lib/commons-lang-2.6.jar:/opt/module/tez/lib/commons-math3-3.1.1.jar:/opt/module/ tez/lib/guava-27.0-jre.jar:/opt/module/tez/lib/hadoop-hdfs-client-3.1.3.jar:/opt/module/tez/lib/hadoop-mapreduce-client-common-3.1.3.jar:/opt/module/tez/lib/hadoop-mapredu ce-client-core-3.1.3.jar:/opt/module/tez/lib/hadoop-yarn-server-timeline-pluginstorage-3.1.3.jar:/opt/module/tez/lib/javax.servlet-api-3.1.0.jar:/opt/module/tez/lib/jersey -client-1.19.jar:/opt/module/tez/lib/jersey-json-1.19.jar:/opt/module/tez/lib/jettison-1.3.4.jar:/opt/module/tez/lib/jsr305-3.0.0.jar:/opt/module/tez/lib/metrics-core-3.1. 0.jar:/opt/module/tez/lib/protobuf-java-2.5.0.jar:/opt/module/tez/lib/RoaringBitmap-0.5.21.jar:/opt/module/tez/lib/slf4j-api-1.7.10.jar:/opt/module/hive/lib/*' -Djava.libr ary.path=:/opt/module/hadoop-3.1.3/lib/native org.apache.flume.node.Application --name a3 --conf-file job/flume-dir-hdfs.conf
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/opt/module/flume/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/module/hadoop-3.1.3/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
2021-10-09 18:50:26,699 ERROR [main] node.Application (Application.java:main(374)) - A fatal error occurred while running. Exception follows.
org.apache.commons.cli.ParseException: The specified configuration file does not exist: /opt/module/flume/datas/job/flume-dir-hdfs.conf
at org.apache.flume.node.Application.main(Application.java:342)
解决方式
删除flume下面的slf4j-log4j12-1.7.25.jar即可
rm /opt/module/flume/lib/slf4j-log4j12-1.7.25.jar
二、flume设置在HDFS上输出目录失败
报错如下:
2021-10-09 19:54:21,288 INFO [SinkRunner-PollingRunner-DefaultSinkProcessor] hdfs.BucketWriter (BucketWriter.java:open(246)) - Creating hdfs://hadoop102:8020/flume/202110 09/19/logs-.1633780369893.tmp
2021-10-09 19:54:21,302 WARN [SinkRunner-PollingRunner-DefaultSinkProcessor] hdfs.HDFSEventSink (HDFSEventSink.java:process(454)) - HDFS IO error
org.apache.hadoop.hdfs.server.namenode.SafeModeException: Cannot create file/flume/20211009/19/logs-.1633780369893.tmp. Name node is in safe mode.
The reported blocks 200 needs additional 2 blocks to reach the threshold 0.9990 of total blocks 203.
The minimum number of live datanodes is not required. Safe mode will be turned off automatically once the thresholds have been reached. NamenodeHostName:hadoop102
解决方法
是由于hdfs处于安全模式,关闭即可:
hdfs dfsadmin -safemode leave
另外有一种永久关闭安全模式的方法
在HDFS配置文件中修改安全模式阀值
在hdfs-site.xml中设置安全阀值属性,属性值默认为0.999f改为0,其值越低检查阈值越低:
<property>
<name>dfs.safemode.threshold.pct</name>
<value>0</value>
<description>
Specifies the percentage of blocks that should satisfy
the minimal replication requirement defined by dfs.replication.min.
Values less than or equal to 0 mean not to wait for any particular
percentage of blocks before exiting safemode.
Values greater than 1 will make safe mode permanent.
</description>
</property>