2020-08-23

Flume-ng问题:[INFO - org.apache.flume.source.ExecSource$ExecRunnable.run(ExecSource.java:372)] Command [tail –F /export/servers/kafka_2.11-1.0.0/agent.log] exited with 1

flume对接kafka,tail -F filepath监控linux系统里的某个文件,以topic发送给kafka,但是tail -F exit,求帮助

查看Flume配置

vim log-kafka.properties
a1.sources = r1
a1.channels = c1
a1.sinks = k1

# Describe the source
a1.sources.r1.type = exec
a1.sources.r1.command = tail –F /export/servers/kafka_2.11-1.0.0/agent.log
a1.sources.r1.interceptors = i1 
a1.sources.r1.interceptors.i1.type = regex_filter 
# 定义日志过滤前缀的正则
a1.sources.r1.interceptors.i1.regex = .+MOVIE_RATING_PREFIX.+

# Describe the sink
a1.sinks.k1.type = org.apache.flume.sink.kafka.KafkaSink 
a1.sinks.k1.kafka.topic = log
a1.sinks.k1.kafka.bootstrap.servers =192.168.52.100:9092,192.168.52.110:9092,192.168.52.120:9092
a1.sinks.k1.kafka.producer.acks = 1
a1.sinks.k1.kafka.flumeBatchSize  = 1

# Each channel's type is defined. 
a1.channels.c1.type = memory
a1.channels.c1.capacity = 1000
a1.channels.c1.transactionCapacity = 100

# Bind the source and sink to the channel
a1.sources.r1.channels = c1
a1.sinks.k1.channel = c1

启动Flume

bin/flume-ng agent -c ./conf -f ./conf/log-kafka.properties -n a1 -Dflume.root.logger=INFO,console
Info: Sourcing environment configuration script /export/servers/apache-flume-1.6.0-cdh5.14.0-bin/conf/flume-env.sh
Info: Including Hadoop libraries found via (/export/servers/hadoop-2.6.0-cdh5.14.0/bin/hadoop) for HDFS access
Info: Including HBASE libraries found via (/export/servers/hbase/bin/hbase) for HBASE access
Info: Including Hive libraries found via (/export/servers/hive-1.1.0-cdh5.14.0) for Hive access
+ exec /export/servers/jdk1.8.0_141/bin/java -Xmx20m -Dflume.root.logger=INFO,console -cp '/export/servers/apache-flume-1.6.0-cdh5.14.0-bin/conf:/export/servers/apache-flume-1.6.0-cdh5.14.0-bin/lib/*:/export/servers/hadoop-2.6.0-cdh5.14.0/etc/hadoop:/export/servers/hadoop-2.6.0-cdh5.14.0/share/hadoop/common/lib/*:/export/servers/hadoop-2.6.0-cdh5.14.0/share/hadoop/common/*:/export/servers/hadoop-2.6.0-cdh5.14.0/share/hadoop/hdfs:/export/servers/hadoop-2.6.0-cdh5.14.0/share/hadoop/hdfs/lib/*:/export/servers/hadoop-2.6.0-cdh5.14.0/share/hadoop/hdfs/*:/export/servers/hadoop-2.6.0-cdh5.14.0/share/hadoop/yarn/lib/*:/export/servers/hadoop-2.6.0-cdh5.14.0/share/hadoop/yarn/*:/export/servers/hadoop-2.6.0-cdh5.14.0/share/hadoop/mapreduce/lib/*:/export/servers/hadoop-2.6.0-cdh5.14.0/share/hadoop/mapreduce/*:/export/servers/hadoop-2.6.0-cdh5.14.0/contrib/capacity-scheduler/*.jar:/export/servers/hbase/conf:/export/servers/jdk1.8.0_141/lib/tools.jar:/export/servers/hbase:/export/servers/hbase/lib/activation-1.1.jar:/export/servers/hbase/lib/antisamy-1.4.3.jar:/export/servers/hbase/lib/aopalliance-1.0.jar:/export/servers/hbase/lib/apacheds-i18n-2.0.0-M15.jar:/export/servers/hbase/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/export/servers/hbase/lib/api-asn1-api-1.0.0-M20.jar:/export/servers/hbase/lib/api-util-1.0.0-M20.jar:/export/servers/hbase/lib/asm-3.1.jar:/export/servers/hbase/lib/avro-1.7.4.jar:/export/servers/hbase/lib/batik-css-1.7.jar:/export/servers/hbase/lib/batik-ext-1.7.jar:/export/servers/hbase/lib/batik-util-1.7.jar:/export/servers/hbase/lib/bsh-core-2.0b4.jar:/export/servers/hbase/lib/commons-beanutils-1.7.0.jar:/export/servers/hbase/lib/commons-beanutils-core-1.7.0.jar:/export/servers/hbase/lib/commons-cli-1.2.jar:/export/servers/hbase/lib/commons-codec-1.9.jar:/export/servers/hbase/lib/commons-collections-3.2.2.jar:/export/servers/hbase/lib/commons-compress-1.4.1.jar:/export/servers/hbase/lib/commons-configuration-1.6.jar:/export/servers/hbase/lib/commons-daemon-1.0.13.jar:/export/servers/hbase/lib/commons-digester-1.8.jar:/export/servers/hbase/lib/commons-el-1.0.jar:/export/servers/hbase/lib/commons-fileupload-1.2.jar:/export/servers/hbase/lib/commons-httpclient-3.1.jar:/export/servers/hbase/lib/commons-io-2.4.jar:/export/servers/hbase/lib/commons-lang-2.6.jar:/export/servers/hbase/lib/commons-logging-1.2.jar:/export/servers/hbase/lib/commons-math-2.2.jar:/export/servers/hbase/lib/commons-math3-3.1.1.jar:/export/servers/hbase/lib/commons-net-3.1.jar:/export/servers/hbase/lib/disruptor-3.3.0.jar:/export/servers/hbase/lib/esapi-2.1.0.jar:/export/servers/hbase/lib/findbugs-annotations-1.3.9-1.jar:/export/servers/hbase/lib/guava-12.0.1.jar:/export/servers/hbase/lib/guice-3.0.jar:/export/servers/hbase/lib/guice-servlet-3.0.jar:/export/servers/hbase/lib/hadoop-annotations-2.5.1.jar:/export/servers/hbase/lib/hadoop-auth-2.5.1.jar:/export/servers/hbase/lib/hadoop-client-2.5.1.jar:/export/servers/hbase/lib/hadoop-common-2.5.1.jar:/export/servers/hbase/lib/hadoop-hdfs-2.5.1.jar:/export/servers/hbase/lib/hadoop-mapreduce-client-app-2.5.1.jar:/export/servers/hbase/lib/hadoop-mapreduce-client-common-2.5.1.jar:/export/servers/hbase/lib/hadoop-mapreduce-client-core-2.5.1.jar:/export/servers/hbase/lib/hadoop-mapreduce-client-jobclient-2.5.1.jar:/export/servers/hbase/lib/hadoop-mapreduce-client-shuffle-2.5.1.jar:/export/servers/hbase/lib/hadoop-yarn-api-2.5.1.jar:/export/servers/hbase/lib/hadoop-yarn-client-2.5.1.jar:/export/servers/hbase/lib/hadoop-yarn-common-2.5.1.jar:/export/servers/hbase/lib/hadoop-yarn-server-common-2.5.1.jar:/export/servers/hbase/lib/hbase-annotations-1.2.1.jar:/export/servers/hbase/lib/hbase-annotations-1.2.1-tests.jar:/export/servers/hbase/lib/hbase-client-1.2.1.jar:/export/servers/hbase/lib/hbase-common-1.2.1.jar:/export/servers/hbase/lib/hbase-common-1.2.1-tests.jar:/export/servers/hbase/lib/hbase-examples-1.2.1.jar:/export/servers/hbase/lib/hbase-external-blockcache-1.2.1.jar:/export/servers/hbase/lib/hbase-hadoop2-compat-1.2.1.jar:/export/servers/hbase/lib/hbase-hadoop-compat-1.2.1.jar:/export/servers/hbase/lib/hbase-it-1.2.1.jar:/export/servers/hbase/lib/hbase-it-1.2.1-tests.jar:/export/servers/hbase/lib/hbase-prefix-tree-1.2.1.jar:/export/servers/hbase/lib/hbase-procedure-1.2.1.jar:/export/servers/hbase/lib/hbase-protocol-1.2.1.jar:/export/servers/hbase/lib/hbase-resource-bundle-1.2.1.jar:/export/servers/hbase/lib/hbase-rest-1.2.1.jar:/export/servers/hbase/lib/hbase-server-1.2.1.jar:/export/servers/hbase/lib/hbase-server-1.2.1-tests.jar:/export/servers/hbase/lib/hbase-shell-1.2.1.jar:/export/servers/hbase/lib/hbase-thrift-1.2.1.jar:/export/servers/hbase/lib/htrace-core-3.1.0-incubating.jar:/export/servers/hbase/lib/httpclient-4.2.5.jar:/export/servers/hbase/lib/httpcore-4.4.1.jar:/export/servers/hbase/lib/jackson-core-asl-1.9.13.jar:/export/servers/hbase/lib/jackson-jaxrs-1.9.13.jar:/export/servers/hbase/lib/jackson-mapper-asl-1.9.13.jar:/export/servers/hbase/lib/jackson-xc-1.9.13.jar:/export/servers/hbase/lib/jamon-runtime-2.4.1.jar:/export/servers/hbase/lib/jasper-compiler-5.5.23.jar:/export/servers/hbase/lib/jasper-runtime-5.5.23.jar:/export/servers/hbase/lib/javax.inject-1.jar:/export/servers/hbase/lib/java-xmlbuilder-0.4.jar:/export/servers/hbase/lib/jaxb-api-2.2.2.jar:/export/servers/hbase/lib/jaxb-impl-2.2.3-1.jar:/export/servers/hbase/lib/jcodings-1.0.8.jar:/export/servers/hbase/lib/jersey-client-1.9.jar:/export/servers/hbase/lib/jersey-core-1.9.jar:/export/servers/hbase/lib/jersey-guice-1.9.jar:/export/servers/hbase/lib/jersey-json-1.9.jar:/export/servers/hbase/lib/jersey-server-1.9.jar:/export/servers/hbase/lib/jets3t-0.9.0.jar:/export/servers/hbase/lib/jettison-1.3.3.jar:/export/servers/hbase/lib/jetty-6.1.26.jar:/export/servers/hbase/lib/jetty-sslengine-6.1.26.jar:/export/servers/hbase/lib/jetty-util-6.1.26.jar:/export/servers/hbase/lib/joni-2.1.2.jar:/export/servers/hbase/lib/jruby-complete-1.6.8.jar:/export/servers/hbase/lib/jsch-0.1.42.jar:/export/servers/hbase/lib/jsp-2.1-6.1.14.jar:/export/servers/hbase/lib/jsp-api-2.1-6.1.14.jar:/export/servers/hbase/lib/jsr305-1.3.9.jar:/export/servers/hbase/lib/junit-4.12.jar:/export/servers/hbase/lib/leveldbjni-all-1.8.jar:/export/servers/hbase/lib/libthrift-0.9.3.jar:/export/servers/hbase/lib/log4j-1.2.17.jar:/export/servers/hbase/lib/metrics-core-2.2.0.jar:/export/servers/hbase/lib/nekohtml-1.9.12.jar:/export/servers/hbase/lib/netty-all-4.0.23.Final.jar:/export/servers/hbase/lib/paranamer-2.3.jar:/export/servers/hbase/lib/phoenix-4.8.2-HBase-1.2-server.jar:/export/servers/hbase/lib/phoenix-core-4.8.2-HBase-1.2.jar:/export/servers/hbase/lib/protobuf-java-2.5.0.jar:/export/servers/hbase/lib/servlet-api-2.5-6.1.14.jar:/export/servers/hbase/lib/servlet-api-2.5.jar:/export/servers/hbase/lib/slf4j-api-1.7.7.jar:/export/servers/hbase/lib/slf4j-log4j12-1.7.5.jar:/export/servers/hbase/lib/snappy-java-1.0.4.1.jar:/export/servers/hbase/lib/spymemcached-2.11.6.jar:/export/servers/hbase/lib/xalan-2.7.0.jar:/export/servers/hbase/lib/xml-apis-1.3.03.jar:/export/servers/hbase/lib/xml-apis-ext-1.3.04.jar:/export/servers/hbase/lib/xmlenc-0.52.jar:/export/servers/hbase/lib/xom-1.2.5.jar:/export/servers/hbase/lib/xz-1.0.jar:/export/servers/hbase/lib/zookeeper-3.4.6.jar:/export/servers/hadoop-2.6.0-cdh5.14.0/etc/hadoop:/export/servers/hadoop-2.6.0-cdh5.14.0/share/hadoop/common/lib/*:/export/servers/hadoop-2.6.0-cdh5.14.0/share/hadoop/common/*:/export/servers/hadoop-2.6.0-cdh5.14.0/share/hadoop/hdfs:/export/servers/hadoop-2.6.0-cdh5.14.0/share/hadoop/hdfs/lib/*:/export/servers/hadoop-2.6.0-cdh5.14.0/share/hadoop/hdfs/*:/export/servers/hadoop-2.6.0-cdh5.14.0/share/hadoop/yarn/lib/*:/export/servers/hadoop-2.6.0-cdh5.14.0/share/hadoop/yarn/*:/export/servers/hadoop-2.6.0-cdh5.14.0/share/hadoop/mapreduce/lib/*:/export/servers/hadoop-2.6.0-cdh5.14.0/share/hadoop/mapreduce/*:/export/servers/hadoop-2.6.0-cdh5.14.0/contrib/capacity-scheduler/*.jar:/export/servers/hbase/conf:/export/servers/hive-1.1.0-cdh5.14.0/lib/*' -Djava.library.path=:/export/servers/hadoop-2.6.0-cdh5.14.0/lib/native:/export/servers/hadoop-2.6.0-cdh5.14.0/lib/native org.apache.flume.node.Application -f ./conf/log-kafka.properties -n a1
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/export/servers/apache-flume-1.6.0-cdh5.14.0-bin/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/export/servers/hadoop-2.6.0-cdh5.14.0/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/export/servers/hbase/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
2020-08-23 18:20:56,790 (lifecycleSupervisor-1-0) [INFO - org.apache.flume.node.PollingPropertiesFileConfigurationProvider.start(PollingPropertiesFileConfigurationProvider.java:62)] Configuration provider starting
2020-08-23 18:20:56,830 (conf-file-poller-0) [INFO - org.apache.flume.node.PollingPropertiesFileConfigurationProvider$FileWatcherRunnable.run(PollingPropertiesFileConfigurationProvider.java:134)] Reloading configuration file:./conf/log-kafka.properties
2020-08-23 18:20:56,897 (conf-file-poller-0) [INFO - org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty(FlumeConfiguration.java:1016)] Processing:k1
2020-08-23 18:20:56,904 (conf-file-poller-0) [INFO - org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty(FlumeConfiguration.java:1016)] Processing:k1
2020-08-23 18:20:56,905 (conf-file-poller-0) [INFO - org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty(FlumeConfiguration.java:930)] Added sinks: k1 Agent: a1
2020-08-23 18:20:56,909 (conf-file-poller-0) [INFO - org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty(FlumeConfiguration.java:1016)] Processing:k1
2020-08-23 18:20:56,909 (conf-file-poller-0) [INFO - org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty(FlumeConfiguration.java:1016)] Processing:k1
2020-08-23 18:20:56,909 (conf-file-poller-0) [INFO - org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty(FlumeConfiguration.java:1016)] Processing:k1
2020-08-23 18:20:56,909 (conf-file-poller-0) [INFO - org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty(FlumeConfiguration.java:1016)] Processing:k1
2020-08-23 18:20:56,988 (conf-file-poller-0) [INFO - org.apache.flume.conf.FlumeConfiguration.validateConfiguration(FlumeConfiguration.java:140)] Post-validation flume configuration contains configuration for agents: [a1]
2020-08-23 18:20:56,989 (conf-file-poller-0) [INFO - org.apache.flume.node.AbstractConfigurationProvider.loadChannels(AbstractConfigurationProvider.java:147)] Creating channels
2020-08-23 18:20:57,028 (conf-file-poller-0) [INFO - org.apache.flume.channel.DefaultChannelFactory.create(DefaultChannelFactory.java:42)] Creating instance of channel c1 type memory
2020-08-23 18:20:57,052 (conf-file-poller-0) [INFO - org.apache.flume.node.AbstractConfigurationProvider.loadChannels(AbstractConfigurationProvider.java:201)] Created channel c1
2020-08-23 18:20:57,059 (conf-file-poller-0) [INFO - org.apache.flume.source.DefaultSourceFactory.create(DefaultSourceFactory.java:41)] Creating instance of source r1, type exec
2020-08-23 18:20:57,235 (conf-file-poller-0) [INFO - org.apache.flume.interceptor.RegexFilteringInterceptor$Builder.build(RegexFilteringInterceptor.java:159)] Creating RegexFilteringInterceptor: regex=.+MOVIE_RATING_PREFIX.+,excludeEvents=false
2020-08-23 18:20:57,248 (conf-file-poller-0) [INFO - org.apache.flume.sink.DefaultSinkFactory.create(DefaultSinkFactory.java:42)] Creating instance of sink: k1, type: org.apache.flume.sink.kafka.KafkaSink
2020-08-23 18:20:57,286 (conf-file-poller-0) [INFO - org.apache.flume.sink.kafka.KafkaSink.configure(KafkaSink.java:314)] Using the static topic log. This may be overridden by event headers
2020-08-23 18:20:57,326 (conf-file-poller-0) [INFO - org.apache.flume.node.AbstractConfigurationProvider.getConfiguration(AbstractConfigurationProvider.java:116)] Channel c1 connected to [r1, k1]
2020-08-23 18:20:57,406 (conf-file-poller-0) [INFO - org.apache.flume.node.Application.startAllComponents(Application.java:137)] Starting new configuration:{ sourceRunners:{r1=EventDrivenSourceRunner: { source:org.apache.flume.source.ExecSource{name:r1,state:IDLE} }} sinkRunners:{k1=SinkRunner: { policy:org.apache.flume.sink.DefaultSinkProcessor@3222ea1b counterGroup:{ name:null counters:{} } }} channels:{c1=org.apache.flume.channel.MemoryChannel{name: c1}} }
2020-08-23 18:20:57,407 (conf-file-poller-0) [INFO - org.apache.flume.node.Application.startAllComponents(Application.java:144)] Starting Channel c1
2020-08-23 18:20:57,848 (lifecycleSupervisor-1-0) [INFO - org.apache.flume.instrumentation.MonitoredCounterGroup.register(MonitoredCounterGroup.java:119)] Monitored counter group for type: CHANNEL, name: c1: Successfully registered new MBean.
2020-08-23 18:20:57,852 (lifecycleSupervisor-1-0) [INFO - org.apache.flume.instrumentation.MonitoredCounterGroup.start(MonitoredCounterGroup.java:95)] Component type: CHANNEL, name: c1 started
2020-08-23 18:20:57,857 (conf-file-poller-0) [INFO - org.apache.flume.node.Application.startAllComponents(Application.java:171)] Starting Sink k1
2020-08-23 18:20:57,866 (conf-file-poller-0) [INFO - org.apache.flume.node.Application.startAllComponents(Application.java:182)] Starting Source r1
2020-08-23 18:20:57,869 (lifecycleSupervisor-1-2) [INFO - org.apache.flume.source.ExecSource.start(ExecSource.java:168)] Exec source starting with command: tail –F /export/servers/kafka_2.11-1.0.0/agent.log
2020-08-23 18:20:57,902 (lifecycleSupervisor-1-2) [INFO - org.apache.flume.instrumentation.MonitoredCounterGroup.register(MonitoredCounterGroup.java:119)] Monitored counter group for type: SOURCE, name: r1: Successfully registered new MBean.
2020-08-23 18:20:57,902 (lifecycleSupervisor-1-2) [INFO - org.apache.flume.instrumentation.MonitoredCounterGroup.start(MonitoredCounterGroup.java:95)] Component type: SOURCE, name: r1 started
2020-08-23 18:20:58,184 (lifecycleSupervisor-1-1) [INFO - org.apache.kafka.common.config.AbstractConfig.logAll(AbstractConfig.java:196)] ProducerConfig values: 
        acks = 1
        batch.size = 16384
        block.on.buffer.full = false
        bootstrap.servers = [192.168.52.100:9092, 192.168.52.110:9092, 192.168.52.120:9092]
        buffer.memory = 33554432
        client.id = 
        compression.type = none
        connections.max.idle.ms = 540000
        interceptor.classes = null
        key.serializer = class org.apache.kafka.common.serialization.StringSerializer
        linger.ms = 0
        max.block.ms = 60000
        max.in.flight.requests.per.connection = 5
        max.request.size = 1048576
        metadata.fetch.timeout.ms = 60000
        metadata.max.age.ms = 300000
        metric.reporters = []
        metrics.num.samples = 2
        metrics.sample.window.ms = 30000
        partitioner.class = class org.apache.kafka.clients.producer.internals.DefaultPartitioner
        receive.buffer.bytes = 32768
        reconnect.backoff.ms = 50
        request.timeout.ms = 30000
        retries = 0
        retry.backoff.ms = 100
        sasl.jaas.config = null
        sasl.kerberos.kinit.cmd = /usr/bin/kinit
        sasl.kerberos.min.time.before.relogin = 60000
        sasl.kerberos.service.name = null
        sasl.kerberos.ticket.renew.jitter = 0.05
        sasl.kerberos.ticket.renew.window.factor = 0.8
        sasl.mechanism = GSSAPI
        security.protocol = PLAINTEXT
        send.buffer.bytes = 131072
        ssl.cipher.suites = null
        ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1]
        ssl.endpoint.identification.algorithm = null
        ssl.key.password = null
        ssl.keymanager.algorithm = SunX509
        ssl.keystore.location = null
        ssl.keystore.password = null
        ssl.keystore.type = JKS
        ssl.protocol = TLS
        ssl.provider = null
        ssl.secure.random.implementation = null
        ssl.trustmanager.algorithm = PKIX
        ssl.truststore.location = null
        ssl.truststore.password = null
        ssl.truststore.type = JKS
        timeout.ms = 30000
        value.serializer = class org.apache.kafka.common.serialization.ByteArraySerializer

2020-08-23 18:20:58,985 (pool-3-thread-1) [INFO - org.apache.flume.source.ExecSource$ExecRunnable.run(ExecSource.java:372)] Command [tail –F /export/servers/kafka_2.11-1.0.0/agent.log] exited with 1
2020-08-23 18:20:59,069 (lifecycleSupervisor-1-1) [INFO - org.apache.kafka.common.utils.AppInfoParser$AppInfo.<init>(AppInfoParser.java:83)] Kafka version : 0.10.2-kafka-2.2.0
2020-08-23 18:20:59,070 (lifecycleSupervisor-1-1) [INFO - org.apache.kafka.common.utils.AppInfoParser$AppInfo.<init>(AppInfoParser.java:84)] Kafka commitId : unknown
2020-08-23 18:20:59,078 (lifecycleSupervisor-1-1) [INFO - org.apache.flume.instrumentation.MonitoredCounterGroup.register(MonitoredCounterGroup.java:119)] Monitored counter group for type: SINK, name: k1: Successfully registered new MBean.
2020-08-23 18:20:59,079 (lifecycleSupervisor-1-1) [INFO - org.apache.flume.instrumentation.MonitoredCounterGroup.start(MonitoredCounterGroup.java:95)] Component type: SINK, name: k1 started

其中报错的地方:[INFO - org.apache.flume.source.ExecSource$ExecRunnable.run(ExecSource.java:372)] Command [tail –F /export/servers/kafka_2.11-1.0.0/agent.log] exited with 1

但是此文件在linux系统中存在

[root@node03 kafka_2.11-1.0.0]# ll
total 21508
-rw-r--r--  1 root root        9 Aug 23 16:47 agent.log
drwxr-xr-x  3 root root     4096 Jul 26 22:17 bin
drwxr-xr-x  2 root root     4096 Aug  7 11:40 config
drwxr-xr-x  2 root root     4096 Sep 14  2018 libs
-rw-r--r--  1 root root    28824 Sep 14  2018 LICENSE
drwxr-xr-x 30 root root    36864 Aug 23 18:40 logs
-rw-------  1 root root 21919179 Aug 23 18:36 nohup.out
-rw-r--r--  1 root root      336 Sep 14  2018 NOTICE
drwxr-xr-x  2 root root     4096 Sep 14  2018 site-docs
[root@node03 kafka_2.11-1.0.0]# pwd
/export/servers/kafka_2.11-1.0.0
[root@node03 kafka_2.11-1.0.0]# cat agent.log 
11111111
[root@node03 kafka_2.11-1.0.0]# 

现象:当向监控文件添加新信息,kafka没反应。但是向监控文件添加信息以后,再重启Flume,kafka就能接收到数据。

推测:tail -F 命令只是在Flume启动的时候执行了一次

求大神们帮帮忙!!!!谢谢

评论 6
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值