Unable to load sink type: org.apache.flume.sink.kafka.kafkaSink

使用flume与kafka搭配测试时,启动flume报如下错误

[walker001@walker001 conf]$  flume-ng agent --name a1 --conf conf --conf-file $FLUME_HOME/conf/flume-conf_kafka.properties -Dflume.root.logger=INFO,console
Info: Including Hadoop libraries found via (/home/walker001/app/hadoop-2.8.2/bin/hadoop) for HDFS access
Info: Including HBASE libraries found via (/home/walker001/app/hbase-1.2.6.1/bin/hbase) for HBASE access
Info: Including Hive libraries found via (/home/walker001/app/apache-hive-2.3.6-bin) for Hive access
+ exec /home/walker001/app/jdk1.8.0_144/bin/java -Xmx20m -Dflume.root.logger=INFO,console -cp 'conf:/home/walker001/app/apache-flume-1.8.0-bin/lib/*:/home/walker001/app/hadoop-2.8.2/etc/hadoop:/home/walker001/app/hadoop-2.8.2/share/hadoop/common/lib/*:/home/walker001/app/hadoop-2.8.2/share/hadoop/common/*:/home/walker001/app/hadoop-2.8.2/share/hadoop/hdfs:/home/walker001/app/hadoop-2.8.2/share/hadoop/hdfs/lib/*:/home/walker001/app/hadoop-2.8.2/share/hadoop/hdfs/*:/home/walker001/app/hadoop-2.8.2/share/hadoop/yarn/lib/*:/home/walker001/app/hadoop-2.8.2/share/hadoop/yarn/*:/home/walker001/app/hadoop-2.8.2/share/hadoop/mapreduce/lib/*:/home/walker001/app/hadoop-2.8.2/share/hadoop/mapreduce/*:/home/walker001/app/hadoop-2.8.2/contrib/capacity-scheduler/*.jar:/home/walker001/app/hbase-1.2.6.1/conf:/home/walker001/app/jdk1.8.0_144/lib/tools.jar:/home/walker001/app/hbase-1.2.6.1:/home/walker001/app/hbase-1.2.6.1/lib/activation-1.1.jar:/home/walker001/app/hbase-1.2.6.1/lib/aopalliance-1.0.jar:/home/walker001/app/hbase-1.2.6.1/lib/apacheds-i18n-2.0.0-M15.jar:/home/walker001/app/hbase-1.2.6.1/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/home/walker001/app/hbase-1.2.6.1/lib/api-asn1-api-1.0.0-M20.jar:/home/walker001/app/hbase-1.2.6.1/lib/api-util-1.0.0-M20.jar:/home/walker001/app/hbase-1.2.6.1/lib/asm-3.1.jar:/home/walker001/app/hbase-1.2.6.1/lib/avro-1.7.4.jar:/home/walker001/app/hbase-1.2.6.1/lib/commons-beanutils-1.7.0.jar:/home/walker001/app/hbase-1.2.6.1/lib/commons-beanutils-core-1.8.0.jar:/home/walker001/app/hbase-1.2.6.1/lib/commons-cli-1.2.jar:/home/walker001/app/hbase-1.2.6.1/lib/commons-codec-1.9.jar:/home/walker001/app/hbase-1.2.6.1/lib/commons-collections-3.2.2.jar:/home/walker001/app/hbase-1.2.6.1/lib/commons-compress-1.4.1.jar:/home/walker001/app/hbase-1.2.6.1/lib/commons-configuration-1.6.jar:/home/walker001/app/hbase-1.2.6.1/lib/commons-daemon-1.0.13.jar:/home/walker001/app/hbase-1.2.6.1/lib/commons-digester-1.8.jar:/home/walker001/app/hbase-1.2.6.1/lib/commons-el-1.0.jar:/home/walker001/app/hbase-1.2.6.1/lib/commons-httpclient-3.1.jar:/home/walker001/app/hbase-1.2.6.1/lib/commons-io-2.4.jar:/home/walker001/app/hbase-1.2.6.1/lib/commons-lang-2.6.jar:/home/walker001/app/hbase-1.2.6.1/lib/commons-logging-1.2.jar:/home/walker001/app/hbase-1.2.6.1/lib/commons-math-2.2.jar:/home/walker001/app/hbase-1.2.6.1/lib/commons-math3-3.1.1.jar:/home/walker001/app/hbase-1.2.6.1/lib/commons-net-3.1.jar:/home/walker001/app/hbase-1.2.6.1/lib/disruptor-3.3.0.jar:/home/walker001/app/hbase-1.2.6.1/lib/findbugs-annotations-1.3.9-1.jar:/home/walker001/app/hbase-1.2.6.1/lib/guava-12.0.1.jar:/home/walker001/app/hbase-1.2.6.1/lib/guice-3.0.jar:/home/walker001/app/hbase-1.2.6.1/lib/guice-servlet-3.0.jar:/home/walker001/app/hbase-1.2.6.1/lib/hadoop-annotations-2.5.1.jar:/home/walker001/app/hbase-1.2.6.1/lib/hadoop-auth-2.5.1.jar:/home/walker001/app/hbase-1.2.6.1/lib/hadoop-client-2.5.1.jar:/home/walker001/app/hbase-1.2.6.1/lib/hadoop-common-2.5.1.jar:/home/walker001/app/hbase-1.2.6.1/lib/hadoop-hdfs-2.5.1.jar:/home/walker001/app/hbase-1.2.6.1/lib/hadoop-mapreduce-client-app-2.5.1.jar:/home/walker001/app/hbase-1.2.6.1/lib/hadoop-mapreduce-client-common-2.5.1.jar:/home/walker001/app/hbase-1.2.6.1/lib/hadoop-mapreduce-client-core-2.5.1.jar:/home/walker001/app/hbase-1.2.6.1/lib/hadoop-mapreduce-client-jobclient-2.5.1.jar:/home/walker001/app/hbase-1.2.6.1/lib/hadoop-mapreduce-client-shuffle-2.5.1.jar:/home/walker001/app/hbase-1.2.6.1/lib/hadoop-yarn-api-2.5.1.jar:/home/walker001/app/hbase-1.2.6.1/lib/hadoop-yarn-client-2.5.1.jar:/home/walker001/app/hbase-1.2.6.1/lib/hadoop-yarn-common-2.5.1.jar:/home/walker001/app/hbase-1.2.6.1/lib/hadoop-yarn-server-common-2.5.1.jar:/home/walker001/app/hbase-1.2.6.1/lib/hbase-annotations-1.2.6.1.jar:/home/walker001/app/hbase-1.2.6.1/lib/hbase-annotations-1.2.6.1-tests.jar:/home/walker001/app/hbase-1.2.6.1/lib/hbase-client-1.2.6.1.jar:/home/walker001/app/hbase-1.2.6.1/lib/hbase-common-1.2.6.1.jar:/home/walker001/app/hbase-1.2.6.1/lib/hbase-common-1.2.6.1-tests.jar:/home/walker001/app/hbase-1.2.6.1/lib/hbase-examples-1.2.6.1.jar:/home/walker001/app/hbase-1.2.6.1/lib/hbase-external-blockcache-1.2.6.1.jar:/home/walker001/app/hbase-1.2.6.1/lib/hbase-hadoop2-compat-1.2.6.1.jar:/home/walker001/app/hbase-1.2.6.1/lib/hbase-hadoop-compat-1.2.6.1.jar:/home/walker001/app/hbase-1.2.6.1/lib/hbase-it-1.2.6.1.jar:/home/walker001/app/hbase-1.2.6.1/lib/hbase-it-1.2.6.1-tests.jar:/home/walker001/app/hbase-1.2.6.1/lib/hbase-prefix-tree-1.2.6.1.jar:/home/walker001/app/hbase-1.2.6.1/lib/hbase-procedure-1.2.6.1.jar:/home/walker001/app/hbase-1.2.6.1/lib/hbase-protocol-1.2.6.1.jar:/home/walker001/app/hbase-1.2.6.1/lib/hbase-resource-bundle-1.2.6.1.jar:/home/walker001/app/hbase-1.2.6.1/lib/hbase-rest-1.2.6.1.jar:/home/walker001/app/hbase-1.2.6.1/lib/hbase-server-1.2.6.1.jar:/home/walker001/app/hbase-1.2.6.1/lib/hbase-server-1.2.6.1-tests.jar:/home/walker001/app/hbase-1.2.6.1/lib/hbase-shell-1.2.6.1.jar:/home/walker001/app/hbase-1.2.6.1/lib/hbase-thrift-1.2.6.1.jar:/home/walker001/app/hbase-1.2.6.1/lib/htrace-core-3.1.0-incubating.jar:/home/walker001/app/hbase-1.2.6.1/lib/httpclient-4.2.5.jar:/home/walker001/app/hbase-1.2.6.1/lib/httpcore-4.4.1.jar:/home/walker001/app/hbase-1.2.6.1/lib/jackson-core-asl-1.9.13.jar:/home/walker001/app/hbase-1.2.6.1/lib/jackson-jaxrs-1.9.13.jar:/home/walker001/app/hbase-1.2.6.1/lib/jackson-mapper-asl-1.9.13.jar:/home/walker001/app/hbase-1.2.6.1/lib/jackson-xc-1.9.13.jar:/home/walker001/app/hbase-1.2.6.1/lib/jamon-runtime-2.4.1.jar:/home/walker001/app/hbase-1.2.6.1/lib/jasper-compiler-5.5.23.jar:/home/walker001/app/hbase-1.2.6.1/lib/jasper-runtime-5.5.23.jar:/home/walker001/app/hbase-1.2.6.1/lib/javax.inject-1.jar:/home/walker001/app/hbase-1.2.6.1/lib/java-xmlbuilder-0.4.jar:/home/walker001/app/hbase-1.2.6.1/lib/jaxb-api-2.2.2.jar:/home/walker001/app/hbase-1.2.6.1/lib/jaxb-impl-2.2.3-1.jar:/home/walker001/app/hbase-1.2.6.1/lib/jcodings-1.0.8.jar:/home/walker001/app/hbase-1.2.6.1/lib/jersey-client-1.9.jar:/home/walker001/app/hbase-1.2.6.1/lib/jersey-core-1.9.jar:/home/walker001/app/hbase-1.2.6.1/lib/jersey-guice-1.9.jar:/home/walker001/app/hbase-1.2.6.1/lib/jersey-json-1.9.jar:/home/walker001/app/hbase-1.2.6.1/lib/jersey-server-1.9.jar:/home/walker001/app/hbase-1.2.6.1/lib/jets3t-0.9.0.jar:/home/walker001/app/hbase-1.2.6.1/lib/jettison-1.3.3.jar:/home/walker001/app/hbase-1.2.6.1/lib/jetty-6.1.26.jar:/home/walker001/app/hbase-1.2.6.1/lib/jetty-sslengine-6.1.26.jar:/home/walker001/app/hbase-1.2.6.1/lib/jetty-util-6.1.26.jar:/home/walker001/app/hbase-1.2.6.1/lib/joni-2.1.2.jar:/home/walker001/app/hbase-1.2.6.1/lib/jruby-complete-1.6.8.jar:/home/walker001/app/hbase-1.2.6.1/lib/jsch-0.1.42.jar:/home/walker001/app/hbase-1.2.6.1/lib/jsp-2.1-6.1.14.jar:/home/walker001/app/hbase-1.2.6.1/lib/jsp-api-2.1-6.1.14.jar:/home/walker001/app/hbase-1.2.6.1/lib/junit-4.12.jar:/home/walker001/app/hbase-1.2.6.1/lib/leveldbjni-all-1.8.jar:/home/walker001/app/hbase-1.2.6.1/lib/libthrift-0.9.3.jar:/home/walker001/app/hbase-1.2.6.1/lib/log4j-1.2.17.jar:/home/walker001/app/hbase-1.2.6.1/lib/metrics-core-2.2.0.jar:/home/walker001/app/hbase-1.2.6.1/lib/netty-all-4.0.23.Final.jar:/home/walker001/app/hbase-1.2.6.1/lib/paranamer-2.3.jar:/home/walker001/app/hbase-1.2.6.1/lib/protobuf-java-2.5.0.jar:/home/walker001/app/hbase-1.2.6.1/lib/servlet-api-2.5-6.1.14.jar:/home/walker001/app/hbase-1.2.6.1/lib/servlet-api-2.5.jar:/home/walker001/app/hbase-1.2.6.1/lib/slf4j-api-1.7.7.jar:/home/walker001/app/hbase-1.2.6.1/lib/slf4j-log4j12-1.7.5.jar:/home/walker001/app/hbase-1.2.6.1/lib/snappy-java-1.0.4.1.jar:/home/walker001/app/hbase-1.2.6.1/lib/spymemcached-2.11.6.jar:/home/walker001/app/hbase-1.2.6.1/lib/xmlenc-0.52.jar:/home/walker001/app/hbase-1.2.6.1/lib/xz-1.0.jar:/home/walker001/app/hbase-1.2.6.1/lib/zookeeper-3.4.6.jar:/home/walker001/app/hadoop-2.8.2/etc/hadoop:/home/walker001/app/hadoop-2.8.2/share/hadoop/common/lib/*:/home/walker001/app/hadoop-2.8.2/share/hadoop/common/*:/home/walker001/app/hadoop-2.8.2/share/hadoop/hdfs:/home/walker001/app/hadoop-2.8.2/share/hadoop/hdfs/lib/*:/home/walker001/app/hadoop-2.8.2/share/hadoop/hdfs/*:/home/walker001/app/hadoop-2.8.2/share/hadoop/yarn/lib/*:/home/walker001/app/hadoop-2.8.2/share/hadoop/yarn/*:/home/walker001/app/hadoop-2.8.2/share/hadoop/mapreduce/lib/*:/home/walker001/app/hadoop-2.8.2/share/hadoop/mapreduce/*:/home/walker001/app/hadoop-2.8.2/contrib/capacity-scheduler/*.jar:/home/walker001/app/hbase-1.2.6.1/conf:/home/walker001/app/apache-hive-2.3.6-bin/lib/*' -Djava.library.path=:/home/walker001/app/hadoop-2.8.2/lib/native:/home/walker001/app/hadoop-2.8.2/lib/native org.apache.flume.node.Application --name a1 --conf-file /home/walker001/app/apache-flume-1.8.0-bin/conf/flume-conf_kafka.properties
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/walker001/app/apache-flume-1.8.0-bin/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/walker001/app/hadoop-2.8.2/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/walker001/app/hbase-1.2.6.1/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/walker001/app/apache-hive-2.3.6-bin/lib/log4j-slf4j-impl-2.6.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
20/05/01 12:43:30 INFO node.PollingPropertiesFileConfigurationProvider: Configuration provider starting
20/05/01 12:43:30 INFO node.PollingPropertiesFileConfigurationProvider: Reloading configuration file:/home/walker001/app/apache-flume-1.8.0-bin/conf/flume-conf_kafka.properties
20/05/01 12:43:30 INFO conf.FlumeConfiguration: Processing:k1
20/05/01 12:43:30 INFO conf.FlumeConfiguration: Processing:k1
20/05/01 12:43:30 INFO conf.FlumeConfiguration: Added sinks: k1 Agent: a1
20/05/01 12:43:30 INFO conf.FlumeConfiguration: Processing:k1
20/05/01 12:43:30 INFO conf.FlumeConfiguration: Processing:k1
20/05/01 12:43:30 INFO conf.FlumeConfiguration: Processing:k1
20/05/01 12:43:30 INFO conf.FlumeConfiguration: Post-validation flume configuration contains configuration for agents: [a1]
20/05/01 12:43:30 INFO node.AbstractConfigurationProvider: Creating channels
20/05/01 12:43:30 INFO channel.DefaultChannelFactory: Creating instance of channel c1 type memory
20/05/01 12:43:30 INFO node.AbstractConfigurationProvider: Created channel c1
20/05/01 12:43:30 INFO source.DefaultSourceFactory: Creating instance of source r1, type exec
20/05/01 12:43:30 INFO sink.DefaultSinkFactory: Creating instance of sink: k1, type: org.apache.flume.sink.kafka.kafkaSink
20/05/01 12:43:30 ERROR node.PollingPropertiesFileConfigurationProvider: Failed to load configuration data. Exception follows.
org.apache.flume.FlumeException: Unable to load sink type: org.apache.flume.sink.kafka.kafkaSink, class: org.apache.flume.sink.kafka.kafkaSink
        at org.apache.flume.sink.DefaultSinkFactory.getClass(DefaultSinkFactory.java:70)
        at org.apache.flume.sink.DefaultSinkFactory.create(DefaultSinkFactory.java:43)
        at org.apache.flume.node.AbstractConfigurationProvider.loadSinks(AbstractConfigurationProvider.java:408)
        at org.apache.flume.node.AbstractConfigurationProvider.getConfiguration(AbstractConfigurationProvider.java:102)
        at org.apache.flume.node.PollingPropertiesFileConfigurationProvider$FileWatcherRunnable.run(PollingPropertiesFileConfigurationProvider.java:141)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
        at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
        at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.ClassNotFoundException: org.apache.flume.sink.kafka.kafkaSink
        at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
        at java.lang.Class.forName0(Native Method)
        at java.lang.Class.forName(Class.java:264)
        at org.apache.flume.sink.DefaultSinkFactory.getClass(DefaultSinkFactory.java:68)
        ... 11 more

配置文件如下

a1.sources = r1
a1.sinks = k1
a1.channels = c1

#source属性配置信息
a1.sources.r1.type = exec
a1.sources.r1.command = tail -F /home/walker001/zwk/data.log


#channel属性配置信息
a1.channels.c1.type = memory
a1.channels.c1.capacity = 1000
a1.channels.c1.transactionCapacity = 100

#sink组件
a1.sinks.k1.type = org.apache.flume.sink.kafka.kafkaSink
a1.sinks.k1.kafka.bootstrap.servers = walker001:9092,walker002:9092,walker003:9092

#指定主题名称
a1.sinks.k1.kafka.topic = topictest
#指定序列化类
a1.sinks.k1.serializer.class = kafka.serializer.StringEncoder


#绑定source和sink到channel上
a1.sources.r1.channels = c1
a1.sinks.k1.channel = c1

原因是a1.sinks.k1.type = org.apache.flume.sink.kafka.kafkaSink中kafkaSink应该是KafkaSink,首字母需大写。。。

改为大写后重新启动成功

[walker001@walker001 conf]$  flume-ng agent --name a1 --conf conf --conf-file $FLUME_HOME/conf/flume-conf_kafka.properties -Dflume.root.logger=INFO,console
Info: Including Hadoop libraries found via (/home/walker001/app/hadoop-2.8.2/bin/hadoop) for HDFS access
Info: Including HBASE libraries found via (/home/walker001/app/hbase-1.2.6.1/bin/hbase) for HBASE access
Info: Including Hive libraries found via (/home/walker001/app/apache-hive-2.3.6-bin) for Hive access
+ exec /home/walker001/app/jdk1.8.0_144/bin/java -Xmx20m -Dflume.root.logger=INFO,console -cp 'conf:/home/walker001/app/apache-flume-1.8.0-bin/lib/*:/home/walker001/app/hadoop-2.8.2/etc/hadoop:/home/walker001/app/hadoop-2.8.2/share/hadoop/common/lib/*:/home/walker001/app/hadoop-2.8.2/share/hadoop/common/*:/home/walker001/app/hadoop-2.8.2/share/hadoop/hdfs:/home/walker001/app/hadoop-2.8.2/share/hadoop/hdfs/lib/*:/home/walker001/app/hadoop-2.8.2/share/hadoop/hdfs/*:/home/walker001/app/hadoop-2.8.2/share/hadoop/yarn/lib/*:/home/walker001/app/hadoop-2.8.2/share/hadoop/yarn/*:/home/walker001/app/hadoop-2.8.2/share/hadoop/mapreduce/lib/*:/home/walker001/app/hadoop-2.8.2/share/hadoop/mapreduce/*:/home/walker001/app/hadoop-2.8.2/contrib/capacity-scheduler/*.jar:/home/walker001/app/hbase-1.2.6.1/conf:/home/walker001/app/jdk1.8.0_144/lib/tools.jar:/home/walker001/app/hbase-1.2.6.1:/home/walker001/app/hbase-1.2.6.1/lib/activation-1.1.jar:/home/walker001/app/hbase-1.2.6.1/lib/aopalliance-1.0.jar:/home/walker001/app/hbase-1.2.6.1/lib/apacheds-i18n-2.0.0-M15.jar:/home/walker001/app/hbase-1.2.6.1/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/home/walker001/app/hbase-1.2.6.1/lib/api-asn1-api-1.0.0-M20.jar:/home/walker001/app/hbase-1.2.6.1/lib/api-util-1.0.0-M20.jar:/home/walker001/app/hbase-1.2.6.1/lib/asm-3.1.jar:/home/walker001/app/hbase-1.2.6.1/lib/avro-1.7.4.jar:/home/walker001/app/hbase-1.2.6.1/lib/commons-beanutils-1.7.0.jar:/home/walker001/app/hbase-1.2.6.1/lib/commons-beanutils-core-1.8.0.jar:/home/walker001/app/hbase-1.2.6.1/lib/commons-cli-1.2.jar:/home/walker001/app/hbase-1.2.6.1/lib/commons-codec-1.9.jar:/home/walker001/app/hbase-1.2.6.1/lib/commons-collections-3.2.2.jar:/home/walker001/app/hbase-1.2.6.1/lib/commons-compress-1.4.1.jar:/home/walker001/app/hbase-1.2.6.1/lib/commons-configuration-1.6.jar:/home/walker001/app/hbase-1.2.6.1/lib/commons-daemon-1.0.13.jar:/home/walker001/app/hbase-1.2.6.1/lib/commons-digester-1.8.jar:/home/walker001/app/hbase-1.2.6.1/lib/commons-el-1.0.jar:/home/walker001/app/hbase-1.2.6.1/lib/commons-httpclient-3.1.jar:/home/walker001/app/hbase-1.2.6.1/lib/commons-io-2.4.jar:/home/walker001/app/hbase-1.2.6.1/lib/commons-lang-2.6.jar:/home/walker001/app/hbase-1.2.6.1/lib/commons-logging-1.2.jar:/home/walker001/app/hbase-1.2.6.1/lib/commons-math-2.2.jar:/home/walker001/app/hbase-1.2.6.1/lib/commons-math3-3.1.1.jar:/home/walker001/app/hbase-1.2.6.1/lib/commons-net-3.1.jar:/home/walker001/app/hbase-1.2.6.1/lib/disruptor-3.3.0.jar:/home/walker001/app/hbase-1.2.6.1/lib/findbugs-annotations-1.3.9-1.jar:/home/walker001/app/hbase-1.2.6.1/lib/guava-12.0.1.jar:/home/walker001/app/hbase-1.2.6.1/lib/guice-3.0.jar:/home/walker001/app/hbase-1.2.6.1/lib/guice-servlet-3.0.jar:/home/walker001/app/hbase-1.2.6.1/lib/hadoop-annotations-2.5.1.jar:/home/walker001/app/hbase-1.2.6.1/lib/hadoop-auth-2.5.1.jar:/home/walker001/app/hbase-1.2.6.1/lib/hadoop-client-2.5.1.jar:/home/walker001/app/hbase-1.2.6.1/lib/hadoop-common-2.5.1.jar:/home/walker001/app/hbase-1.2.6.1/lib/hadoop-hdfs-2.5.1.jar:/home/walker001/app/hbase-1.2.6.1/lib/hadoop-mapreduce-client-app-2.5.1.jar:/home/walker001/app/hbase-1.2.6.1/lib/hadoop-mapreduce-client-common-2.5.1.jar:/home/walker001/app/hbase-1.2.6.1/lib/hadoop-mapreduce-client-core-2.5.1.jar:/home/walker001/app/hbase-1.2.6.1/lib/hadoop-mapreduce-client-jobclient-2.5.1.jar:/home/walker001/app/hbase-1.2.6.1/lib/hadoop-mapreduce-client-shuffle-2.5.1.jar:/home/walker001/app/hbase-1.2.6.1/lib/hadoop-yarn-api-2.5.1.jar:/home/walker001/app/hbase-1.2.6.1/lib/hadoop-yarn-client-2.5.1.jar:/home/walker001/app/hbase-1.2.6.1/lib/hadoop-yarn-common-2.5.1.jar:/home/walker001/app/hbase-1.2.6.1/lib/hadoop-yarn-server-common-2.5.1.jar:/home/walker001/app/hbase-1.2.6.1/lib/hbase-annotations-1.2.6.1.jar:/home/walker001/app/hbase-1.2.6.1/lib/hbase-annotations-1.2.6.1-tests.jar:/home/walker001/app/hbase-1.2.6.1/lib/hbase-client-1.2.6.1.jar:/home/walker001/app/hbase-1.2.6.1/lib/hbase-common-1.2.6.1.jar:/home/walker001/app/hbase-1.2.6.1/lib/hbase-common-1.2.6.1-tests.jar:/home/walker001/app/hbase-1.2.6.1/lib/hbase-examples-1.2.6.1.jar:/home/walker001/app/hbase-1.2.6.1/lib/hbase-external-blockcache-1.2.6.1.jar:/home/walker001/app/hbase-1.2.6.1/lib/hbase-hadoop2-compat-1.2.6.1.jar:/home/walker001/app/hbase-1.2.6.1/lib/hbase-hadoop-compat-1.2.6.1.jar:/home/walker001/app/hbase-1.2.6.1/lib/hbase-it-1.2.6.1.jar:/home/walker001/app/hbase-1.2.6.1/lib/hbase-it-1.2.6.1-tests.jar:/home/walker001/app/hbase-1.2.6.1/lib/hbase-prefix-tree-1.2.6.1.jar:/home/walker001/app/hbase-1.2.6.1/lib/hbase-procedure-1.2.6.1.jar:/home/walker001/app/hbase-1.2.6.1/lib/hbase-protocol-1.2.6.1.jar:/home/walker001/app/hbase-1.2.6.1/lib/hbase-resource-bundle-1.2.6.1.jar:/home/walker001/app/hbase-1.2.6.1/lib/hbase-rest-1.2.6.1.jar:/home/walker001/app/hbase-1.2.6.1/lib/hbase-server-1.2.6.1.jar:/home/walker001/app/hbase-1.2.6.1/lib/hbase-server-1.2.6.1-tests.jar:/home/walker001/app/hbase-1.2.6.1/lib/hbase-shell-1.2.6.1.jar:/home/walker001/app/hbase-1.2.6.1/lib/hbase-thrift-1.2.6.1.jar:/home/walker001/app/hbase-1.2.6.1/lib/htrace-core-3.1.0-incubating.jar:/home/walker001/app/hbase-1.2.6.1/lib/httpclient-4.2.5.jar:/home/walker001/app/hbase-1.2.6.1/lib/httpcore-4.4.1.jar:/home/walker001/app/hbase-1.2.6.1/lib/jackson-core-asl-1.9.13.jar:/home/walker001/app/hbase-1.2.6.1/lib/jackson-jaxrs-1.9.13.jar:/home/walker001/app/hbase-1.2.6.1/lib/jackson-mapper-asl-1.9.13.jar:/home/walker001/app/hbase-1.2.6.1/lib/jackson-xc-1.9.13.jar:/home/walker001/app/hbase-1.2.6.1/lib/jamon-runtime-2.4.1.jar:/home/walker001/app/hbase-1.2.6.1/lib/jasper-compiler-5.5.23.jar:/home/walker001/app/hbase-1.2.6.1/lib/jasper-runtime-5.5.23.jar:/home/walker001/app/hbase-1.2.6.1/lib/javax.inject-1.jar:/home/walker001/app/hbase-1.2.6.1/lib/java-xmlbuilder-0.4.jar:/home/walker001/app/hbase-1.2.6.1/lib/jaxb-api-2.2.2.jar:/home/walker001/app/hbase-1.2.6.1/lib/jaxb-impl-2.2.3-1.jar:/home/walker001/app/hbase-1.2.6.1/lib/jcodings-1.0.8.jar:/home/walker001/app/hbase-1.2.6.1/lib/jersey-client-1.9.jar:/home/walker001/app/hbase-1.2.6.1/lib/jersey-core-1.9.jar:/home/walker001/app/hbase-1.2.6.1/lib/jersey-guice-1.9.jar:/home/walker001/app/hbase-1.2.6.1/lib/jersey-json-1.9.jar:/home/walker001/app/hbase-1.2.6.1/lib/jersey-server-1.9.jar:/home/walker001/app/hbase-1.2.6.1/lib/jets3t-0.9.0.jar:/home/walker001/app/hbase-1.2.6.1/lib/jettison-1.3.3.jar:/home/walker001/app/hbase-1.2.6.1/lib/jetty-6.1.26.jar:/home/walker001/app/hbase-1.2.6.1/lib/jetty-sslengine-6.1.26.jar:/home/walker001/app/hbase-1.2.6.1/lib/jetty-util-6.1.26.jar:/home/walker001/app/hbase-1.2.6.1/lib/joni-2.1.2.jar:/home/walker001/app/hbase-1.2.6.1/lib/jruby-complete-1.6.8.jar:/home/walker001/app/hbase-1.2.6.1/lib/jsch-0.1.42.jar:/home/walker001/app/hbase-1.2.6.1/lib/jsp-2.1-6.1.14.jar:/home/walker001/app/hbase-1.2.6.1/lib/jsp-api-2.1-6.1.14.jar:/home/walker001/app/hbase-1.2.6.1/lib/junit-4.12.jar:/home/walker001/app/hbase-1.2.6.1/lib/leveldbjni-all-1.8.jar:/home/walker001/app/hbase-1.2.6.1/lib/libthrift-0.9.3.jar:/home/walker001/app/hbase-1.2.6.1/lib/log4j-1.2.17.jar:/home/walker001/app/hbase-1.2.6.1/lib/metrics-core-2.2.0.jar:/home/walker001/app/hbase-1.2.6.1/lib/netty-all-4.0.23.Final.jar:/home/walker001/app/hbase-1.2.6.1/lib/paranamer-2.3.jar:/home/walker001/app/hbase-1.2.6.1/lib/protobuf-java-2.5.0.jar:/home/walker001/app/hbase-1.2.6.1/lib/servlet-api-2.5-6.1.14.jar:/home/walker001/app/hbase-1.2.6.1/lib/servlet-api-2.5.jar:/home/walker001/app/hbase-1.2.6.1/lib/slf4j-api-1.7.7.jar:/home/walker001/app/hbase-1.2.6.1/lib/slf4j-log4j12-1.7.5.jar:/home/walker001/app/hbase-1.2.6.1/lib/snappy-java-1.0.4.1.jar:/home/walker001/app/hbase-1.2.6.1/lib/spymemcached-2.11.6.jar:/home/walker001/app/hbase-1.2.6.1/lib/xmlenc-0.52.jar:/home/walker001/app/hbase-1.2.6.1/lib/xz-1.0.jar:/home/walker001/app/hbase-1.2.6.1/lib/zookeeper-3.4.6.jar:/home/walker001/app/hadoop-2.8.2/etc/hadoop:/home/walker001/app/hadoop-2.8.2/share/hadoop/common/lib/*:/home/walker001/app/hadoop-2.8.2/share/hadoop/common/*:/home/walker001/app/hadoop-2.8.2/share/hadoop/hdfs:/home/walker001/app/hadoop-2.8.2/share/hadoop/hdfs/lib/*:/home/walker001/app/hadoop-2.8.2/share/hadoop/hdfs/*:/home/walker001/app/hadoop-2.8.2/share/hadoop/yarn/lib/*:/home/walker001/app/hadoop-2.8.2/share/hadoop/yarn/*:/home/walker001/app/hadoop-2.8.2/share/hadoop/mapreduce/lib/*:/home/walker001/app/hadoop-2.8.2/share/hadoop/mapreduce/*:/home/walker001/app/hadoop-2.8.2/contrib/capacity-scheduler/*.jar:/home/walker001/app/hbase-1.2.6.1/conf:/home/walker001/app/apache-hive-2.3.6-bin/lib/*' -Djava.library.path=:/home/walker001/app/hadoop-2.8.2/lib/native:/home/walker001/app/hadoop-2.8.2/lib/native org.apache.flume.node.Application --name a1 --conf-file /home/walker001/app/apache-flume-1.8.0-bin/conf/flume-conf_kafka.properties
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/walker001/app/apache-flume-1.8.0-bin/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/walker001/app/hadoop-2.8.2/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/walker001/app/hbase-1.2.6.1/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/walker001/app/apache-hive-2.3.6-bin/lib/log4j-slf4j-impl-2.6.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
20/05/01 12:45:59 INFO node.PollingPropertiesFileConfigurationProvider: Configuration provider starting
20/05/01 12:45:59 INFO node.PollingPropertiesFileConfigurationProvider: Reloading configuration file:/home/walker001/app/apache-flume-1.8.0-bin/conf/flume-conf_kafka.properties
20/05/01 12:45:59 INFO conf.FlumeConfiguration: Processing:k1
20/05/01 12:45:59 INFO conf.FlumeConfiguration: Processing:k1
20/05/01 12:45:59 INFO conf.FlumeConfiguration: Added sinks: k1 Agent: a1
20/05/01 12:45:59 INFO conf.FlumeConfiguration: Processing:k1
20/05/01 12:45:59 INFO conf.FlumeConfiguration: Processing:k1
20/05/01 12:45:59 INFO conf.FlumeConfiguration: Processing:k1
20/05/01 12:45:59 INFO conf.FlumeConfiguration: Post-validation flume configuration contains configuration for agents: [a1]
20/05/01 12:45:59 INFO node.AbstractConfigurationProvider: Creating channels
20/05/01 12:45:59 INFO channel.DefaultChannelFactory: Creating instance of channel c1 type memory
20/05/01 12:45:59 INFO node.AbstractConfigurationProvider: Created channel c1
20/05/01 12:45:59 INFO source.DefaultSourceFactory: Creating instance of source r1, type exec
20/05/01 12:45:59 INFO sink.DefaultSinkFactory: Creating instance of sink: k1, type: org.apache.flume.sink.kafka.KafkaSink
20/05/01 12:45:59 WARN kafka.KafkaSink: serializer.class is deprecated. Flume now uses the latest Kafka producer which implements a different interface for serializers. Please use the parameter kafka.producer.value.serializer
20/05/01 12:45:59 INFO kafka.KafkaSink: Using the static topic topictest. This may be overridden by event headers
20/05/01 12:45:59 INFO node.AbstractConfigurationProvider: Channel c1 connected to [r1, k1]
20/05/01 12:45:59 INFO node.Application: Starting new configuration:{ sourceRunners:{r1=EventDrivenSourceRunner: { source:org.apache.flume.source.ExecSource{name:r1,state:IDLE} }} sinkRunners:{k1=SinkRunner: { policy:org.apache.flume.sink.DefaultSinkProcessor@664adb06 counterGroup:{ name:null counters:{} } }} channels:{c1=org.apache.flume.channel.MemoryChannel{name: c1}} }
20/05/01 12:45:59 INFO node.Application: Starting Channel c1
20/05/01 12:45:59 INFO node.Application: Waiting for channel: c1 to start. Sleeping for 500 ms
20/05/01 12:45:59 INFO instrumentation.MonitoredCounterGroup: Monitored counter group for type: CHANNEL, name: c1: Successfully registered new MBean.
20/05/01 12:45:59 INFO instrumentation.MonitoredCounterGroup: Component type: CHANNEL, name: c1 started
20/05/01 12:46:00 INFO node.Application: Starting Sink k1
20/05/01 12:46:00 INFO node.Application: Starting Source r1
20/05/01 12:46:00 INFO source.ExecSource: Exec source starting with command: tail -F /home/walker001/zwk/data.log
20/05/01 12:46:00 INFO instrumentation.MonitoredCounterGroup: Monitored counter group for type: SOURCE, name: r1: Successfully registered new MBean.
20/05/01 12:46:00 INFO instrumentation.MonitoredCounterGroup: Component type: SOURCE, name: r1 started
20/05/01 12:46:00 INFO producer.ProducerConfig: ProducerConfig values: 
        compression.type = none
        metric.reporters = []
        metadata.max.age.ms = 300000
        metadata.fetch.timeout.ms = 60000
        reconnect.backoff.ms = 50
        sasl.kerberos.ticket.renew.window.factor = 0.8
        bootstrap.servers = [walker001:9092, walker002:9092, walker003:9092]
        retry.backoff.ms = 100
        sasl.kerberos.kinit.cmd = /usr/bin/kinit
        buffer.memory = 33554432
        timeout.ms = 30000
        key.serializer = class org.apache.kafka.common.serialization.StringSerializer
        sasl.kerberos.service.name = null
        sasl.kerberos.ticket.renew.jitter = 0.05
        ssl.keystore.type = JKS
        ssl.trustmanager.algorithm = PKIX
        block.on.buffer.full = false
        ssl.key.password = null
        max.block.ms = 60000
        sasl.kerberos.min.time.before.relogin = 60000
        connections.max.idle.ms = 540000
        ssl.truststore.password = null
        max.in.flight.requests.per.connection = 5
        metrics.num.samples = 2
        client.id = 
        ssl.endpoint.identification.algorithm = null
        ssl.protocol = TLS
        request.timeout.ms = 30000
        ssl.provider = null
        ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1]
        acks = 1
        batch.size = 16384
        ssl.keystore.location = null
        receive.buffer.bytes = 32768
        ssl.cipher.suites = null
        ssl.truststore.type = JKS
        security.protocol = PLAINTEXT
        retries = 0
        max.request.size = 1048576
        value.serializer = class org.apache.kafka.common.serialization.ByteArraySerializer
        ssl.truststore.location = null
        ssl.keystore.password = null
        ssl.keymanager.algorithm = SunX509
        metrics.sample.window.ms = 30000
        partitioner.class = class org.apache.kafka.clients.producer.internals.DefaultPartitioner
        send.buffer.bytes = 131072
        linger.ms = 0

20/05/01 12:46:00 INFO utils.AppInfoParser: Kafka version : 0.9.0.1
20/05/01 12:46:00 INFO utils.AppInfoParser: Kafka commitId : 23c69d62a0cabf06
20/05/01 12:46:00 INFO instrumentation.MonitoredCounterGroup: Monitored counter group for type: SINK, name: k1: Successfully registered new MBean.
20/05/01 12:46:00 INFO instrumentation.MonitoredCounterGroup: Component type: SINK, name: k1 started

#定义三大组件的名称 a.sources = r a.sinks = k1 k2 k3 a.channels = c1 c2 c3 #将数据流复制给所有channel a.sources.r.selector.type = replicating  # 配置Source组件 a.sources.r.type = exec #exec表示数据源来自运行给定的Unix命令后生成的数据 a.sources.r.command = cat /home/bit/ys/hngyzd.csv # kafka a.sinks.k1.type = org.apache.flume.sink.kafka.KafkaSink a.sinks.k1.kafka.topic = data a.sinks.k1.kafka.bootstrap.servers = localhost:9092 a.sinks.k1.kafka.flumeBatchSize = 20 a.sinks.k1.kafka.producer.acks = 1 a.sinks.k1.kafka.producer.linger.ms = 1 a.sinks.k1.kafka.producer.compression.type = snappy a.channels.c1.type = memory a.channels.c1.capacity = 100000 a.channels.c1.transactionCapacity = 100 # mysql a.sinks.k2.type =com.us.flume.MysqlSink a.sinks.k2.hostname=localhost a.sinks.k2.port=3306 a.sinks.k2.databaseName=ys a.sinks.k2.tableName=table1 a.sinks.k2.user=bit a.sinks.k2.password=123456 a.channels.c2.type = memory a.channels.c2.capacity = 100000 a.channels.c2.transactionCapactiy = 2000 # hdfs a.sinks.k3.type = hdfs a.sinks.k3.hdfs.path = hdfs://localhost:9000/user/bit/ys #积攒多少个Event才flush到HDFS一次 a.sinks.k3.hdfs.batchSize = 100 #设置文件类型,可支持压缩 a.sinks.k3.hdfs.fileType = DataStream #多久生成一个新的文件 a.sinks.k3.hdfs.rollInterval = 5 a.channels.c3.type = memory a.channels.c3.capacity =100000 a.channels.c3.transactionCapacity = 100 # Bind the source and sink to the channel a.sources.r.channels = c1 c2 c3 a.sinks.k1.channel = c1 a.sinks.k2.channel = c2 a.sinks.k3.channel = c3
05-23
这段代码是一个 Apache Flume 的配置文件,用于将数据从一个源头复制到多个目的地。其中,a.sources 定义了数据源,a.sinks 定义了数据的目的地,a.channels 定义了数据在传输过程中的缓存区。具体配置如下: - 数据源:a.sources.r.type = exec 表示数据源来自运行给定的 Unix 命令后生成的数据,a.sources.r.command = cat /home/bit/ys/hngyzd.csv 表示运行 cat 命令读取指定文件中的数据作为数据源。 - Kafka 目的地:a.sinks.k1.type = org.apache.flume.sink.kafka.KafkaSink 表示将数据发送到 Kafka,a.sinks.k1.kafka.bootstrap.servers = localhost:9092 表示连接到本地的 Kafka 服务器,a.sinks.k1.channel = c1 表示从名为 c1 的缓存区取出数据发送到 Kafka。 - MySQL 目的地:a.sinks.k2.type = com.us.flume.MysqlSink 表示将数据写入 MySQL 数据库,a.sinks.k2.hostname = localhost、a.sinks.k2.port = 3306、a.sinks.k2.databaseName = ys、a.sinks.k2.tableName = table1、a.sinks.k2.user = bit、a.sinks.k2.password = 123456 分别表示连接到本地的 MySQL 数据库 ys 中的 table1 表,并使用 bit 用户名和 123456 密码进行认证。a.sinks.k2.channel = c2 表示从名为 c2 的缓存区取出数据写入 MySQL。 - HDFS 目的地:a.sinks.k3.type = hdfs 表示将数据写入 HDFS,a.sinks.k3.hdfs.path = hdfs://localhost:9000/user/bit/ys 表示将数据写入到本地的 HDFS 文件系统中的 /user/bit/ys 目录下。a.sinks.k3.hdfs.batchSize = 100 表示积攒多少个事件才将它们一起 flush 到 HDFS 中,a.sinks.k3.hdfs.rollInterval = 5 表示每隔 5 秒生成一个新的文件。a.sinks.k3.channel = c3 表示从名为 c3 的缓存区取出数据写入 HDFS。 最后,a.sources.r.channels、a.sinks.k1.channel、a.sinks.k2.channel 和 a.sinks.k3.channel 分别将数据源和目的地绑定到缓存区 c1、c2 和 c3。这样,数据在传输过程中会先进入缓存区,再从缓存区分别发送到 Kafka、MySQL 和 HDFS 中。
评论 8
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值