windows安装Flume将本地目录下文件上传到HDFS(kerberos)问题汇总

环境

本地系统:Windows10
CDH:6.1.0
Hadoop: 3.0
Flume:1.8.0
JDK:1.8
安全:kerberos

问题一:

找不到com/ctc/wstx/io/InputBootstrapper
2020-09-02 11:02:22,640 (SinkRunner-PollingRunner-DefaultSinkProcessor) [ERROR - org.apache.flume.sink.hdfs.HDFSEventSink.process(HDFSEventSink.java:447)] process failed
java.lang.NoClassDefFoundError: com/ctc/wstx/io/InputBootstrapper
at org.apache.flume.sink.hdfs.BucketWriter.open(BucketWriter.java:224)
at org.apache.flume.sink.hdfs.BucketWriter.append(BucketWriter.java:541)
at org.apache.flume.sink.hdfs.HDFSEventSink.process(HDFSEventSink.java:401)
at org.apache.flume.sink.DefaultSinkProcessor.process(DefaultSinkProcessor.java:67)
at org.apache.flume.SinkRunner P o l l i n g R u n n e r . r u n ( S i n k R u n n e r . j a v a : 145 ) a t j a v a . l a n g . T h r e a d . r u n ( T h r e a d . j a v a : 748 ) C a u s e d b y : j a v a . l a n g . C l a s s N o t F o u n d E x c e p t i o n : c o m . c t c . w s t x . i o . I n p u t B o o t s t r a p p e r a t j a v a . n e t . U R L C l a s s L o a d e r . f i n d C l a s s ( U R L C l a s s L o a d e r . j a v a : 381 ) a t j a v a . l a n g . C l a s s L o a d e r . l o a d C l a s s ( C l a s s L o a d e r . j a v a : 424 ) a t s u n . m i s c . L a u n c h e r PollingRunner.run(SinkRunner.java:145) at java.lang.Thread.run(Thread.java:748) Caused by: java.lang.ClassNotFoundException: com.ctc.wstx.io.InputBootstrapper at java.net.URLClassLoader.findClass(URLClassLoader.java:381) at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at sun.misc.Launcher PollingRunner.run(SinkRunner.java:145)atjava.lang.Thread.run(Thread.java:748)Causedby:java.lang.ClassNotFoundException:com.ctc.wstx.io.InputBootstrapperatjava.net.URLClassLoader.findClass(URLClassLoader.java:381)atjava.lang.ClassLoader.loadClass(ClassLoader.java:424)atsun.misc.LauncherAppClassLoader.loadClass(Launcher.java:349)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
… 6 more
Exception in thread “SinkRunner-PollingRunner-DefaultSinkProcessor” java.lang.NoClassDefFoundError: com/ctc/wstx/io/InputBootstrapper
at org.apache.flume.sink.hdfs.BucketWriter.open(BucketWriter.java:224)
at org.apache.flume.sink.hdfs.BucketWriter.append(BucketWriter.java:541)
at org.apache.flume.sink.hdfs.HDFSEventSink.process(HDFSEventSink.java:401)
at org.apache.flume.sink.DefaultSinkProcessor.process(DefaultSinkProcessor.java:67)
at org.apache.flume.SinkRunner P o l l i n g R u n n e r . r u n ( S i n k R u n n e r . j a v a : 145 ) a t j a v a . l a n g . T h r e a d . r u n ( T h r e a d . j a v a : 748 ) C a u s e d b y : j a v a . l a n g . C l a s s N o t F o u n d E x c e p t i o n : c o m . c t c . w s t x . i o . I n p u t B o o t s t r a p p e r a t j a v a . n e t . U R L C l a s s L o a d e r . f i n d C l a s s ( U R L C l a s s L o a d e r . j a v a : 381 ) a t j a v a . l a n g . C l a s s L o a d e r . l o a d C l a s s ( C l a s s L o a d e r . j a v a : 424 ) a t s u n . m i s c . L a u n c h e r PollingRunner.run(SinkRunner.java:145) at java.lang.Thread.run(Thread.java:748) Caused by: java.lang.ClassNotFoundException: com.ctc.wstx.io.InputBootstrapper at java.net.URLClassLoader.findClass(URLClassLoader.java:381) at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at sun.misc.Launcher PollingRunner.run(SinkRunner.java:145)atjava.lang.Thread.run(Thread.java:748)Causedby:java.lang.ClassNotFoundException:com.ctc.wstx.io.InputBootstrapperatjava.net.URLClassLoader.findClass(URLClassLoader.java:381)atjava.lang.ClassLoader.loadClass(ClassLoader.java:424)atsun.misc.LauncherAppClassLoader.loadClass(Launcher.java:349)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
… 6 more

问题解决:

在Hadoop集群中将Hadoop下的jar包放到flume的lib下:
woodstox-core-5.0.3.jar
hadoop-hdfs-3.0.0-cdh6.1.0.jar
commons-configuration2-2.1.jar
hadoop-common-3.0.0-cdh6.1.0.jar
commons-io-2.4.jar
hadoop-auth-3.0.0-cdh6.1.0.jar

问题二:

2020-09-02 14:34:01,915 (SinkRunner-PollingRunner-DefaultSinkProcessor) [ERROR - org.apache.flume.sink.hdfs.HDFSEventSink.process(HDFSEventSink.java:447)] process failed
java.lang.NoClassDefFoundError: org/codehaus/stax2/XMLInputFactory2
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:763)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)
at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
at java.net.URLClassLoader$1.run(URLClassLoader.java:368)
at java.net.URLClassLoader 1. r u n ( U R L C l a s s L o a d e r . j a v a : 362 ) a t j a v a . s e c u r i t y . A c c e s s C o n t r o l l e r . d o P r i v i l e g e d ( N a t i v e M e t h o d ) a t j a v a . n e t . U R L C l a s s L o a d e r . f i n d C l a s s ( U R L C l a s s L o a d e r . j a v a : 361 ) a t j a v a . l a n g . C l a s s L o a d e r . l o a d C l a s s ( C l a s s L o a d e r . j a v a : 424 ) a t s u n . m i s c . L a u n c h e r 1.run(URLClassLoader.java:362) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:361) at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at sun.misc.Launcher 1.run(URLClassLoader.java:362)atjava.security.AccessController.doPrivileged(NativeMethod)atjava.net.URLClassLoader.findClass(URLClassLoader.java:361)atjava.lang.ClassLoader.loadClass(ClassLoader.java:424)atsun.misc.LauncherAppClassLoader.loadClass(Launcher.java:349)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at org.apache.hadoop.conf.Configuration.(Configuration.java:321)
at org.apache.flume.sink.hdfs.BucketWriter.open(BucketWriter.java:224)
at org.apache.flume.sink.hdfs.BucketWriter.append(BucketWriter.java:541)
at org.apache.flume.sink.hdfs.HDFSEventSink.process(HDFSEventSink.java:401)
at org.apache.flume.sink.DefaultSinkProcessor.process(DefaultSinkProcessor.java:67)
at org.apache.flume.SinkRunner P o l l i n g R u n n e r . r u n ( S i n k R u n n e r . j a v a : 145 ) a t j a v a . l a n g . T h r e a d . r u n ( T h r e a d . j a v a : 748 ) C a u s e d b y : j a v a . l a n g . C l a s s N o t F o u n d E x c e p t i o n : o r g . c o d e h a u s . s t a x 2. X M L I n p u t F a c t o r y 2 a t j a v a . n e t . U R L C l a s s L o a d e r . f i n d C l a s s ( U R L C l a s s L o a d e r . j a v a : 381 ) a t j a v a . l a n g . C l a s s L o a d e r . l o a d C l a s s ( C l a s s L o a d e r . j a v a : 424 ) a t s u n . m i s c . L a u n c h e r PollingRunner.run(SinkRunner.java:145) at java.lang.Thread.run(Thread.java:748) Caused by: java.lang.ClassNotFoundException: org.codehaus.stax2.XMLInputFactory2 at java.net.URLClassLoader.findClass(URLClassLoader.java:381) at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at sun.misc.Launcher PollingRunner.run(SinkRunner.java:145)atjava.lang.Thread.run(Thread.java:748)Causedby:java.lang.ClassNotFoundException:org.codehaus.stax2.XMLInputFactory2atjava.net.URLClassLoader.findClass(URLClassLoader.java:381)atjava.lang.ClassLoader.loadClass(ClassLoader.java:424)atsun.misc.LauncherAppClassLoader.loadClass(Launcher.java:349)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
… 19 more

问题解决:

stax2-api-3.1.4.jar

问题三:

2020-09-02 14:52:40,947 (SinkRunner-PollingRunner-DefaultSinkProcessor) [ERROR - org.apache.flume.sink.hdfs.HDFSEventSink.process(HDFSEventSink.java:447)] process failed
java.lang.NoClassDefFoundError: org/apache/htrace/core/Tracer$Builder
at org.apache.hadoop.fs.FsTracer.get(FsTracer.java:42)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:3283)
at org.apache.hadoop.fs.FileSystem.access 200 ( F i l e S y s t e m . j a v a : 123 ) a t o r g . a p a c h e . h a d o o p . f s . F i l e S y s t e m 200(FileSystem.java:123) at org.apache.hadoop.fs.FileSystem 200(FileSystem.java:123)atorg.apache.hadoop.fs.FileSystemCache.getInternal(FileSystem.java:3337)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:3305)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:476)
at org.apache.hadoop.fs.Path.getFileSystem(Path.java:361)
at org.apache.flume.sink.hdfs.BucketWriter$1.call(BucketWriter.java:260)
at org.apache.flume.sink.hdfs.BucketWriter$1.call(BucketWriter.java:252)
at org.apache.flume.sink.hdfs.BucketWriter$9$1.run(BucketWriter.java:701)
at org.apache.flume.auth.SimpleAuthenticator.execute(SimpleAuthenticator.java:50)
at org.apache.flume.sink.hdfs.BucketWriter 9. c a l l ( B u c k e t W r i t e r . j a v a : 698 ) a t j a v a . u t i l . c o n c u r r e n t . F u t u r e T a s k . r u n ( F u t u r e T a s k . j a v a : 266 ) a t j a v a . u t i l . c o n c u r r e n t . T h r e a d P o o l E x e c u t o r . r u n W o r k e r ( T h r e a d P o o l E x e c u t o r . j a v a : 1149 ) a t j a v a . u t i l . c o n c u r r e n t . T h r e a d P o o l E x e c u t o r 9.call(BucketWriter.java:698) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor 9.call(BucketWriter.java:698)atjava.util.concurrent.FutureTask.run(FutureTask.java:266)atjava.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)atjava.util.concurrent.ThreadPoolExecutorWorker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.ClassNotFoundException: org.apache.htrace.core.Tracer B u i l d e r a t j a v a . n e t . U R L C l a s s L o a d e r . f i n d C l a s s ( U R L C l a s s L o a d e r . j a v a : 381 ) a t j a v a . l a n g . C l a s s L o a d e r . l o a d C l a s s ( C l a s s L o a d e r . j a v a : 424 ) a t s u n . m i s c . L a u n c h e r Builder at java.net.URLClassLoader.findClass(URLClassLoader.java:381) at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at sun.misc.Launcher Builderatjava.net.URLClassLoader.findClass(URLClassLoader.java:381)atjava.lang.ClassLoader.loadClass(ClassLoader.java:424)atsun.misc.LauncherAppClassLoader.loadClass(Launcher.java:349)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
… 16 more

问题解决:

htrace-core4-4.2.0-incubating.jar

问题四:

2020-09-02 14:55:10,550 (hdfs-k3-call-runner-0) [WARN - org.apache.hadoop.util.NativeCodeLoader.(NativeCodeLoader.java:60)] Unable to load native-hadoop library for your platform… using builtin-java classes where applicable
2020-09-02 14:55:10,996 (SinkRunner-PollingRunner-DefaultSinkProcessor) [WARN - org.apache.flume.sink.hdfs.HDFSEventSink.process(HDFSEventSink.java:443)] HDFS IO error
org.apache.hadoop.fs.UnsupportedFileSystemException: No FileSystem for scheme “hdfs”
at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:3266)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:3286)
at org.apache.hadoop.fs.FileSystem.access 200 ( F i l e S y s t e m . j a v a : 123 ) a t o r g . a p a c h e . h a d o o p . f s . F i l e S y s t e m 200(FileSystem.java:123) at org.apache.hadoop.fs.FileSystem 200(FileSystem.java:123)atorg.apache.hadoop.fs.FileSystemCache.getInternal(FileSystem.java:3337)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:3305)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:476)
at org.apache.hadoop.fs.Path.getFileSystem(Path.java:361)
at org.apache.flume.sink.hdfs.BucketWriter$1.call(BucketWriter.java:260)
at org.apache.flume.sink.hdfs.BucketWriter$1.call(BucketWriter.java:252)
at org.apache.flume.sink.hdfs.BucketWriter$9$1.run(BucketWriter.java:701)
at org.apache.flume.auth.SimpleAuthenticator.execute(SimpleAuthenticator.java:50)
at org.apache.flume.sink.hdfs.BucketWriter 9. c a l l ( B u c k e t W r i t e r . j a v a : 698 ) a t j a v a . u t i l . c o n c u r r e n t . F u t u r e T a s k . r u n ( F u t u r e T a s k . j a v a : 266 ) a t j a v a . u t i l . c o n c u r r e n t . T h r e a d P o o l E x e c u t o r . r u n W o r k e r ( T h r e a d P o o l E x e c u t o r . j a v a : 1149 ) a t j a v a . u t i l . c o n c u r r e n t . T h r e a d P o o l E x e c u t o r 9.call(BucketWriter.java:698) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor 9.call(BucketWriter.java:698)atjava.util.concurrent.FutureTask.run(FutureTask.java:266)atjava.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)atjava.util.concurrent.ThreadPoolExecutorWorker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)

问题解决:windows安装flume需要将所有hadoop-hdfs*的jar包拷贝到flume的lib下

hadoop-hdfs-3.0.0-cdh6.1.0.jar
hadoop-hdfs-3.0.0-cdh6.1.0-tests.jar
hadoop-hdfs-client-3.0.0-cdh6.1.0.jar
hadoop-hdfs-client-3.0.0-cdh6.1.0-tests.jar
hadoop-hdfs-httpfs-3.0.0-cdh6.1.0.jar
hadoop-hdfs-native-client-3.0.0-cdh6.1.0.jar
hadoop-hdfs-native-client-3.0.0-cdh6.1.0-tests.jar
hadoop-hdfs-nfs-3.0.0-cdh6.1.0.jar

问题四:

2020-09-02 15:12:25,698 (SinkRunner-PollingRunner-DefaultSinkProcessor) [WARN - org.apache.flume.sink.hdfs.HDFSEventSink.process(HDFSEventSink.java:443)] HDFS IO error
org.apache.hadoop.security.AccessControlException: SIMPLE authentication is not enabled. Available:[TOKEN, KERBEROS]
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:121)
at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:88)
at org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:281)
at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1176)
at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1155)
at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1093)
at org.apache.hadoop.hdfs.DistributedFileSystem$8.doCall(DistributedFileSystem.java:463)
at org.apache.hadoop.hdfs.DistributedFileSystem$8.doCall(DistributedFileSystem.java:460)
at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:474)
at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:401)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:1103)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:1083)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:972)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:960)
at org.apache.flume.sink.hdfs.HDFSDataStream.doOpen(HDFSDataStream.java:81)
at org.apache.flume.sink.hdfs.HDFSDataStream.open(HDFSDataStream.java:108)
at org.apache.flume.sink.hdfs.BucketWriter$1.call(BucketWriter.java:262)
at org.apache.flume.sink.hdfs.BucketWriter$1.call(BucketWriter.java:252)
at org.apache.flume.sink.hdfs.BucketWriter$9$1.run(BucketWriter.java:701)
at org.apache.flume.auth.SimpleAuthenticator.execute(SimpleAuthenticator.java:50)
at org.apache.flume.sink.hdfs.BucketWriter 9. c a l l ( B u c k e t W r i t e r . j a v a : 698 ) a t j a v a . u t i l . c o n c u r r e n t . F u t u r e T a s k . r u n ( F u t u r e T a s k . j a v a : 266 ) a t j a v a . u t i l . c o n c u r r e n t . T h r e a d P o o l E x e c u t o r . r u n W o r k e r ( T h r e a d P o o l E x e c u t o r . j a v a : 1149 ) a t j a v a . u t i l . c o n c u r r e n t . T h r e a d P o o l E x e c u t o r 9.call(BucketWriter.java:698) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor 9.call(BucketWriter.java:698)atjava.util.concurrent.FutureTask.run(FutureTask.java:266)atjava.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)atjava.util.concurrent.ThreadPoolExecutorWorker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): SIMPLE authentication is not enabled. Available:[TOKEN, KERBEROS]
at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1499)
at org.apache.hadoop.ipc.Client.call(Client.java:1445)
at org.apache.hadoop.ipc.Client.call(Client.java:1355)
at org.apache.hadoop.ipc.ProtobufRpcEngine I n v o k e r . i n v o k e ( P r o t o b u f R p c E n g i n e . j a v a : 228 ) a t o r g . a p a c h e . h a d o o p . i p c . P r o t o b u f R p c E n g i n e Invoker.invoke(ProtobufRpcEngine.java:228) at org.apache.hadoop.ipc.ProtobufRpcEngine Invoker.invoke(ProtobufRpcEngine.java:228)atorg.apache.hadoop.ipc.ProtobufRpcEngineInvoker.invoke(ProtobufRpcEngine.java:116)
at com.sun.proxy. P r o x y 12. c r e a t e ( U n k n o w n S o u r c e ) a t o r g . a p a c h e . h a d o o p . h d f s . p r o t o c o l P B . C l i e n t N a m e n o d e P r o t o c o l T r a n s l a t o r P B . c r e a t e ( C l i e n t N a m e n o d e P r o t o c o l T r a n s l a t o r P B . j a v a : 349 ) a t s u n . r e f l e c t . N a t i v e M e t h o d A c c e s s o r I m p l . i n v o k e 0 ( N a t i v e M e t h o d ) a t s u n . r e f l e c t . N a t i v e M e t h o d A c c e s s o r I m p l . i n v o k e ( N a t i v e M e t h o d A c c e s s o r I m p l . j a v a : 62 ) a t s u n . r e f l e c t . D e l e g a t i n g M e t h o d A c c e s s o r I m p l . i n v o k e ( D e l e g a t i n g M e t h o d A c c e s s o r I m p l . j a v a : 43 ) a t j a v a . l a n g . r e f l e c t . M e t h o d . i n v o k e ( M e t h o d . j a v a : 498 ) a t o r g . a p a c h e . h a d o o p . i o . r e t r y . R e t r y I n v o c a t i o n H a n d l e r . i n v o k e M e t h o d ( R e t r y I n v o c a t i o n H a n d l e r . j a v a : 422 ) a t o r g . a p a c h e . h a d o o p . i o . r e t r y . R e t r y I n v o c a t i o n H a n d l e r Proxy12.create(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.create(ClientNamenodeProtocolTranslatorPB.java:349) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:422) at org.apache.hadoop.io.retry.RetryInvocationHandler Proxy12.create(UnknownSource)atorg.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.create(ClientNamenodeProtocolTranslatorPB.java:349)atsun.reflect.NativeMethodAccessorImpl.invoke0(NativeMethod)atsun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)atsun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)atjava.lang.reflect.Method.invoke(Method.java:498)atorg.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:422)atorg.apache.hadoop.io.retry.RetryInvocationHandlerCall.invokeMethod(RetryInvocationHandler.java:165)
at org.apache.hadoop.io.retry.RetryInvocationHandler C a l l . i n v o k e ( R e t r y I n v o c a t i o n H a n d l e r . j a v a : 157 ) a t o r g . a p a c h e . h a d o o p . i o . r e t r y . R e t r y I n v o c a t i o n H a n d l e r Call.invoke(RetryInvocationHandler.java:157) at org.apache.hadoop.io.retry.RetryInvocationHandler Call.invoke(RetryInvocationHandler.java:157)atorg.apache.hadoop.io.retry.RetryInvocationHandlerCall.invokeOnce(RetryInvocationHandler.java:95)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:359)
at com.sun.proxy.$Proxy13.create(Unknown Source)
at org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:276)
… 23 more

问题解决:

tier1.sinks.sink1.hdfs.kerberosKeytab= D:/…/hdfs.keytab
tier1.sinks.sink1.hdfs.kerberosPrincipal= hdfs@AUTO.COM

问题五:

java.lang.IllegalArgumentException: Can’t get Kerberos realm
at org.apache.hadoop.security.HadoopKerberosName.setConfiguration(HadoopKerberosName.java:65)
at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:318)
at org.apache.hadoop.security.UserGroupInformation.setConfiguration(UserGroupInformation.java:364)
at org.apache.flume.auth.KerberosAuthenticator.authenticate(KerberosAuthenticator.java:143)
at org.apache.flume.auth.FlumeAuthenticationUtil.getAuthenticator(FlumeAuthenticationUtil.java:68)
at org.apache.flume.sink.hdfs.HDFSEventSink.configure(HDFSEventSink.java:256)
at org.apache.flume.conf.Configurables.configure(Configurables.java:41)
at org.apache.flume.node.AbstractConfigurationProvider.loadSinks(AbstractConfigurationProvider.java:411)
at org.apache.flume.node.AbstractConfigurationProvider.getConfiguration(AbstractConfigurationProvider.java:102)
at org.apache.flume.node.PollingPropertiesFileConfigurationProvider F i l e W a t c h e r R u n n a b l e . r u n ( P o l l i n g P r o p e r t i e s F i l e C o n f i g u r a t i o n P r o v i d e r . j a v a : 141 ) a t j a v a . u t i l . c o n c u r r e n t . E x e c u t o r s FileWatcherRunnable.run(PollingPropertiesFileConfigurationProvider.java:141) at java.util.concurrent.Executors FileWatcherRunnable.run(PollingPropertiesFileConfigurationProvider.java:141)atjava.util.concurrent.ExecutorsRunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access 301 ( S c h e d u l e d T h r e a d P o o l E x e c u t o r . j a v a : 180 ) a t j a v a . u t i l . c o n c u r r e n t . S c h e d u l e d T h r e a d P o o l E x e c u t o r 301(ScheduledThreadPoolExecutor.java:180) at java.util.concurrent.ScheduledThreadPoolExecutor 301(ScheduledThreadPoolExecutor.java:180)atjava.util.concurrent.ScheduledThreadPoolExecutorScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.security.authentication.util.KerberosUtil.getDefaultRealm(KerberosUtil.java:110)
at org.apache.hadoop.security.HadoopKerberosName.setConfiguration(HadoopKerberosName.java:63)
… 16 more
Caused by: KrbException: Cannot locate default realm
at sun.security.krb5.Config.getDefaultRealm(Config.java:1029)
… 22 more
问题解决:
配置krb5.conf,我是在windows端部署的flume,所以在flume的安装目录的conf下的flume-env.ps1添加如下krb5.conf路径
J A V A O P T S = " JAVA_OPTS=" JAVAOPTS="JAVA_OPTS -Djava.security.krb5.conf=D:\tools\apache-flume-1.8.0-bin\conf\krb5.conf"

问题六:
2020-09-02 19:18:55,867 (SinkRunner-PollingRunner-DefaultSinkProcessor) [WARN - org.apache.flume.sink.hdfs.HDFSEventSink.process(HDFSEventSink.java:443)] HDFS IO error
java.io.IOException: Failed on local exception: java.io.IOException: Couldn’t set up IO streams: java.lang.NoClassDefFoundError: com/google/re2j/PatternSyntaxException; Host Details : local host is: “LAPTOP-NOJK9QO5/169.254.232.127”; destination host is: “hostname01”:8020;
at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:808)
at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1503)
at org.apache.hadoop.ipc.Client.call(Client.java:1445)
at org.apache.hadoop.ipc.Client.call(Client.java:1355)
at org.apache.hadoop.ipc.ProtobufRpcEngine I n v o k e r . i n v o k e ( P r o t o b u f R p c E n g i n e . j a v a : 228 ) a t o r g . a p a c h e . h a d o o p . i p c . P r o t o b u f R p c E n g i n e Invoker.invoke(ProtobufRpcEngine.java:228) at org.apache.hadoop.ipc.ProtobufRpcEngine Invoker.invoke(ProtobufRpcEngine.java:228)atorg.apache.hadoop.ipc.ProtobufRpcEngineInvoker.invoke(ProtobufRpcEngine.java:116)
at com.sun.proxy. P r o x y 12. c r e a t e ( U n k n o w n S o u r c e ) a t o r g . a p a c h e . h a d o o p . h d f s . p r o t o c o l P B . C l i e n t N a m e n o d e P r o t o c o l T r a n s l a t o r P B . c r e a t e ( C l i e n t N a m e n o d e P r o t o c o l T r a n s l a t o r P B . j a v a : 349 ) a t s u n . r e f l e c t . N a t i v e M e t h o d A c c e s s o r I m p l . i n v o k e 0 ( N a t i v e M e t h o d ) a t s u n . r e f l e c t . N a t i v e M e t h o d A c c e s s o r I m p l . i n v o k e ( N a t i v e M e t h o d A c c e s s o r I m p l . j a v a : 62 ) a t s u n . r e f l e c t . D e l e g a t i n g M e t h o d A c c e s s o r I m p l . i n v o k e ( D e l e g a t i n g M e t h o d A c c e s s o r I m p l . j a v a : 43 ) a t j a v a . l a n g . r e f l e c t . M e t h o d . i n v o k e ( M e t h o d . j a v a : 498 ) a t o r g . a p a c h e . h a d o o p . i o . r e t r y . R e t r y I n v o c a t i o n H a n d l e r . i n v o k e M e t h o d ( R e t r y I n v o c a t i o n H a n d l e r . j a v a : 422 ) a t o r g . a p a c h e . h a d o o p . i o . r e t r y . R e t r y I n v o c a t i o n H a n d l e r Proxy12.create(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.create(ClientNamenodeProtocolTranslatorPB.java:349) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:422) at org.apache.hadoop.io.retry.RetryInvocationHandler Proxy12.create(UnknownSource)atorg.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.create(ClientNamenodeProtocolTranslatorPB.java:349)atsun.reflect.NativeMethodAccessorImpl.invoke0(NativeMethod)atsun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)atsun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)atjava.lang.reflect.Method.invoke(Method.java:498)atorg.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:422)atorg.apache.hadoop.io.retry.RetryInvocationHandlerCall.invokeMethod(RetryInvocationHandler.java:165)
at org.apache.hadoop.io.retry.RetryInvocationHandler C a l l . i n v o k e ( R e t r y I n v o c a t i o n H a n d l e r . j a v a : 157 ) a t o r g . a p a c h e . h a d o o p . i o . r e t r y . R e t r y I n v o c a t i o n H a n d l e r Call.invoke(RetryInvocationHandler.java:157) at org.apache.hadoop.io.retry.RetryInvocationHandler Call.invoke(RetryInvocationHandler.java:157)atorg.apache.hadoop.io.retry.RetryInvocationHandlerCall.invokeOnce(RetryInvocationHandler.java:95)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:359)
at com.sun.proxy.$Proxy13.create(Unknown Source)
at org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:276)
at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1176)
at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1155)
at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1093)
at org.apache.hadoop.hdfs.DistributedFileSystem$8.doCall(DistributedFileSystem.java:463)
at org.apache.hadoop.hdfs.DistributedFileSystem$8.doCall(DistributedFileSystem.java:460)
at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:474)
at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:401)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:1103)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:1083)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:972)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:960)
at org.apache.flume.sink.hdfs.HDFSDataStream.doOpen(HDFSDataStream.java:81)
at org.apache.flume.sink.hdfs.HDFSDataStream.open(HDFSDataStream.java:108)
at org.apache.flume.sink.hdfs.BucketWriter$1.call(BucketWriter.java:262)
at org.apache.flume.sink.hdfs.BucketWriter$1.call(BucketWriter.java:252)
at org.apache.flume.sink.hdfs.BucketWriter$9$1.run(BucketWriter.java:701)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1729)
at org.apache.flume.auth.UGIExecutor.execute(UGIExecutor.java:46)
at org.apache.flume.auth.KerberosAuthenticator.execute(KerberosAuthenticator.java:64)
at org.apache.flume.sink.hdfs.BucketWriter 9. c a l l ( B u c k e t W r i t e r . j a v a : 698 ) a t j a v a . u t i l . c o n c u r r e n t . F u t u r e T a s k . r u n ( F u t u r e T a s k . j a v a : 266 ) a t j a v a . u t i l . c o n c u r r e n t . T h r e a d P o o l E x e c u t o r . r u n W o r k e r ( T h r e a d P o o l E x e c u t o r . j a v a : 1149 ) a t j a v a . u t i l . c o n c u r r e n t . T h r e a d P o o l E x e c u t o r 9.call(BucketWriter.java:698) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor 9.call(BucketWriter.java:698)atjava.util.concurrent.FutureTask.run(FutureTask.java:266)atjava.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)atjava.util.concurrent.ThreadPoolExecutorWorker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.io.IOException: Couldn’t set up IO streams: java.lang.NoClassDefFoundError: com/google/re2j/PatternSyntaxException
at org.apache.hadoop.ipc.Client C o n n e c t i o n . s e t u p I O s t r e a m s ( C l i e n t . j a v a : 861 ) a t o r g . a p a c h e . h a d o o p . i p c . C l i e n t Connection.setupIOstreams(Client.java:861) at org.apache.hadoop.ipc.Client Connection.setupIOstreams(Client.java:861)atorg.apache.hadoop.ipc.ClientConnection.access 3600 ( C l i e n t . j a v a : 410 ) a t o r g . a p a c h e . h a d o o p . i p c . C l i e n t . g e t C o n n e c t i o n ( C l i e n t . j a v a : 1560 ) a t o r g . a p a c h e . h a d o o p . i p c . C l i e n t . c a l l ( C l i e n t . j a v a : 1391 ) . . . 43 m o r e C a u s e d b y : j a v a . l a n g . N o C l a s s D e f F o u n d E r r o r : c o m / g o o g l e / r e 2 j / P a t t e r n S y n t a x E x c e p t i o n a t o r g . a p a c h e . h a d o o p . s e c u r i t y . S a s l R p c C l i e n t . g e t S e r v e r P r i n c i p a l ( S a s l R p c C l i e n t . j a v a : 311 ) a t o r g . a p a c h e . h a d o o p . s e c u r i t y . S a s l R p c C l i e n t . c r e a t e S a s l C l i e n t ( S a s l R p c C l i e n t . j a v a : 234 ) a t o r g . a p a c h e . h a d o o p . s e c u r i t y . S a s l R p c C l i e n t . s e l e c t S a s l C l i e n t ( S a s l R p c C l i e n t . j a v a : 160 ) a t o r g . a p a c h e . h a d o o p . s e c u r i t y . S a s l R p c C l i e n t . s a s l C o n n e c t ( S a s l R p c C l i e n t . j a v a : 390 ) a t o r g . a p a c h e . h a d o o p . i p c . C l i e n t 3600(Client.java:410) at org.apache.hadoop.ipc.Client.getConnection(Client.java:1560) at org.apache.hadoop.ipc.Client.call(Client.java:1391) ... 43 more Caused by: java.lang.NoClassDefFoundError: com/google/re2j/PatternSyntaxException at org.apache.hadoop.security.SaslRpcClient.getServerPrincipal(SaslRpcClient.java:311) at org.apache.hadoop.security.SaslRpcClient.createSaslClient(SaslRpcClient.java:234) at org.apache.hadoop.security.SaslRpcClient.selectSaslClient(SaslRpcClient.java:160) at org.apache.hadoop.security.SaslRpcClient.saslConnect(SaslRpcClient.java:390) at org.apache.hadoop.ipc.Client 3600(Client.java:410)atorg.apache.hadoop.ipc.Client.getConnection(Client.java:1560)atorg.apache.hadoop.ipc.Client.call(Client.java:1391)...43moreCausedby:java.lang.NoClassDefFoundError:com/google/re2j/PatternSyntaxExceptionatorg.apache.hadoop.security.SaslRpcClient.getServerPrincipal(SaslRpcClient.java:311)atorg.apache.hadoop.security.SaslRpcClient.createSaslClient(SaslRpcClient.java:234)atorg.apache.hadoop.security.SaslRpcClient.selectSaslClient(SaslRpcClient.java:160)atorg.apache.hadoop.security.SaslRpcClient.saslConnect(SaslRpcClient.java:390)atorg.apache.hadoop.ipc.ClientConnection.setupSaslConnection(Client.java:614)
at org.apache.hadoop.ipc.Client$Connection.access 2300 ( C l i e n t . j a v a : 410 ) a t o r g . a p a c h e . h a d o o p . i p c . C l i e n t 2300(Client.java:410) at org.apache.hadoop.ipc.Client 2300(Client.java:410)atorg.apache.hadoop.ipc.ClientConnection 2. r u n ( C l i e n t . j a v a : 799 ) a t o r g . a p a c h e . h a d o o p . i p c . C l i e n t 2.run(Client.java:799) at org.apache.hadoop.ipc.Client 2.run(Client.java:799)atorg.apache.hadoop.ipc.ClientConnection 2. r u n ( C l i e n t . j a v a : 795 ) a t j a v a . s e c u r i t y . A c c e s s C o n t r o l l e r . d o P r i v i l e g e d ( N a t i v e M e t h o d ) a t j a v a x . s e c u r i t y . a u t h . S u b j e c t . d o A s ( S u b j e c t . j a v a : 422 ) a t o r g . a p a c h e . h a d o o p . s e c u r i t y . U s e r G r o u p I n f o r m a t i o n . d o A s ( U s e r G r o u p I n f o r m a t i o n . j a v a : 1729 ) a t o r g . a p a c h e . h a d o o p . i p c . C l i e n t 2.run(Client.java:795) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1729) at org.apache.hadoop.ipc.Client 2.run(Client.java:795)atjava.security.AccessController.doPrivileged(NativeMethod)atjavax.security.auth.Subject.doAs(Subject.java:422)atorg.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1729)atorg.apache.hadoop.ipc.ClientConnection.setupIOstreams(Client.java:795)
… 46 more
Caused by: java.lang.ClassNotFoundException: com.google.re2j.PatternSyntaxException
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
… 58 more

问题解决:补充hadoop jar下的依赖包

re2j-1.0.jar

问题七:问题记录,暂时没有找到问题在哪,我再集群本地是可以上传文件不报错的,也不是内存问题,我怀疑是因为集群部署的kerberos导致认证时节点不互通

org.apache.hadoop.ipc.RemoteException(java.io.IOException): File /user/flume/biqas/FlumeData.1599096698303.tmp could only be written to 0 of the 1 minReplication nodes. There are 3 datanode(s) running and 3 node(s) are excluded in this operation.
at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:2095)
at org.apache.hadoop.hdfs.server.namenode.FSDirWriteFileOp.chooseTargetForNewBlock(FSDirWriteFileOp.java:294)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:2671)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:872)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:550)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol 2. c a l l B l o c k i n g M e t h o d ( C l i e n t N a m e n o d e P r o t o c o l P r o t o s . j a v a ) a t o r g . a p a c h e . h a d o o p . i p c . P r o t o b u f R p c E n g i n e 2.callBlockingMethod(ClientNamenodeProtocolProtos.java) at org.apache.hadoop.ipc.ProtobufRpcEngine 2.callBlockingMethod(ClientNamenodeProtocolProtos.java)atorg.apache.hadoop.ipc.ProtobufRpcEngineServer P r o t o B u f R p c I n v o k e r . c a l l ( P r o t o b u f R p c E n g i n e . j a v a : 523 ) a t o r g . a p a c h e . h a d o o p . i p c . R P C ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:523) at org.apache.hadoop.ipc.RPC ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:523)atorg.apache.hadoop.ipc.RPCServer.call(RPC.java:991)
at org.apache.hadoop.ipc.Server R p c C a l l . r u n ( S e r v e r . j a v a : 869 ) a t o r g . a p a c h e . h a d o o p . i p c . S e r v e r RpcCall.run(Server.java:869) at org.apache.hadoop.ipc.Server RpcCall.run(Server.java:869)atorg.apache.hadoop.ipc.ServerRpcCall.run(Server.java:815)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1729)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2675)

    at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1499)
    at org.apache.hadoop.ipc.Client.call(Client.java:1445)
    at org.apache.hadoop.ipc.Client.call(Client.java:1355)
    at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:228)
    at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:116)
    at com.sun.proxy.$Proxy12.addBlock(Unknown Source)
    at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.addBlock(ClientNamenodeProtocolTranslatorPB.java:497)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:422)
    at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:165)
    at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:157)
    at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:95)
    at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:359)
    at com.sun.proxy.$Proxy13.addBlock(Unknown Source)
    at org.apache.hadoop.hdfs.DFSOutputStream.addBlock(DFSOutputStream.java:1083)
    at org.apache.hadoop.hdfs.DataStreamer.locateFollowingBlock(DataStreamer.java:1865)
    at org.apache.hadoop.hdfs.DataStreamer.nextBlockOutputStream(DataStreamer.java:1668)
    at org.apache.hadoop.hdfs.DataStreamer.run(DataStreamer.java:716)

2020-09-03 09:32:54,764 (SinkRunner-PollingRunner-DefaultSinkProcessor) [ERROR - org.apache.flume.sink.hdfs.AbstractHDFSWriter.isUnderReplicated(AbstractHDFSWriter.java:99)] Unexpected error while checking replication factor
java.lang.reflect.InvocationTargetException
at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flume.sink.hdfs.AbstractHDFSWriter.getNumCurrentReplicas(AbstractHDFSWriter.java:166)
at org.apache.flume.sink.hdfs.AbstractHDFSWriter.isUnderReplicated(AbstractHDFSWriter.java:85)
at org.apache.flume.sink.hdfs.BucketWriter.shouldRotate(BucketWriter.java:610)
at org.apache.flume.sink.hdfs.BucketWriter.append(BucketWriter.java:545)
at org.apache.flume.sink.hdfs.HDFSEventSink.process(HDFSEventSink.java:401)
at org.apache.flume.sink.DefaultSinkProcessor.process(DefaultSinkProcessor.java:67)
at org.apache.flume.SinkRunner$PollingRunner.run(SinkRunner.java:145)
at java.lang.Thread.run(Thread.java:748)

Window ⇒ Flume ⇒ HDFS方案:

由于问题七没有得到解决被搁置。

Window ⇒ Flume ⇒ Flume ⇒ HDFS方案:

通过Flume端口传输成功将数据从Windows端传输到集群的HDFS中。

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值