记录一次Flink写入ES6的踩坑经历

近日在一次Flink打包运行插入ES的任务时,遇到一些版本冲突的问题

在这里插入图片描述
Flink自己实现的连接器里提供的ES版本为6.3.1(6.x连接器版本)而我们集群环境则为6.5.4

运行时会报一些异常

2021-06-25 15:46:05,827 WARN org.apache.flink.runtime.taskmanager.Task [] - Window(TumblingEventTimeWindows(8639913600000), ContinuousEventTimeTrigger, AggregateFunctionKaTeX parse error: Expected 'EOF', got '#' at position 60: …: Unnamed (1/3)#̲22 (86c2d30effb…Listener;)Lorg/apache/flink/elasticsearch6/shaded/org/elasticsearch/action/bulk/BulkProcessor$Builder; is abstract
at org.apache.flink.streaming.connectors.elasticsearch6.Elasticsearch6ApiCallBridge.createBulkProcessorBuilder(Elasticsearch6ApiCallBridge.java)
at org.apache.flink.streaming.connectors.elasticsearch.ElasticsearchSinkBase.buildBulkProcessor(ElasticsearchSinkBase.java:379)
at org.apache.flink.streaming.connectors.elasticsearch.ElasticsearchSinkBase.open(ElasticsearchSinkBase.java:319)
at org.apache.flink.api.common.functions.util.FunctionUtils.openFunction(FunctionUtils.java:34)
at org.apache.flink.streaming.api.operators.AbstractUdfStreamOperator.open(AbstractUdfStreamOperator.java:102)
at org.apache.flink.streaming.api.operators.StreamSink.open(StreamSink.java:46)
at org.apache.flink.streaming.runtime.tasks.OperatorChain.initializeStateAndOpenOperators(OperatorChain.java:437)
at org.apache.flink.streaming.runtime.tasks.StreamTask.restoreGates(StreamTask.java:574)
at org.apache.flink.streaming.runtime.tasks.StreamTaskActionExecutor$1.call(StreamTaskActionExecutor.java:55)
at org.apache.flink.streaming.runtime.tasks.StreamTask.restore(StreamTask.java:554)
at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:756)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:563)
at java.lang.Thread.run(Thread.java:745)

在观察过版本依赖后,决定主动下载6.3.1的相关ES包

在这里插入图片描述

在补充过jar包之后,运行会抛以下异常

java.lang.AbstractMethodError: Method org/apache/flink/streaming/connectors/elasticsearch6/Elasticsearch6ApiCallBridge.createBulkProcessorBuilder(Ljava/lang/AutoCloseable;Lorg/apache/flink/elasticsearch6/shaded/org/elasticsearch/action/bulk/BulkProcessor L i s t e n e r ; ) L o r g / a p a c h e / f l i n k / e l a s t i c s e a r c h 6 / s h a d e d / o r g / e l a s t i c s e a r c h / a c t i o n / b u l k / B u l k P r o c e s s o r Listener;)Lorg/apache/flink/elasticsearch6/shaded/org/elasticsearch/action/bulk/BulkProcessor Listener;)Lorg/apache/flink/elasticsearch6/shaded/org/elasticsearch/action/bulk/BulkProcessorBuilder; is abstract
at org.apache.flink.streaming.connectors.elasticsearch6.Elasticsearch6ApiCallBridge.createBulkProcessorBuilder(Elasticsearch6ApiCallBridge.java) ~[flink-connector-elasticsearch6_2.12-1.13.0.jar:1.13.0]
at org.apache.flink.streaming.connectors.elasticsearch.ElasticsearchSinkBase.buildBulkProcessor(ElasticsearchSinkBase.java:379) ~[flink-sql-connector-elasticsearch6_2.11-1.13.0.jar:1.13.0]
at org.apache.flink.streaming.connectors.elasticsearch.ElasticsearchSinkBase.open(ElasticsearchSinkBase.java:319) ~[flink-sql-connector-elasticsearch6_2.11-1.13.0.jar:1.13.0]
at org.apache.flink.api.common.functions.util.FunctionUtils.openFunction(FunctionUtils.java:34) ~[realtime_vod_active.jar:?]
at org.apache.flink.streaming.api.operators.AbstractUdfStreamOperator.open(AbstractUdfStreamOperator.java:102) ~[realtime_vod_active.jar:?]
at org.apache.flink.streaming.api.operators.StreamSink.open(StreamSink.java:46) ~[realtime_vod_active.jar:?]
at org.apache.flink.streaming.runtime.tasks.OperatorChain.initializeStateAndOpenOperators(OperatorChain.java:437) ~[realtime_vod_active.jar:?]
at org.apache.flink.streaming.runtime.tasks.StreamTask.restoreGates(StreamTask.java:574) ~[realtime_vod_active.jar:?]
at org.apache.flink.streaming.runtime.tasks.StreamTaskActionExecutor$1.call(StreamTaskActionExecutor.java:55) ~[realtime_vod_active.jar:?]
at org.apache.flink.streaming.runtime.tasks.StreamTask.restore(StreamTask.java:554) ~[realtime_vod_active.jar:?]
at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:756) ~[realtime_vod_active.jar:?]
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:563) ~[realtime_vod_active.jar:?]
at java.lang.Thread.run(Thread.java:745) ~[?:1.8.0_121]

这个异常显示jar包在找执行环境时,错误的寻找了flink sql connector相关类

我们把相关jar包注释掉即可

再次运行后,报错

java.lang.NoClassDefFoundError: org/apache/flink/streaming/connectors/elasticsearch/ActionRequestFailureHandler
at com.guttv.test.AgDayHyperLogLogUv.main(AgDayHyperLogLogUv.java:198)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:355)
at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:222)
at org.apache.flink.client.ClientUtils.executeProgram(ClientUtils.java:114)
at org.apache.flink.client.cli.CliFrontend.executeProgram(CliFrontend.java:812)
at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:246)
at org.apache.flink.client.cli.CliFrontend.parseAndRun(CliFrontend.java:1054)
at org.apache.flink.client.cli.CliFrontend.lambda$main 10 ( C l i F r o n t e n d . j a v a : 1132 ) a t j a v a . s e c u r i t y . A c c e s s C o n t r o l l e r . d o P r i v i l e g e d ( N a t i v e M e t h o d ) a t j a v a x . s e c u r i t y . a u t h . S u b j e c t . d o A s ( S u b j e c t . j a v a : 422 ) a t o r g . a p a c h e . h a d o o p . s e c u r i t y . U s e r G r o u p I n f o r m a t i o n . d o A s ( U s e r G r o u p I n f o r m a t i o n . j a v a : 1920 ) a t o r g . a p a c h e . f l i n k . r u n t i m e . s e c u r i t y . c o n t e x t s . H a d o o p S e c u r i t y C o n t e x t . r u n S e c u r e d ( H a d o o p S e c u r i t y C o n t e x t . j a v a : 41 ) a t o r g . a p a c h e . f l i n k . c l i e n t . c l i . C l i F r o n t e n d . m a i n ( C l i F r o n t e n d . j a v a : 1132 ) C a u s e d b y : j a v a . l a n g . C l a s s N o t F o u n d E x c e p t i o n : o r g . a p a c h e . f l i n k . s t r e a m i n g . c o n n e c t o r s . e l a s t i c s e a r c h . A c t i o n R e q u e s t F a i l u r e H a n d l e r a t j a v a . n e t . U R L C l a s s L o a d e r . f i n d C l a s s ( U R L C l a s s L o a d e r . j a v a : 381 ) a t j a v a . l a n g . C l a s s L o a d e r . l o a d C l a s s ( C l a s s L o a d e r . j a v a : 424 ) a t s u n . m i s c . L a u n c h e r 10(CliFrontend.java:1132) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1920) at org.apache.flink.runtime.security.contexts.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41) at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1132) Caused by: java.lang.ClassNotFoundException: org.apache.flink.streaming.connectors.elasticsearch.ActionRequestFailureHandler at java.net.URLClassLoader.findClass(URLClassLoader.java:381) at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at sun.misc.Launcher 10(CliFrontend.java:1132)atjava.security.AccessController.doPrivileged(NativeMethod)atjavax.security.auth.Subject.doAs(Subject.java:422)atorg.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1920)atorg.apache.flink.runtime.security.contexts.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)atorg.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1132)Causedby:java.lang.ClassNotFoundException:org.apache.flink.streaming.connectors.elasticsearch.ActionRequestFailureHandleratjava.net.URLClassLoader.findClass(URLClassLoader.java:381)atjava.lang.ClassLoader.loadClass(ClassLoader.java:424)atsun.misc.LauncherAppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
… 17 more

显示flink-es-connector 缺少了某个类,我们把项目代码重新打包,把es连接器的jar包一起打入,注释掉lib目录下原来的连接器jar包,再次运行即可。

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 1
    评论
可以使用 Flink 提供的 Elasticsearch Sink 将数据写入 Elasticsearch 中。具体步骤如下: 1. 引入 Elasticsearch Sink 的依赖: ``` <dependency> <groupId>org.apache.flink</groupId> <artifactId>flink-connector-elasticsearch7_2.12</artifactId> <version>1.12.</version> </dependency> ``` 2. 创建 Elasticsearch Sink: ``` import org.apache.flink.streaming.connectors.elasticsearch.ElasticsearchSink; import org.apache.flink.streaming.connectors.elasticsearch7.ElasticsearchSinkBuilder; import org.apache.http.HttpHost; import org.elasticsearch.client.RestClient; import org.elasticsearch.client.RestHighLevelClient; import org.elasticsearch.common.xcontent.XContentType; import org.elasticsearch.common.xcontent.json.JsonXContent; import java.util.ArrayList; import java.util.HashMap; import java.util.List; import java.util.Map; public class MyElasticsearchSink { public static ElasticsearchSink<Map<String, Object>> createSink(String indexName, String typeName, String clusterName, String[] hosts) { List<HttpHost> httpHosts = new ArrayList<>(); for (String host : hosts) { httpHosts.add(new HttpHost(host, 920, "http")); } ElasticsearchSink.Builder<Map<String, Object>> builder = new ElasticsearchSinkBuilder<>(httpHosts, new ElasticsearchSinkFunction<Map<String, Object>>() { @Override public void process(Map<String, Object> element, RuntimeContext ctx, RequestIndexer indexer) { indexer.add(createIndexRequest(element, indexName, typeName)); } private IndexRequest createIndexRequest(Map<String, Object> element, String indexName, String typeName) { return Requests.indexRequest() .index(indexName) .type(typeName) .source(JsonXContent.contentBuilder().map(element), XContentType.JSON); } }); builder.setBulkFlushMaxActions(100); builder.setBulkFlushInterval(100); builder.setRestClientFactory(restClientBuilder -> { restClientBuilder.setMaxRetryTimeoutMillis(60000); restClientBuilder.setHttpClientConfigCallback(httpClientBuilder -> { httpClientBuilder.setMaxConnTotal(200); httpClientBuilder.setMaxConnPerRoute(100); return httpClientBuilder; }); return new RestHighLevelClient(restClientBuilder); }); return builder.build(); } } ``` 3. 使用 Elasticsearch Sink 将数据写入 Elasticsearch: ``` DataStream<Map<String, Object>> dataStream = ...; String indexName = "my_index"; String typeName = "my_type"; String clusterName = "my_cluster"; String[] hosts = {"localhost"}; dataStream.addSink(MyElasticsearchSink.createSink(indexName, typeName, clusterName, hosts)); ```

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论 1
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值