FlinkCDC从mysql -kafka-es同步数据时遇到的问题

2021-01-12 01:12:15,551 INFO  org.apache.kafka.clients.consumer.internals.SubscriptionState [] - [Consumer clientId=consumer-10, groupId=null] Resetting offset for partition mysql2es_t_bank_question-1 to offset 0.
2021-01-12 01:12:15,924 ERROR org.apache.flink.streaming.connectors.elasticsearch.util.NoOpFailureHandler [] - Failed Elasticsearch item request: Elasticsearch exception [type=version_conflict_engine_exception, reason=[article][appeoDKPQVu1459-qs_5ffbe1d4581a9_R2uAHCvJ]: version conflict, required seqNo [1797790], primary term [1]. current document has seqNo [1797850] and primary term [1]]
org.elasticsearch.ElasticsearchException: Elasticsearch exception [type=version_conflict_engine_exception, reason=[article][appeoDKPQVu1459-qs_5ffbe1d4581a9_R2uAHCvJ]: version conflict, required seqNo [1797790], primary term [1]. current document has seqNo [1797850] and primary term [1]]
	at org.elasticsearch.ElasticsearchException.innerFromXContent(ElasticsearchException.java:510) ~[blob_p-21f0d8527b99290da28a61b1488a989434dbefef-99cc4e80db37d6b4bc1f923477e436d0:6.3.1]
	at org.elasticsearch.ElasticsearchException.fromXContent(ElasticsearchException.java:421) ~[blob_p-21f0d8527b99290da28a61b1488a989434dbefef-99cc4e80db37d6b4bc1f923477e436d0:6.3.1]
	at org.elasticsearch.action.bulk.BulkItemResponse.fromXContent(BulkItemResponse.java:135) ~[blob_p-21f0d8527b99290da28a61b1488a989434dbefef-99cc4e80db37d6b4bc1f923477e436d0:6.3.1]
	at org.elasticsearch.action.bulk.BulkResponse.fromXContent(BulkResponse.java:198) ~[blob_p-21f0d8527b99290da28a61b1488a989434dbefef-99cc4e80db37d6b4bc1f923477e436d0:6.3.1]
	at org.elasticsearch.client.RestHighLevelClient.parseEntity(RestHighLevelClient.java:653) ~[blob_p-bc54ce7e13c1299ef100ba41e2b2254e5609270e-f1a544d31c47c4e06bbe9e8ca8a6f6d2:6.3.1]
	at org.elasticsearch.client.RestHighLevelClient.lambda$performRequestAsyncAndParseEntity$3(RestHighLevelClient.java:549) ~[blob_p-bc54ce7e13c1299ef100ba41e2b2254e5609270e-f1a544d31c47c4e06bbe9e8ca8a6f6d2:6.3.1]
	at org.elasticsearch.client.RestHighLevelClient$1.onSuccess(RestHighLevelClient.java:580) [blob_p-bc54ce7e13c1299ef100ba41e2b2254e5609270e-f1a544d31c47c4e06bbe9e8ca8a6f6d2:6.3.1]
	at org.elasticsearch.client.RestClient$FailureTrackingResponseListener.onSuccess(RestClient.java:621) [blob_p-99de036a2cd99dbecec1cc84f5d0e19032e74fa7-84c9ba8cf52dbddab768066dcef86f8d:6.3.1]
	at org.elasticsearch.client.RestClient$1.completed(RestClient.java:375) [blob_p-99de036a2cd99dbecec1cc84f5d0e19032e74fa7-84c9ba8cf52dbddab768066dcef86f8d:6.3.1]
	at org.elasticsearch.client.RestClient$1.completed(RestClient.java:366) [blob_p-99de036a2cd99dbecec1cc84f5d0e19032e74fa7-84c9ba8cf52dbddab768066dcef86f8d:6.3.1]
	at org.apache.http.concurrent.BasicFuture.completed(BasicFuture.java:119) [blob_p-e3fd8ced1f52c7574af952e2e6da0df8df08eb82-42012784b37d9afbb4aa86747f0bee29:4.4.6]
	at org.apache.http.impl.nio.client.DefaultClientExchangeHandlerImpl.responseCompleted(DefaultClientExchangeHandlerImpl.java:177) [blob_p-95aa3e6fb520191a0970a73cf09f62948ee614be-76eeb883350993edca8cd9a6eb494d0b:4.1.2]
	at org.apache.http.nio.protocol.HttpAsyncRequestExecutor.processResponse(HttpAsyncRequestExecutor.java:436) [blob_p-f4be009e7505f6ceddf21e7960c759f413f15056-d173e56a892e118bf8d2e2d49d6d8f3a:4.4.5]
	at org.apache.http.nio.protocol.HttpAsyncRequestExecutor.inputReady(HttpAsyncRequestExecutor.java:326) [blob_p-f4be009e7505f6ceddf21e7960c759f413f15056-d173e56a892e118bf8d2e2d49d6d8f3a:4.4.5]
	at org.apache.http.impl.nio.DefaultNHttpClientConnection.consumeInput(DefaultNHttpClientConnection.java:265) [blob_p-f4be009e7505f6ceddf21e7960c759f413f15056-d173e56a892e118bf8d2e2d49d6d8f3a:4.4.5]
	at org.apache.http.impl.nio.client.InternalIODispatch.onInputReady(InternalIODispatch.java:81) [blob_p-95aa3e6fb520191a0970a73cf09f62948ee614be-76eeb883350993edca8cd9a6eb494d0b:4.1.2]
	at org.apache.http.impl.nio.client.InternalIODispatch.onInputReady(InternalIODispatch.java:39) [blob_p-95aa3e6fb520191a0970a73cf09f62948ee614be-76eeb883350993edca8cd9a6eb494d0b:4.1.2]
	at org.apache.http.impl.nio.reactor.AbstractIODispatch.inputReady(AbstractIODispatch.java:114) [blob_p-f4be009e7505f6ceddf21e7960c759f413f15056-d173e56a892e118bf8d2e2d49d6d8f3a:4.4.5]
	at org.apache.http.impl.nio.reactor.BaseIOReactor.readable(BaseIOReactor.java:162) [blob_p-f4be009e7505f6ceddf21e7960c759f413f15056-d173e56a892e118bf8d2e2d49d6d8f3a:4.4.5]
	at org.apache.http.impl.nio.reactor.AbstractIOReactor.processEvent(AbstractIOReactor.java:337) [blob_p-f4be009e7505f6ceddf21e7960c759f413f15056-d173e56a892e118bf8d2e2d49d6d8f3a:4.4.5]
	at org.apache.http.impl.nio.reactor.AbstractIOReactor.processEvents(AbstractIOReactor.java:315) [blob_p-f4be009e7505f6ceddf21e7960c759f413f15056-d173e56a892e118bf8d2e2d49d6d8f3a:4.4.5]
	at org.apache.http.impl.nio.reactor.AbstractIOReactor.execute(AbstractIOReactor.java:276) [blob_p-f4be009e7505f6ceddf21e7960c759f413f15056-d173e56a892e118bf8d2e2d49d6d8f3a:4.4.5]
	at org.apache.http.impl.nio.reactor.BaseIOReactor.execute(BaseIOReactor.java:104) [blob_p-f4be009e7505f6ceddf21e7960c759f413f15056-d173e56a892e118bf8d2e2d49d6d8f3a:4.4.5]
	at org.apache.http.impl.nio.reactor.AbstractMultiworkerIOReactor$Worker.run(AbstractMultiworkerIOReactor.java:588) [blob_p-f4be009e7505f6ceddf21e7960c759f413f15056-d173e56a892e118bf8d2e2d49d6d8f3a:4.4.5]
	at java.lang.Thread.run(Thread.java:748) [?:1.8.0_252]
2021-01-12 01:12:15,930 ERROR org.apache.flink.streaming.runtime.tasks.StreamTask          [] - Error during disposal of stream operator.
java.lang.RuntimeException: An error occurred in ElasticsearchSink.
	at org.apache.flink.streaming.connectors.elasticsearch.ElasticsearchSinkBase.checkErrorAndRethrow(ElasticsearchSinkBase.java:380) ~[blob_p-92dd38faf9d380ca7874d869811112ae8a909875-3d17eda2919dfe5743b86aeefa0b3479:1.12.0]
	at org.apache.flink.streaming.connectors.elasticsearch.ElasticsearchSinkBase.close(ElasticsearchSinkBase.java:342) ~[blob_p-92dd38faf9d380ca7874d869811112ae8a909875-3d17eda2919dfe5743b86aeefa0b3479:1.12.0]
	at org.apache.flink.api.common.functions.util.FunctionUtils.closeFunction(FunctionUtils.java:43) ~[flink-dist_2.11-1.11.2.jar:1.11.2]
	at org.apache.flink.streaming.api.operators.AbstractUdfStreamOperator.dispose(AbstractUdfStreamOperator.java:117) ~[flink-dist_2.11-1.11.2.jar:1.11.2]
	at org.apache.flink.streaming.runtime.tasks.StreamTask.disposeAllOperators(StreamTask.java:729) [flink-dist_2.11-1.11.2.jar:1.11.2]
	at org.apache.flink.streaming.runtime.tasks.StreamTask.cleanUpInvoke(StreamTask.java:645) [flink-dist_2.11-1.11.2.jar:1.11.2]
	at org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:549) [flink-dist_2.11-1.11.2.jar:1.11.2]
	at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:721) [flink-dist_2.11-1.11.2.jar:1.11.2]
	at org.apache.flink.runtime.taskmanager.Task.run(Task.java:546) [flink-dist_2.11-1.11.2.jar:1.11.2]
	at java.lang.Thread.run(Thread.java:748) [?:1.8.0_252]
Caused by: org.elasticsearch.ElasticsearchException: Elasticsearch exception [type=version_conflict_engine_exception, reason=[article][appeoDKPQVu1459-qs_5ffbe1d4581a9_R2uAHCvJ]: version conflict, required seqNo [1797790], primary term [1]. current document has seqNo [1797850] and primary term [1]]
	at org.elasticsearch.ElasticsearchException.innerFromXContent(ElasticsearchException.java:510) ~[blob_p-21f0d8527b99290da28a61b1488a989434dbefef-99cc4e80db37d6b4bc1f923477e436d0:6.3.1]
	at org.elasticsearch.ElasticsearchException.fromXContent(ElasticsearchException.java:421) ~[blob_p-21f0d8527b99290da28a61b1488a989434dbefef-99cc4e80db37d6b4bc1f923477e436d0:6.3.1]
	at org.elasticsearch.action.bulk.BulkItemResponse.fromXContent(BulkItemResponse.java:135) ~[blob_p-21f0d8527b99290da28a61b1488a989434dbefef-99cc4e80db37d6b4bc1f923477e436d0:6.3.1]
	at org.elasticsearch.action.bulk.BulkResponse.fromXContent(BulkResponse.java:198) ~[blob_p-21f0d8527b99290da28a61b1488a989434dbefef-99cc4e80db37d6b4bc1f923477e436d0:6.3.1]
	at org.elasticsearch.client.RestHighLevelClient.parseEntity(RestHighLevelClient.java:653) ~[blob_p-bc54ce7e13c1299ef100ba41e2b2254e5609270e-f1a544d31c47c4e06bbe9e8ca8a6f6d2:6.3.1]
	at org.elasticsearch.client.RestHighLevelClient.lambda$performRequestAsyncAndParseEntity$3(RestHighLevelClient.java:549) ~[blob_p-bc54ce7e13c1299ef100ba41e2b2254e5609270e-f1a544d31c47c4e06bbe9e8ca8a6f6d2:6.3.1]
	at org.elasticsearch.client.RestHighLevelClient$1.onSuccess(RestHighLevelClient.java:580) ~[blob_p-bc54ce7e13c1299ef100ba41e2b2254e5609270e-f1a544d31c47c4e06bbe9e8ca8a6f6d2:6.3.1]
	at org.elasticsearch.client.RestClient$FailureTrackingResponseListener.onSuccess(RestClient.java:621) ~[blob_p-99de036a2cd99dbecec1cc84f5d0e19032e74fa7-84c9ba8cf52dbddab768066dcef86f8d:6.3.1]
	at org.elasticsearch.client.RestClient$1.completed(RestClient.java:375) ~[blob_p-99de036a2cd99dbecec1cc84f5d0e19032e74fa7-84c9ba8cf52dbddab768066dcef86f8d:6.3.1]
	at org.elasticsearch.client.RestClient$1.completed(RestClient.java:366) ~[blob_p-99de036a2cd99dbecec1cc84f5d0e19032e74fa7-84c9ba8cf52dbddab768066dcef86f8d:6.3.1]
	at org.apache.http.concurrent.BasicFuture.completed(BasicFuture.java:119) ~[blob_p-e3fd8ced1f52c7574af952e2e6da0df8df08eb82-42012784b37d9afbb4aa86747f0bee29:4.4.6]
	at org.apache.http.impl.nio.client.DefaultClientExchangeHandlerImpl.responseCompleted(DefaultClientExchangeHandlerImpl.java:177) ~[blob_p-95aa3e6fb520191a0970a73cf09f62948ee614be-76eeb883350993edca8cd9a6eb494d0b:4.1.2]
	at org.apache.http.nio.protocol.HttpAsyncRequestExecutor.processResponse(HttpAsyncRequestExecutor.java:436) ~[blob_p-f4be009e7505f6ceddf21e7960c759f413f15056-d173e56a892e118bf8d2e2d49d6d8f3a:4.4.5]
	at org.apache.http.nio.protocol.HttpAsyncRequestExecutor.inputReady(HttpAsyncRequestExecutor.java:326) ~[blob_p-f4be009e7505f6ceddf21e7960c759f413f15056-d173e56a892e118bf8d2e2d49d6d8f3a:4.4.5]
	at org.apache.http.impl.nio.DefaultNHttpClientConnection.consumeInput(DefaultNHttpClientConnection.java:265) ~[blob_p-f4be009e7505f6ceddf21e7960c759f413f15056-d173e56a892e118bf8d2e2d49d6d8f3a:4.4.5]
	at org.apache.http.impl.nio.client.InternalIODispatch.onInputReady(InternalIODispatch.java:81) ~[blob_p-95aa3e6fb520191a0970a73cf09f62948ee614be-76eeb883350993edca8cd9a6eb494d0b:4.1.2]
	at org.apache.http.impl.nio.client.InternalIODispatch.onInputReady(InternalIODispatch.java:39) ~[blob_p-95aa3e6fb520191a0970a73cf09f62948ee614be-76eeb883350993edca8cd9a6eb494d0b:4.1.2]
	at org.apache.http.impl.nio.reactor.AbstractIODispatch.inputReady(AbstractIODispatch.java:114) ~[blob_p-f4be009e7505f6ceddf21e7960c759f413f15056-d173e56a892e118bf8d2e2d49d6d8f3a:4.4.5]
	at org.apache.http.impl.nio.reactor.BaseIOReactor.readable(BaseIOReactor.java:162) ~[blob_p-f4be009e7505f6ceddf21e7960c759f413f15056-d173e56a892e118bf8d2e2d49d6d8f3a:4.4.5]
	at org.apache.http.impl.nio.reactor.AbstractIOReactor.processEvent(AbstractIOReactor.java:337) ~[blob_p-f4be009e7505f6ceddf21e7960c759f413f15056-d173e56a892e118bf8d2e2d49d6d8f3a:4.4.5]
	at org.apache.http.impl.nio.reactor.AbstractIOReactor.processEvents(AbstractIOReactor.java:315) ~[blob_p-f4be009e7505f6ceddf21e7960c759f413f15056-d173e56a892e118bf8d2e2d49d6d8f3a:4.4.5]
	at org.apache.http.impl.nio.reactor.AbstractIOReactor.execute(AbstractIOReactor.java:276) ~[blob_p-f4be009e7505f6ceddf21e7960c759f413f15056-d173e56a892e118bf8d2e2d49d6d8f3a:4.4.5]
	at org.apache.http.impl.nio.reactor.BaseIOReactor.execute(BaseIOReactor.java:104) ~[blob_p-f4be009e7505f6ceddf21e7960c759f413f15056-d173e56a892e118bf8d2e2d49d6d8f3a:4.4.5]
	at org.apache.http.impl.nio.reactor.AbstractMultiworkerIOReactor$Worker.run(AbstractMultiworkerIOReactor.java:588) ~[blob_p-f4be009e7505f6ceddf21e7960c759f413f15056-d173e56a892e118bf8d2e2d49d6d8f3a:4.4.5]
	... 1 more
2021-01-12 01:12:15,931 ERROR org.apache.flink.streaming.connectors.elasticsearch.util.NoOpFailureHandler [] - Failed Elasticsearch item request: Elasticsearch exception [type=version_conflict_engine_exception, reason=[article][appCqCo13HI5200-qs_5f9f9500c4912_cy7bT0cH0]: version conflict, required seqNo [1797803], primary term [1]. current document has seqNo [1797805] and primary term [1]]
org.elasticsearch.ElasticsearchException: Elasticsearch exception [type=version_conflict_engine_exception, reason=[article][appCqCo13HI5200-qs_5f9f9500c4912_cy7bT0cH0]: version conflict, required seqNo [1797803], primary term [1]. current document has seqNo [1797805] and primary term [1]]
	at org.elasticsearch.ElasticsearchException.innerFromXContent(ElasticsearchException.java:510) ~[blob_p-21f0d8527b99290da28a61b1488a989434dbefef-99cc4e80db37d6b4bc1f923477e436d0:6.3.1]
	at org.elasticsearch.ElasticsearchException.fromXContent(ElasticsearchException.java:421) ~[blob_p-21f0d8527b99290da28a61b1488a989434dbefef-99cc4e80db37d6b4bc1f923477e436d0:6.3.1]
	at org.elasticsearch.action.bulk.BulkItemResponse.fromXContent(BulkItemResponse.java:135) ~[blob_p-21f0d8527b99290da28a61b1488a989434dbefef-99cc4e80db37d6b4bc1f923477e436d0:6.3.1]
	at org.elasticsearch.action.bulk.BulkResponse.fromXContent(BulkResponse.java:198) ~[blob_p-21f0d8527b99290da28a61b1488a989434dbefef-99cc4e80db37d6b4bc1f923477e436d0:6.3.1]
	at org.elasticsearch.client.RestHighLevelClient.parseEntity(RestHighLevelClient.java:653) ~[blob_p-bc54ce7e13c1299ef100ba41e2b2254e5609270e-f1a544d31c47c4e06bbe9e8ca8a6f6d2:6.3.1]
	at org.elasticsearch.client.RestHighLevelClient.lambda$performRequestAsyncAndParseEntity$3(RestHighLevelClient.java:549) ~[blob_p-bc54ce7e13c1299ef100ba41e2b2254e5609270e-f1a544d31c47c4e06bbe9e8ca8a6f6d2:6.3.1]
	at org.elasticsearch.client.RestHighLevelClient$1.onSuccess(RestHighLevelClient.java:580) [blob_p-bc54ce7e13c1299ef100ba41e2b2254e5609270e-f1a544d31c47c4e06bbe9e8ca8a6f6d2:6.3.1]
	at org.elasticsearch.client.RestClient$FailureTrackingResponseListener.onSuccess(RestClient.java:621) [blob_p-99de036a2cd99dbecec1cc84f5d0e19032e74fa7-84c9ba8cf52dbddab768066dcef86f8d:6.3.1]
	at org.elasticsearch.client.RestClient$1.completed(RestClient.java:375) [blob_p-99de036a2cd99dbecec1cc84f5d0e19032e74fa7-84c9ba8cf52dbddab768066dcef86f8d:6.3.1]
	at org.elasticsearch.client.RestClient$1.completed(RestClient.java:366) [blob_p-99de036a2cd99dbecec1cc84f5d0e19032e74fa7-84c9ba8cf52dbddab768066dcef86f8d:6.3.1]
	at org.apache.http.concurrent.BasicFuture.completed(BasicFuture.java:119) [blob_p-e3fd8ced1f52c7574af952e2e6da0df8df08eb82-42012784b37d9afbb4aa86747f0bee29:4.4.6]
	at org.apache.http.impl.nio.client.DefaultClientExchangeHandlerImpl.responseCompleted(DefaultClientExchangeHandlerImpl.java:177) [blob_p-95aa3e6fb520191a0970a73cf09f62948ee614be-76eeb883350993edca8cd9a6eb494d0b:4.1.2]
	at org.apache.http.nio.protocol.HttpAsyncRequestExecutor.processResponse(HttpAsyncRequestExecutor.java:436) [blob_p-f4be009e7505f6ceddf21e7960c759f413f15056-d173e56a892e118bf8d2e2d49d6d8f3a:4.4.5]
	at org.apache.http.nio.protocol.HttpAsyncRequestExecutor.inputReady(HttpAsyncRequestExecutor.java:326) [blob_p-f4be009e7505f6ceddf21e7960c759f413f15056-d173e56a892e118bf8d2e2d49d6d8f3a:4.4.5]
	at org.apache.http.impl.nio.DefaultNHttpClientConnection.consumeInput(DefaultNHttpClientConnection.java:265) [blob_p-f4be009e7505f6ceddf21e7960c759f413f15056-d173e56a892e118bf8d2e2d49d6d8f3a:4.4.5]
	at org.apache.http.impl.nio.client.InternalIODispatch.onInputReady(InternalIODispatch.java:81) [blob_p-95aa3e6fb520191a0970a73cf09f62948ee614be-76eeb883350993edca8cd9a6eb494d0b:4.1.2]
	at org.apache.http.impl.nio.client.InternalIODispatch.onInputReady(InternalIODispatch.java:39) [blob_p-95aa3e6fb520191a0970a73cf09f62948ee614be-76eeb883350993edca8cd9a6eb494d0b:4.1.2]
	at org.apache.http.impl.nio.reactor.AbstractIODispatch.inputReady(AbstractIODispatch.java:114) [blob_p-f4be009e7505f6ceddf21e7960c759f413f15056-d173e56a892e118bf8d2e2d49d6d8f3a:4.4.5]
	at org.apache.http.impl.nio.reactor.BaseIOReactor.readable(BaseIOReactor.java:162) [blob_p-f4be009e7505f6ceddf21e7960c759f413f15056-d173e56a892e118bf8d2e2d49d6d8f3a:4.4.5]
	at org.apache.http.impl.nio.reactor.AbstractIOReactor.processEvent(AbstractIOReactor.java:337) [blob_p-f4be009e7505f6ceddf21e7960c759f413f15056-d173e56a892e118bf8d2e2d49d6d8f3a:4.4.5]
	at org.apache.http.impl.nio.reactor.AbstractIOReactor.processEvents(AbstractIOReactor.java:315) [blob_p-f4be009e7505f6ceddf21e7960c759f413f15056-d173e56a892e118bf8d2e2d49d6d8f3a:4.4.5]
	at org.apache.http.impl.nio.reactor.AbstractIOReactor.execute(AbstractIOReactor.java:276) [blob_p-f4be009e7505f6ceddf21e7960c759f413f15056-d173e56a892e118bf8d2e2d49d6d8f3a:4.4.5]
	at org.apache.http.impl.nio.reactor.BaseIOReactor.execute(BaseIOReactor.java:104) [blob_p-f4be009e7505f6ceddf21e7960c759f413f15056-d173e56a892e118bf8d2e2d49d6d8f3a:4.4.5]
	at org.apache.http.impl.nio.reactor.AbstractMultiworkerIOReactor$Worker.run(AbstractMultiworkerIOReactor.java:588) [blob_p-f4be009e7505f6ceddf21e7960c759f413f15056-d173e56a892e118bf8d2e2d49d6d8f3a:4.4.5]
	at java.lang.Thread.run(Thread.java:748) [?:1.8.0_252]
2021-01-12 01:12:15,932 WARN  org.apache.flink.runtime.taskmanager.Task                    [] - Source: TableSourceScan(table=[[hive, rtime_db_ods, ods_t_bank_question]], fields=[app_id, id, summary, title, audio_urls, question_type, score, correct_answer, analysis, disorder_match, state, question_library_id, question_from, weight, is_material_question, created_at, updated_at]) -> Sink: Sink(table=[hive.rtime_db_ads.ads_t_bank_question], fields=[app_id, id, summary, title, audio_urls, question_type, score, correct_answer, analysis, disorder_match, state, question_library_id, question_from, weight, is_material_question, created_at, updated_at]) (3/3) (1e8fc64c2e908821f95d8938c8ecf5fc) switched from RUNNING to FAILED.

从kafka的分区数为3,从kafka数据导入到es时,设置了并行度为3,es的表指定了主键采用update的方式写入es,启动程序后报该错误。

flink sql cdc做计算后, elastic search connector 多并发sink 会有问题,然后将并行度设置为1解决此异常,暂时没想到其他方法。

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
Flink CDCMySQL 同步Kafka 的流程如下: 1. 配置 MySQL 数据源:在 Flink CDC 中,使用 JDBC Connector 连接 MySQL 数据库,并配置相应的参数,例如数据库连接 URL、用户名、密码等。 2. 配置 Kafka 数据接收器:使用 Kafka Connector 连接 Kafka,配置相应的参数,例如 Kafka 主题、Kafka Broker 地址等。 3. 创建 Flink CDC 任务:使用 Flink SQL 或 Flink Table API 创建 Flink CDC 任务,并配置相应的数据源和数据接收器。 4. 启动 Flink CDC 任务:使用 Flink 自带的命令行工具或 Web UI 启动 Flink CDC 任务,开始同步 MySQL 数据Kafka 中。 具体的步骤如下: 1. 下载并安装 Flink:从 Flink 官网下载并安装 Flink。 2. 配置 MySQL 数据源:在 Flink 的 conf 目录下创建一个新的文件,例如 mysql.properties,配置 MySQL 数据源相关的参数,例如: ``` connector.class = jdbc connector.url = jdbc:mysql://localhost:3306/test?useSSL=false connector.table = my_table connector.username = root connector.password = password ``` 3. 配置 Kafka 数据接收器:在 Flink 的 conf 目录下创建一个新的文件,例如 kafka.properties,配置 Kafka 数据接收器相关的参数,例如: ``` connector.class = kafka connector.topic = my_topic connector.properties.bootstrap.servers = localhost:9092 ``` 4. 创建 Flink CDC 任务:使用 Flink SQL 或 Flink Table API 创建 Flink CDC 任务,例如: ``` CREATE TABLE my_table ( id INT, name STRING, age INT ) WITH ( 'connector' = 'jdbc', 'url' = 'jdbc:mysql://localhost:3306/test?useSSL=false', 'table-name' = 'my_table', 'username' = 'root', 'password' = 'password' ); CREATE TABLE my_topic ( id INT, name STRING, age INT ) WITH ( 'connector' = 'kafka', 'topic' = 'my_topic', 'properties.bootstrap.servers' = 'localhost:9092' ); INSERT INTO my_topic SELECT * FROM my_table; ``` 5. 启动 Flink CDC 任务:使用 Flink 自带的命令行工具或 Web UI 启动 Flink CDC 任务,例如: ``` ./bin/flink run -c com.example.MyCDCJob /path/to/my/cdc/job.jar ``` 通过以上步骤,就可以实现从 MySQL 同步数据Kafka 中的流程。需要注意的是,Flink CDC 可以根据实际的需求进行调整,例如任务并行度、缓冲区大小等参数。

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值