1.1 Introduction中 Putting the Pieces Together官网剖析(博主推荐)

 

 

 

不多说,直接上干货!

 

  一切来源于官网

http://kafka.apache.org/documentation/

 

 

 

 

 

 

 

Putting the Pieces Together
拼在一起

 

 

  This combination of messaging, storage, and stream processing may seem unusual but it is essential to Kafka's role as a streaming platform.

消息传递,存储和流处理的组合看似反常,但对于Kafka作为流式处理平台的作用至关重要。

 

 

  A distributed file system like HDFS allows storing static files for batch processing. Effectively a system like this allows storing and processing historical data from the past.

像HDFS这样的分布式文件系统允许存储静态文件来进行批处理。这样系统可以有效地存储和处理来自过去的历史数据。

 

 

 

  A traditional enterprise messaging system allows processing future messages that will arrive after you subscribe. Applications built in this way process future data as it arrives.

传统企业的消息系统允许在你订阅之后处理未来的消息:在未来数据到达时处理它。

 

 

  Kafka combines both of these capabilities, and the combination is critical both for Kafka usage as a platform for streaming applications as well as for streaming data pipelines.

Kafka结合了这两种能力,这种组合对于kafka作为流处理应用和流数据管道平台是至关重要的。

 

 

 

  By combining storage and low-latency subscriptions, streaming applications can treat both past and future data the same way. That is a single application can process historical, stored data but rather than ending when it reaches the last record it can keep processing as future data arrives. This is a generalized notion of stream processing that subsumes batch processing as well as message-driven applications.

批处理以及消息驱动应用程序的流处理的概念:通过组合存储和低延迟订阅,流处理应用可以用相同的方式对待过去和未来的数据。
它是一个单一的应用程序,它可以处理历史的存储数据,当它处理到最后一个消息时,它进入等待未来的数据到达,而不是结束。

 

 

 

  Likewise for streaming data pipelines the combination of subscription to real-time events make it possible to use Kafka for very low-latency pipelines; but the ability to store data reliably make it possible to use it for critical data where the delivery of data must be guaranteed or for integration with offline systems that load data only periodically or may go down for extended periods of time for maintenance. The stream processing facilities make it possible to transform data as it arrives.

同样,对于流数据管道(pipeline),订阅实时事件的组合使得可以将Kafka用于非常低延迟的管道;
但是,可靠地存储数据的能力使得它可以将其用于必须保证传递的关键数据,
或与仅定期加载数据或长时间维护的离线系统集成在一起。
流处理可以在数据到达时转换它。

 

 

  For more information on the guarantees, apis, and capabilities Kafka provides see the rest of the documentation.

有关Kafka提供的保证,api和功能的更多信息,可继续查阅本网

 

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值