大数据处理的流程
MapReduce: input -> map(reduce) -> output
Storm: input -> Spout/Bolt -> output
Spark: input -> transformation/action -> output
Flink: input -> transformation/sink -> output
DataSet and DataStream
immutable
批处理:DataSet
流处理:DaTa Stream
Anatomy of a Flink Program
- Obtain an
execution environment
, - Load/create the initial data,
- Specify transformations on this data,
- Specify where to put the results of your computations,
- Trigger the program execution
Lazy Evaluation
All Flink programs are executed lazily: When the program’s main method is executed, the data loading and transformations do not happen directly. Rather, each operation is created and added to the program’s plan. The operations are actually executed when the execution is explicitly triggered by an
execute()
call on the execution environment. Whether the program is executed locally or on a cluster depends on the type of execution environment
The lazy evaluation lets you construct sophisticated programs that Flink executes as one holistically planned unit.
简单地说,延迟执行适应于Pipline流水线的方式。这种方式可以在中间进行一些优化,总体上有很大的一个性能提升。