Window Operations on Event Time

在一个滑动的事件时间上进行聚合操作(spark3.0.0)
完整demo:

package structured_streaming

import java.sql.Timestamp
import java.util.Date

import org.apache.spark.sql.SparkSession
import org.apache.spark.sql.functions.{count, current_timestamp, window}

object WordCount2 {
  def main(args: Array[String]): Unit = {
    System.setProperty("HADOOP_USER_NAME", "hdfs")
    val spark=SparkSession.builder()
      .appName("WordCount")
      .master("local")
      .getOrCreate()

    val socket=spark.readStream.format("socket")
      .option("host","127.0.0.1")
      .option("port","9998")
      .load()
    socket.isStreaming

    import spark.implicits._

    val words=socket.map(row=>(new Timestamp(new Date().getTime),row.getString(0).split(" ")))
    val df=words.toDF("timestamp","word")
    val windowedCounts = df.groupBy(
      window($"timestamp", "10 minutes", "5 minutes"),
      $"word"
    ).count()

    val query =windowedCounts.writeStream
      .outputMode("complete")
      .format("console")
      .start()
    query.awaitTermination()
  }
}

运行结果:

+--------------------+----+-----+
|              window|word|count|
+--------------------+----+-----+
|[2020-08-17 16:55...| [r]|    1|
|[2020-08-17 16:50...| [e]|    2|
|[2020-08-17 16:50...|[aa]|    1|
|[2020-08-17 16:50...|[55]|    1|
|[2020-08-17 17:00...| [r]|    1|
|[2020-08-17 16:45...|[aa]|    1|
|[2020-08-17 16:45...|[55]|    1|
|[2020-08-17 16:55...| [e]|    2|
+--------------------+----+-----+
  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值