structed streaming 整合kafka idea本地测试时遇到的问题

7 篇文章 0 订阅
6 篇文章 0 订阅

structed streaming 整合kafka idea本地测试时遇到的问题

代码如下

  def main(args: Array[String]): Unit = {

    val sparksession = SparkSession
      .builder()
      .master("local[*]")
      .appName("demoPro")
      //.config("spark.debug.maxToStringFields", "200")
      .getOrCreate()

    import sparksession.implicits._

    val lines: DataFrame = sparksession
      .readStream.format("kafka")
      .option("kafka.bootstrap.servers", "10.*.*.1:9092,10.*.*.2:9092,10.*.*.3:9092")
      .option("startingOffsets", "earliest")
      .option("subscribe", "zyftest")
      .load()

    val query = lines
      .selectExpr("CAST(topic AS STRING) as topic","CAST(offset AS STRING) as offset","CAST(value AS STRING) as value")
      .filter($"value".contains("\"op\":\"ins\"")  || $"value".contains("\"op\":\"upd\"") || $"value".contains("\"op\":\"del\""))
      .as[(String,String,String)]
      .writeStream
      .outputMode("append")
      .format("console")
      .start()

    query.awaitTermination()

  }

简单的从kafka拉数的代码,一直报错
spark streaming NoClassDefFoundError:
org/apache/spark/sql/sources/v2/reader/SupportsScanUnsafeRow
从网上查去掉pom.xml里的
<//scope>provided<//scope>标签也不对
最后还是修改pom.xml修改正确,这里附上我的pom。xml缺一不可


         <dependency>
            <groupId>org.apache.spak</groupId>
            <artifactId>spark-streaming_2.11</artifactId>
            <version>2.4.0</version>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-streaming-kafka-0-10_2.11</artifactId>
            <version>2.4.0</version>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-sql_2.11</artifactId>
            <version>2.4.0</version>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-core_2.11</artifactId>
            <version>2.4.0</version>
           </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-sql-kafka-0-10_2.11</artifactId>
            <version>2.4.0</version>
        </dependency>

```xml

解决灵感来自于

> https://www.it1352.com/1935721.html

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值