Kafka Receiver错误 java.lang.IllegalArgumentException: requirement failed: No output operations regist...

20/04/15 21:46:25 WARN KafkaUtils: overriding receive.buffer.bytes to 65536 see KAFKA-3135
org.apache.spark.streaming.kafka010.DirectKafkaInputDStream@108531c2
20/04/15 21:46:25 ERROR StreamingContext: Error starting the context, marking it as stopped
java.lang.IllegalArgumentException: requirement failed: No output operations registered, so nothing to execute
	at scala.Predef$.require(Predef.scala:224)
	at org.apache.spark.streaming.DStreamGraph.validate(DStreamGraph.scala:168)
	at org.apache.spark.streaming.StreamingContext.validate(StreamingContext.scala:513)
	at org.apache.spark.streaming.StreamingContext.liftedTree1$1(StreamingContext.scala:573)
	at org.apache.spark.streaming.StreamingContext.start(StreamingContext.scala:572)
	at com.kmai.demo02.rua04$.main(rua04.scala:42)
	at com.kmai.demo02.rua04.main(rua04.scala)
Exception in thread "main" java.lang.IllegalArgumentException: requirement failed: No output operations registered, so nothing to execute
	at scala.Predef$.require(Predef.scala:224)
	at org.apache.spark.streaming.DStreamGraph.validate(DStreamGraph.scala:168)
	at org.apache.spark.streaming.StreamingContext.validate(StreamingContext.scala:513)
	at org.apache.spark.streaming.StreamingContext.liftedTree1$1(StreamingContext.scala:573)
	at org.apache.spark.streaming.StreamingContext.start(StreamingContext.scala:572)
	at com.kmai.demo02.rua04$.main(rua04.scala:42)
	at com.kmai.demo02.rua04.main(rua04.scala)
20/04/15 21:46:25 INFO SparkContext: Invoking stop() from shutdown hook
20/04/15 21:46:25 INFO SparkUI: Stopped Spark web UI at http://192.168.100.6:4040
20/04/15 21:46:25 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
20/04/15 21:46:25 INFO MemoryStore: MemoryStore cleared
20/04/15 21:46:25 INFO BlockManager: BlockManager stopped
20/04/15 21:46:25 INFO BlockManagerMaster: BlockManagerMaster stopped
20/04/15 21:46:25 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
20/04/15 21:46:25 INFO SparkContext: Successfully stopped SparkContext
20/04/15 21:46:25 INFO ShutdownHookManager: Shutdown hook called
20/04/15 21:46:25 INFO ShutdownHookManager: Deleting directory C:\Users\kami\AppData\Local\Temp\spark-15c105c7-9e3b-4098-bfed-381c8b1efb7e

解决原因:代码中没有写DStream的Output Operation操作或者是在使用updateStateByKey算子,没有指定检查点,导致没有触发DStream需要的aciton,所以会报这个错误。
解决方法:使用以下方法之一触发:

print() 
foreachRDD() 
saveAsObjectFiles() 
saveAsTextFiles() 
saveAsHadoopFiles()
  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值