sparkStreaming+kafka SparkException: java.nio.channels.ClosedChannelException异常报错

在运行sparkStreaming+kafka的时候报错 java io报错,

如果broker-list的端口不对或者kafka服务端未启动,会遇到以下错误:

Exception in thread "main" org.apache.spark.SparkException: java.nio.channels.ClosedChannelException
	at org.apache.spark.streaming.kafka.KafkaCluster$$anonfun$checkErrors$1.apply(KafkaCluster.scala:366)
	at org.apache.spark.streaming.kafka.KafkaCluster$$anonfun$checkErrors$1.apply(KafkaCluster.scala:366)
	at scala.util.Either.fold(Either.scala:97)
	at org.apache.spark.streaming.kafka.KafkaCluster$.checkErrors(KafkaCluster.scala:365)
	at org.apache.spark.streaming.kafka.KafkaUtils$.getFromOffsets(KafkaUtils.scala:222)
	at org.apache.spark.streaming.kafka.KafkaUtils$.createDirectStream(KafkaUtils.scala:484)
	at com.xxx.spark.main.xxx.createStream(xxx.scala:223)
	at com.xxx.spark.main.xxx.createStreamingContext(xxx.scala:72)
	at com.xxx.spark.main.xxx$$anonfun$getOrCreateStreamingContext$1.apply(xxx.scala:47)
	at com.xxx.spark.main.xxx$$anonfun$getOrCreateStreamingContext$1.apply(xxx.scala:47)
	at scala.Option.getOrElse(Option.scala:120)
	at org.apache.spark.streaming.StreamingContext$.getOrCreate(StreamingContext.scala:864)
	at com.xxx.spark.main.xxx.getOrCreateStreamingContext(xxx.scala:47)
	at com.xxx.spark.main.xxx$.main(xxx.scala:34)
	at com.xxx.spark.main.xxx.main(xxx.scala)

解决方式:检查kafka服务器端口是否正常,如果已经启动kafka把进程杀掉,kill 进程号  重新启动

kafka设置一键启停   https://blog.csdn.net/qq_43412289/article/details/100633902 

评论 1
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值