RejectedExecutionException

15 篇文章 0 订阅
4 篇文章 0 订阅

遗留问题:在SecureCRT中跑spark项目有时会出现异常:RejectedExecutionException,但是此昂木还是跑成功了,不知道是为什么?
异常信息:

17/05/27 10:09:04 INFO YarnClientSchedulerBackend: Stopped
17/05/27 10:09:04 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
17/05/27 10:09:04 INFO BlockManagerInfo: Removed broadcast_10_piece0 on 192.168.16.100:43093 in memory (size: 641.8 KB, free: 2.5 GB)
17/05/27 10:09:04 ERROR LiveListenerBus: SparkListenerBus has already stopped! Dropping event SparkListenerBlockUpdated(BlockUpdatedInfo(BlockManagerId(driver, 192.168.16.100, 43093),broadcast_10_piece0,StorageLevel(1 replicas),0,0))
17/05/27 10:09:04 INFO MemoryStore: MemoryStore cleared
17/05/27 10:09:04 INFO BlockManager: BlockManager stopped
17/05/27 10:09:04 INFO BlockManagerMaster: BlockManagerMaster stopped
17/05/27 10:09:04 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
17/05/27 10:09:04 ERROR TransportResponseHandler: Still have 1 requests outstanding when connection from /192.168.16.107:58618 is closed
17/05/27 10:09:04 ERROR TransportResponseHandler: Still have 1 requests outstanding when connection from /192.168.16.108:38069 is closed
17/05/27 10:09:04 ERROR TransportResponseHandler: Still have 1 requests outstanding when connection from /192.168.16.106:47530 is closed
    java.util.concurrent.RejectedExecutionException: Task scala.concurrent.impl.CallbackRunnable@371b8864 rejected from java.util.concurrent.ThreadPoolExecutor@7097797f[Terminated, pool size = 0, active threads = 0, queued tasks = 0, completed tasks = 113]
            at java.util.concurrent.ThreadPoolExecutor$AbortPolicy.rejectedExecution(ThreadPoolExecutor.java:2048)
            at java.util.concurrent.ThreadPoolExecutor.reject(ThreadPoolExecutor.java:821)
            at java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1372)
            at scala.concurrent.impl.ExecutionContextImpl$$anon$1.execute(ExecutionContextImpl.scala:136)
                at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:40)
                at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:248)
                at scala.concurrent.Promise$class.complete(Promise.scala:55)
                at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:153)
                at scala.concurrent.Future$$anonfun$recover$1.apply(Future.scala:326)
            at scala.concurrent.Future$$anonfun$recover$1.apply(Future.scala:326)
                at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:32)
                at org.spark_project.guava.util.concurrent.MoreExecutors$SameThreadExecutorService.execute(MoreExecutors.java:293)
                at scala.concurrent.impl.ExecutionContextImpl$$anon$1.execute(ExecutionContextImpl.scala:136)
            at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:40)
            at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:248)
            at scala.concurrent.Promise$class.complete(Promise.scala:55)
            at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:153)
            at scala.concurrent.Future$$anonfun$map$1.apply(Future.scala:237)
                at scala.concurrent.Future$$anonfun$map$1.apply(Future.scala:237)
            at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:32)
            at scala.concurrent.BatchingExecutor$Batch$$anonfun$run$1.processBatch$1(BatchingExecutor.scala:63)
            at scala.concurrent.BatchingExecutor$Batch$$anonfun$run$1.apply$mcV$sp(BatchingExecutor.scala:78)
            at scala.concurrent.BatchingExecutor$Batch$$anonfun$run$1.apply(BatchingExecutor.scala:55)
            at scala.concurrent.BatchingExecutor$Batch$$anonfun$run$1.apply(BatchingExecutor.scala:55)
            at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:72)
            at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:54)
            at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:601)
            at scala.concurrent.BatchingExecutor$class.execute(BatchingExecutor.scala:106)
            at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:599)
            at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:40)
            at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:248)
            at scala.concurrent.Promise$class.tryFailure(Promise.scala:112)
            at scala.concurrent.impl.Promise$DefaultPromise.tryFailure(Promise.scala:153)
            at org.apache.spark.rpc.netty.NettyRpcEnv.org$apache$spark$rpc$netty$NettyRpcEnv$$onFailure$1(NettyRpcEnv.scala:205)
            at org.apache.spark.rpc.netty.NettyRpcEnv$$anonfun$2.apply(NettyRpcEnv.scala:228)
                at org.apache.spark.rpc.netty.NettyRpcEnv$$anonfun$2.apply(NettyRpcEnv.scala:228)
            at org.apache.spark.rpc.netty.RpcOutboxMessage.onFailure(Outbox.scala:75)
            at org.apache.spark.network.client.TransportResponseHandler.failOutstandingRequests(TransportResponseHandler.java:110)
            at org.apache.spark.network.client.TransportResponseHandler.channelInactive(TransportResponseHandler.java:128)
            at org.apache.spark.network.server.TransportChannelHandler.channelInactive(TransportChannelHandler.java:109)
            at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:208)
            at io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:194)
            at io.netty.channel.ChannelInboundHandlerAdapter.channelInactive(ChannelInboundHandlerAdapter.java:75)
            at io.netty.handler.timeout.IdleStateHandler.channelInactive(IdleStateHandler.java:257)
            at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:208)
            at io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:194)
            at io.netty.channel.ChannelInboundHandlerAdapter.channelInactive(ChannelInboundHandlerAdapter.java:75)
            at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:208)
            at io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:194)
            at io.netty.channel.ChannelInboundHandlerAdapter.channelInactive(ChannelInboundHandlerAdapter.java:75)
            at org.apache.spark.network.util.TransportFrameDecoder.channelInactive(TransportFrameDecoder.java:182)
            at io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:208)
            at io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:194)
            at io.netty.channel.DefaultChannelPipeline.fireChannelInactive(DefaultChannelPipeline.java:828)
            at io.netty.channel.AbstractChannel$AbstractUnsafe$7.run(AbstractChannel.java:621)
            at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:328)
            at io.netty.util.concurrent.SingleThreadEventExecutor.confirmShutdown(SingleThreadEventExecutor.java:627)
            at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:362)
            at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
            at java.lang.Thread.run(Thread.java:745)
    17/05/27 10:09:04 INFO SparkContext: Successfully stopped SparkContext
    17/05/27 10:09:04 INFO ShutdownHookManager: Shutdown hook called
    17/05/27 10:09:04 INFO ShutdownHookManager: Deleting directory /tmp/spark-cb2d68b5-ffb8-4dcb-9fa9-cf07b3b13a7c
评论 5
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值