executor.Executor: Managed memory leak detected; size = 37247642 bytes, TID = 5

https://stackoverflow.com/questions/34359211/debugging-managed-memory-leak-detected-in-spark-1-6-0

https://stackoverflow.com/questions/33518992/spark-executor-managed-memory-leak-detected

 
I saw this exception while running spark streaming on EMR 16/04/14 13:49:10 WARN memory.TaskMemoryManager: leak 32.3 MB memory from org.apache.spark.unsafe.map.BytesToBytesMap@34158d5f 16/04/14 13:49:10 ERROR executor.Executor: Managed memory leak detected; size = 33816576 bytes, TID = 2942915 16/04/14 13:49:10 ERROR executor.Executor: Exception in task 22.0 in stage 35684.0 (TID 2942915) java.lang.OutOfMemoryError: Unable to acquire 262144 bytes of memory, got 220032 – Nipun Apr 20 '16 at 9:27 
   
Were you able to resolve it? I am facing a similar issue of memory leak in spark 1.6.2 – Anchika Agarwal Mar 24 at 18:43
   
I think in my case it was SPARK-14560. It is fixed in Spark 2.0.0 and we are using 2.1.0 now, so all is good. – Daniel Darabos Mar 25 at 20:10
   
does all these comment mean, if I can not upgrade to the 'good' version yet due to company IT readiness, I can safely ignore the error? I am currently using spark 1.6.0 and having errors like this when some dataframe does 'distinct' and then 'head'. – Minnie Shi Aug 8 at 9:55 
   
Well, if ignoring it is your only option, I suggest you ignore it :). It may lead to out-of-memory issues, being a warning about memory mis-accounting. – Daniel Darabos Aug 8 at 10:31

转载于:https://www.cnblogs.com/rocky-AGE-24/p/7512071.html

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值