我认为这是一个错误,也许我找到了解决方案:调用“del data.f”.
for i in xrange(10000000):
data = np.load('tmp.npz')
del data.f
data.close() # avoid the "too many files are open" error
发现这种内存泄漏.您可以使用以下代码:
import numpy as np
import gc
# here come the overflow:
for i in xrange(10000):
data = np.load('tmp.npz')
data.close() # avoid the "too many files are open" error
d = dict()
for o in gc.get_objects():
name = type(o).__name__
if name not in d:
d[name] = 1
else:
d[name] += 1
items = d.items()
items.sort(key=lambda x:x[1])
for key, value in items:
print key, value
在测试程序之后,我在gc.get_objects()中创建了一个dict和count对象.这是输出:
...
wrapper_descriptor 1382
function 2330
tuple 9117
BagObj 10000
NpzFile 10000
list 20288
dict 21001
从结果我们知道BagObj和NpzFile有问题.找到代码:
class NpzFile(object):
def __init__(self, fid, own_fid=False):
...
self.zip = _zip
self.f = BagObj(self)
if own_fid:
self.fid = fid
else:
self.fid = None
def close(self):
"""
Close the file.
"""
if self.zip is not None:
self.zip.close()
self.zip = None
if self.fid is not None:
self.fid.close()
self.fid = None
def __del__(self):
self.close()
class BagObj(object):
def __init__(self, obj):
self._obj = obj
def __getattribute__(self, key):
try:
return object.__getattribute__(self, '_obj')[key]
except KeyError:
raise AttributeError, key
NpzFile有del(),NpzFile.f是BagObj,而BagObj._obj是NpzFile,这是一个引用循环,会导致NpzFile和BagObj无法收集.以下是Python文档中的一些解释:http://docs.python.org/library/gc.html#gc.garbage
因此,要打破参考周期,需要调用“del data.f”