展开全部
两种可选的方式
1:使用yield的懒惰加载,示例代码如下:def read_in_chunks(file_object, chunk_size=1024):
"""Lazy function (generator) to read a file piece by piece.
Default chunk size: 1k."""
while True:
data = file_object.read(chunk_size)
if not data:
break
yield data
f = open('really_big_file.dat')
for piece in read_in_chunks(f):
process_data(piece)
2:使用iter和一个帮助方法:e69da5e887aa62616964757a686964616f31333335343932f = open('really_big_file.dat')
def read1k():
return f.read(1024)
for piece in iter(read1k, ''):
process_data(piece)
推荐使用第一个。