我已经得到了一些压缩的文件,但是解压是30GB+的,而且是在Windows中压缩的。我试图使用EC2实例创建一个系统来解压这些实例,但是我一直在耗尽内存(错误IOError: [Errno 28] No space left on device)。我的解压脚本如下:import boto3
from boto3.s3.transfer import S3Transfer
from zipfile import ZipFile as zip
import ec2metadata
import re
s3 = boto3.client('s3')
transfer = S3Transfer(s3)
def get_info():
userdata = re.findall(r"\=(.*?) ", ec2metadata.get('user-data'))
global dump_bucket
dump_bucket = userdata[0]
global bucket
bucket = userdata[1]
global key
key = userdata[2]
return dump_bucket, bucket, key
def unzipper(origin_bucket, origin_file, dest_bucket):
s3.download_file(bucket, key, '/tmp/file.zip')
zfile = zip('/tmp/file.zip')
namelist = zfile.namelist()
for filename in namelist:
data = zfile.read(filename)
f = open('/tmp/' + str(filename), 'wb')
f.write(data)
f.close()
transfer.upload_file('/tmp/' + str(filename), dump_bucket, namelist[0])
def main():
get_info()
unzipper(dump_bucket, bucket, key)
main()
有没有更好的解压方法?我尝试过流式传输,但由于最初的压缩方式,这很可能行不通。在