我从465个网页中提取xml数据,并使用python dataframe解析并存储在“.csv”文件中。程序运行30分钟后,程序保存“200.csv”文件并自行终止。命令行执行显示“Killed”。但当我分别为前200页和其余265页运行该程序时,它运行得很好。我在网上彻底搜索过,没有合适的答案。你能告诉我原因是什么吗?在for i in list:
addr = str(url + i + '?&$format=json')
response = requests.get(addr, auth=(self.user_, self.pass_))
# print (response.content)
json_data = response.json()
if ('d' in json_data):
df = json_normalize(json_data['d']['results'])
paginate = 'true'
while paginate == 'true':
if '__next' in json_data['d']:
addr_next = json_data['d']['__next']
response = requests.get(addr_next, auth=(self.user_, self.pass_))
json_data = response.json()
df = df.append(json_normalize(json_data['d']['results']))
else:
paginate = 'false'
try:
if(not df.empty):
storage = '/usr/share/airflow/documents/output/' + i + '_output.csv'
df.to_csv(storage, sep=',', encoding='utf-8-sig')
else:
pass
except:
pass
提前谢谢!在