我有两个脚本,scraper.py和db_control.py.在scraper.py我有这样的事情:
...
def scrap(category,field,pages,search,use_proxy,proxy_file):
...
loop = asyncio.get_event_loop()
to_do = [ get_pages(url,params,conngen) for url in urls ]
wait_coro = asyncio.wait(to_do)
res,_ = loop.run_until_complete(wait_coro)
...
loop.close()
return [ x.result() for x in res ]
...
在db_control.py中:
from scraper import scrap
...
while new < 15:
data = scrap(category,proxy_file)
...
...
从理论上讲,刮板应该在未知时间开始,直到获得足够的数据.但是当新的不是imidiatelly> 15然后发生此错误:
File "/usr/lib/python3.4/asyncio/base_events.py",line 293,in run_until_complete
self._check_closed()
File "/usr/lib/python3.4/asyncio/base_events.py",line 265,in _check_closed
raise RuntimeError('Event loop is closed')
RuntimeError: Event loop is closed
但是如果我只运行一次scrap(),脚本就可以了.所以我想重新创建loop = asyncio.get_event_loop()有一些问题,我尝试了this但没有任何改变.我怎么解决这个问题?当然这些只是我的代码的片段,如果您认为问题可以在其他地方,完整代码可用here.