起因
需要写个爬虫去爬一些数据,于是用python写了个,但由于众所周知的GIL锁问题,python的多线程其实效率并不高,于是准备采用协程的方法去实现,在写demo测试的时候就遇到问题了,使用await去等待requests的响应却是无效的
测试代码
import asyncio
import requests
async def hello1(url):
print('55555555555555555555')
async with requests.get(url) as resp:
print(url, resp.status_code)
print('666666666666')
async def hello(n,url):
print("协程" + str(n) +"启动")
await hello1(url)
print("协程" + str(n) + "结束")
if __name__ == "__main__":
tasks = []
url = 'http://localhost:8080/TBIMPSWEB/drugPrice/query.tran?REQ_MESSAGE={}'
for i in range(0, 3):
tasks.append(hello(i,url))
loop = asyncio.get_event_loop()
loop.run_until_complete(asyncio.wait(tasks))
loop.close()