问题描述:
在使用scrapy框架进行网络爬虫的时候,是没有问题的,我在DOWNLOADER_MIDDLEWARES中间件中添加了代理ip后,就开始报错,报错如下:
Traceback (most recent call last):
File "F:\scrapy\venv\lib\site-packages\twisted\internet\defer.py", line 1416, in _inlineCallbacks
result = result.throwExceptionIntoGenerator(g)
File "F:\scrapy\venv\lib\site-packages\twisted\python\failure.py", line 491, in throwExceptionIntoGenerator
return g.throw(self.type, self.value, self.tb)
File "F:\scrapy\venv\lib\site-packages\scrapy\core\downloader\middleware.py", line 43, in process_request
defer.returnValue((yield download_func(request=request,spider=spider)))
File "F:\scrapy\venv\lib\site-packages\scrapy\utils\defer.py", line 45, in mustbe_deferred
result = f(*args, **kw)
File "F:\scrapy\venv\lib\site-packages\scrapy\core\downloader\handlers\__init__.py", line 65, in download_request
return handler.download_request(request, spider)
File "F:\scrapy\venv\lib\site-packages\scrapy\core\downloader\handlers\http11.py", line 67, in download_request
return agent.download_request(request)
File "F:\scrapy\venv\lib\site-packages\scrapy\core\downloader\handlers\http11.py", line 331, in download_request
method, to_bytes(url, encoding='ascii'), headers, bodyproducer)
File "F:\scrapy\venv\lib\site-packages\scrapy\core\downloader\handlers\http11.py", line 252, in request
proxyEndpoint = self._getEndpoint(self._proxyURI)
File "F:\scrapy\venv\lib\site-packages\twisted\web\client.py", line 1635, in _getEndpoint
return self._endpointFactory.endpointForURI(uri)
File "F:\scrapy\venv\lib\site-packages\twisted\web\client.py", line 1513, in endpointForURI
raise SchemeNotSupported("Unsupported scheme: %r" % (uri.scheme,))
twisted.web.error.SchemeNotSupported: Unsupported scheme: b''
2019-04-24 14:54:40 [scrapy.core.engine] INFO: Closing spider (finished)
原因分析:
在中间件中虽然使用了代理ip,但是代理ip的前面没有加上“http://”
解决方案:
代理ip前面加上“http://”即可
正确代码:
def process_request(self, request, spider):
print("process_request")
print("使用ip代理:")
proxy1 = "http://115.213.174.158:4223"
# print(request.url)
# 代理为空时
print("使用ip代理:", proxy1)
request.meta['proxy'] = proxy1
代码运行成功。
参考: