在使用scrapy爬虫时,出现了下面了的错误
Traceback (most recent call last):
File "E:\project\venv\lib\site-packages\twisted\internet\defer.py", line 1416, in _inlineCallbacks
result = result.throwExceptionIntoGenerator(g)
File "E:\project\venv\lib\site-packages\twisted\python\failure.py", line 491, in throwExceptionIntoGenerator
return g.throw(self.type, self.value, self.tb)
File "E:\project\venv\lib\site-packages\scrapy\core\downloader\middleware.py", line 43, in process_request
defer.returnValue((yield download_func(request=request,spider=spider)))
File "E:\project\venv\lib\site-packages\scrapy\utils\defer.py", line 45, in mustbe_deferred
result = f(*args, **kw)
File "E:\project\venv\lib\site-packages\scrapy\core\downloader\handlers\__init__.py", line 65, in download_request
return handler.download_request(request, spider)
File "E:\project\venv\lib\site-packages\scrapy\core\downloader\handlers\http11.py", line 67, in download_request
return agent.download_request(request)
File "E:\project\venv\lib\site-packages\scrapy\core\downloader\handlers\http11.py", line 331, in download_request
method, to_bytes(url, encoding='ascii'), headers, bodyproducer)
File "E:\project\venv\lib\site-packages\scrapy\core\downloader\handlers\http11.py", line 252, in request
proxyEndpoint = self._getEndpoint(self._proxyURI)
File "E:\project\venv\lib\site-packages\twisted\web\client.py", line 1635, in _getEndpoint
return self._endpointFactory.endpointForURI(uri)
File "E:\project\venv\lib\site-packages\twisted\web\client.py", line 1513, in endpointForURI
raise SchemeNotSupported("Unsupported scheme: %r" % (uri.scheme,))
twisted.web.error.SchemeNotSupported: Unsupported scheme: b''
经调查是在下载中间件中设置代理是出现了错误:
class IpProxyDownloadMiddleware(object):
PROXIES = ['110.52.235.131:9999','110.52.235.249:9999','112.17.38.141:3128']
def process_request(self,request,spider):
proxy = random.choice(self.PROXIES)
request.meta['proxy'] = proxy
在上面的下载中间件中设置代理时,需要添加协议名称,“http://”或者“https://”。
如下:
class IpProxyDownloadMiddleware(object):
PROXIES = ['110.52.235.131:9999','110.52.235.249:9999','112.17.38.141:3128']
def process_request(self,request,spider):
proxy = random.choice(self.PROXIES)
request.meta['proxy'] = 'http://'+proxy
问题成功解决。