python_scrapy_twisted.web.error.SchemeNotSupported: Unsupported scheme: b''_及解决

问题描述:在使用scrapy框架的middleware中间件,去尝试使用代理,执行后就会报错

2018-12-26 00:39:30 [scrapy.core.scraper] ERROR: Error downloading <GET http://httpbinorg/get/>
Traceback (most recent call last):
  File "e:\anaconda3\lib\site-packages\twisted\internet\defer.py", line 1416, in _inlineCallbacks
    result = result.throwExceptionIntoGenerator(g)
  File "e:\anaconda3\lib\site-packages\twisted\python\failure.py", line 491, in throwExceptionIntoGenerator
    return g.throw(self.type, self.value, self.tb)
  File "e:\anaconda3\lib\site-packages\scrapy\core\downloader\middleware.py", line 43, in process_request
    defer.returnValue((yield download_func(request=request,spider=spider)))
  File "e:\anaconda3\lib\site-packages\scrapy\utils\defer.py", line 45, in mustbe_deferred
    result = f(*args, **kw)
  File "e:\anaconda3\lib\site-packages\scrapy\core\downloader\handlers\__init__.py", line 65, in download_request
    return handler.download_request(request, spider)
  File "e:\anaconda3\lib\site-packages\scrapy\core\downloader\handlers\http11.py", line 67, in download_request
    return agent.download_request(request)
  File "e:\anaconda3\lib\site-packages\scrapy\core\downloader\handlers\http11.py", line 331, in download_request
    method, to_bytes(url, encoding='ascii'), headers, bodyproducer)
  File "e:\anaconda3\lib\site-packages\scrapy\core\downloader\handlers\http11.py", line 252, in request
    proxyEndpoint = self._getEndpoint(self._proxyURI)
  File "e:\anaconda3\lib\site-packages\twisted\web\client.py", line 1635, in _getEndpoint
    return self._endpointFactory.endpointForURI(uri)
  File "e:\anaconda3\lib\site-packages\twisted\web\client.py", line 1513, in endpointForURI
    raise SchemeNotSupported("Unsupported scheme: %r" % (uri.scheme,))
twisted.web.error.SchemeNotSupported: Unsupported scheme: b''

中间件代码如下:

class ProxyMiddleware(object):
    logger = logging.getLogger(__name__)

    def process_request(self, request, spider):
        self.logger.debug("Using Proxy")
        request.meta['proxy'] = '119.101.112.28:9999'
        return None

    def process_response(self, request, response, spider):
        response.status = 202
        return response

解决:

原因是代理的值前没有加http!!!

正确代码:

class ProxyMiddleware(object):
    logger = logging.getLogger(__name__)

    def process_request(self, request, spider):
        self.logger.debug("Using Proxy")
        request.meta['proxy'] = 'http://119.101.112.28:9999'
        return None

    def process_response(self, request, response, spider):
        response.status = 202
        return response

修改后,运行成功。问题解决

 

  • 4
    点赞
  • 2
    收藏
    觉得还不错? 一键收藏
  • 1
    评论
出现请求超时的原因可能有很多,比如网络原因、服务器响应速度慢等等。在使用Scrapy发送POST请求时,可以尝试以下几种方法来解决超时问题: 1. 增加超时时间:在Scrapy的settings.py文件中设置DOWNLOAD_TIMEOUT参数,增加请求超时时间,例如: ``` DOWNLOAD_TIMEOUT = 20 ``` 2. 使用RetryMiddleware:在Scrapy中使用RetryMiddleware可以自动重试请求,可以设置重试次数和重试时间间隔。在settings.py文件中添加以下代码: ``` RETRY_TIMES = 3 RETRY_HTTP_CODES = [500, 502, 503, 504, 400, 403, 404, 408] DOWNLOADER_MIDDLEWARES = { 'scrapy.downloadermiddlewares.retry.RetryMiddleware': 90, 'scrapy_proxies.RandomProxy': 100, 'scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware': 110, } ``` 3. 使用代理:在Scrapy中使用代理可以解决网络闪断等问题,可以使用scrapy_proxies库来实现代理功能。在settings.py文件中添加以下代码: ``` PROXY_LIST = '/path/to/proxy/list.txt' PROXY_MODE = 0 RANDOM_UA_PER_PROXY = True ``` 其中,PROXY_LIST为代理IP列表文件路径,PROXY_MODE为代理模式,0为随机选择代理IP,1为顺序选择代理IP。RANDOM_UA_PER_PROXY为是否在每个代理IP上使用随机User-Agent。 4. 使用requests库:如果使用Scrapy发送POST请求仍然存在超时问题,可以尝试使用requests库来发送请求。在Scrapy中可以使用scrapy-requests库来集成requests库,具体使用方法可以参考文档:https://github.com/scrapy-plugins/scrapy-requests
评论 1
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值