记录问题: HTTPSConnectionPool(host=‘files.pythonhosted.org‘, port=443): Read timed out.

前言:

在学习scrapy框架保存img图片时需要安装pillow库,但是总是报错

(venv) E:\Scrapy\scrapy02>pip install pillow
Collecting pillow
  Downloading Pillow-9.4.0-cp37-cp37m-win_amd64.whl (2.5 MB)
     ---- ----------------------------------- 0.3/2.5 MB 3.0 kB/s eta 0:12:19
ERROR: Exception:
Traceback (most recent call last):
  File "e:\scrapy\venv\lib\site-packages\pip\_vendor\urllib3\response.py", line 438, in _error_catcher
    yield
  File "e:\scrapy\venv\lib\site-packages\pip\_vendor\urllib3\response.py", line 561, in read
    data = self._fp_read(amt) if not fp_closed else b""
  File "e:\scrapy\venv\lib\site-packages\pip\_vendor\urllib3\response.py", line 527, in _fp_read
    return self._fp.read(amt) if amt is not None else self._fp.read()
  File "e:\scrapy\venv\lib\site-packages\pip\_vendor\cachecontrol\filewrapper.py", line 90, in read
    data = self.__fp.read(amt)
  File "C:\Python\lib\http\client.py", line 447, in read
    n = self.readinto(b)
  File "C:\Python\lib\http\client.py", line 491, in readinto
    n = self.fp.readinto(b)
  File "C:\Python\lib\socket.py", line 589, in readinto
    return self._sock.recv_into(b)
  File "C:\Python\lib\ssl.py", line 1049, in recv_into
    return self.read(nbytes, buffer)
  File "C:\Python\lib\ssl.py", line 908, in read
    return self._sslobj.read(len, buffer)
socket.timeout: The read operation timed out

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "e:\scrapy\venv\lib\site-packages\pip\_internal\cli\base_command.py", line 160, in exc_logging_wrapper
    status = run_func(*args)
  File "e:\scrapy\venv\lib\site-packages\pip\_internal\cli\req_command.py", line 247, in wrapper
    return func(self, options, args)
  File "e:\scrapy\venv\lib\site-packages\pip\_internal\commands\install.py", line 420, in run
    reqs, check_supported_wheels=not options.target_dir
  File "e:\scrapy\venv\lib\site-packages\pip\_internal\resolution\resolvelib\resolver.py", line 93, in resolve
    collected.requirements, max_rounds=try_to_avoid_resolution_too_deep
  File "e:\scrapy\venv\lib\site-packages\pip\_vendor\resolvelib\resolvers.py", line 481, in resolve
    state = resolution.resolve(requirements, max_rounds=max_rounds)
  File "e:\scrapy\venv\lib\site-packages\pip\_vendor\resolvelib\resolvers.py", line 348, in resolve
    self._add_to_criteria(self.state.criteria, r, parent=None)
  File "e:\scrapy\venv\lib\site-packages\pip\_vendor\resolvelib\resolvers.py", line 172, in _add_to_criteria
    if not criterion.candidates:
  File "e:\scrapy\venv\lib\site-packages\pip\_vendor\resolvelib\structs.py", line 151, in __bool__
    return bool(self._sequence)
  File "e:\scrapy\venv\lib\site-packages\pip\_internal\resolution\resolvelib\found_candidates.py", line 155, in __bool__
    return any(self)
  File "e:\scrapy\venv\lib\site-packages\pip\_internal\resolution\resolvelib\found_candidates.py", line 143, in <genexpr>
    return (c for c in iterator if id(c) not in self._incompatible_ids)
  File "e:\scrapy\venv\lib\site-packages\pip\_internal\resolution\resolvelib\found_candidates.py", line 47, in _iter_built
    candidate = func()
  File "e:\scrapy\venv\lib\site-packages\pip\_internal\resolution\resolvelib\factory.py", line 211, in _make_candidate_from_link
    version=version,
  File "e:\scrapy\venv\lib\site-packages\pip\_internal\resolution\resolvelib\candidates.py", line 303, in __init__
    version=version,
  File "e:\scrapy\venv\lib\site-packages\pip\_internal\resolution\resolvelib\candidates.py", line 162, in __init__
    self.dist = self._prepare()
  File "e:\scrapy\venv\lib\site-packages\pip\_internal\resolution\resolvelib\candidates.py", line 231, in _prepare
    dist = self._prepare_distribution()
  File "e:\scrapy\venv\lib\site-packages\pip\_internal\resolution\resolvelib\candidates.py", line 308, in _prepare_distribution
    return preparer.prepare_linked_requirement(self._ireq, parallel_builds=True)
  File "e:\scrapy\venv\lib\site-packages\pip\_internal\operations\prepare.py", line 491, in prepare_linked_requirement
    return self._prepare_linked_requirement(req, parallel_builds)
  File "e:\scrapy\venv\lib\site-packages\pip\_internal\operations\prepare.py", line 542, in _prepare_linked_requirement
    hashes,
  File "e:\scrapy\venv\lib\site-packages\pip\_internal\operations\prepare.py", line 170, in unpack_url
    hashes=hashes,
  File "e:\scrapy\venv\lib\site-packages\pip\_internal\operations\prepare.py", line 107, in get_http_url
    from_path, content_type = download(link, temp_dir.path)
  File "e:\scrapy\venv\lib\site-packages\pip\_internal\network\download.py", line 147, in __call__
    for chunk in chunks:
  File "e:\scrapy\venv\lib\site-packages\pip\_internal\cli\progress_bars.py", line 53, in _rich_progress_bar
    for chunk in iterable:
  File "e:\scrapy\venv\lib\site-packages\pip\_internal\network\utils.py", line 87, in response_chunks
    decode_content=False,
  File "e:\scrapy\venv\lib\site-packages\pip\_vendor\urllib3\response.py", line 622, in stream
    data = self.read(amt=amt, decode_content=decode_content)
  File "e:\scrapy\venv\lib\site-packages\pip\_vendor\urllib3\response.py", line 587, in read
    raise IncompleteRead(self._fp_bytes_read, self.length_remaining)
  File "C:\Python\lib\contextlib.py", line 130, in __exit__
    self.gen.throw(type, value, traceback)
  File "e:\scrapy\venv\lib\site-packages\pip\_vendor\urllib3\response.py", line 443, in _error_catcher
    raise ReadTimeoutError(self._pool, None, "Read timed out.")
pip._vendor.urllib3.exceptions.ReadTimeoutError: HTTPSConnectionPool(host='files.pythonhosted.org', port=443): Read timed out.

总结下,主要有三种方法:

准备前:更新pip

1、切换网络

2、或者使用命令pip --default-timeout=1000 install -U Scrapy 来重新设定默认下载时间。这里说明下默认的timeout=100,这里设置为1000。

3、切换镜像源进行下载,这里使用豆瓣

命令:pip install package(要下载的包名) -i http://pypi.douban.com/simple --trusted-host pypi.douban.com

参考文章:

pip._vendor.urllib3.exceptions.ReadTimeoutError错误的解决方法_CharlesLC的博客的博客-CSDN博客

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值