Scrapy的安装

Scrapy的安装

**写在前面
自己下载package安装,方法比较笨,但是比较有效,后面也有介绍其他方法。

一、pip install+package包

package包下载链接
安装Scrapy首先要装wheel、lxml、Twisted

PS C:\WINDOWS\system32> pip install wheel
Requirement already satisfied: wheel in c:\users\46731\appdata\local\programs\python\python37\lib\site-packages (0.34.2)
PS C:\WINDOWS\system32> pip install lxml
Requirement already satisfied: lxml in c:\users\46731\appdata\local\programs\python\python37\lib\site-packages (4.5.0)
PS C:\WINDOWS\system32> pip install Twisted
Requirement already satisfied: Twisted in c:\users\46731\appdata\local\programs\python\python37\lib\site-packages (19.10.0)
PS C:\WINDOWS\system32> pip install Scrapy

利用pip install查看/安装 对应库
如果一次性安装成功了,那么恭喜你!!!
可以在cmd中输入scrapy查看是否成功。

PS C:\WINDOWS\system32> scrapy
Scrapy 1.8.0 - no active project

Usage:
  scrapy <command> [options] [args]

Available commands:
  bench         Run quick benchmark test
  fetch         Fetch a URL using the Scrapy downloader
  genspider     Generate new spider using pre-defined templates
  runspider     Run a self-contained spider (without creating a project)
  settings      Get settings values
  shell         Interactive scraping console
  startproject  Create new project
  version       Print Scrapy version
  view          Open URL in browser, as seen by Scrapy

  [ more ]      More commands available when run from project directory

Use "scrapy <command> -h" to see more info about a command

二、遇到的问题

1.安装过程报错,提示升级pip——You are using pip version 9.0.1, however version 19.1.1 is available.

You should consider upgrading via the ‘python -m pip install --upgrade pip’ command.

You are using pip version 9.0.1, however version 19.1.1 is available.
You should consider upgrading via the 'python -m pip install --upgrade pip' command.
解决方法:大概是因为服务器超时
python -m pip install --upgrade pip -i https://pypi.douban.com/simple
2.支持库下载出错

支持库下载中断,一堆红色报错

Collecting pyasn1-modules
  Downloading pyasn1_modules-0.2.8-py2.py3-none-any.whl (155 kB)
     |██                              | 10 kB 2.2 kB/s eta 0:01:06ERROR: Exception:
Traceback (most recent call last):
  File "c:\users\46731\appdata\local\programs\python\python37\lib\site-packages\pip\_vendor\urllib3\response.py", line 425, in _error_catcher
    yield
 .
 .
 .
解决方法:

上 https://pypi.org/ (上诉的package包下载链接)下载对应的支持库,并安装。
下载了pyasn1_modules-0.2.8-py2.py3-none-any.whl
pip install pyasn1_modules-0.2.8-py2.py3-none-any.whl进行安装,然后再继续安装scrapy

 E:\迅雷下载> pip install .\pyasn1_modules-0.2.8-py2.py3-none-any.whl
Processing e:\迅雷下载\pyasn1_modules-0.2.8-py2.py3-none-any.whl
Requirement already satisfied: pyasn1<0.5.0,>=0.4.6 in c:\users\46731\appdata\local\programs\python\python37\lib\site-packages (from pyasn1-modules==0.2.8) (0.4.8)
Installing collected packages: pyasn1-modules
Successfully installed pyasn1-modules-0.2.8
PS E:\迅雷下载> pip install .\Scrapy-1.8.0-py2.py3-none-any.whl
PS:scrapy的支持库有挺多的,也许的网络的原因等,会导致下载出错,总之就是哪个包下载不了,我们就手动下载安装。有兴趣的朋友可以尝试去 修改环境变量中的pip源,即下载路径。

pip国内的一些镜像
  阿里云 http://mirrors.aliyun.com/pypi/simple/
  中国科技大学 https://pypi.mirrors.ustc.edu.cn/simple/
  豆瓣(douban) http://pypi.douban.com/simple/
  清华大学 https://pypi.tuna.tsinghua.edu.cn/simple/
  中国科学技术大学 http://pypi.mirrors.ustc.edu.cn/simple/
修改源方法:
临时使用:
可以在使用pip的时候在后面加上-i参数,指定pip源
eg: pip install scrapy -i https://pypi.tuna.tsinghua.edu.cn/simple

三、其他安装方法

试过,有效。
1.利用pycharm

pycharm是一款具备丰富功能的LDE工具
下载地址
打开pytharm —> File —> settings —> Project:untitled
在这里插入图片描述
在这里插入图片描述

2.利用anaconda

下载链接
启动 prompt
在这里插入图片描述
执行conda install scrapy,询问时输入Y 即可

(base) PS C:\Users\46731> conda install scrapy
Collecting package metadata (current_repodata.json): done
Solving environment: done

## Package Plan ##

  environment location: E:\Anaconda3

  added / updated specs:
    - scrapy


The following packages will be downloaded:

    package                    |            build
    ---------------------------|-----------------
    conda-4.8.2                |           py37_0         2.8 MB
    twisted-19.10.0            |   py37he774522_0         4.1 MB
    ------------------------------------------------------------
                                           Total:         7.0 MB

The following NEW packages will be INSTALLED:

  automat            pkgs/main/noarch::automat-0.8.0-py_0
  bcrypt             pkgs/main/win-64::bcrypt-3.1.7-py37he774522_0
  constantly         pkgs/main/win-64::constantly-15.1.0-py37h28b3542_0
  cssselect          pkgs/main/noarch::cssselect-1.1.0-py_0
  hyperlink          pkgs/main/noarch::hyperlink-19.0.0-py_0
  incremental        pkgs/main/win-64::incremental-17.5.0-py37_0
  parsel             pkgs/main/win-64::parsel-1.5.2-py37_0
  pyasn1             pkgs/main/noarch::pyasn1-0.4.8-py_0
  pyasn1-modules     pkgs/main/noarch::pyasn1-modules-0.2.7-py_0
  pydispatcher       pkgs/main/win-64::pydispatcher-2.0.5-py37_1
  pyhamcrest         pkgs/main/win-64::pyhamcrest-1.9.0-py37_2
  pytest-runner      pkgs/main/noarch::pytest-runner-5.2-py_0
  queuelib           pkgs/main/win-64::queuelib-1.5.0-py37_0
  scrapy             pkgs/main/win-64::scrapy-1.6.0-py37_0
  service_identity   pkgs/main/win-64::service_identity-18.1.0-py37h28b3542_0
  twisted            pkgs/main/win-64::twisted-19.10.0-py37he774522_0
  w3lib              pkgs/main/noarch::w3lib-1.21.0-py_0
  zope               pkgs/main/win-64::zope-1.0-py37_1
  zope.interface     pkgs/main/win-64::zope.interface-4.7.1-py37he774522_0

The following packages will be UPDATED:

  conda                                       4.7.12-py37_0 --> 4.8.2-py37_0


Proceed ([y]/n)? y   '''这里输入 Y 即可,等待安装完成'''
  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值