【Scrapy】ModuleNotFoundError: No module named 'scrapy.contrib'

在Python 3.6和Scrapy 1.6.0环境下,启动爬虫时遇到ModuleNotFoundError:找不到'scrapy.contrib'模块。问题在于Scrapy 1.6.0中已移除此模块。解决方案是回退到Scrapy 1.5.x版本。
摘要由CSDN通过智能技术生成

运行环境:

pyhton-3.6
scrapy-1.6.0

爬虫启动报错:

Traceback (most recent call last):
  File "/usr/local/bin/scrapy", line 10, in <module>
    sys.exit(execute())
  File "/usr/local/lib/python3.6/dist-packages/scrapy/cmdline.py", line 150, in execute
    _run_print_help(parser, _run_command, cmd, args, opts)
  File "/usr/local/lib/python3.6/dist-packages/scrapy/cmdline.py", line 90, in _run_print_help
    func(*a, **kw)
  File "/usr/local/lib/python3.6/dist-packages/scrapy/cmdline.py", line 157, in _run_command
    cmd.run(args, opts)
  File "/usr/local/lib/python3.6/dist-packages/scrapy/commands/crawl.py", line 57, in run
    self.crawler_process.crawl(spname, **opts.spargs)
  File "/usr/local/lib/python3.6/dist-packages/scrapy/crawler.py", line 171, in crawl
    crawler = self.create_crawler(crawler_or_spidercls)
  File "/usr/local/lib/python3.6/dist-packages/scrapy/crawler.py"
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值