scrapy爬虫框架使用命令运行出错

在我学习莫烦Python的爬虫最后的scrapy框架时,在terminal中使用scrapy runspider try24.py -o res.json命令,输出:

Fatal error in launcher: Unable to create process using
‘“d:\bld\scrapy_1572360424769_h_env\python.exe”
“G:\Anaconda3\Scripts\scrapy.exe” runspider try24.py -o res.json’

参考

使用python -m scrapy runspider try24.py -o res.json,加上python -m 表示以模块为脚本运行。
-m mod : run library module as a script (terminates option list)
意思是将库中的python模块用作脚本去运行。

最终代码:

import scrapy


class MofanSpider(scrapy.Spider):
    name = "mofan"
    start_urls = [
        'https://morvanzhou.github.io/',
    ]
    # unseen = set()
    # seen = set()      # we don't need these two as scrapy will deal with them automatically

    def parse(self, response):
        yield {     # return some results
            'title': response.css('h1::text').extract_first(default='Missing').strip().replace('"', ""),#replace() seems can delete
            'url': response.url,
        }

        urls = response.css('a::attr(href)').re(r'^/.+?/$')     # find all sub urls
        for url in urls:
            yield response.follow(url, callback=self.parse)     # it will filter duplication automatically


# lastly, run this in terminal
# python -m scrapy runspider try24.py -o res.json
  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值