前言
scrapyd-client 命令的使用主要就是发布egg到scrapyd,查看所有的projects,查看所有的spiders,运行指定spider等功能
(weibo) >scrapyd-client -h
usage: scrapyd-client [-h] [-t TARGET] {deploy,projects,schedule,spiders} ...
A command line interface for Scrapyd.
positional arguments:
{deploy,projects,schedule,spiders}
optional arguments:
-h, --help show this help message and exit
-t TARGET, --target TARGET
Specifies the Scrapyd's API base URL.
-t 用来设置 http://127.0.0.1:6800/ 默认重scrapy.cfg的deploy中获取,获取不到默认为target = http://127.0.0.1:6800 project默认为None
一、deploy
scrapyd-client --target=http://127.0.0.1:6800 deploy
使用的是scrapyd-deploy命令,具体详情见 scrapyd-client 中 scrapyd-deploy命令行使用
二、projects
# 对应的url
http://127.0.0.1:6800/listprojects.json
scrapyd-client -t=http://10.10.10.62:6800 projects
查询出所有的projects
三、spiders
#
http://127.0.0.1:6800/listspiders.json?project=weibo
# 指定project
-p --project
# 打印拥于字段 spider.name前添加project.name
-v --verbose
scrapyd-client -t=http://10.10.10.62:6800 spiders --project=weibo --verbose
查询出指定project下所有的spiders
四、schedule
# url
POST
http://127.0.0.1:6800/schedule.json
{'project':'weibo','spider':'weibo_file',***}
scrapyd-client -t=http://10.10.10.62:6800 schedule --project=weibo weibo_file
# weibo_file 为spider的name
scrapyd-client -t=http://10.10.10.62:6800 schedule --project=weibo weibo_file --arg a=b c=d
参数 --arg 为设置其它数据,可以添加多对,运行指定spider