第一种方法:scrapy.Request
import urllib
import scrapy
form_data = {'f1':'1', 'f2':'100'}
yield scrapy.Request(url,
headers=self.headers,
body = urllib.urlencode(data),
method="POST",
callback=self.parse)
第二种方法:scrapy.FormRequest
普通GET请求使用scrapy.Request类就可以实现,但是遇到模拟表单或Ajax提交post请求的时候,Request类也可以用body传参数,就是不如 子类 FormRequest类方便了,因为他自带 formdata ,专门用来设置表单字段数据,默认method也是POST。
def start_requests(self):
form_data = {'f1':'1', 'f2':'100'} # 表单数据,字典格式,注意数字也要用引号引起来,否则报错。
yield scrapy.FormRequest(url, formdata=form_data) # 还可以通过callback修改回调函数等
参考:
http://scrapy-chs.readthedocs.io/zh_CN/0.24/topics/request-response.html?highlight=post#formrequest-objects
https://stackoverflow.com/questions/39012902/scrapy-making-request-with-post-method
第三种方法:scrapy.http.FormRequest
还有其他方法,如scrapy.http.FormRequest,但是感觉不如以上方法方便:
return [scrapy.http.FormRequest(
self.myurl,
formdata={'f1':'123','f2':'456'},
callback=self.parse)]
from scrapy.item import Item, Field
from scrapy.http import FormRequest
from scrapy.spider import BaseSpider
class DeltaItem(Item):
title = Field()
link = Field()
desc = Field()
class DmozSpider(BaseSpider):
name = "delta"
allowed_domains = ["delta.com"]
start_urls = ["http://www.delta.com"]
def parse(self, response):
yield FormRequest.from_response(response,
formname='flightSearchForm',
formdata={'departureCity[0]': 'JFK',
'destinationCity[0]': 'SFO',
'departureDate[0]': '07.20.2013',
'departureDate[1]': '07.28.2013'},
callback=self.parse1)
def parse1(self, response):
print response.status
参考:http://www.smipple.net/snippet/fruityworld/scrapy%20post%20request