- 博客(9)
- 资源 (1)
- 收藏
- 关注
原创 爬取汽车之家
import scrapy from scrapy.linkextractors import LinkExtractor from scrapy.spiders import CrawlSpider, Rule from BWM3.items import Bwm3Item class Bwm3GtSpider(CrawlSpider): name = ‘bwm3_gt’ allowed_domains = [‘car.autohome.com.cn’] start_urls = [‘https://ca
2020-07-20 12:16:11
193
原创 爬取ajax斗鱼
import scrapy import json from douyu.items import DouyuItem class DySpider(scrapy.Spider): name = ‘dy’ allowed_domains = [‘douyu.com’] base_url = ‘http://capi.douyucdn.cn/api/v1/getVerticalRoom?limit=20&offset=’ # 对应多少间隔 offsert = 0 # 爬取的起始网址 start_url
2020-07-19 14:44:51
108
原创 start代码
from scrapy import cmdline cmdline.execute([“scrapy”,“crawl”,‘爬虫名’])
2020-07-19 14:43:59
391
原创 scrapy爬取80小说
-- coding: utf-8 -- import scrapy from Novels1.items import Novels1Item class Novels80Spider(scrapy.Spider): name = ‘novels80’ allowed_domains = [‘txt80.com’] start_urls = [‘http://txt80.com/dushi/’] def parse(self, response): # divs = response.xpath('
2020-07-19 14:43:14
157
原创 scrapy爬取当当
import scrapy 上一个文件夹的item的DangdangItem from …items import DangdangItem class DdSpider(scrapy.Spider): name = ‘dd’ allowed_domains = [‘dangdang.com’] start_urls = [‘http://search.dangdang.com/?key=python’] def parse(self, response): # 使用xpath获取包含所有书籍信息的
2020-07-19 14:40:04
123
原创 爬取广西防疫信息
import requests import time from lxml import etree import re 9 12 5 7 headers = { ‘User-Agent’ : ‘Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/83.0.4103.61 Safari/537.36’ } 获取到五月份数据的url def get_May_info(url): resp
2020-07-19 14:37:26
136
原创 python连接mongo
from pymongo import MongoClient class A: def init(self): # 连接mongodb self.client = MongoClient() # 确定连接的数据库 self.db = self.client[‘application’] def addOne(self): obj = {'name':'特斯拉'} return self.db.car.insert_one(obj) def addMany(self): obj =
2020-07-19 14:36:37
153
原创 xpath练习
‘’’ 2020.5.28作业 xpath写法 ‘’’ import requests from lxml import etree import socket import time import random from pymongo import MongoClient 白嫖的代理ip proxies = { ‘https’ : ‘58.218.201.74:2874’ } 设置代理 def getUserAgent(): “”" :return: 随机返回一个浏览器请求头的User-Agent值 “
2020-07-19 14:35:45
310
空空如也
TA创建的收藏夹 TA关注的收藏夹
TA关注的人