Scrapy 异步写入Mysql

这篇博客介绍了如何在Python3环境下,使用Scrapy爬虫框架实现数据的异步写入MySQL数据库,包括pipelines.py和settings.py的配置方法。
摘要由CSDN通过智能技术生成

python3 异步写入MySQL

 

十分想念顺店杂可。。。

 

pipelines.py

# pipelines.py

from .settings import MY_SETTINGS
from pymysql import cursors
# twisted 网络框架
# API 接口
from twisted.enterprise import adbapi


class SaveToMysqlAsynPipeline(object):
    # 从配置文件中读取配置
    @classmethod
    def from_settings(cls, settings):
        asyn_mysql_settings = MY_SETTINGS
        asyn_mysql_settings['cursorclass'] = cursors.DictCursor
        dbpool = adbapi.ConnectionPool("pymysql", **asyn_mysql_settings)
        return cls(dbpool)

    def __init__(self, dbpool):
        self.dbpool = dbpool
        if os.path.exists("job.state"):
            bloom = Bloomfilter("job.state")
        else:
            bloom = Bloomfilter(1000000)
        self.bloom = bloom
        query = self.d
  • 3
    点赞
  • 1
    收藏
    觉得还不错? 一键收藏
  • 1
    评论
Scrapy提供了异步的Item Pipeline机制,可以方便地将数据存储到MySQL数据库中。具体实现步骤如下: 1. 安装异步MySQL库aiomysql:`pip install aiomysql` 2. 在settings.py中配置MySQL数据库信息: ``` MYSQL_HOST = 'localhost' MYSQL_PORT = 3306 MYSQL_USER = 'root' MYSQL_PASSWORD = 'password' MYSQL_DBNAME = 'database_name' ``` 3. 创建一个异步MySQL连接池: ``` import aiomysql class MySQLPipeline(object): def __init__(self, mysql_host, mysql_port, mysql_user, mysql_password, mysql_dbname): self.mysql_host = mysql_host self.mysql_port = mysql_port self.mysql_user = mysql_user self.mysql_password = mysql_password self.mysql_dbname = mysql_dbname self.pool = None @classmethod async def from_crawler(cls, crawler): mysql_host = crawler.settings.get('MYSQL_HOST', 'localhost') mysql_port = crawler.settings.get('MYSQL_PORT', 3306) mysql_user = crawler.settings.get('MYSQL_USER', 'root') mysql_password = crawler.settings.get('MYSQL_PASSWORD', 'password') mysql_dbname = crawler.settings.get('MYSQL_DBNAME', 'database_name') obj = cls(mysql_host, mysql_port, mysql_user, mysql_password, mysql_dbname) obj.pool = await aiomysql.create_pool( host=obj.mysql_host, port=obj.mysql_port, user=obj.mysql_user, password=obj.mysql_password, db=obj.mysql_dbname, charset='utf8mb4', autocommit=True, maxsize=10, minsize=1 ) return obj async def process_item(self, item, spider): async with self.pool.acquire() as conn: async with conn.cursor() as cur: sql = "INSERT INTO table_name (field1, field2) VALUES (%s, %s)" await cur.execute(sql, (item['field1'], item['field2'])) return item async def close_spider(self, spider): self.pool.close() await self.pool.wait_closed() ``` 4. 在settings.py中启用MySQLPipeline: ``` ITEM_PIPELINES = { 'myproject.pipelines.MySQLPipeline': 300, } ```
评论 1
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值