笔趣阁爬虫优化

本文探讨了针对笔趣阁网站的Python爬虫优化技术,旨在提高爬取效率和稳定性。通过对网站结构的分析,优化了请求策略,避免了频繁的403错误,并实现了数据解析的高效处理。
摘要由CSDN通过智能技术生成

笔趣阁爬虫优化

import requests
import time
from bs4 import BeautifulSoup
import os
from multiprocessing.dummy import Pool as ThreadPool
from multiprocessing import Pool
from threading import Thread
import pandas as pd
from pandas import DataFrame,Series
import numpy as np
class MyThread(Thread):
    def __init__(self, url, urls3):
        super(MyThread).__init__()
        self.url = url
        self.urls3 = urls3

    def run(self):
        res = requests.get(self.url).content.decode('gbk')
        soup = BeautifulSoup(res, "html.parser")
        #
        contents = soup.find_all("div", attrs={
   "class": "l"})

        # 热门小说
        contents2 = soup.find_all("div", attrs={
   "class": "r"})
        # 玄幻小说,仙侠小说,都市言情小说
        contents3 = soup.find_all("div", attrs={
   "class": "novelslist"})
        # 更新小说
        contents4 = soup.find_all("div", attrs={
   "id": "newscontent"})
        for i, content in enumerate(contents):
            dts = content.find_all("dt")
            for dt in dts:

                try:
                    self.urls3.append(dt.a.get("href"))
                except Exception as e:
                    print(i)
        for c in contents2:
            lis = c.find_all("li")
            for li in lis:
                self.urls3.append(li.a.get("href"))
        for c in contents3:
            dts = c.find_all("dt")
            lis = c.find_all("li")
            for dt in dts:
                self.urls3.append(dt.a.get("href"))
            for li in lis:
                self.urls3.append(li.a.get("href"))
        for c in contents4:
            lis = c.find_all("li")
            for li in lis:
                self.urls3.append(li.a.get("href"))

    def result(self):
        return self.urls3


class MyThread1(Thread):
    def __init__(self, zzz, i):
        super(MyThread).__init__()
        self.zzz = zzz
        self.i = i
        self.contents = ""
        self.urls
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值