python随机切换代理请求数据

使用免费代理ip请求数据,代理ip的获取,详见 “爬取站大爷的免费ip代理” 篇

# coding = utf-8
import random
from time import sleep, ctime
import time
import requests

url = 'http://linuxdba.ltd'
proxies_list = [{'http': 'http://59.55.161.88:3256', 'https': 'https://59.55.161.88:3256'},
                {'http': 'http://103.37.141.69:80', 'https': 'https://103.37.141.69:80'},
                {'http': 'http://27.191.60.168:3256', 'https': 'https://27.191.60.168:3256'},
                {'http': 'http://124.205.153.36:80', 'https': 'https://124.205.153.36:80'},
                {'http': 'http://139.224.18.116:80', 'https': 'https://139.224.18.116:80'},
                {'http': 'http://60.191.11.241:3128', 'https': 'https://60.191.11.241:3128'},
                {'http': 'http://120.194.55.139:6969', 'https': 'https://120.194.55.139:6969'},
                {'http': 'http://175.7.199.222:3256', 'https': 'https://175.7.199.222:3256'},
                {'http': 'http://27.191.60.5:3256', 'https': 'https://27.191.60.5:3256'},
                {'http': 'http://121.4.36.93:8888', 'https': 'https://121.4.36.93:8888'}]

header = 'Mozilla/5.0 (Linux; Android 6.0; Nexus 5 Build/MRA58N) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/92.0.4515.107 Mobile Safari/537.36'

for i in range(50):  # 请求50次,随机挑选代码,也可轮询检测代理有效性,将无效的剔除后再使用
    try:
        proxies = random.choice(proxies_list)
        print(proxies)
        res = requests.get(url, headers={"User-Agent": header}, proxies=proxies, timeout=2)
        print(res.status_code)
        print(res.request.headers)
    except Exception as e:
        print(e)

结果域名tomcat日志:

 

  • 1
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
要实现随机切换User-Agent,你可以使用Python库中的fake_useragent模块和random模块。首先,你需要导入fake_useragent和random模块。然后,你可以使用fake_useragent来生成一个UserAgent对象,并使用random模块中的方法从该对象中选择一个随机的User-Agent。下面是一个示例代码: ```python import fake_useragent import random ua = fake_useragent.UserAgent() random_ua = getattr(ua, random.choice(list(fake_useragent.settings.SHORTCUTS.keys()))) print(random_ua) ``` 这段代码会打印出一个随机选择的User-Agent。 此外,你可以在爬虫中使用middleware来实现随机切换User-Agent。首先,在middleware.py文件中创建一个RandomUserAgentMiddleware类。该类的作用是在请求随机设置User-Agent。下面是一个示例代码: ```python from fake_useragent import UserAgent class RandomUserAgentMiddleware(object): def __init__(self, crawler): super(RandomUserAgentMiddleware, self).__init__() self.ua = UserAgent() self.ua_type = crawler.settings.get("RANDOM_UA_TYPE", "random") @classmethod def from_crawler(cls, crawler): return cls(crawler) def process_request(self, request, spider): def get_ua(): return getattr(self.ua, self.ua_type) request.headers.setdefault('User-Agent', get_ua()) ``` 在这个示例代码中,RandomUserAgentMiddleware类继承自Scrapy的Middleware类,并重写了process_request方法来设置请求的User-Agent。在设置User-Agent时,它使用了之前生成的UserAgent对象,并根据设置的ua_type随机选择一个User-Agent进行设置。 为了让Scrapy使用这个middleware,你还需要在settings.py文件中配置相应的参数。你可以在settings.py文件中添加以下代码: ```python DOWNLOADER_MIDDLEWARES = { 'your_project_name.middlewares.RandomUserAgentMiddleware': 543, } RANDOM_UA_TYPE = "random" ``` 这段代码会告诉Scrapy使用RandomUserAgentMiddleware来处理请求,并设置RANDOM_UA_TYPE参数为"random",以随机选择User-Agent。 通过以上步骤,你就可以实现随机切换User-Agent了。每次请求时,middleware都会随机选择一个User-Agent并设置到请求中。这样能够增加爬虫的隐匿性,防止被网站封禁。<span class="em">1</span><span class="em">2</span><span class="em">3</span> #### 引用[.reference_title] - *1* [随机切换user-agent](https://blog.csdn.net/weixin_30908941/article/details/97354994)[target="_blank" data-report-click={"spm":"1018.2226.3001.9630","extra":{"utm_source":"vip_chatgpt_common_search_pc_result","utm_medium":"distribute.pc_search_result.none-task-cask-2~all~insert_cask~default-1-null.142^v93^chatsearchT3_2"}}] [.reference_item style="max-width: 50%"] - *2* *3* [scrapy随机更换User-Agent](https://blog.csdn.net/weixin_42260204/article/details/81087402)[target="_blank" data-report-click={"spm":"1018.2226.3001.9630","extra":{"utm_source":"vip_chatgpt_common_search_pc_result","utm_medium":"distribute.pc_search_result.none-task-cask-2~all~insert_cask~default-1-null.142^v93^chatsearchT3_2"}}] [.reference_item style="max-width: 50%"] [ .reference_list ]

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值