爬虫系列之怎么找cookie以及user-Agent

如图所示就可以了,刚打开可能是白的,F5刷新一下就好啦!user-Agent也在这个的最下面哦!
在这里插入图片描述

以下是使用VBA编写基本的爬虫代码,并包含了Host、Referer、User-AgentCookie请求头: ``` Sub webScraping() Dim xmlHttp As Object Dim htmlDoc As Object Dim url As String Dim headers As String ' 设置请求URL和请求头信息 url = "https://example.com" headers = "Host: example.com" & vbCrLf & _ "Referer: https://www.google.com/" & vbCrLf & _ "User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/58.0.3029.110 Safari/537.3" & vbCrLf & _ "Cookie: SESSIONID=1234567890abcdef" ' 创建XMLHttpRequest对象 Set xmlHttp = CreateObject("MSXML2.XMLHTTP") ' 发送GET请求 xmlHttp.Open "GET", url, False xmlHttp.setRequestHeader "Content-Type", "text/plain;charset=UTF-8" xmlHttp.setRequestHeader "Connection", "keep-alive" xmlHttp.setRequestHeader "Accept-Language", "en-US,en;q=0.9" xmlHttp.setRequestHeader "Accept-Encoding", "gzip, deflate, br" xmlHttp.setRequestHeader "Cache-Control", "max-age=0" xmlHttp.setRequestHeader "Upgrade-Insecure-Requests", "1" xmlHttp.setRequestHeader "Pragma", "no-cache" xmlHttp.setRequestHeader "DNT", "1" xmlHttp.setRequestHeader "Sec-Fetch-Site", "none" xmlHttp.setRequestHeader "Sec-Fetch-Mode", "navigate" xmlHttp.setRequestHeader "Sec-Fetch-User", "?1" xmlHttp.setRequestHeader "Sec-Fetch-Dest", "document" xmlHttp.setRequestHeader "Sec-Ch-Ua", """Google Chrome"";v=""93"", "" Not;A Brand"";v=""99"", ""Chromium"";v=""93""" xmlHttp.setRequestHeader "Sec-Ch-Ua-Mobile", "?0" ' 添加自定义请求头 xmlHttp.setRequestHeader "Host", "example.com" xmlHttp.setRequestHeader "Referer", "https://www.google.com/" xmlHttp.setRequestHeader "User-Agent", "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/58.0.3029.110 Safari/537.3" xmlHttp.setRequestHeader "Cookie", "SESSIONID=1234567890abcdef" ' 发送请求并获取响应 xmlHttp.send ' 解析响应内容 Set htmlDoc = CreateObject("HTMLfile") htmlDoc.body.innerHTML = xmlHttp.responseText ' 输出响应结果 Debug.Print htmlDoc.body.innerHTML End Sub ``` 请注意,上述示例代码仅供参考,并且具体的请求头信息需要根据实际情况进行调整。同时,还需要注意网站是否允许爬虫访问,否则可能会触发反爬虫机制。
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值