使用爬虫框架 gathertool
- 框架地址: https://github.com/mangenotwork/gathertool
- 框架下载: go get github.com/mangenotwork/gathertool
- 介绍: 轻量级爬虫,接口测试,压力测试框架, 提高开发对应场景的golang程序。
- 框架文档: https://380949.baklib-free.com/
获取cookie步骤
- 请求https://weibo.com F12 Network查看到由 https://passport.weibo.com/js/visitor/mini_original.js?v=20161116 在main函数 js代码 找到获取tid的url与传参
- 通过 https://passport.weibo.com/visitor/genvisitor?cb=gen_callback&fp={“os”:“1”,“browser”:“Chrome70,0,3538,25”,“fonts”:“undefined”,“screenInfo”:“1920108024”,“plugins”:""} 获取 tid
- F12 Network可以发现 https://passport.weibo.com/visitor/visitor?a=incarnate&t=hWJuIDjcJt14rjdJBmelhsQ0ReEl6ATZZnf2EQbrBQM=&w=3&c&cb=restore_back&from=weibo 获取 SUB 与SUBP , 这个就是所需的cookie
获取cookie步骤的实现
- 获取tid
getTidUrl := "https://passport.weibo.com/visitor/genvisitor?cb=gen_callback&fp={\"os\":\"1\",\"browser\":\"Chrome70,0,3538,25\",\"fonts\":\"undefined\",\"screenInfo\":\"1920*1080*24\",\"plugins\":\"\"}"
ctx, _ := gt.Get(getTidUrl)
ctx.Do()
log.Println(string(ctx.RespBody))
- 带入tid 获取 sub subp
参数介绍:
a :固定 incarnate
t :上一步得到的tid
w :上一步 new_tid = true 就是3 否则为2
cb :固定 cross_domain
from : 固定 weibo
getSubUrl := "https://passport.weibo.com/visitor/visitor?a=incarnate&t=h2b7xQtQwqk2cEjMgH/0AaWYvpijlgCCAs3qDzj2W58=&w=3&c&cb=restore_back&from=weibo"
ctx,_ := gt.Get(getSubUrl)
ctx.Do()
log.Println(ctx.Resp)
log.Println(ctx.Resp.Cookies())
log.Println(string(ctx.RespBody))
如图可见 返回的结果下下发的cookie都含有sub subp
带入cookie抓取微博
func case1(){
/*
category
0 热门
1760 头条
99991 榜单
10011 高笑
7 社会
12 时尚
10018 电影
10007 美女
3 体育
10005 动漫
*/
url := "https://weibo.com/a/aj/transform/loadingmoreunlogin?ajwvr=6&category=0&page=2&lefnav=0&cursor=&__rnd="+ gt.Timestamp()
ctx, _ := gt.Get(url, gt.SucceedFunc(succed))
ctx.Req.AddCookie(&http.Cookie{Name: "SUBP",Value: "0033WrSXqPxfM72-Ws9jqgMF55529P9D9WWENAjmKyIZz1AWjDi68mRw", HttpOnly: true})
ctx.Req.AddCookie(&http.Cookie{Name: "SUB",Value: "_2AkMXxWiSf8NxqwFRmPoWz2nlbop1zwvEieKhmZlJJRMxHRl-yT9jqlAItRB6PEVGfTP09XmsX_7CR2H1OUv6b-f-1bJl", HttpOnly: true})
ctx.Do()
}
func succed(ctx *gt.Context) {
//html := gt.ConvertByte2String(ctx.RespBody, gt.GB2312)
htmlBody,err := gt.UnescapeUnicode(ctx.RespBody)
html := string(htmlBody)
html = strings.Replace(html,"\\r","", -1)
html = strings.Replace(html,"\\n","", -1)
html = strings.Replace(html,"\\","", -1)
dom,err := gt.NewGoquery(html)
if err != nil{
log.Println(err)
return
}
dom.Find("div[action-type=feed_list_item]").Each(func(i int, div *goquery.Selection){
divHtml,err := div.Html()
if err != nil {
log.Println(err)
return
}
log.Println("\n\n\n\n ******************************* \n ")
dataList := gt.RegHtmlA(divHtml)
for _, v := range dataList{
if strings.Contains(v,`class="subinfo_face`) {
log.Println("[头像] : ", v)
scr := gt.RegHtmlSrc(v)
log.Println(scr)
}
if strings.Contains(v,`class="subinfo S_txt2`) {
log.Println("[昵称] : ", v)
href := gt.RegHtmlHref(v)
log.Println(href)
name := gt.RegHtmlSpan(v)
log.Println(name)
}
}
log.Println( "\n===================================\n\n")
})
}
完整代码
https://github.com/mangenotwork/gathertool/tree/main/_examples/weibo