Python-爬虫代理--proxy(反爬)

from urllib import request
import random,ssl

ssl._create_default_https_context=ssl._create_unverified_context

#代理列表
proxy_list=[
    {'https':'175.5.44.34:808'},
    {"https":"122.72.18.35:80"},
    {"https":"122.72.18.34:80"},
    {"https":"120.26.14.14:3128"},
    {"https":"122.72.18.34:80"},
    {"http":"118.212.137.135:31288"}
]
#随机选取一个代理
proxy=random.choice(proxy_list)
#设置代理
httpProxy=request.ProxyHandler(proxy)
#网络代替请求
opener=request.build_opener(httpProxy)

#无代理
nullProxy=request.ProxyHandler()

#开关
switch=input('是否使用代理:y/n?')

#默认不使用代理
opener=request.build_opener(nullProxy)
if switch=='y':
    opener = request.build_opener(httpProxy)  # 网络请求,代替自己本机IP
else:
    opener = request.build_opener(nullProxy)




base_url = "https://www.lagou.com/jobs/positionAjax.json?px=default&gx=%E5%85%A8%E8%81%8C&city=%E6%9D%AD%E5%B7%9E&needAddtionalResult=false&isSchoolJob=1"

req=request.Request(base_url)
response=opener.open(req)
content=response.read().decode('utf-8')
print(content)
/Library/Frameworks/Python.framework/Versions/3.6/bin/python3.6 /Users/apple/PycharmProjects/stage4/spider/2018_3_12/05daili_demo.py
是否使用代理:y/n?y
{"success":false,"msg":"您操作太频繁,请稍后再访问","clientIp":"122.72.18.34"}

Process finished with exit code 0


阅读更多
版权声明: https://blog.csdn.net/zbrj12345/article/details/80320117
文章标签: 代理 proxy 反爬
个人分类: Python3 爬虫
想对作者说点什么? 我来说一句
相关热词

没有更多推荐了,返回首页

关闭
关闭
关闭