【python】爬虫3——抓取亦舒博客所有文章

<span style="font-family: Arial, Helvetica, sans-serif;">#! /usr/bin/env python</span>
#coding=utf-8
from urllib import urlopen
import time
url = ['']*350
page = 1
link = 1#链接变量

while page<= 4:
    arti = urlopen('http://blog.sina.com.cn/s/articlelist_1227636382_0_'+str(page)+'.html').read()
    i = 0
    title = arti.find(r'<a title=')
    href = arti.find(r'href=',title)
    html = arti.find(r'.html',href)
    while title != -1 and href != -1 and html != -1 and i<40:
        url[i] = arti[href+6:html+5]
        print link,' ',url[i]
        title = arti.find(r'<a title=',html)
        href = arti.find(r'href=',title)
        html = arti.find(r'.html',href)
        
        content = urlopen(url[i]).read()  
        filename = url[i][-26:]  
        print ' ',filename  
        open(r'yishu/'+url[i][-26:],'w+').write(content)  
        print 'downloading',url[i]  
        
        i = i + 1
        link = link+1#发现了多少链接地
        time.sleep(1)  
       
    else:
        print page,'find end'
    page=page+1
    
else:
    print'all find'

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值