Python中使用BeautifulSoup做网页解析

BeautifulSoup为第三方库,用于从HTML或XML中提取数据
下载地址:http://www.crummy.com/software/BeautifulSoup/

linux中安装BeautifulSoup:

  1. 解压后,进入beautifulsoup4-4.3.0目录;
  2. 在命令行输入 python setup.py install或者sudo python setup.py install;
  3. 输入python,进入python模块;
  4. 输入from bs4 import BeautifulSoup检测是否成功。
'''
Created on 2016-4-15
https://www.crummy.com/software/BeautifulSoup/bs4/doc/index.html#beautiful-soup-4-2-0
@author: developer
'''

html_doc = """
<html><head><title>The Dormouse's story</title></head>
<body>
<p class="title"><b>The Dormouse's story</b></p>

<p class="story">Once upon a time there were three little sisters; and their names were
<a href="http://example.com/elsie" class="sister" id="link1">Elsie</a>,
<a href="http://example.com/lacie" class="sister" id="link2">Lacie</a> and
<a href="http://example.com/tillie" class="sister" id="link3">Tillie</a>;
and they lived at the bottom of a well.</p>

<p class="story">...</p>
"""
from bs4 import BeautifulSoup
import re

#根据HTML网页字符串创建BeautifulSoup对象
'''
html_doc                  #HTML文档字符串
'html.parser'             #HTML解析器
from_encoding='utf-8'      #HTML文档的编码
'''
soup = BeautifulSoup(html_doc, 'html.parser',from_encoding='utf-8')
########################################
print("获取所有链接")
#查找所有标签为a的节点
links = soup.find_all('a')
#获取查找到的节点的标签名称 node.name
#获取查找到的a节点的href属性 node['href']
#获取查找到的a节点的链接文字  node.get_text()
for link in links:
    print(link.name, link['href'], link.get_text())
#########################################
print("获取Lacie的链接")
link_node = soup.find('a', href="http://example.com/lacie")
print(link_node.name, link_node['href'], link_node.get_text())

#########################################
print("正则匹配")
link_node = soup.find('a', href=re.compile(r"ill"))
print(link_node.name, link_node['href'], link_node.get_text())

#########################################
print("获取p段落文字")
p_node = soup.find('p', class_="title")      ##class为python中的关键字,所以加了下划线,避免冲突
print(p_node.name,  p_node.get_text())



评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值