Python网络爬虫实战 课时6 BeautifulSoup 基础操作(2)

1:

# -*- coding: UTF-8 -*-
from bs4 import BeautifulSoup
html_sample = ' \
<html> \
<body> \
<h1 id = "title">Hello World</h1> \
<a href ="#" class="link">this is link1</a> \
<a href = "# link2" Class = "link">This is link2</a> \
</body> \
</html>'




soup = BeautifulSoup(html_sample,'html.parser')
#使用select找到所有id为title的元素(id 前面需要加#) 一般id 为唯一元素
alink = soup.select('#title')

print(alink)

运行结果为:

[<h1 id="title">Hello World</h1>]

2:

# -*- coding: UTF-8 -*-
from bs4 import BeautifulSoup
html_sample = ' \
<html> \
<body> \
<h1 id = "title">Hello World</h1> \
<a href ="#" class="link">this is link1</a> \
<a href = "# link2" Class = "link">This is link2</a> \
</body> \
</html>'




soup = BeautifulSoup(html_sample,'html.parser')
#使用selet 找到所有class为link的元素(class前面需要加.) (class一般多重复)
for link in soup.select('.link'):

    print (link)


运行结果为:

<a class="link" href="#">this is link1</a>

<a class="link" href="# link2">This is link2</a>


3:

# -*- coding: UTF-8 -*-
from bs4 import BeautifulSoup
html_sample = ' \
<html> \
<body> \
<h1 id = "title">Hello World</h1> \
<a href ="#" class="link">this is link1</a> \
<a href = "# link2" Class = "link">This is link2</a> \
</body> \
</html>'




soup = BeautifulSoup(html_sample,'html.parser')


alinks = soup.select( 'a' )
for link in alinks:

    print (link)

运行结果为:

<a class="link" href="#">this is link1</a>

<a class="link" href="# link2">This is link2</a>

4:

# -*- coding: UTF-8 -*-
from bs4 import BeautifulSoup
html_sample = ' \
<html> \
<body> \
<h1 id = "title">Hello World</h1> \
<a href ="#" class="link">this is link1</a> \
<a href = "# link2" Class = "link">This is link2</a> \
</body> \
</html>'




soup = BeautifulSoup(html_sample,'html.parser')
#使用select找到所有a tag 的href 连结
alinks = soup.select( 'a' )
for link in alinks:

    print (link['href'])

运行结果为:

#
# link2

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值