我们经常会在网站日志中看到一些莫名其妙的国外蜘蛛,疯狂的爬取您的网站,设置robots.txt文件都没有任何用,果断把这些没有用大垃圾知识屏蔽啦。
屏蔽不讲robots规则的垃圾蜘蛛方法
方法一、屏蔽蜘蛛ip
能屏蔽ip当然最好,但往往这些蜘蛛不只一个ip,这个方法效果就没有那么明显了。
方法二、在nginx的server字段中屏蔽
//多蜘蛛屏蔽
if ($http_user_agent ~* (baiduspider|googlebot|bing|sogou|yahoo)){
return 503;
}
//单个蜘蛛屏蔽
if ($http_user_agent ~* baiduspider){
return 503;
}
该方法比较有效的,推荐使用方法二。
apache屏蔽蜘蛛 下方代码复制到.htaccess文件
<IfModule mod_rewrite.c>
RewriteEngine On
#Block spider
RewriteCond %{HTTP_USER_AGENT} "SemrushBot|Webdup|AcoonBot|AhrefsBot|Ezooms|EdisterBot|EC2LinkFinder|jikespider|Purebot|MJ12bot|WangIDSpider|WBSearchBot|Wotbox|xbfMozilla|Yottaa|YandexBot|Jorgee|SWEBot|spbot|TurnitinBot-Agent|mail.RU|curl|perl|Python|Wget|Xenu|ZmEu" [NC]
RewriteRule !(^robots\.txt$) - [F]
</IfModule>
iis屏蔽蜘蛛 web.config
<?xml version="1.0" encoding="UTF-8"?>
<configuration>
<system.webServer>
<rewrite>
<rules>
<rule name="Block spider">
<match url="(^robots.txt$)" ignoreCase="false" negate="true" />
<conditions>
<add input="{HTTP_USER_AGENT}" pattern="SemrushBot|Webdup|AcoonBot|AhrefsBot|Ezooms|EdisterBot|EC2LinkFinder|jikespider|Purebot|MJ12bot|WangIDSpider|WBSearchBot|Wotbox|xbfMozilla|Yottaa|YandexBot|Jorgee|SWEBot|spbot|TurnitinBot-Agent|mail.RU|curl|perl|Python|Wget|Xenu|ZmEu" ignoreCase="true" />
</conditions>
<action type="AbortRequest"/>
</rule>
</rules>
</rewrite>
</system.webServer>
</configuration>