最近发现手上有个站点访问速度非常慢,看了nginx 日志发现有好多宜搜等垃圾的抓取记录,这些垃圾爬虫既不遵守 robots 规则对服务器造成压力,还不能为网站带来流量的无用爬虫,于是从网络上整理收集了各种禁止垃圾蜘蛛爬站的方法,留做记录,以备不时之需。
Apache
1、通过修改 .htaccess 文件
修改网站目录下的.htaccess,添加如下代码即可(2 种代码任选):
可用代码(1)RewriteEngine On
RewriteCond %{HTTP_USER_AGENT} (^$|FeedDemon|Indy Library|Alexa Toolbar|AskTbFXTV|AhrefsBot|CrawlDaddy|CoolpadWebkit|Java|Feedly|UniversalFeedParser|ApacheBench|Microsoft URL Control|Swiftbot|ZmEu|oBot|jaunty|Python-urllib|lightDeckReports Bot|YYSpider|DigExt|HttpClient|MJ12bot|heritrix|EasouSpider|DotBot|Ezooms) [NC]
RewriteRule ^(.*)$ - [F]
可用代码(2)SetEnvIfNoCase ^User-Agent$ .*(FeedDemon|Indy Library|Alexa Toolbar|AskTbFXTV|AhrefsBot|CrawlDaddy|CoolpadWebkit|Java|Feedly|UniversalFeedParser|ApacheBench|Microsoft URL Control|Swiftbot|ZmEu|oBot|jaunty|Python-urllib|lightDeckReports Bot|YYSpider|DigExt|HttpClient|MJ12bot|heritrix|EasouSpider|DotBot|Ezooms) BADBOT
Order Allow,Deny
Allow from all
Deny from env=BADBOT
可用代码(3)
如果以上两种方式会导致500错误,可以使用下面的代码。setEnvIfNoCase User-Agent "^$" bad_bot
setEnvIfNoCase User-Agent "^FeedDemon" bad_bot
setEnvIfNoCase User-Agent "^Indy Library" bad_bot
setEnvIfNoCase User-Agent &#