Linux下规则文件.htaccess(手工创建.htaccess文件到站点根目录)
<IfModule mod_rewrite.c> RewriteEngine On #Block spider RewriteCond %{HTTP_USER_AGENT} "Apache-HttpClient|SemrushBot|Webdup|AcoonBot|AhrefsBot|Ezooms|EdisterBot|EC2LinkFinder|jikespider|Purebot|MJ12bot|WangIDSpider|WBSearchBot|Wotbox|xbfMozilla|Yottaa|YandexBot|Jorgee|SWEBot|spbot|TurnitinBot-Agent|mail.RU|curl|perl|Python|Wget|Xenu|ZmEu" [NC] RewriteRule !(^robots\.txt$) - [F] </IfModule>Windows2008、2012或更高系统下规则文件web.config (手工创建web.config文件到站点根目录)
<?xml version="1.0" encoding="UTF-8"?> <configuration> <system.webServer> <rewrite> <rules> <rule name="Block spider"> <match url="(^robots.txt$)" ignoreCase="false" negate="true" /> <conditions> <add input="{HTTP_USER_AGENT}" pattern="Apache-HttpClient|SemrushBot|Webdup|AcoonBot|AhrefsBot|Ezooms|EdisterBot|EC2LinkFinder|jikespider|Purebot|MJ12bot|WangIDSpider|WBSearchBot|Wotbox|xbfMozilla|Yottaa|YandexBot|Jorgee|SWEBot|spbot|TurnitinBot-Agent|mail.RU|curl|perl|Python|Wget|Xenu|ZmEu" ignoreCase="true" /> </conditions> <action type="AbortRequest"/> </rule> </rules> </rewrite> </system.webServer> </configuration>注:“{HTTP_USER_AGENT}”所在行中是不明蜘蛛名称,根据需要添加以"|"为分割。 规则中默认屏蔽部分不明蜘蛛,要屏蔽其他蜘蛛按规则添加即可,附各大蜘蛛名字: google蜘蛛:googlebot 百度蜘蛛:baiduspider 百度手机蜘蛛:baiduboxapp yahoo蜘蛛:slurp alexa蜘蛛:ia_archiver msn蜘蛛:msnbot bing蜘蛛:bingbot altavista蜘蛛:scooter lycos蜘蛛:lycos_spider_(t-rex) alltheweb蜘蛛:fast-webcrawler inktomi蜘蛛:slurp 有道蜘蛛:YodaoBot和OutfoxBot 热土蜘蛛:Adminrtspider 搜狗蜘蛛:sogou spider SOSO蜘蛛:sosospider 360搜蜘蛛:360spider
腾云数据-为您提供快速、稳定、安全的专业云计算服务。www.qqidc.com.cn,联系QQ:810379922 新浪微博:腾云数据
本文链接: https://www.qqidc.com.cn/blog/230.html