Nginx反爬虫

时间:2025-05-03 09:34:19

原文地址:http://abublog.com/nginx_agent_deny.html

进入到nginx安装目录下的conf目录,将如下代码保存为 agent_deny.conf

# cd /usr/local/nginx/conf

# vi agent_deny.conf

#禁止Scrapy等工具的抓取
if ($http_user_agent ~* (Scrapy|Curl|HttpClient)) {
return 403;
} #禁止指定UA及UA为空的访问
if ($http_user_agent ~ "WinHttp|WebZIP|FetchURL|node-superagent|java/|FeedDemon|Jullo|JikeSpider|Indy Library|Alexa Toolbar|AskTbFXTV|AhrefsBot|CrawlDaddy|Java|Feedly|Apache-HttpAsyncClient|UniversalFeedParser|ApacheBench|Microsoft URL Control|Swiftbot|ZmEu|oBot|jaunty|Python-urllib|lightDeckReports Bot|YYSpider|DigExt|HttpClient|MJ12bot|heritrix|EasouSpider|Ezooms|BOT/0.1|YandexBot|FlightDeckReports|Linguee Bot|^$" ) {
return 403;
} #禁止非GET|HEAD|POST方式的抓取
if ($request_method !~ ^(GET|HEAD|POST)$) {
return 403;
}

然后,在网站相关配置中的 server段插入如下代码:

include agent_deny.conf;

保存后,执行如下命令,平滑重启nginx即可:

/usr/local/nginx/sbin/nginx -s reload