So, I have an application that does some heavy lifting, and bots like to maliciously crawl it for data. It's a search application with a ton of facets.
Normal traffic may make a few requests in a minute, however the malicious traffic will hit the site between once a second to a few seconds (3 or 4, typically). The problem is while 15 requests a minute is a ringer for a bot, it's kind of also a possibility for a normal power user to do that as well (or a group of users at an allowable partner who are sharing the same IP).
The difference, is that normal users don't continually do it over a several minute time span. Here's my setup, that isn't really working:
http {
...
limit_req_zone $limit zone=php:10m rate=30r/m;
...
}
server {
...
location ~ \.php$ {
...
limit_req zone=php burst=25 nodelay;
...
}
...
}
I'm pretty sure my rate/burst is all out of what for what I'm trying to catch, but I'm not convinced nginx rate limiting is right for limiting by the minute. It seems more aligned with limiting by the second.
Any insight or assistance would be appreciated!