Dynamic IP Restriction strategy
This morning one of our sites got attacked by a bot that was looking for vulnerabilities. The same IP address was used. However the BOT only made on average 12-16 page requests a minute. Different pages were hit that have 20-40 static resources made up of images, css, js etc.
Armed with this knowledge what is a good strategy in enabling Dynamic IP Restrictions on IIS7.5? I can see I can enable "logging only mode" but am not entirely sure how to best look at the log files to solve this problem.
What I don't want to do is to lock out my users but abort the request for BOTs.
We've recently tried to set up the Dynamic IP Restrictions module for one of our larger sites and it's not been easy. Reading the comments on the original question I had to smile broadly when I saw "Don't waste your time.". It's mostly true.
I'll still give a few hints, though, what you've got to look out for:
- You might have to separate your page calls from your static assets calls, i.e. move the latter to their own domain, to apply the restriction you'll set up only to page loads not static assets.
- You'll need to define and test your request denial criteria carefully - is it something like 10 req / 1s or 50 req / 10s? 12 req / 60 sec is most likely not a good denial criteria because it might affect a lot of legitimate users.
- Using the "Logging Only Mode" checkbox will log pseudo-denied requests to the IIS log, but with a status code of 200 and a substatus code of 502 (so make you sure you're logging that). In addition, the Advanced Logging module will not log those requests correctly so forget about using that if you want to use Dynamic IP Restrictions and want to stay informed about denied requests.
- You have to continuously watch your logs and might eventually have to whitelist some external proxies' IP addresses so that users behind them will be treated correctly as separate users and not as a single user (with the IP of the proxy).
- You cannot penalize clients for some amount of time after hitting your request limits - as soon as they throttle their requests pass through again.
We started trying this module out over a month ago and still haven't switched the "Logging Only Mode" off because there's just so much that doesn't feel or work right...