Blocking apache access via user agent string
I've got a scripter who is using a proxy to attack a website I'm serving.
I've noticed that they tend to access the site via software with a certain common user agent string (i.e. http://www.itsecteam.com/en/projects/project1_page2.htm "Havij advanced sql injection software" with a user_agent string of Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1; SV1; .NET CLR 2.0.50727) Havij
). I'm aware that any cracking software worth it's salt will probably be able to modify it's user agent string, but I'm fine with the scripter having to deal with that feature at some point.
So, is there any software out there for automatically blocking access & permanently blacklisting by matching user agent strings?
Solution 1:
you can deny access by BrowserMatch and Deny from SetEnvIf Example:
SetEnvIfNoCase User-Agent "^Wget" bad_bot
SetEnvIfNoCase User-Agent "^EmailSiphon" bad_bot
SetEnvIfNoCase User-Agent "^EmailWolf" bad_bot
<Directory "/var/www">
Order Allow,Deny
Allow from all
Deny from env=bad_bot
</Directory>
To permanenly block them you have to write custom log file and use fail2ban for example to ban them with iptables
For example create LogFormat
LogFormat "%a %{User-agent}i" ipagent
Add logging to your vhost/server-wide
CustomLog /var/log/apache2/useragent.log ipagent
/etc/fail2ban/filter.d/baduseragent.conf
[Definition]
failregex = ^<HOST> Mozilla/4\.0 \(compatible; MSIE 7\.0; Windows NT 5\.1; SV1; \.NET CLR 2\.0\.50727\) Havij$
/etc/fail2ban/jail.conf
[apache-bad-user-agent]
enabled = true
port = 80,443
protocol = tcp
filter = baduseragent
maxretry = 1
bantime = 86400
logpath = /var/log/apache2/useragent.log
Solution 2:
I think i understand your question. I will provide a more detailed explanation should this be what you are looking for. (this will work as a trap for other things as well)
- Enable the mod_rewrite engine in apache2
- Create a trap.php, visiting can do whatever you like. For example, i made it add all visitors ip to a blacklist that denies access to my web.
- Create a file of the useragents you dont like, one per line like this
bas_useragent [tab] black
useragent_bad [tab} black
- Now, add your mod_rewrite that matches the map from bad useragents, then reqrites to your trap if there is a map. The rule may look like this:
RewriteMap badlist txt:~/bad_useragent_list
RewriteCond %{HTTP_USER_AGENT} .* [NC]
RewriteCond ${badlist:%1|white} ^black$ [NC]
RewriteRule (.*) "/trap.php" [L]
- This basically matches the useragent to the keys in your file, if it isnt found it is assumed to be "white" and the request is unmodified. If it is found, and the associated value is "black" the request is rewritten to go to your trap.php file, which does whatever you like it to.
- Some possible ideas. Have another script watching a common file that trap.php writes an IP to. If this common file changes, this watcher reads the new info, parses out IP addresses, and adds a rule to IP tables that blocks all traffic from that address. I hope this helps! Again, if you would like more detail, just reply here.