Apache Website : How to protect against URL testing

I have a website running on Linux with Apache. I can see on the access log file several IP addresses trying to access some common and well known application like PhpMyAdmin, probably to hack the server if found.

Of course, I do not have phpmyadmin publically available but I wonder how to protect against such kind of attack. Maybe there is a tool under linux that is able to detect this and automatically block the IP adress? Maybe another tool is able to send me a mail automatically when this is detected.

What is the state of the art for that ?

Thanx for your help.


Solution 1:

ModSecurity(TM) is an open source intrusion detection and prevention engine for web applications. It can also be called an web application firewall. It operates embedded into the web server, acting as a powerful umbrella, shielding applications from attacks.

You can download the mod_security from http://www.modsecurity.org/download/

ModSecurity configuration directives are added to your configuration file (typically httpd.conf) directly.

<IfModule mod_security.c>
   # mod_security configuration directives
   # ...
</IfModule>

You can prevent the attacks by configuring it in following ways

Cross site scripting attacks

    SecFilter "<script"
    SecFilter "<.+>"

The first filter will protect only against JavaScript injection with the tag. The second filter is more general, and disallows any HTML code in parameters.

SQL/database attacks

    SecFilter "delete[[:space:]]+from"
    SecFilter "insert[[:space:]]+into"
    SecFilter "select.+from"

It can protect you from most SQL-related attacks. These are only examples, you need to craft your filters carefully depending on the actual database engine you use.

A filter like this:

    SecFilterSelective ARGS "bin/"

will detect attempts to execute binaries residing in various folders on a Unix-related operating system.

Buffer overflow attacks

    SecFilterByteRange 32 126

it will only accept requests that consists of bytes from this range.

If you want to support multiple ranges, regular expressions come to rescue. You can use something like:

    SecFilterSelective THE_REQUEST "!^[\x0a\x0d\x20-\x7f]+$"

For further information regarding mod_security, please refer following URL

http://modsecurity.org/documentation/modsecurity-apache/1.9.3/html-multipage/index.html

Solution 2:

The correct approach to this is not to block attempts by malicious bots to scan the contents of your web server.

Focusing on blocking the scans to achieve security by obscurity will not yield any kind of protection against a real attacker, or really even against a bot that guesses well at a vulnerable URL in its first couple attempts before your countermeasures kick in - not to mention the impact of a real user getting caught up in the bot net.

Instead, what you need to direct your effort toward is ensuring that the content that you expose to the world is secure.

Know your web footprint, secure it, and be confident that the bots can scan your web presence all day without doing any damage.

Solution 3:

I'd recommend having a look at fail2ban - it is most commonly used for blocking ssh attacks - but it can work with anything which writes parseable log files. The fail2ban daemon (typically) injects new iptables rules which is a lot more efficient / scalable than trying to apply filtering in userspace. But fail2ban does not preclude the possibility of filtering based on ore complex code in userspace, using e.g. referer, browser, cookies etc.

You can set it to just look for a high proportion of 404 errors - or drop in a script at the urls which are being targeted which pokes fail2ban to close the door.