Limit simultaneous connections per IP with Apache2
I am hosting a public heavy CPU web service, and I would like to configure Apache2 to only allow 1 simultaneous connection per IP address to the location at hand, to prevent single clients from using too much resources.
Is there a neat apache2 solution to do this? I have looked into mod_bw but this does not seem to do the trick (the MaxConnections only applies for all users, not per IP). There also is a module called apache2-mod-limitipconn, but this one has no precompiled packages and I think is longer maintained as the website is dead. I would prefer something that I can include as a formal dependency in Ubuntu.
Solution 1:
I'm not sure about apache module, but you could use iptables for it. Here is the howto about connlimit module: http://www.cyberciti.biz/faq/iptables-connection-limits-howto/
In your situation something like following would work:
/sbin/iptables -A INPUT -p tcp --syn --dport 80 -m connlimit --connlimit-above 20 -j REJECT --reject-with tcp-reset
However, as mentioned in other questions, be carefull: this rule might block some legimitive robots (like google crawler) or ISPs/organisations that use NAT and shares single ip address for large number of users.
Solution 2:
You can use mod_qos httpd module.
QS_SrvMaxConnPerIP
<number>
[<connections>]
Defines the maximum number of connections per source IP address for this server (virtual host). The "connections" argument defines the number of busy connections of the server (all virtual hosts) to enable this limitation, default is 0 (which means that the limitation is always enabled, even the server is idle).
More details in http://mod-qos.sourceforge.net.
Example:
LoadModule qos_module path_to_module/mod_qos.so
<IfModule mod_qos.c>
# max connection per IP is
QS_SrvMaxConnPerIP 15
</IfModule>
<VirtualHost *:80>
...
</VirtualHost>
Solution 3:
I just had the opposite need - apache2 was only serving one page at a time per IP. If that took a long time, another page load in another tab would not start loading until any others to that server I had since requested were completed.
I knew it was related to sessions and found the full answer here and here.
Solution 4:
One connection per IP address is not going to work. A web browser will use one connection to download the web page, then 10+ simultaneous connections to get all the images, css, javacripts, etc. So if you do limit by IP, the user will get the main page, and maybe a few images and that is all.
The only use case that limit by IP works for is if you have a dedicated download server you don't want people using download accelerators on. Aka, RapidShare.
You need to look into how the website abusers are abusing your services and target them. If you limit everyone, then everyone is going to hurt.
If it's case of just too much traffic, then you need to optimize the site or add some more cpu cycles with more/faster hardware.