HAProxy - http_req_rate
Solution 1:
Natural browsing (i.e. by a person) can produce bursts of connections. For example, if you load a web page, plus the images on that page, plus CSS, plus some JavaScript, you could pretty quickly get to 10 connections in a second.
It is also reasonable that a person loads a home page/landing page and then within a few seconds clicks a second page.
However, it is unlikely that a natural person would generate traffic at a sustained pace over a longer time period - they will not click page after page after page ad infinitum.
This sustained traffic is what (typically) differentiates automated traffic from occasional bursts associated with natural browsing.
To put it another way, the longer the period you average over, the more confident you can be that the traffic exceeding a "connections per second" threshold is the result of automation.
However, choosing a longer period is not without disadvantages - a longer period means higher memory requirements for the table and a slower response time. The choice is a balance of these things.
Once you choose the period to average over, the number of connections needs to be set conservatively so that you do not catch (i.e. block) natural browsing.