Christopher Schultz wrote:
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA256

To whom it may concern,

On 5/19/15 8:09 AM, javalishixml wrote:
Just understood you. Really appreciate for your feedback.


How do we judge it's a robot? item1: we find the request IP is
always the same one. item2: our page may contains several
keep-alive connections. But the attack connection only focus on
connection.

Based upon the first request, how can you tell that the robot is going
to make later keep-alive requests?

Based on these 2 items, we think the client is a robot.

Can you write some pseudo-code that shows the algorithm in its
simplest form?

I think maybe putting these 2 items together to consider it as a robot is a bit complex. Let's do it from the simple point.

If we always find there is a same IP request our website the same
url for many times, can I block this request at httpd level?

This sounds like a job for mod_qos, mod_evasive, or mod_security.

- -chris

+1.
Also, a quick search in Google for "apache filtering unwanted requests", gives a bunch of results which you may want to explore.


---------------------------------------------------------------------
To unsubscribe, e-mail: users-unsubscr...@tomcat.apache.org
For additional commands, e-mail: users-h...@tomcat.apache.org

Reply via email to