On Tuesday, December 19, 2006 11:01 PM [EDT], howard chen <[EMAIL PROTECTED]> wrote:
Anyone have been thinking of using analog to detect abnormal requests?
When I worked on a system that generated dynamic price lists, we occassionally noticed that it was being spidered (by a competitor, we presumed). I would do a quick log analysis with Analog, with all reports off except the Host report, sorted by Page Requests, and showing the number of Requests and the number of Page Requests. Any spider would stick out like a sore thumb (spiders usually don't request images, so the number of page requests would match the number of Requests). We could then do some urther investigation and decide whether or not to block that IP address. (A caching proxy server might also be only requesting Pages, as it might have cached most of the common images, so it would be important to check the request pattern to see if it indicated ordinary use, or more methodical spidering).
For example, a DOS at track from a remote IP, it would be not efficient to implement as a real time system using such as PHP. For example, If we analyze the log using analog every 30 min, and find out those abnormal request, it would be quite interesting. And most importantly, analog is fast and will not hurt your system,
We never automated this process, we just used it as an investigative technique when other monitors indicated a problem, but you could certainly use it as a primary monitor to look for specific troublesome patterns.
Aengus
+------------------------------------------------------------------------ | TO UNSUBSCRIBE from this list: | http://lists.meer.net/mailman/listinfo/analog-help | | Analog Documentation: http://analog.cx/docs/Readme.html | List archives: http://www.analog.cx/docs/mailing.html#listarchives | Usenet version: news://news.gmane.org/gmane.comp.web.analog.general +------------------------------------------------------------------------

