Hi Nick,

Not directly answering your question, but I came across a pretty neat 
article about detecting naughty bots, how to keep them at bay, and sometimes 
give them a taste of their own medicine. (If that is the goal of what you 
are trying to do)

http://www.fleiner.com/bots/

Cheers
Aaron

----- Original Message ----- 
From: "Nick Le Mouton" <[EMAIL PROTECTED]>
To: "NZ PHP Users Group" <[email protected]>
Sent: Tuesday, November 11, 2008 12:39 PM
Subject: [phpug] [OT] Tracking top IPs


>
> I've been having some problems with some dodgy spiders accessing my
> website and taking it down by overloading it. At the moment we use
> analytics and awstats for stats, but it doesn't give a real time view
> of who's accessing our website.
>
> I found a command to run across the apache access file which will show
> how many times an IP is logged in x lines of the log file.
>
> tail -100000 access.log | awk '{print $1}' | sort | uniq -c |sort -nr
>
> This is fine, but I'd like to setup a system to do this automatically,
> something that would show me easily who the top IP's were for the last
> hour/day (across 5 web servers).
>
> I could do this with mysql, but it wouldn't grab all the image/js/css
> accesses and if the website goes down (because of overloading) then
> mysql would get bogged down and not log.
>
> Can anyone think of a good way to do this? Is there a shell/perl
> script that would do something like what I'm after?
>
> Thanks in advance
> Nick
> > 


--~--~---------~--~----~------------~-------~--~----~
NZ PHP Users Group: http://groups.google.com/group/nzphpug
To post, send email to [email protected]
To unsubscribe, send email to
[EMAIL PROTECTED]
-~----------~----~----~----~------~----~------~--~---

Reply via email to