Re: [PHP] Limiting repetitive file access

2003-11-16 Thread Andre Dubuc
On Sunday 16 November 2003 10:45 pm, David Otton wrote:
> On Sat, 15 Nov 2003 23:52:31 -0500, you wrote:
> >Recently, a 'user' attempted to access a restricted area of my site
> >repetitively (spanning five hours) entering the same url repetitively
> >[probably by script]. A massive log file was generated. I would like to
> > ban such behavior by limiting the number of successive 'get's a user can
> > do (say 4 attempts) before an appropriate action is taken..
> >
> >As a temporary measure (until I can figure a better way) the url in
> > question was disabled.
> >
> >What I'd like to do, on a per-file basis using $_SESSION, is a combination
> > of ipaddress perhaps with a counter that records the number of times that
> > file was accessed, and limit the number of successive 'get's that can be
> > done before the file is no longer accessible.
>
> Sessions won't work unless the script at the other end is co-operating by
> holding state for you, which is unlikely.
>
> Blocking is never an ideal solution - automated processing can catch
> innocent users, a DDoS can knock most sites over, IP bans can be got around
> by abusing open proxies, and assholes seem to have infinite free time. It's
> rarely worth getting into an arms race with them because ultimately, if
> someone wants your machine off the net badly enough, there's nothing you
> can do about it.
>
> Having said that, the earlier in the chain you block malicious traffic the
> better. ISP > router > firewall > web server > script. I think maybe a
> script that adds a firewall rule when triggered would be effective in your
> case (a quick Google will probably find something like this). Just bear in
> mind that it's not foolproof.
>
> You could also try blocking if a referer: is lacking... but again it's not
> really reliable.


Thanks David,

After reading about the options that I have at my disposal, I'm thinking that 
one bad 'user' in a year's operation ain't too shoddy. Perhaps the best line 
of attack to this problem, is to do nothing at all.

Aside from filling up my webaccess.log, he didn't stop the flow of traffic, 
nor did he get anywhere. I seemed to do more damage than he! -- I just hosed 
that file good  . . . sigh . . . It's been a great weekend - began with a 44 
hour power outage, both phone lines dead, then I hosed my acccess file, and 
finished it off by taking out my developmental db -- not bad, huh? (Perhaps I 
need some sleep??)

Tomorrow, I'll have a nice chat with my restrictive IP and figure out what I 
am allowed to do . . . Personally I like the firewall trigger -- put 'em out 
of business early!

Thanks again for giving me the 'straight-goods'!

Regards,
andre

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] Limiting repetitive file access

2003-11-16 Thread David Otton
On Sat, 15 Nov 2003 23:52:31 -0500, you wrote:

>Recently, a 'user' attempted to access a restricted area of my site 
>repetitively (spanning five hours) entering the same url repetitively 
>[probably by script]. A massive log file was generated. I would like to ban 
>such behavior by limiting the number of successive 'get's a user can do (say 
>4 attempts) before an appropriate action is taken..
>
>As a temporary measure (until I can figure a better way) the url in question 
>was disabled.
>
>What I'd like to do, on a per-file basis using $_SESSION, is a combination of 
>ipaddress perhaps with a counter that records the number of times that file 
>was accessed, and limit the number of successive 'get's that can be done 
>before the file is no longer accessible.

Sessions won't work unless the script at the other end is co-operating by
holding state for you, which is unlikely.

Blocking is never an ideal solution - automated processing can catch
innocent users, a DDoS can knock most sites over, IP bans can be got around
by abusing open proxies, and assholes seem to have infinite free time. It's
rarely worth getting into an arms race with them because ultimately, if
someone wants your machine off the net badly enough, there's nothing you can
do about it.

Having said that, the earlier in the chain you block malicious traffic the
better. ISP > router > firewall > web server > script. I think maybe a
script that adds a firewall rule when triggered would be effective in your
case (a quick Google will probably find something like this). Just bear in
mind that it's not foolproof.

You could also try blocking if a referer: is lacking... but again it's not
really reliable.

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] Limiting repetitive file access

2003-11-15 Thread Raditha Dissanayake
Hi,

While your solution is feasible it would still consume processor and 
memory because you are doing this at a very high level, you will
be better of solving this at a lower level by a proper use of a 
firewall. What you have described sounds like a kiddie script attempt at 
a denial of service or brute force cracking.



Andre Dubuc wrote:

Hi,

Recently, a 'user' attempted to access a restricted area of my site 
repetitively (spanning five hours) entering the same url repetitively 
[probably by script]. A massive log file was generated. I would like to ban 
such behavior by limiting the number of successive 'get's a user can do (say 
4 attempts) before an appropriate action is taken..

As a temporary measure (until I can figure a better way) the url in question 
was disabled.

What I'd like to do, on a per-file basis using $_SESSION, is a combination of 
ipaddress perhaps with a counter that records the number of times that file 
was accessed, and limit the number of successive 'get's that can be done 
before the file is no longer accessible.

In a script that checks for bad words, I have used:



if ($_SESSION['text'] = "badwords"){
	 $_SESSION['attempt'] = 1; 
	header("location: unwanted.php");
}

[In the file unwanted.php I checked for $_SESSION['attempt'] = 1 and booted if 
the condition was met]

However, using this approach I cannot augment this number without resorting to 
a file get/put schema. Is there a way around this? Is there a better 
approach?

I've tried .htaccess but the user in question has a dynamic address.

Any help appreciated.
Tia,
Andre
 



--
Raditha Dissanayake.

http://www.radinks.com/sftp/ | http://www.raditha.com/megaupload
Lean and mean Secure FTP applet with | Mega Upload - PHP file uploader
Graphical User Inteface. Just 150 KB | with progress bar.
--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php


[PHP] Limiting repetitive file access

2003-11-15 Thread Andre Dubuc
Hi,

Recently, a 'user' attempted to access a restricted area of my site 
repetitively (spanning five hours) entering the same url repetitively 
[probably by script]. A massive log file was generated. I would like to ban 
such behavior by limiting the number of successive 'get's a user can do (say 
4 attempts) before an appropriate action is taken..

As a temporary measure (until I can figure a better way) the url in question 
was disabled.

What I'd like to do, on a per-file basis using $_SESSION, is a combination of 
ipaddress perhaps with a counter that records the number of times that file 
was accessed, and limit the number of successive 'get's that can be done 
before the file is no longer accessible.

In a script that checks for bad words, I have used:

http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php