I'm using a script called BlackHole that uses PHP and a "hidden" 
directory to catch bad bots and block them by their IP address from the site 
[http://perishablepress.com/blackhole-bad-bots/].  It's working very well, but 
I'd like to expand on the idea.

        When looking at our access logs, I see someone "adding code" to our 
URL's in hopes to, apparently, hack into our site.  If our URL is like this:

http://www.example.com/pages/bolts.php

        they are adding to it like this:

http://www.example.com/pages/bolts.php/wp-content/themes/functions/thumb.php.cache/external_e19f4bb51bc2262e07d23b79b916c12c.php
or
http://www.example.com/pages/bolts.php/cart.php
or
http://www.example.com/pages/bolts.php/wp-content/themes/functions/thumb.phptimthumb.php
or
http://www.example.com/pages/bolts.php/bratislava/stare-mesto

        Is there a way that I can trap the "extra" info in the URL's and pass 
it to the BlackHole script to ban these attempts as well?  I'm thinking I'll 
need to compare the entered URL with $_SERVER["PHP_SELF"] and if there's extra 
info, like another "*.php", after the page name, then ban the IP.  I realize 
that's a rather simplistic explanation and there may be a better way to 
accomplish this.

        If anyone has any examples they can share, I'd really appreciate.

Thanks,
Marc
--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php

Reply via email to