On Thursday 30 May 2002 19:42, James Holden wrote:
> I do it a different way - using IP's can be a bit of a bummer since they
> might change but often as not the hostnames dont.
>
> I detect spiders using the HTTP_USER_AGENT which identifies them either as
> say "kitty once hourly", "GoogleBot" or "Lycos" or some such - most of the
> decent spiders use the user_agent var to identify themselves and you can
> display alternate info dependent on which spider it is.

Do note that some (most?) crawlers don't take kindly to this practice of 
delivering content based on the HTTP_USER_AGENT. If you're found out you'll 
most likely be removed from the search engine in question.

-- 
Jason Wong -> Gremlins Associates -> www.gremlins.com.hk
Open Source Software Systems Integrators
* Web Design & Hosting * Internet & Intranet Applications Development *

/*
On the whole, I'd rather be in Philadelphia.
                -- W.C. Fields' epitaph
*/


-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php

Reply via email to