On Fri, Feb 22, 2008 at 12:38 PM, Wade Preston Shearer <
[EMAIL PROTECTED]> wrote:

> > Do you want them to crawl, but just get throttled? Or no spider?
>
> Sorry, I didn't ask my question well. They are free to crawl as much
> as they would like. I don't wan to assign them a session if they are a
> bot/spider. I am looking for a way (besides manually maintaining a
> user-agent list) to automatically disqualify them from getting an
> session assigned to them.
>
>
You could check USER_AGENT and/or use get_browser before initiating the
session.

-- 
-
http://stderr.ws/
"Insert pseudo-insightful quote here." - Some Guy

_______________________________________________

UPHPU mailing list
[email protected]
http://uphpu.org/mailman/listinfo/uphpu
IRC: #uphpu on irc.freenode.net

Reply via email to