Gerald Richter wrote:
> >
> > Having tested sessions (via Embperl and Apache::Session) I have
> > discovered an awful lot of rows in my database being created by robots.
> > My assumption is that the robots don't support cookies so every time
> > they hit a page on the site a new session is created in the database.
> > When they trawl a discussion board, for example, they can very quickly
> > create hundreds of sessions.
> >
> > How do people deal with this?  The only thing I can think to do is to
> > set a separate cookie on every page on the site, and then only try to
> > create session data if that cookie is actually set.  The disadvantage
> > here is that it's a pain to do and even then I couldn't start a session
> > on the first page they look at.
> >
> 
> One solution would be to delete the session if the useragent is a know
> robot.

I have an idea that may or may not be useful to you.  If the user
comes to a random page without a valid cookie, you redirect them to
the front page.  After that point, they have a valid cookie, and they
can access all the other pages.  The robots would get redirected once,
try all the links on your front page (which would all redirect back to
the front page), and then give up.  

If your cookie has a long expiry, and you refresh it when the user
comes back, then this won't stop them bookmarking pages and returning
to them at a later stage.

Useful, maybe ?

Jim

-- 
 Jim Peters         /             __   |  \              Aguazul
                   /   /| /| )| /| / )||   \
 jim@aguazul.      \  (_|(_|(_|(_| )(_|I   /        www.aguazul.
  demon.co.uk       \    ._)     _/       /          demon.co.uk

---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to