----- Original Message -----
From: "Gerald Richter" <[EMAIL PROTECTED]>
To: "Michael Smith" <[EMAIL PROTECTED]>; <[EMAIL PROTECTED]>
Sent: Tuesday, November 14, 2000 11:50 AM
Subject: Re: robots and sessions


> >
> > Having tested sessions (via Embperl and Apache::Session) I have
> > discovered an awful lot of rows in my database being created by robots.
> > My assumption is that the robots don't support cookies so every time
> > they hit a page on the site a new session is created in the database.
> > When they trawl a discussion board, for example, they can very quickly
> > create hundreds of sessions.
> >
> > How do people deal with this?  The only thing I can think to do is to
> > set a separate cookie on every page on the site, and then only try to
> > create session data if that cookie is actually set.  The disadvantage
> > here is that it's a pain to do and even then I couldn't start a session
> > on the first page they look at.
> >
>
> One solution would be to delete the session if the useragent is a know
> robot, but you have to add code like
>
> [-
> $r = shift ;
> $r -> DeleteSession if ($ENV{HTTP_USER_AGENT} =~ /robot|otherrobot/) ;
> -]
>
> to the end of each of your pages. We can add something like this to the
> Embperl core, that disables session for certain useragents. Not sure if
this
> is really a solution?
>
> Gerald

Well there must be hundreds of these HTTP_USER_AGENT strings that you'd have
to filter out (wouldn't there?).  It'd be quite difficult to try to account
for all of them.

Also there's the issue of people that choose not to accept cookies, I
wouldn't really want to create sessions for all of them either.

Mike


---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to