On Wednesday, March 29, 2006 8:26 AM [EDT],
Robert D. <[EMAIL PROTECTED]> wrote:

> Hi Analog-Help:
>
> Once, when I first ran analog/report-magic, I saw a header called
> Known Robots, or something similar. The robots were all clumped
> together.
>
> Now I am trying to get that back whereby in all reports, if the URL
> was from ANY of the robots, it gets grouped there.

Any given  request will only occur once in the Request Report. So you can't
have the same request in a "robot clump" and in it's normal place in a
single Request Report.

If you want all the requests from robots lumped together in the Request
Report, you'd have to create a report on just the Robot requests, excluding
all non-robot requests.

> The only lines I
> put in which I believe may put me back on-track are:
>
> BROWEXCLUDE REGEXPI:robot
> BROWEXCLUDE REGEXPI:spider
> BROWEXCLUDE REGEXPI:crawler

They don't look like valid regular expressions to me.

> BROWEXCLUDE Googlebot*
> BROWEXCLUDE Infoseek*
> BROWEXCLUDE Scooter*
> BROWEXCLUDE Slurp*
> BROWEXCLUDE Ultraseek*
> BROWEXCLUDE Infopath*
> BROWEXCLUDE REGEXPI:NET+CLR+1*
> BROWEXCLUDE MsnBot*
> BROWEXCLUDE Psbot*

These lines will completely exclude all lines from those agents from all
reports - your Request Report won't include any request from these agents.

> ROBOTINCLUDE Googlebot/*
> ROBOTINCLUDE pBot/*
> ROBOTINCLUDE MsnBot*

This just tells Analog to treat the matching browser strings as Robots  in
the Operating system report.

Aengus

+------------------------------------------------------------------------
|  TO UNSUBSCRIBE from this list:
|    http://lists.meer.net/mailman/listinfo/analog-help
|
|  Analog Documentation: http://analog.cx/docs/Readme.html
|  List archives:  http://www.analog.cx/docs/mailing.html#listarchives
|  Usenet version: news://news.gmane.org/gmane.comp.web.analog.general
+------------------------------------------------------------------------

Reply via email to