What is the most elegant way to exclude robots from the request report?
Thank you,
Boris.
+
| TO UNSUBSCRIBE from this list:
|http://lists.isite.net/listgate/analog-help/unsubscribe.html
|
| Digest version:
ROBOTINCLUDE and ROBOTEXCLUDE won't help in this situation
because they determine which browsers count as robots in the Operating
System Report.
A robot (for example, Googlebot) appears in server logs as an User Agent
or Browser. So, BROWINCLUDE and BROWEXCLUDE are the way
to include and exclude
Is it possible to limit the displayed requests to a specific directory?
I did a directory report but that just gives me a total number for the
directory.
Thanks,
Carl
___
Carl Snow
Purdue University Libraries
Network Access Librarian
[EMAIL PROTECTED]
765-494-2764
On Thursday, August 12, 2004 6:09 PM [GMT],
Snow, Carl E. [EMAIL PROTECTED] wrote:
Is it possible to limit the displayed requests to a specific
directory? I did a directory report but that just gives me a total
number for the directory.
FILEINCLUDE /directory/*
Aengus
On Thu, 12 Aug 2004, Aengus wrote:
On Thursday, August 12, 2004 6:09 PM [GMT],
Snow, Carl E. [EMAIL PROTECTED] wrote:
Is it possible to limit the displayed requests to a specific
directory? I did a directory report but that just gives me a total
number for the directory.
FILEINCLUDE
I am using analog to analyze EZproxy log files. What do I need to do to get
a more readable output for long names in the file column like:
http://www.morrisville.edu:2048/login?url=http://infotrac.galegroup.com/itwe
b/morrisville
I would prefer something like
"Drew, Bill" wrote:
I am using analog to analyze EZproxy log files. What do I need to do to get
a more readable output for long names in the file column like:
http://www.morrisville.edu:2048/login?url=http://infotrac.galegroup.com/itwe
b/morrisville
I would prefer something like