On Tue, Aug 26, 2008 at 10:07:43AM -0500, Tim Donohue wrote:
> So, although I think it was already mentioned, I'd add as a requirement 
> for a good Statistics Package:
> 
> * Must filter out web-crawlers in a semi-automated fashion!

+1!  Suggestions as to how?

The Rochester mod.s could be augmented to filter out the easiest cases
more simply.  Some well-behaved crawlers can be spotted automatically.
(No, I don't recall how.)  The filter rules could be made more
flexible than just a single type of fixed-size netblocks (if memory
serves).  I've been meaning to work on these at some point, but
haven't yet reached That Point.

Crawler filtering sounds like something that might be abstracted from
the various existing stat. patches and provided as a common service.
We all should invent this wheel only once.

-- 
Mark H. Wood, Lead System Programmer   [EMAIL PROTECTED]
Typically when a software vendor says that a product is "intuitive" he
means the exact opposite.

Attachment: pgpiMhhX3CAGS.pgp
Description: PGP signature

_______________________________________________
Dspace-general mailing list
[email protected]
http://mailman.mit.edu/mailman/listinfo/dspace-general

Reply via email to