"Bruce Bradbury" <[EMAIL PROTECTED]> wrote:

> Our request report shows 6000 requests for a particular (pdf) file.
> Does that really mean that 6000 people have tried to download it?

Requests for PDF files are often partial requests - this will depend on
your server, but from what I understand, most servers now support this.

> Or does this include a large number of robot hits?

It's extremely unlikely that a robot would read that file, and not read
the rest of your site, so if you think that robots are significantly
inflating the requests for that file, they'll be having the same effect on
your whole site.

The simplest way to answer your question, though, is to run a report and
just FILEINCLUDE that pdf file. If you look at the Host report and the
Browser report (off by default), you should get a better sense of what's
happening. In particular, if you see that the 6,000 hits were generated by
only 200 different hosts, it should answer your first question.

> Is there a rule-of-thumb for how many requests are genuine human
> requests?

Different websites would have different rules of thumb. There really is no
general answer.

Aengus

+------------------------------------------------------------------------
|  This is the analog-help mailing list. To unsubscribe from this
|  mailing list, go to
|    http://lists.isite.net/listgate/analog-help/unsubscribe.html
|
|  List archives are available at
|    http://www.mail-archive.com/[email protected]/
|    http://lists.isite.net/listgate/analog-help/archives/
|    http://www.tallylist.com/archives/index.cfm/mlist.7
+------------------------------------------------------------------------

Reply via email to