Jeremy Wadsack wrote:

>Jim Sander wrote:
>
>> If you're going to commit to a "fudge factor" then why not simply use
>> that factor against some concrete metric- a ratio of requests for 
>> pages to unique IP addresses during your site's busiest 15 minutes 
>> maybe? I'd say that number would be at least as valid a measure of 
>> "stickiness" as anything else and is a TON less intensive 
>> computationally.
>
>I find this a much more helpful response then the Random Number Generator 
>(sorry, Aengus).

Fair' nuff - I wasn't entirely serious. I still think my way gives the 
best "bang for the buck"!, though :-)

But even Jims suggestion only gives you a site specific metric. Some 
sites are far more likely to be visited by AOL users than others, for 
example, and the ratio for a site with a lot of AOL users will be 
severely skewed by AOLs proxy configuration. And that's the whole point 
- there is no way to express this in a general fashion.

And I recognize that there can be value in such metrics. But a measure 
such as ROI is only valuable as a "rule of thumb" metric because it's 
based on very solid data. Log files just don't contain accurate 
information on the things that your are proposing to measure. I can be 
reasonably sure that a site that records 100,000 page requests a day 
gets more traffic than one that gets 10,000, but I can't reasonably make 
the same judgement about a site that records 12,000 requests, without 
digging in a lot deeper. 

Aengus




------------------------------------------------------------------------
This is the analog-help mailing list. To unsubscribe from this
mailing list, send mail to [EMAIL PROTECTED]
with "unsubscribe" in the main BODY OF THE MESSAGE.
List archived at http://www.mail-archive.com/[email protected]/
------------------------------------------------------------------------

Reply via email to