Stephen Turner wrote:
> I get lots of mails like this defending "approximations", or as you put
> it "fuzziness". The problem is, I don't think they're approximations. I
> strongly suspect they're often closer to 5 or 10 times out. Think of it this
> way and you'll perhaps realise why analog doesn't provide them.
In the marketing world, 5% accuracy is considered the holy grail. 10% - 20%
accuracy (if I understand what "5 or 10 times out" means) would be phenomenal.
In marketing systems, data mining techniques are used to increase the
essentially random 2% - 3% response to as good as 5%. If some data can be
gleaned from logfiles that helps improve this result, why not use it?
This isn't to say that Analog should be the product to do this. Analog's goal is
website logfile analysis and reporting and it is (many time proven) the most
accurate tool available for such. Too many hosting companies offer other
products which claim to include specious data, such as Visitor tracking, without
properly understanding or even educating their clients on the inaccuracy of such
'statistics'.
Jim Sander wrote:
> If you're going to commit to a "fudge factor" then why not simply use
> that factor against some concrete metric- a ratio of requests for pages to
> unique IP addresses during your site's busiest 15 minutes maybe? I'd say
> that number would be at least as valid a measure of "stickiness" as
> anything else and is a TON less intensive computationally.
I find this a much more helpful response then the Random Number Generator
(sorry, Aengus). In business and marketing, there are a plethora of quotients
and ratios, such as ROI, used to make assessments of the quality of data,
values, investment potential, response potential, etc. Rather than limiting
ourselves to known quantities and basic derived values (like average pages per
day), why not consider the value of some more advanced (yet less speculative)
derived values that may provide insight to web site performance, especially, as
Jason suggested, when compared to similar quantities in other time periods or on
other sites.
Last count, I think there were estimated to be on the order of 10,000 sites
running Analog (and you want to talk about inaccurate numbers :). What if all
these webmasters started looking at quantities such as "maximal page-ip ratio"
(to coin a term :) and comparing them to other results. Say a webmaster runs a
dozen sites. Certainly these numbers, at first, would mean nothing to her. But
when she compares them to different sites and different periods, some meanings
would eventually come out of some of the quantities.
Analog may not be the "testing ground" to determine which quantities are helpful
and which aren't, but it may provide a springboard to begin developing this kind
of result. If we can provide marketing people with new "web quantities" rather
than trying to fit stateless http data into marketing quantities, like
"stickiness", "cum", and "score", we might actually develop results that are
useful for everybody.
Jeremy Wadsack
Wadsack-Allen Digital Group
------------------------------------------------------------------------
This is the analog-help mailing list. To unsubscribe from this
mailing list, send mail to [EMAIL PROTECTED]
with "unsubscribe" in the main BODY OF THE MESSAGE.
List archived at http://www.mail-archive.com/[email protected]/
------------------------------------------------------------------------