I'm aware that this is not what people think of when they think of 
stats. But the fact of the matter is that if you are simply given the 
final number -- 33% of all co-eds in a certain year married 
professors -- very few people are likely to question the underlying 
data. But what I think is more to the point:  It was years before 
anyone asked why that had happened.

But let's assume that it had been a sample set of 600 and the 
percentage had been the same, what does that tell us about the sample 
set? Absolutely nothing except that 1/3 of them acted a certain way. 
Which slowed down absolutely no one when it came to drawing 
conclusions about the behavior of the women in this set.

The problem with data is that it does not tell us WHY. Let's assume 
that we get a datum that 90% of all visitors to a site drop out on a 
page. So, we make some changes and now we now have 10% drop out on 
the same page. Which -- if any -- of those changes made the 
difference? Maybe it was because in the meantime we passed into the 
Christmas gift-giving season and people were more tolerant of being 
asked for certain data (or whatever). Perhaps it was because some 
large corporation changed our status so we could be accessed 
internally. Maybe someone with a particularly influential blog 
recommended us. Maybe another similar site came onto the scene that 
influenced our users to think of our methods as standard. Basically, 
the data itself tells us bugger all. And I suspect we don't really 
care; we just want to be able to attribute the improvement to our 
work....and it doesn't even matter that we can't say *which* of the 
changes is most important. Perhaps 1 of the changes, if made in 
isolation, would have caused a ~0% drop out rate on that page. 
Perhaps the other changes actually negated the underlying improvement.

And on the Web (as with all human endeavor) there's no way to isolate 
the changes. You can't do a pure A/B test because there are just too 
many possible reactions at any point -- all of which will be 
categorized as "did what we planned" or "did something else."

My main issue with statistics in site development/interaction 
design/user experience is that they tend to obscure more than they 
illuminate. This is because everyone thinks they understand 
statistics (they took math and know that 65% is more than 15% or they 
took intro to statistics or even advanced statistics and know how to 
do all the various mathematical tricks involved in deriving 
statistics) and they're wrong. If you never found out how to make 
statistics lie, then you have no business using them for any purpose 
besides using them -- knowingly -- to support your point -- whatever 
it may be. Almost all of statistic useage today confuses correlation 
with causality, and overall that's incredibly dangerous.

Katie


At 10:07 PM -0500 11/27/07, William Evans wrote:
>Your example is not stats. A sample set of 6 is called anecdote. 
>Turning it into a percentage is not stats. Their I'd no amount of 
>boostrapping that will make it so either. If you are not using a 
>statician fluent in regression analysis and using spss or SAS - then 
>you cannot lay claim to doing quant. Numbers dont lie. People lie. 
>People also ask the wrong questions and then interpret the answers 
>the wrong way bases on assumptions, bias, ignorance or stupidity but 
>properly done real quant done by qualified people is eminently 
>useful. (IMHO)
>
>will evans
>user experience architect
>[EMAIL PROTECTED]
>617.281.1281
>
>
>On Nov 27, 2007, at 8:46 PM, Katie Albers <[EMAIL PROTECTED]> wrote:
>
>>At 6:10 PM -0700 11/27/07, Robert Hoekman, Jr. wrote:
>>>>Do you think analyzing data using tools like Omniture and Coremetrics
>>>>should
>>>>fall under the user experience umbrella?
>>>
>>>
>>>Definitely falls under UX. So much can be learned about human behavior from
>>>stats, it's unreal. And stats don't lie, which is more than we can say about
>>>humans (even when these "lies" are unintentional).
>>>
>>>-r-
>>
>>Oh dear. Oh my. If you're consulting a statistician who can't make
>>any set of data say anything you want them to say then you should
>>find a better statistician. Of course statistics lie. Statistics
>>properly manipulated can tell you just about anything about anyone in
>>any situation. It's like the old joke about the difference between a
>>bookkeeper and an accountant: When you ask how much money you made
>>last year a bookkeeper will answer the question and the accountant
>>will ask you how much money you want to have made.
>>
>>Data don't have meaning without context and context is amazingly
>>flexible. To give just a few examples that leap to my mind whenever
>>someone says that statistics don't lie I cite the following:
>>
>>A study early in the co-education process of a previously all men's
>>college that said 1/3 of all women admitted had married faculty
>>members. Mind you there were only 6 women who'd been admitted and the
>>social life of the college was all frat based and they imported girls
>>for events, thank you very much. Both the male faculty in question
>>were also brand new PhDs.
>>
>>As we all know, 50% of all marriages end in divorce. Except that they
>>don't and they never have. One year in the early 60s a study was done
>>which noticed that in a particular year there would be 50% as many
>>divorces as marriages. You'll never find anyone (except me) who will
>>call your attention to the fact that those data are unrelated to the
>>conclusion.
>>
>>The point is not that the numbers are wrong, nor are they apparently
>>"false" but both of them are intended to elucidate the behavior of a
>>certain group of people under certain circumstances but tell us
>>absolutely nothing about human behavior except that in the US (at
>>least) we tend to believe things if there are numbers attached to it.
>>
>>There are a million examples...many much more pointed than
>>these...and books are constantly being written on the application and
>>misapplication of statistics, but the central fact remains: If you
>>want someone to believe what you're saying, find a number that seems
>>to support it.
>>
>>Katie
>>--
>>
>>----------------
>>Katie Albers
>>[EMAIL PROTECTED]
>>________________________________________________________________
>>*Come to IxDA Interaction08 | Savannah*
>>February 8-10, 2008 in Savannah, GA, USA
>>Register today: http://interaction08.ixda.org/
>>
>>________________________________________________________________
>>Welcome to the Interaction Design Association (IxDA)!
>>To post to this list ....... [EMAIL PROTECTED]
>>Unsubscribe ................ http://www.ixda.org/unsubscribe
>>List Guidelines ............ http://www.ixda.org/guidelines
>>List Help .................. http://www.ixda.org/help


-- 

----------------
Katie Albers
[EMAIL PROTECTED]
________________________________________________________________
*Come to IxDA Interaction08 | Savannah*
February 8-10, 2008 in Savannah, GA, USA
Register today: http://interaction08.ixda.org/

________________________________________________________________
Welcome to the Interaction Design Association (IxDA)!
To post to this list ....... [EMAIL PROTECTED]
Unsubscribe ................ http://www.ixda.org/unsubscribe
List Guidelines ............ http://www.ixda.org/guidelines
List Help .................. http://www.ixda.org/help

Reply via email to