Re: [IxDA Discuss] Asking questions to participants in a positiveor negative way ?

2008-05-25 Thread Caroline Jarrett
I wrote:

: You ought to allow users to have the opinions that they have - even if
:  those opinions include 'don't know' or 'don't care' (or
:  both).
: 
:  The answer options you offer should depend solely on the answers that your
:  users want to give - not upon how many users there are.
: 
:  If you don't know what answers your users want to give, then interview them
:  to find out before running your survey. And by the way -
:  you should do that anyway (i.e., interview some users first)  if you want
:  anything like good results from your survey.
: 
:
And Chiwah asked:

: Do you mean that when a user chooses neutral for a question, it has a
: meaning? And if most of my users choose neutral, it means that my question
: is wrongly formulated? Then in both case should I interview them to know why
: they choose the neutral option?
:
: But in this case, does that mean that I should include for each question a
: checkbox asking if they don't care, don't know and if they felt sometime one
: aspect or another?
:
Possibly. It is definitely the case that users choose 'neutral' for many 
reasons other than that they are neutral.

It is also definitely the case that you should interview users on the topics 
that you want to survey. Surveys aren't a way of 
finding out users' opinions. They are a way of finding out how opinions are 
distributed in a population. If you choose the wrong 
opinions to ask them about, you will get poor results.

For example, a classic way to get poor results is to ask users a series of 
questions on a topic that they don't know about or don't 
care about.

But it's not necessarily a good idea to include specific checkboxes for 'don't 
know' and 'don't care' with _each_ question. It might 
be that they don't know or don't care about the whole topic. It might be that 
don't know is a commonly held view for some of your 
questions, with don't care being rare - but that don't care is common for other 
questions, and don't know is rare. It might be that 
'do have a view but don't want to give it to you' is a common opinion.

The only way to find out is to interview some users to get a feeling for the 
types and ranges of opinions that they do have. Then 
you construct your questions. Then you test your questionnaire, and interview 
the test participants about it. By this point you have 
a good chance of getting a decent questionnaire put together and that's half 
the battle of a survey.

(The other half of the battle of a survey is deciding what you want to find out 
about in the first place, getting a good sampling 
strategy, analysing the pilot and actual data, and reporting, and doing 
something about what you find).

Aside: one classic mistake people make is to think: we don't have time to do 
any face-to-face user research such as usability 
testing or field studies - we'll survey them instead'. But in fact, a good 
survey is at least 10, and sometimes nearer 100, times 
harder and more time-consuming than a few field studies or a bit of usability 
testing.

A second aside: I use the term 'survey' to mean the end-to-end process of 
gathering user data or opinions using a predetermined set 
of questions, including the process of deciding what the questions should be. I 
use 'questionniare' to mean the predetermined set of 
questions itself. Many survey methodologists use the term 'instrument' instead 
of 'questionnaire'.

Best,
Caroline Jarrett
[EMAIL PROTECTED]
07990 570647

Effortmark Ltd
Usability - Forms - Content

We have moved. New address:
16 Heath Road
Leighton Buzzard
LU7 3AB 



Welcome to the Interaction Design Association (IxDA)!
To post to this list ... [EMAIL PROTECTED]
Unsubscribe  http://www.ixda.org/unsubscribe
List Guidelines  http://www.ixda.org/guidelines
List Help .. http://www.ixda.org/help


Re: [IxDA Discuss] Asking questions to participants in a positiveor negative way ?

2008-05-21 Thread Caroline Jarrett
From: chiwah liu [EMAIL PROTECTED]
:
: I don't know if I am right, but for me, the neutral option depends on the
: number of users :
: - If we don't have enough user to reach a statistical significance (let's
: say less than 100 users) for our survey, we should add a neutral option.
: The users who don't have any idea can bias the survey.
:
: - Now if we have enough user to reach a statistical significance (200-300+
: users), we can force them to choose because they should give a random
: answer. That mean if my scale is between 1 and 4, I should have the same
: number of users that answer 2 than those who answer 3. If this case happens,
: then I can suppose that users don't really have idea about the answer.
: Otherwise, they might have preferences and it shouldn't be biased because it
: is be statistically significant.
:
:
No. I think the phrase 'force them to choose' shows exactly why this is a bad 
idea.

You ought to allow users to have the opinions that they have - even if those 
opinions include 'don't know' or 'don't care' (or 
both).

The answer options you offer should depend solely on the answers that your 
users want to give - not upon how many users there are.

If you don't know what answers your users want to give, then interview them to 
find out before running your survey. And by the way - 
you should do that anyway (i.e., interview some users first)  if you want 
anything like good results from your survey.

There's a longer version of my views at:
http://www.usabilitynews.com/news/article1269.asp

Best
Caroline Jarrett
[EMAIL PROTECTED] 



Welcome to the Interaction Design Association (IxDA)!
To post to this list ... [EMAIL PROTECTED]
Unsubscribe  http://www.ixda.org/unsubscribe
List Guidelines  http://www.ixda.org/guidelines
List Help .. http://www.ixda.org/help