Hi Jared,

How are you? It didn't seem you were alone in the "accusing" (your word;-)) 
camp.

I ALWAYS explain to clients that:

- ET does not equal measuring "seeing" (because seeing is a cognitive action), 
it's the CORRELATION between "seeing" and point of regard fixations and 
saccades we're measuring.  

- ET measures foveal point of regard and NOT peripheral vision which is ALSO 
used by people to gather information about the stimulus whether it's a screen, 
a room or anything else, so yes, you can "see" things off fovea (whether 
they're actually there or not is another question;-))

- Calibration quality in ET is key if we are to reduce error margins to 
acceptable levels - error margins basically translate into a drifting of 
correspondances across the X and Y coords. We need to take this into account 
when defining Areas of Interest for analysis.

- ET sampling rate is another - different machines have different rates, so yes 
there can be missed data.

- Look out for how methodology can change behaviour (the "think aloud" vs 
"silent task" issue)

- ET results should be addressed ONLY within the scope and context of the tasks 
that were given to the respondents, i.e. don't use results from one task to 
imply something else. 

- Usual caveats about sample sizes (qual vs quant) and statistical projection 


Where I'm finding ET really interesting is with larger sample sizes. 

We're looking right now at examples coming from the 100-person study about 
online surveys (a whole 'nother controversy;-)) we did earlier this year. 
What's interesting about that is we have ET data AND survey answers - which 
themselves infer that respondents "read" questions and "saw" labels because 
they selected items and input text answers too. Seeing how these line up - or 
not -  is really providing some interesting learnings.

Have a great day!

Kate 
[email protected]
+1 514 502-5862






________________________________
From: Jared Spool <[email protected]>
To: Kate Caldwell <[email protected]>
Cc: [email protected]
Sent: Friday, August 21, 2009 9:40:42 AM
Subject: Re: [IxDA Discuss] Eye-Tracker software/hardware recommendations


On Aug 20, 2009, at 9:43 AM, Kate Caldwell wrote:

> I have an SMI system in our facility in downtown Montreal.
> I'm very interested in the discussion. The pros and cons of using ET
> for usability testing seem pretty well described above.
> 
> At the same time, I dislike what I understood as the suggestion that
> some practitioners are using ET to con clients. NO methodology or
> tool should be offered (honestly) without being clear about its
> deliverables, benefits and limitations.


Kate,

I agree. Given that I'm the one making the suggestion, (and I think it was more 
of an accusation than a suggestion,) I'd like to say that I also think that we 
need to be honest about what we do, especially to ourselves.

I'd be interested in hearing the disclaimers you give your clients before 
presenting inferences from eye tracking data.

Jared


      __________________________________________________________________
Be smarter than spam. See how smart SpamGuard is at giving junk email the boot 
with the All-new Yahoo! Mail.  Click on Options in Mail and switch to New Mail 
today or register for free at http://mail.yahoo.ca
________________________________________________________________
Welcome to the Interaction Design Association (IxDA)!
To post to this list ....... [email protected]
Unsubscribe ................ http://www.ixda.org/unsubscribe
List Guidelines ............ http://www.ixda.org/guidelines
List Help .................. http://www.ixda.org/help

Reply via email to