On Oct 2, 2008, at 12:25 PM, Benjamin Ho wrote:

I work for Tyler Technologies - developing software for the public
sector.  Our users are financial, operations, HR, payroll etc.

We're presenting a session on how usability and UCD/UXD influence
the design of our products and hence, how our users work better
because of it.

To top it off, we thought an interactive component would help them
understand what this means through [their own] experience.

Hi Benjamin,

I think I have a good idea of what you're trying to do right now. There's a bunch of activities you could do.

I didn't know how much time you wanted to allocate to this portion of your presentation, so I've included some time estimates to help decide. Also, you weren't clear for what you wanted the attendees to get out of the exercise, so I've the potential goal.

Activity Option #1: Making a PB&J Sandwich
Minimum Time: 20 minutes
Goal: To enforce the importance of clear user assistance

This is a classic. (I first saw it demonstrated in 1972 by my sixth grade english teacher and I think it's the only thing about her class I retained.) You ask each attendee to write down instructions for assembling a peanut butter & jelly sandwich. Then, taking the raw materials (bread, peanut butter, jelly, a knife) and a randomly chosen set of instructions, you proceed to follow the directions LITERALLY.

For example, if the author never mentions removing the bread from the package, you proceed with to assemble it with the bread still in the bag. "Put the jelly on the bread" is funny in that context. The more literal you interpret the instructions, the funnier it gets. Make the point that this is what real people do when they don't realize it.


Activity Option #2: Testing Lego Construction
Minimum Time: 40 minutes
Goal:  To enforce the benefits of usability testing

We use this for training people on simple observation and moderator skills. We purchase inexpensive Lego sets (well, as inexpensive as it gets, like this one: http://is.gd/3x0B) and have small teams conduct a sample usability test, with one person assembling the kit and two others acting as observers (or one as moderator, if we've done the training).

If you can't get the budget for Lego sets, it also works with origami sets (and there's a ton of origami instructions on the web).


Activity Option #3: What's Changed?
Minimum Time: 10 minutes
Goal: To help participants see the impact of the work you've done

This is a good way for people to see how you've had an impact on their work. Show before and after screen shots of designs you've worked on, without explaining the differences. (Ideally, you can display them simultaneously on two screens or have high-res printouts they can compare side-by-side.)

Have the audience suggest differences. Then, ask them to provide reasons why you might've made them. You can compare their rationale to yours. It's a good opportunity to explain the research you've done and how it has influenced your approach to design.


Activity Option #4: The Focus Quiz
Minimum Time: 15 minutes
Goal: To demonstrate how focus can change during observation

We use this to train teams on how to observe during field studies. (I wrote about it here: http://is.gd/3wZL) You give each person a different criteria to observe in the room (such as "all the round items") and ask them to write them down.

Then, you have the people with the same criteria to name objects they observed without naming the criteria. Everyone else tries to guess the criteria. It's a demonstration of how you notice some things only when you're trying.


Activity Option #5: Guess the Reason
Minimum Time: 15 minutes
Goal: To show the differences between observations and inferences

We use this to train teams on the difference between an observation and an inference (which I wrote about here: http://is.gd/3x2e). You display a screen shot and cite a specific observation from testing or analytics, such as "6 out of 8 participants we observed didn't scroll beyond the first screen."

Then you ask the audience to suggest reasons why this might've happened. What was it that made the users behave that way? We use the different answers to show that different inferences could result in different changes to the design. We then talk about how we'd construct research to identify which inference is the one we should design for.


Activity Option #6: Human Bar Charts
Minimum Time: 15 minutes
Goal: To demonstrate the range of individual differences and to collect data on audience diversity

This is a new exercise we just started doing. It has the benefit of demonstrating how people are different, while giving us some data on our audience. We pass out a survey with scales, such as "On a scale of 1 to 5, rate how important these features are to your work" (and then we list 5-10 features that the audience would use).

We've placed the numbers 1 through 5 on the wall. We ask the audience stand next to the numbers that represent their rating for each question. It's fun to see people move around, plus it helps you see the areas where everyone agrees and where people are diverse.

Jeff Patton told me he's done this with two dimensions simultaneously. He created two 1-to-10 axis on the floor, then had attendees in his workshop stand at the intersection of "How well their organization implemented Agile techniques" and "How well their organization implemented UCD techniques". It gave him a great snapshot of how many folks were well versed in both issues. (During the exercise, he used the mic to have some of the "outliers" explain what their organizations were or weren't doing.)


Activity Option #7: KJ Analysis
Minimum Time: 40 minutes
Goal: To identify top issues surrounding a focus question

If you've got 40 minutes and a good wall for post-its, you can do a KJ analysis (http://is.gd/3x32). Posing a focus question (such as "What's the most important change you'd like to see in our product?"), you have groups of 8-10 folks walk through the brainstorming and organizing steps, concluding with ratings.

The largest audience I've done this with is about 340 people (34 teams of 10 in a very large ballroom). Every team worked on the same focus ("What can we do to improve our field?") question and practically every team came up with the same top 3 answers. It was amazing how much consensus there was, even though everyone worked in separate teams.


Hope that all helps.

(Wow! Don't be surprised if this shows up on my blog. I didn't really mean to write so much, but then I really go into it. :) )

Jared

Jared M. Spool
User Interface Engineering
510 Turnpike St., Suite 102, North Andover, MA 01845
e: [EMAIL PROTECTED] p: +1 978 327 5561
http://uie.com  Blog: http://uie.com/brainsparks

________________________________________________________________
Welcome to the Interaction Design Association (IxDA)!
To post to this list ....... [EMAIL PROTECTED]
Unsubscribe ................ http://www.ixda.org/unsubscribe
List Guidelines ............ http://www.ixda.org/guidelines
List Help .................. http://www.ixda.org/help

Reply via email to