Menu

Group Activities to Demonstrate Usability and Design

by Jared Spool

Over at the IxDA Discussion List, Benjamin Ho asked about activities he could use at the end of a presentation he was giving at his company’s annual user conference. I got thinking about different exercises we use when we’re training and thought this was a good time to share some of them.

Activity Option #1: Making a PB&J Sandwich
Minimum Time: 20 minutes
Goal: To enforce the importance of clear user assistance

This is a classic. (I first saw it demonstrated in 1972 by my sixth grade English teacher and I think it’s the only thing about her class I retained.) You ask each attendee to write down instructions for assembling a peanut butter & jelly sandwich. Then, taking the raw materials (bread, peanut butter, jelly, a knife) and a randomly chosen set of instructions, you proceed to follow the directions LITERALLY.

For example, if the author never mentions removing the bread from the package, you proceed with to assemble it with the bread still in the bag. “Put the jelly on the bread” is funny in that context. The more literal you interpret the instructions, the funnier it gets. Make the point that this is what real people do when they don’t realize it.

Activity Option #2: Testing Lego Construction
Minimum Time: 40 minutes
Goal: To enforce the benefits of usability testing

We use this for training people on simple observation and moderator skills. We purchase inexpensive Lego sets (well, as inexpensive as it gets, like this one) and have small teams conduct a sample usability test, with one person assembling the kit and two others acting as observers (or one as moderator, if we’ve done the training).

If you can’t get the budget for Lego sets, it also works with origami sets (and there’s a ton of origami instructions on the web).

Lego Police Motorcycle Kit

Activity Option #3: What’s Changed?
Minimum Time: 10 minutes
Goal: To help participants see the impact of the work you’ve done

This is a good way for people to see how you’ve had an impact on their work. Show before and after screen shots of designs you’ve worked on, without explaining the differences. (Ideally, you can display them simultaneously on two screens or have high-res printouts they can compare side-by-side.)

Have the audience suggest differences. Then, ask them to provide reasons why you might’ve made them. You can compare their rationale to yours. It’s a good opportunity to explain the research you’ve done and how it has influenced your approach to design.

Activity Option #4: The Focus Quiz
Minimum Time: 15 minutes
Goal: To demonstrate how focus can change during observation

We use this to train teams on how to observe during field studies. (I wrote about it back in 2006.) You give each person a different criteria to observe in the room (such as “all the round items”) and ask them to write them down.

Then, you have the people with the same criteria to name objects they observed without naming the criteria. Everyone else tries to guess the criteria. It’s a demonstration of how you notice some things only when you’re trying.

Activity Option #5: Guess the Reason
Minimum Time: 15 minutes
Goal: To show the differences between observations and inferences

We use this to train teams on the difference between an observation and an inference. You display a screen shot and cite a specific observation from testing or analytics, such as “6 out of 8 participants we observed didn’t scroll beyond the first screen.”

Then you ask the audience to suggest reasons why this might’ve happened. What was it that made the users behave that way? We use the different answers to show that different inferences could result in different changes to the design. We then talk about how we’d construct research to identify which inference is the one we should design for.

Activity Option #6: Human Bar Charts
Minimum Time: 15 minutes
Goal: To demonstrate the range of individual differences and to collect data on audience diversity

This is a new exercise we just started doing. It has the benefit of demonstrating how people are different, while giving us some data on our audience. We pass out a survey with scales, such as “On a scale of 1 to 5, rate how important these features are to your work” (and then we list 5-10 features that the audience would use).

We’ve placed the numbers 1 through 5 on the wall. We ask the audience stand next to the numbers that represent their rating for each question. It’s fun to see people move around, plus it helps you see the areas where everyone agrees and where people are diverse.

Jeff Patton told me he’s done this with two dimensions simultaneously. He created two 1-to-10 axis on the floor, then had attendees in his workshop stand at the intersection of “How well their organization implemented Agile techniques” and “How well their organization implemented UCD techniques”. It gave him a great snapshot of how many folks were well versed in both issues. (During the exercise, he used the mic to have some of the “outliers” explain what their organizations were or weren’t doing.)

Activity Option #7: KJ Analysis
Minimum Time: 40 minutes
Goal: To identify top issues surrounding a focus question

If you’ve got 40 minutes and a good wall for post-its, you can do a KJ analysis. Posing a focus question (such as “What’s the most important change you’d like to see in our product?”), you have groups of 8-10 folks walk through the brainstorming and organizing steps, concluding with ratings.

The largest audience I’ve done this with is about 340 people (34 teams of 10 in a very large ballroom). Every team worked on the same focus (“What can we do to improve our field?”) question and practically every team came up with the same top 3 answers. It was amazing how much consensus there was, even though everyone worked in separate teams.

I’d be interested in any exercises you’ve come up with. Finding new ways to talk about what we do in interesting and engaging ways makes me very happy.