Interviewing


Most of the research described in this book boils down to one technique: the interview. Observation is critical, but to really know the user's experience, you have to ask him or her about it, and that's an interview. The usability interview—the other tool that's a basic part of nearly all user experience research—differs from the kind of interview an investigative journalist or a prospective employer would hold. It's more formal, more standardized, and as a kind of nondirected interview, tries to completely remove the perspective of the person asking the questions from the interview.

The Interview Structure

Nearly every user experience interview, whether it's a one-person lunchtime chat or a ten-person focus group, has a similar underlying structure. It's an hourglass shape that begins with the most general information and then moves to more and more specific questions before stepping back for a bigger perspective and concluding with a summary and wrap-up. Here is one way of dividing a standard interview process into six phases.

  1. Introduction. All participants introduce themselves. In groups, it's important to know that the other people in the group are somewhat like you in order to feel comfortable, so a group introduction emphasizes the similarities between all the participants, including the interviewer. In contrast, an individual interview introduction establishes the role of the interviewer as a neutral, but sympathetic, entity.

  2. Warm-up. The process of answering questions or engaging in a discussion needs everyone to be in an appropriate frame of mind. The warm-up in any interview is designed to get people to step away from their regular lives and focus on thinking about the product and the work of answering questions.

  3. General issues. The initial product-specific round of questions concentrates on the issues that surround the product and how people use it. The focus is on attitudes, expectation, assumptions, and experiences. Asking these kinds of questions early prevents the assumptions of the product development team from skewing people's perceptions. Often, the product isn't even named during this phase.

  4. Deep focus. The product, or product idea, is introduced, and people concentrate on the details of what it does, how it does it, whether they can use it, and what their immediate experience of it is. For usability testing, this phase makes up the bulk of the interview, but for contextual inquiry, where the point is to uncover problems, it may never enter the discussion.

  5. Retrospective. This phase allows people to evaluate the product or idea in a broader light. The discussion is comparable to the "General issues" phase, but the discussion is focused on how the ideas introduced in the "Deep focus" phase affect the issues discussed earlier.

  6. Wrap-up. This is generally the shortest phase of the interview. It formally completes the interview so that the participants aren't left hanging when the last question is asked, and it brings the discussion back to the most general administrative topics.

Warning

Do a dry run with every new interview script. Run through it with a colleague or a sample participant, complete with all recording devices and prototypes, and then revise it appropriately.

Nondirected Interviewing

A famous scientist once asked the following question on a survey:

Does your employer or his representative resort to trickery in order to defraud you of a part of your earnings?[1]

This is a leading question. Before you read on, think about what makes this a leading question. What in it implies a "right" answer? What is the actual information the author is trying to elicit? What would have to be different for the question not to be a leading question?

The scientist who wrote it was Karl Marx, and he clearly had an answer that he was expecting, and it wasn't "no."

Leading questions are the bane of all social research since they inject the prejudices of the person asking a question into a situation that should be completely about the perspective of the person answering it. But avoiding directed questioning is easier said than done. It requires a constant vigilance on the part of the person asking the questions and a deeply held belief in the need to know people's thoughts unconditionally.

Nondirected interviewing is the process of conducting interviews that do not lead or bias the answers. It's the process of getting at the user's thoughts, feelings, and experiences without filtering those thoughts through the preconceptions of the interviewer.

The Neutral Interviewer

As the person writing and asking the questions in a nondirected interview, your job is to step outside everything you know and feel about your product. Forget all the hard work and creativity. Put away all hopes for success and all fears of failure. Ignore everything you've ever heard or thought about it. See it in a completely neutral light, as if it's not yours at all. It's merely a thing you're asking questions about, a thing that you care nothing about.

This seems harsh, but it's necessary in order to be able to understand the feedback people give you, both positive and negative, and relate that to the process of making the product into what they want and need, not what you think they want and need. Otherwise, you'll always be seeing either the silver lining or the cloud, when you need to be seeing both.

Zen aside, asking questions so as to not bias the respondent's answer involves a lot of self-imposed distance and a rigorously critical examination of your assumptions. This can be especially difficult when the product under examination is one you are intimately familiar with or one you have a lot of interest in. At first, it's going to feel like you're expending a lot of energy not to ask the obvious questions or that your questions are coming out stilted. With some experience, it becomes clearer which questions lead people and how to phrase questions so that you get the most natural responses. Eventually—when you've achieved nondirected question enlightenment—your questions will sound natural, analysis will be easier, and the unbiased answers you get will give you greater confidence in your results.

Composing Nondirected Questions

Most important, every question should be focused on the person answering it. It should focus on experience, not extrapolation. Our understanding of our own behavior rarely corresponds to how we really behave. When we try to put ourselves into others' shoes, we idealize and simplify. That's useful in trying to understand people's ideals, but it's rarely useful in understanding their behavior. A question such as "Is this a useful feature?" can be easily misinterpreted as "In the universe of all things, do you think that someone somewhere could find some use for this feature?" Even if most people take it at face value, the potential of misunderstanding makes all replies questionable. "Is this feature valuable to the work you do right now?" clarifies the perspective.

Similarly, questions should concentrate on immediate experience. People's current behavior better predicts their future behavior than do their predictions. If you ask people "Is this interesting to you?" they may imagine that at some point they could find it interesting and say yes. But the things that are interesting in theory are quite different from the things that people will remember and return to. If they find something compelling right now, they're likely to continue to find it compelling. Thus, the responses to "If it were available right now, would you use it? Why?" will be more useful.

Questions should be nonjudgmental. The person answering the question should not think that you're expecting a specific answer or that any answer is wrong. You can (and should) state this explicitly, but it works better if the question reinforces that view. "Don't you think that this would be better if it was also available on PDAs?" implies that the person asking the question thinks that it would be a good idea and that they will disapprove if they hear otherwise. "If this feature were available tomorrow on PDAs, would you use it?" doesn't imply that there's an expected answer (though it suffers from being a binary question, as described later).An even better approach would be to ask, "Is there any other way you'd like to use a feature like this?" and then prompt to discuss PDAs after they've stated their initial thoughts.

Questions should be focused on a single topic. A question that has an "and" or an "or" linking two ideas leads to ambiguity since it's often unclear which part of the question is being answered. "How would this product be useful to you in school or at work?" is actually two questions. An answer to it may insufficiently differentiate between them.

Keep questions open-ended. If given a limited choice, people will choose one of the options, even if their view lies outside those options or if more than one is acceptable. They'll adjust their definitions of the options and pick the one that's closest to how they feel. But that's not how they really feel. You should always provide an out from a close-ended question, unless you're absolutely sure that the options cover all the possibilities. That's rarely the case since you're most often looking for the shades of meaning. "Which feature from the following list is most important to you?" assumes that there are features that are important, and it assumes that there is one that's more important than any other. A better way would be to say "Rate from 1 to 5 how important each of the following features is to you, where 1 is least important and 5 is most important. Put 0 if a feature is completely unimportant. Write down any features we may have missed" or, ignoring the feature naming scheme entirely, "Does the product do anything that's particularly useful to you? If so, what is it? What makes it useful?"

Avoid binary questions. They're an especially insidious form of close-ended questions. Binary questions are of the form "yes/no" or "true/false" or "this/that," and they force people to make a black- and-white choice when their attitude may not lie near either extreme. "Is this a good product?" misses a lot of the subtlety in people's attitudes. Although it may be nice to get a quick sample of people's off-the-cuff opinions, it's much more valuable to know what they find good and bad about the idea, rather than just whether they think the whole thing is good or bad. "What, if anything, do you like about this product?"

Running a Nondirected Interview

A nondirected interview is conducted just as you would any other interview, except that you have to listen more closely to the meaning of your words and the words of the person you're talking to for signs of bias. There are a number of things you can do to increase the quality of the responses.

Define terms. Words are ambiguous and easily misused. "That thing" can refer to a button, a feature, or the whole site. Personal definitions of words can be different from either the dictionary definition or the development team's definition. Someone may speak of a simple function as a "module," whereas the development team may call complex clusters of functions "modules." When using a technical term, make sure that you clearly define it first. Whenever possible, use the respondent's definition of a word (even if it's not how you use it), but make sure that you understand what that definition is first (which may mean asking the respondent to define it). This is especially important in group interactions, where everyone can come in with different definitions.

Don't force opinions. There are times when we just don't have an opinion about something. We may have never thought about a given question in qualitative terms, or we may not have enough information about it in order to form an opinion. When asked for an opinion, most people will form one, but it's not going to be carefully considered or deeply held. When asking a question that requires an opinion, it's good to make sure that the people answering are likely to have an opinion already. "Would this be better if it were done automatically?" may not make any sense to someone who has no experience with "this."

Restate answers. One of the best techniques to cut through problems with questions is to bounce the respondent's answer back at him or her using different words. It clarifies a lot of the subtlety of terminology and verifies that you've understood the answer and that the respondent understood the question. Immediately after someone has finished a thought, you can say something like "So I hear you saying that ..." and state it as you just understood it, but using different words. However, avoid substituting the "correct" terminology for the words that the person has used. Investigate his or her understanding of the terminology first. So if someone refers to the "order summary," but it's really the "confirmation page," ask the person to elaborate what he or she expects to find on an "order summary" before using the term confirmation page in restating the point.

Follow up with examples, but always wait for an undirected answer first. Sometimes people understand a question, but may not know how to start answering it. If you are precise with your wording, it shouldn't be an issue. Occasionally, though, you may want to ask a question that's intentionally broad, to see how people understand a concept or what their most general thoughts are. Prepare an example (or two) for the questions you feel may need examples. After the participants have given their initial answers, you can refocus their thoughts with an example. Say you're running a focus group that's brainstorming new features. If they're defining features too narrowly and seem to have reached an impasse, you can say, "Now what if it were to email you whenever items you liked were on sale?" and see if the participants can come up with other ideas along the same lines. Don't give more than a couple of examples since that tends to frame people's perceptions too strongly.

Use artifacts to keep people focused on the present and to trigger ideas. Artifacts are the material products of people's work: the notes, the papers, the tools, and so on. Bring participants back to their immediate environment by asking questions that have to do with the physical objects (or the software objects) that they deal with on a regular basis. When someone is talking about "shopping carts" in the abstract, ask about "this shopping cart." When you're in the field and they're talking about how a certain procedure is done, ask them to show it to you with the actual objects. The idealized situation people imagine and discuss in the abstract is often different from the practical situation in which they live, and the objects they use help remind them of the grungy details that are missing from the ideal.

Note

Observers can be present during interviews. Having an observer present makes the interview less intimate, but observers can be useful as note takers or just as a second set of eyes. The extent of their participation should be determined by the moderator, but there generally shouldn't be more than one in-room observer, and he or she should always be introduced. I've found that it works well to create special times when observers are allowed to ask questions.

Be aware of your own expectations. Watch for situations that surprise you or when you find yourself predicting the interviewees' next statement. Despite the exhortations at the beginning of this section, it's impossible to be a blank slate when coming into an interview situation. There are going to be things you assume or expect from the interaction, and these are going to affect how you run the interview. If you're aware of these assumptions, it makes avoiding them easier.

Never say the participant is wrong. Even if someone's understanding of how a product works or what it's for is completely different from what was intended, never tell the person that his or her perspective is wrong. Study the person's perspective and try to understand where it comes from and why he or she has it. It may well be that the person's understanding doesn't match others' or yours, but it's never wrong.

Listen carefully to the questions that are asked of you. Questions reveal a lot about how people understand a product or a situation, and they're important to understanding people's experience and expectations. Probe why people are asking the question. If someone asks," Is that how it's supposed to work?" for example, answer with a question that reveals more of the person's mental model: "Is that how you think it works?" or "Is that how you expected it to work?"

Keep questions simple, both in language and in intent. Use questions to uncover assumptions and perceptions, not prove points or justify actions. A good question does the minimum necessary to elicit a perspective or view, and no more. Analysis of the answers will provide the meaning that can prove and justify. Questions should focus on getting the clearest raw information.

Always review your tapes. It's easy to miss a key statement or a subtle distinction when relying on your memory and notes. Always spend some time with your tapes—whether audio or video— verifying that your views of the discussion accurately represent what happened and how future research can be conducted better.

Common Problems

  • Close-ended questions that should be open-ended. "Which of these three logos do you like the most?" is not particularly useful if they don't like any of them. "Is there anything you like about any of these logos?" will tell you what underlying characteristics people find compelling, if any. That will allow you to tailor the logo to those characteristics rather than to an arbitrary choice.

  • Questions with complex answers posed as binary questions. "Is the Daily Update an important feature to you?" ignores all the reasons it would or would not be. Maybe they don't plan on checking the site every day, but a weekly update would be great. Maybe there's no need for an update at all. "Is there anything about the Daily Update that you find interesting?" will tell you which parts of it are interesting.

  • Loaded words or words with multiple meanings. Be precise in the words that you use. "When you're trying to find something in a site and you get hopelessly lost, what do you do?" "Hopelessly" is imprecise. It can be interpreted by one person as meaning "pretty lost" and by another as "lost without any possibility of ever finding anything." Rewriting the question as "What do you do if, in course of looking for something on a site, you realize that you don't know how to get back to an earlier point?"

  • Asking people to predict the future. As mentioned earlier, when people try to project their actions into the future, they often oversimplify and idealize to the extent that their predictions have little to do with what they actually do. People are much better at explaining the reasons for their actions as they're doing them than they are at predicting their actions ahead of time. If you're interested in how someone will behave in a given situation, put him or her into that situation (or a suitable simulation).

  • Invocation of authority or peer pressure. For example," Most people say that it's pretty easy to find information with this tool. Was that your experience, too?" or "Our designers have a lot of experience making navigation tools, and they came up with this one. How well did it work for you?" These questions can almost always be simplified to the actual question being asked: "Describe your experience using this tool."

  • Assuming you know the answer. I've found myself half-listening to a response to a question, assuming that it's going to be a variant on what I've already heard, only to do a double take when someone answers in a way that I'm totally unprepared for. Sometimes people even use many of the same words as what you're expecting, but a critical negation or spin may reverse or fundamentally change the meaning of what they're saying. Listen carefully to every word.

  • Assuming that they can answer the question. Not everyone knows what they know and what they don't know. If you ask someone whether something is the best in its class, you're assuming that he or she is familiar enough with all the products in the class and that he or she can make a balanced, knowledgeable evaluation of all the products.

Problems don't just arise in the formulation of questions. The interpretation of answers also depends on the way questions are asked. There are a couple of behaviors to watch out for when asking questions, so that you can catch them and follow up quickly, making later analysis less ambiguous.

  • People won't always say what they believe. Sometimes they'll say yes to avoid conflict when they mean no. Watch for the clues about what they really mean. These can take the form of hesitant answers or answers that are inconsistent with previous statements. There can be even more subtle cues, such as someone shaking his or her head no while saying yes or suddenly losing articulation. Attempt to catch such situations as they're happening and ask the person to clarify. Often, just giving the person the floor gives him or her the confidence to say what he or she really means.

  • People will sometimes answer a different question from the one you asked. In a situation where someone is thinking hard about a topic—maybe because he or she is in the middle of a task or trying to remember a situation—he or she may easily mishear the specifics of your question. Sometimes participants have their own agenda and really want to discuss things you're not asking about. Listen carefully for what they're really saying and whether it's directly related to what you're asking. If it's clearly off track, interrupt, and ask the question again, using slightly different wording and emphasis. Don't be afraid to be persistent.

When to Break the Rules

Clearly, following all these rules and suggestions will make for a pretty dry conversation, and that may be worse than the bias it eliminates. People should feel comfortable talking to you and answering questions honestly. You should feel comfortable talking to them.

So take all these rules as suggestions when constructing your questions and try to follow through as much as possible. However, feel free to improvise and humanize your interviews by providing examples or letting the participant "off the hook" if a question seems too difficult to answer as it was posed. An interview can be both nondirected and comfortable. Ultimately, the best interview is the one that provides the information you need when you need it. What it takes to do that will be different in every interview. These rules and guidelines will help you get the best information you can, but only you will know how to implement them appropriately.

start sidebar
Videotaping Interviews

Every single interview and interaction should be videotaped, if at all possible. Many people consider video documentation a fancy form of audio recording. Sometimes that's true, but it can reveal crucial moments in any interaction that just can't be captured on audio. A crucial shrug while someone is saying yes, but they really mean no, can be the crux in understanding the person's perspective correctly. A momentary pause of a mouse over one button before clicking on another can reveal the core confusion in a feature. Plus, it frees the moderator from having to simultaneously take notes and think about moderating.

Videotaping is quite inexpensive and, if introduced and placed carefully, quickly disappears into the background for most people, so it's a relatively unobtrusive technique. The video camera can be introduced in the beginning of the interview, placed on a tripod in an inconspicuous location, and the interview can continue normally. The tape then becomes a permanent record that can be mined for critical nuances and exact quotations (both verbal and physical).

Photography uses less equipment and allows you to collect a close-up record of specific items and arrangements in an interview, but it creates a disruptive process where the researcher stops the flow of conversation in order to take a picture. However, in some situations—such as on-location contextual inquiry interviews in security-conscious organizations—it's the only way to document. In those cases, it should be coupled with an audio recording of the interview.

end sidebar

[1][T.B. Bottomore and Maximilien Rubel, eds., Karl Marx: Selected Writings in Sociology and Social Philosophy (New York: McGraw-Hill, 1956), p. 208; as cited in Earl Babbie, Survey Research Methods (Belmont, California: Wadsworth, 1990), p. 37]




Observing the User Experience. A Practioner's Guide for User Research
Real-World .NET Applications
ISBN: 1558609237
EAN: 2147483647
Year: 2002
Pages: 144

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net