The Presentation


The way results are presented is nearly as important as the report that summarizes them, maybe more important. The complexity of a product's user experience is often difficult to understand from a description, and functionality details are often easier to explain with a careful demonstration than with a text description.

But presentations are more than just readings of the report. Creating and delivering effective presentations is an art to which this book can do little justice. From high school speech classes to motivational executive speeches, there is a lot of presentation advice in the world. That said, there are a couple of points that are particularly appropriate to user research presentations.

  • Prepare your audience. Before talking about specifics, the audience should be put into a mind-set that helps them understand the information they're about to get. Apart from giving a quick summary of the procedure, I usually give a variation of the following speech: "The results of this user research will not make your product problem-free and universally loved. They are issues that we observed while talking to people and watching them use your product. By fixing these problems, you will be making an incremental change that we believe will make your product more in tune with the needs and abilities of your audience. But that doesn't change the fundamental nature of the product or the business model that it rests upon. If it's something that people don't want, then they won't want it no matter how likable or easy to use it is."

  • Pick your points carefully. Carefully choose the observations you're going to highlight. For an hour-long presentation, one or two major themes backed up with five to ten important points and another five or so secondary points is a lot of information for most audiences and gives little time for digressions or questions. This means that generally only the "must know" topics get covered.

  • Use real examples. Seeing a problem "live" is much more effective than having it described. The closer the audience gets to seeing the exact behavior or attitude, the better. A video-tape of several groups of people saying the same thing or people making the same mistake over and over is much more effective than a description. When that's not possible, a quick demo of the problem using the actual product is also pretty effective. If that's not possible, then a description using actual participant names and quotations might be used. For example, "Jim spent three minutes examining the front door before he gave up and said that there must not be a search interface. When it was pointed out to him, he spent another minute apologizing for being 'so stupid' despite the moderator's statements that it wasn't his fault." Any such description or demonstration, however, should not last more than a minute or two.

  • Emphasize the user market's perspective. Whenever people start creating solutions, or even when they try to understand the nature of problems, they naturally begin with their own perspective. Since this is almost never the same perspective as the users', this can lead to erroneous or distorted conclusions by the development team. When results are presented, they should be presented with a strong emphasis on the user perspective. There should be a thematic thread throughout the presentation underscoring the differences between the users' ideas and the ideas of the development team.

  • Use terminology sparingly, and only after defining it. When people hear a lot of words they don't understand, they feel either out of touch or condescended to (or both). Sometimes, however, it's useful for the audience to know a term for a technical concept because it's important in understanding the report. A single sentence defining information architecture as "the discipline of categorizing and organizing information in a way that makes sense to a given audience" can make its use much more palatable. In addition, know the audience's terminology, and highlight differences between their definition and that used in the report. A group of radio editors, for example, considered the word headline to refer only to the title of an important breaking news story, whereas the designers used it to refer to a text presentation style. There was a lot of confusion about the implications of the experience research before the terminology difference was discovered.

  • Use numbers carefully. Humans tend to see numbers as objective and absolute, even when it's explained that they're not. Numbers, histograms, and pie charts look good in presentations, but unless they're backed by hard data, try to avoid them. Otherwise, the audience may fixate on the numbers as an exact measurement rather than as a presentation of ideas.

  • Leave a third of your time for questions. If a presentation ends early and there are no questions, no one will complain, but if an audience has questions and no time is left for discussion, they are likely to be upset. Anticipating what an audience is going to ask and producing "canned" answers can be quite beneficial.

  • Always practice. Do the whole thing from beginning to end by yourself at least once, and at least once in front of someone else.

Presenting to Specific Groups

Different audiences require different approaches when presented information. The exact needs of each audience are going to be unique, but there are some generalizations that can be made about certain specific groups.

Engineers

The software engineer's traditional role is as a problem solver. The waterfall development process is completely geared to this, and it's difficult to think about development otherwise. Engineering time spent investigating the underlying causes of problems is seen as more wasteful than time spent creating solutions to ameliorate them. In most cases, problems are assumed to have been understood in the specification stage and need no further research by the time they reach engineering. This perspective creates an engineering culture that's almost entirely focused on solutions, not problems.

When presenting to engineers, this solution-focused perspective should be expected. It's important to prepare the group with instructions on how to understand and interpret the results. Early on in the presentation, I usually say something like, "This is not a laundry list of fixes that should be addressed and forgotten, it's a group of symptoms that point to underlying problems. Sometimes you can cure the problems by treating the symptoms, but often you need to understand the root cause in order to effectively treat the problem." This is intended to get the audience thinking about the larger context of the product's problems rather than sketching code in the report margins.

The "larger context" philosophy can be taken too far, however, and can cause people to conclude that making any change is futile unless the whole product is revamped from scratch (which, in most cases, will never happen). Explanations of wide contextual issues should be balanced with short-term changes that can immediately improve the user experience.

Engineering is a fairly scientific discipline, and engineers are often better versed in the procedures and philosophies of the natural sciences than the social sciences. Thus, they are often dubious of the nonstatistical methods used in much user experience research. Explaining the procedures and why they are valid can anticipate and address many engineers' reservations. Likewise, it's useful to focus on facts rather than introduce hypotheses or opinions since concrete facts will tend to carry more weight with the audience.

Be especially careful about presenting suggestions that have already been discussed and discarded by the engineers, unless your recommendations are derived from a drastically different way of examining the problem. For example, an interface development group discarded creating a portion of an interface using Macromedia Flash because they felt that their users' computers and network connections would be too slow even though it would have provided for an easier development environment and a more elegant experience solution. A survey of the site's audience showed that the vast majority of their primary user base had high-speed network connections and late-model computers. They would likely have the technical capabilities to use Flash, but since the engineering group had already discarded Flash as a solution, its reintroduction required an additional explanation of why it would work.

Engineers also tend to want to solve the problem for all possible cases, which tends to encompass all possible users and uses. "What about users with text-only browsers?" is often heard as a typical example of this perspective. Of course they're right, and it is bad to intentionally cut out any portion of the user population, but the constraints imposed by attempting to satisfy all possible audiences usually require expending more resources than the rewards of catering to those audiences. Telling the engineering audience the reasoning behind the user profile and that user research is merely a prioritization of needs, not a complete list, can help them accept it as a subset of "all possible users."

Visual Designers

Designers tend to fall asleep as soon as you show them a task analysis, so I try to keep them awake at all costs.

—Victoria Bellotti, senior scientist at Xerox PARC (personal email)

Even though visual designers would seem to be of the opposite temperament to engineers, their perspectives overlap in one important aspect. The designers' role has also traditionally been defined as a problem solver. They create solutions to a different class of problems than engineers, but based on specifications nonetheless. They're likely to sketch in the margins, too, but they'll be scribbling identity and interface elements rather than object hierarchies.

In addition to being solution oriented, many visual designers have moved into interaction from identity design (often from print). The problems facing identity designers more often involve creating a strong initial impact and communicating a single message rather than the sustained series of interrelated messages that software design often requires. Friction can arise because good design of the latter does not imply good design of the former. Preparing an audience of designers may require first explaining and emphasizing the distinction between the identity of a product and its functionality. I sometimes use a writing analogy: the style of the writing and its grammatical correctness are not at odds with each other. Grammar is the functionality of a document, whereas style is its identity. Spell-checking a document does not affect meaning, and user research is like spell checking.

Note

If e. e. cummings and William S. Burroughs are brought up as counterexamples, point out that the purposes of their art were different from that which most Web sites are designed for. In this literary analogy, most Web sites are more like the Sharper Image catalog than Naked Lunch.

Once this distinction has been established, designers often need to be reassured of their role as problem solvers in this interactive domain. One of the most painful things for a designer (or any other solution-oriented profession) is to work by committee. When created by a group, a solution often ends up with the lowest common denominator, the least elegant option. User experience research does not replace—and should not imply that it is going to replace—any of the designers' problem-solving creativity or their visual authority. Research is not a method of creating solutions, but a method of prioritizing problems. I sometimes say, "It's likely that none of the problems you see here are new to you. What this process does is help prioritize problems before you start solving them. Even if nine of ten users say they hate the bright green background color and love earthtones, that does not mean that you need to make the interface brown. What matters is that users pay attention to color, not their choice of colors."

Designers, in general, are not as concerned about the scientific validity of research as much as its face validity (it feels correct) and consistency. Thus observations and recommendations should immediately make sense and fit into the general model under which a site is being designed.

Whenever possible, use visuals and interactivity to support your observations. Ideally, the designers would actually observe some of the research in real life. Video highlights and live demonstrations of phenomena are also effective, as are screen shots. Jeff Veen, a user experience researcher and designer, prefers to create a workshop environment in the presentation, instead of a lecture. As Jeff says, "It's hard to sit in a conference room and write code, but you can sit in a conference room and move things around." Problems can be examined, and rough functionality can be mocked up quickly. It's possible to iterate quickly based on the results of the research, giving designers concrete things to work on while presenting the results of the research.

Marketing

The practice of marketing is focused on understanding desire. Marketing involves satisfying a target audience's desires by letting them acquire or use a product or service.

The tools used by marketing researchers are often quantitative and comparative. Surveys are used to create lists of competitors and competitive attributes. Since user experience research often focuses on the reasons behind choices, rather than just the choices themselves, it can inform marketing researchers trying to understand the data they've gathered. Marketing communications staff can find the information useful in crafting the image of the product they want to present.

Thus, when user research results are presented to a group of marketers, they should be focused on the reasons behind people's choices. Why are they attracted to the product? What drives them away? What keeps them coming back? What can drive them away forever? User experience research presents a cohesive, consistent mental model that helps explain users' existing behavior and expectations while helping to predict future behavior and uncovering barriers that prevent them from purchasing the product (or joining the service, or whatever is important to the short-term business success of the product). A summary of the mental model should probably be the most important point you make in the presentation.

Upper Management

As an audience, management's perspectives can be greatly varied. Some managers are concerned only with the long-term, large-scale issues that affect their companies and products. Their concerns are strategic, and only the topics that affect their perspective (their "50,000-foot view") are important. Others are much more involved in the minutia of how their product works and how their business is run. They feel most comfortable when they know the gritty details of how their customers perceive their product.

Effective presentations are geared toward the interests and perspectives of the participating executives. Their agendas (as far as you know them) should be taken into account when crafting your presentation. As the expert source for information affecting the product, you should be aware of what's really important to them.

The easiest way to find this out is to ask them. Arrange for several short phone calls with the key stakeholders who are going to be in the meeting. Introduce the subject of the discussion, and ask them if there are any topics that they're particularly interested in or questions that they've wanted answered. A frequent request is for numerical metrics (as described in Chapter 18); these are used to gauge success and to investigate the value of the research. Finding out which metrics are most critical to management (is it the number of new leads, the number of new people signing up for the service, the amount of resources expended on support?) can set the cornerstone for the report and presentation.

In structuring the actual presentation, don't gloss over or simplify important findings, but present the information in such a way that it can assist managers in decisions that they make. Punctuating discussions of general trends with examples and recommendations is often more effective than presenting lists of recommended changes or completely abstract discussions.

Allow plenty of time for questions. Even with background research on your audience's interests, it's often difficult to predict what information people will need, and since company executives have a broad range of responsibilities and interests, their concerns are especially difficult to predict.

Common Problems

A number of problems regularly occur in presentations that can lead to all kinds of uncomfortable stammering and hedging by the presenter. Anticipating these and preparing for them can reduce a lot of on-the-spot consternation.

  • "This is not statistically significant!" This is often heard when presenting qualitative results from focus groups, user tests, or contextual inquiry. It's true, the results in these research methods are most often not statistically significant, but that's not the purpose of the research. The goal of qualitative research is to uncover likely problems, understand their causes, and create credible explanations for their existence, not to determine the precise proportion of problems within the population at large. Another way to counter this objection is by showing that different kinds of research—say, usability tests and log file analysis—show the same behavior. If several different research techniques show people behaving in the same way, it's much more convincing that that's how they actually behave.

  • Conflicting internal agendas. Sometimes there are conflicting agendas in the company that get revealed through user confusion. For example, there might be friction between the online sales group and the dealer support groups. One is trying to increase the company's direct sales effort while the other is trying to appease a dealer network that feels threatened by the direct sales. The site may reflect the designers' attempts to relieve this tension by creating a bifurcated interface, but this shows up as a major point of confusion to the users. In such a situation, describing the user confusion may trigger a surfacing of internal tensions, with the research results serving as a catalyst. If you do not work for the company, it's best to stay out of the discussion and serve as an impartial resource to both camps. However, it's also important to make sure that all sides understand the meaning and limitations of the research. Defend your findings and clarify the facts of your research, but let the two groups decide the implications of those facts. If you're part of the company, however, then it's especially important to defend the impartiality of the research.

  • "This user is stupid." This can be stated in a number of ways: "the user is unsophisticated," "the user doesn't have the right amount of experience," and so on. If the research has been recruited based on a target audience profile that was accepted by the project team, then the users are members of the target audience and either their experience needs to be respected as the experience of the target audience or the target audience needs to be specified further.

  • "User X is not our market." Similarly, if the target audience was rigorously defined at the start of the research and the recruiting followed the definition, then User X should be a member of the target audience. If the audience feels otherwise, then it's important to discuss the definition of the target market and how the user does not fit that. It's also useful to discuss whether the observed problems are caused by the specific individual or whether they can occur in a larger segment of the market.

  • "User X did Y; therefore, everyone must do Y" (often followed by a variation of "Told you so!"). This is overgeneralization. Although trends observed in the behavior of several users underscore issues to be studied and phenomena to be aware of, a single data point does not qualify as a trend. Thus, one user's behavior (or even several users' behavior) could be interesting and could point to a potential problem, but it doesn't represent the experiences of the population at large. If there's an internal debate about the prevalence of a certain phenomenon, a single instance one way or another will not resolve it (but it may be a good thing to research as part of a future project). As Carolyn Snyder, principal at Snyder Consulting, says, "From one data point, you can extrapolate in any direction."

  • "They all hated the green, so we need to make it all white, like Yahoo." People, especially solution-oriented people, often tend to tangent off of a superficial observation, assuming that solving it solves the underlying problem. It's the old "treating the symptom, not the disease" problem and should be avoided. Steer discussions away from specific solutions while stressing underlying issues.

  • Be aware of stealth problems. These are frequently severe problems that affect the user experience but that participants don't complain about or are so fundamental to the idea of the product that they're never discussed, by either the developers or the research participants. These problems can be difficult to uncover and discuss without questioning the basic nature of the product. One site, for example, consolidated the process of contacting and donating to nonprofit organizations, yet was itself a for-profit company. People universally rejected its for-profit nature, and the shadow of that rejection was cast over the experience of every user's examination of the site's feature set even though it was rarely mentioned outright. Any discussion of readability or navigation paled in comparison to the importance of discussing how to present the site's fundamental nature such that the audience did not reject it.

Understanding the desires and expectations of the presentation audience is not very different from understanding the needs of a software product's audiences. The media are different, but the issues are the same. In general, the best strategy to overcome people's doubts and answer their questions is to understand them and anticipate their needs. Sometimes, this can be in the form of expectation setting, but mostly it's in preparation. Knowing your audience's agendas and questions is more than just good showmanship; it's critical to moving a research project from being considered interesting information to being seen as an indispensable tool.




Observing the User Experience. A Practioner's Guide for User Research
Real-World .NET Applications
ISBN: 1558609237
EAN: 2147483647
Year: 2002
Pages: 144

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net