Example


This is a short report summarizing a test on another Webmonkey prototype for the site's development team. It builds on the previous testing that the site had gone through and focuses on the changes made to the front door and the renaming of various sections in the site.

Executive Summary

Five Web developers were shown the functional prototype for Webmonkey 4.0. In general, they liked it, especially the tutorials and the color scheme, but some of the organization confused them (specifically, the difference between the "Categories" and "Tutorials" sections).The new folder like navigation metaphor made sense to everyone, and they wished that it was on every page. Everyone saw the "Cool Tools" section, but thought it was an ad and ignored it, and although they liked the "Inspiration" section, they expected it to be more than just animation in the long run.

Finally, a couple of people said that it would be cool to have Webmonkey link to good, useful external content in an unbiased way since it would be useful and would reinforce Webmonkey's street creed.

Executive summaries are very useful when communicating results. The vice president of product development may never read the report, but a couple of paragraphs giving 50,000 of the results of the usability test are likely to be read. When attaching the report in an email, the executive summary should be included in the email, while the rest of the report—including the executive summary—is included as an attachment.

Procedure

Five people who spend a significant amount of time developing Web sites were invited. They were first asked some preliminary questions about their general net usage and where they went for developer information (both on the Web and in general).They were then shown the site prototype and asked to go through it in detail, concentrating on specific details, including the folder-style navigation, and the Cool Tools section. After giving their responses to the front door, they were asked to scroll down through one of the top stories, talking about their experience with the interface and their thoughts on the content. They were then asked to look for some specific content as a way of gauging their understanding of the layout of the site. Finally, they were asked some wrap-up and blue-sky questions, and the test was concluded.

A fast description of the procedure demystifies the process and provides important context for report recipients to be able to understand the results.

Evaluator Profiles

Michael

Michael spends more than 40 hours a week on the Internet, 20 hours of which is spent making Web pages, including design, programming, and production. Of all the development sites, he likes Webmonkey because of its "broad range." He also regularly reads "Flash Zone" because it can give him tutorials that he can't get in print. For CGI work, he follows another site, "CGI Resources" (the "CGI Zone"? The specific site wasn't clear from the interview).

John

John spends 30 hours a week on the Internet, half of which is work related. He spends at least 10 hours a week making Web sites, including design, markup, and code. He uses reference books and Webmonkey for technical Webrelated information. He also goes to "SGML University" and has never been to builder.com. Most of the time, he goes to these sites with specific questions. In general, he would like developer sites to be better organized by topic.

David

David spends 20–30 hours a week on the Internet, 75% of which is work related, and 5% to 10% is spent doing Web development, most of which is design. His main sources of technical information are Webmonkey and notes from school. He has never seen builder.com and goes to Webmonkey for both technology updates and to answer specific questions.

[remaining profiles omitted]

Evaluator profiles are useful both to help the report reader understand the context in which people's statements were made and as a way to personify the participants to those who were unable to observe the tests. Like the user profiles created in Chapter 7, these profiles help personalize the abstract concept of a product's users and make the results that much more immediate.

Observations

General Observations
  1. People like tutorials above all else. All the evaluators were drawn to the tutorials, sometimes to the exclusion of other content. The tutorials section was often the first one mentioned when the evaluators were asked where they would click on next. It was also the section people preferred to go to for general information even though there was a broader range of content in the "Categories" section.

  2. Almost everyone said they liked the color scheme. Without being asked about it, most of the evaluators volunteered that they really liked the color scheme on the homepage.

  3. People generally come to development sites with specific questions in mind, not to see "the latest." When asked whether they go to sites like Webmonkey to catch up on the latest technology or to get answers to specific questions, people generally said that it was to answer specific questions.

Likewise, when asked how they preferred to navigate through sites like Webmonkey, people said that they preferred searching, rather than browsing, since that brought them closer to the specific information they were looking for.

Features
  1. There was confusion between the content people would find in the "Categories" section and the "Tutorials" section (and, to a lesser extent, between the "Tutorials" section and the "Guides" section). Partially because of the ambiguity of the "Categories" name and partly because of the similar—but not completely identical—labels in the two sections, people were confused about what they would find in each section.

    Whenever possible, include screenshots.

  2. The "Cool Tools" section was noticed early by nearly everyone, but treated as a big ad and, thus, ignored by most. Until it was pointed out to a number of the evaluators that there was nonadvertising content in "Cool Tools," they did not appear to notice it. Most pointed to the picture and the price as indicators of why it was considered to be advertising content.

  3. A couple of people saw and liked the idea behind the "First Time Here" link.

  4. People didn't know to go to "Backend" for CGI topics and were unsure of what kinds of things would be found there. One person mentioned that he'd prefer it be called "Server Stuff" or something similar.

  5. People didn't notice the reference pop-up on the main page at first, and when they did, they weren't sure about its relationship to the content accessible from the folders in the left-hand margin. However, most everyone found it to be a useful tool with contents that made sense (except for "ISO Entities").A couple of people suggested that it be put in the left-hand margin along with the folders.

Navigation
  1. Everyone understood the folder metaphor on the front door.

  2. The inconsistent content and appearance of the left margin navigation was somewhat confusing. A number of people mentioned that they were surprised that the navigation in the left-hand margin changed from the front door to the subsections and the tutorials. Several mentioned that they would have preferred a continuation of the folder metaphor from the front door.

  3. People generally understood the pathnamelike breadcrumb navigation at the top of the page though not everyone noticed it. The biggest disconnect came when people would jump to a "Tutorial" directly from the top page (thus expecting the path to be something like "home/ tutorial/javascript") and the path read "home/categories/javascript/tutorial," which did not match their expectation.

Naming
  1. The "Categories" name wasn't clear. People weren't sure what "Categories" was referring to, and one person didn't even see it as a section to be clicked on.

  2. "Guides" wasn't clear as a section title. There was confusion in most of the evaluators between the "Guides" section and the tutorials.

  3. Likewise, "eBiz" wasn't clear. Although not everyone was asked about it, the couple of people who were didn't know what to expect on the other side.

  4. "Heard on the Street" was ambiguous. Without looking at it when the participants were asked to define what the "Heard on the Street" section was and how it was different from the other content sections, most people said that it was a "most recent" section or that it high-lighted some news development.

Conclusion

The combination of the attraction of the concept of tutorials with the unclear wording of "Categories" caused people to frequently ignore the categories section entirely.

A lot of the confusion comes from the ambiguity of single-word names. "Categories" and "Guides," although they adequately describe the sections after people have already seen them, give people little information about them before they've seen them, since, as words, they're quite general. Thus, the naming of sections (and maybe everything on the site in general) has to be done with the user's context in mind. What may, in retrospect, make perfect sense may be confusing and ambiguous before a definition is produced.

Quotations

Michael

"You know the functionality is out there, you just want to know how to put it together."

"When I started going there, it was a beginning site, it was very good for that, but then it kind of stayed there and I moved on." [re: Builder.com]

"I saw 'Cool Tool Pick,' and I thought that this would say GoLive and this would be Dreamweaver and this would be something else."

"If one of them weren't there, it might be easier [to differentiate between them]." [re:"tutorials" vs."categories"]

John

"It stands out without being the obnoxious Wired magazine look."

"If I were coming here to find a specific answer on something, I would go to 'Tutorial.'"

"'Categories' is everything and these are just subtopics."

"I would prefer to see the latest tutorial [rather than Inspiration] at the top."

[remaining quotations omitted]

A couple of sentences of evaluators' actual words often better illustrate the points you're trying to convey than a paragraph of explanation. The readers can then see the patterns you're trying to illustrate for them. When placed next to each point, they serve to reinforce each point as it's made. When presented all at once, they communicate the feel of a usability test.

Usability tests are one of the workhorses of user experience research. They can be done quickly and inexpensively, and provide a lot of immediately actionable information. Too often they're used as the only form of user feedback, but when used correctly, they're an invaluable tool.




Observing the User Experience. A Practioner's Guide for User Research
Real-World .NET Applications
ISBN: 1558609237
EAN: 2147483647
Year: 2002
Pages: 144

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net