|
Once you've run one survey, you should not consider your surveying complete and the survey process over. As your site grows and changes, so will your audience and your knowledge of them. Following up with qualitative research and tracking your audience's changes can help guide your other research and predict the needs of your audience rather than just reacting to them.
Survey research tells you what people feel and think about themselves, their behavior, and your product, but it's too limited a technique to say much about why they feel that way. For that, you need to follow up with qualitative research.
When trying to understand people's values and their causes, one of the best tools is the focus group (described in detail in Chapter 9). For example, if you are running a satisfaction survey and your audience says that they're unsatisfied with a certain feature or features, it's almost impossible to understand why they're unsatisfied. Is it the idea of the feature? Is it the implementation? Is it the way that it interacts with other features? It's difficult to understand this without asking people directly, but without first running a survey, a focus group series may concentrate on a different, less important, feature set than what really matters to the audience.
To understand people's actual behavior, rather than how they report their behavior in the survey, direct observation is important. Contextual inquiry (Chapter 8) can reveal the situations in which people make certain decisions, whereas log analysis (Chapter 13) reveals the pure patterns of their actions. If, in your survey, people say that they read online news two to three times an hour, it's possible to get an idea of the accuracy of that by actually observing a group of people for a couple of hours during the day. If only a few follow the "two to three times an hour" pattern, then you may take that fact with a grain of salt when interpreting the results.
Usability testing (Chapter 10) and other think-aloud techniques can reveal people's decision making and what functionality leads to their perceptions of the product. If they don't like it, maybe it's because they can't use it. Or maybe they like it because it's fast. Or maybe the speed doesn't matter and they don't like it because the button text is red on black or they can't find what they're looking for. It's difficult to know what causes opinions from a survey, but once you know what those opinions are, it helps focus the questions of later research.
By running the same survey in the same way at regular intervals, it's possible to track how your site's audience changes. So, for example, as a certain kind of service becomes more popular, it's likely to attract more and more mainstream users. But how many more? What defines "mainstream"? Repeatedly presenting the same survey to a similar number of people who are invited in the same way reveals whether the profiles change and, if they do, in what ways.
If you determine a set of "core" characteristics that define your audience, you can field additional surveys that ask additional questions that deepen your knowledge. So if you determine that the most important factors that define your audience are the level of their computer experience, the frequency of their computer use, and what software they use, you can field surveys that—in addition to asking these questions—ask further questions to probe their preferences, their satisfaction, the common ways they use your product, and so on. Asking all this on one survey may be impossible for purposes of length, but spreading the "noncore" questions among similarly sized groups with a similar composition can give you deeper knowledge than you could acquire otherwise.
There are times when you want to know how your audience changes in reaction to a specific change. It could be a major interface change, or it could be an advertising campaign. Identical surveys conducted before and after a significant change in a site or its marketing can reveal how the users' opinions or how the makeup of the users' population changes because of the product changes.
A pre/post survey is, as its name implies, run before and after a certain event. The results are compared to see what, if any, effect these changes had on the population. Was a new demographic group attracted to the product after an ad campaign? Are the users more satisfied since the redesign?
Before running a pre/post survey, it's important to determine what variables you will be observing. What do you expect will change as a result of the changes you're about to implement? What do you not want to change? Write your survey with those issues in mind, making sure to include appropriate questions that will address these issues.
It's also important to try to understand the effects of timing on these surveys so that the "pre" survey is fielded before the effects of the change have affected the audience, and the "post" fielded when the effects are greatest. When do you expect that the most significant change will happen? Will it be immediate, or will it take a while to affect the population? Do you expect there to be a buzz around the changes you're about to make? Taking these things into consideration well ahead of the change can minimize the "noise" in observations between the two groups.
In general, multiple surveys can monitor not just what changes happen in your audience, but how the audience changes. Ideally, you should run two surveys before the change and compare them to give you an idea of some of the natural variation in the way people respond to your survey (the natural bias in people's answers). Several surveys after the change can help you track how the changes progress. For example, running one survey a week after your change and then a second one several months later may tell you which changes were short term and which were more long term. Even a postsurvey a year after a presurvey is possible if the product does not change in a way significant to what you're testing.
When fielding multiple surveys, the most critical thing is to keep the surveys as similar as possible. Don't change the wording, the presentation, or the way that people are invited to take them. Analyze them the same way. Then, compare the analyses with an eye for the element that you think changed between the two surveys. Say your changes were made to capture a different market—was your market in fact different? Was it different in the ways you had expected?
Again, the most important thing when analyzing the data from multiple surveys is to make sure that you have set out your questions in advance and that you've focused your whole survey effort on answering those questions. Otherwise, you risk snow blindness in the whiteout of data that surveys can generate.
This chapter merely scratches the surface of what surveys can do. The possible combinations of survey methods are limitless. When used carefully with supporting research, they can provide insight into who your audience really is and what they think.
This is a survey that was written to profile the users of a radio network's Web site and to find out the general categories of information that are driving people to go to the site. It was designed to reveal visitors' expectations in order to optimize the presentation of the content and to provide constraints for subsequent qualitative research. Secondary goals were to prioritize site functionality and to perform a basic analysis of the competitive landscape.
Question | Answers | Reason | ||||||||||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
[Pop-up] |
| For consistency with previous survey To verify news radio listenership | ||||||||||||||||||||||||||||
[Pop-up] |
| Comparison with previous surveys Cross-tab vs. functionality Cross-tab vs. reason for visit | ||||||||||||||||||||||||||||
(Choose only one) [Radio buttons] |
| Find out general reason for visiting | ||||||||||||||||||||||||||||
[Pop-up] |
| Cross-tab with reasons | ||||||||||||||||||||||||||||
(Choose only one) [Radio buttons] |
| If general reason is news-or information-related, find out more specific information about cause of visit | ||||||||||||||||||||||||||||
[Pop-up] |
| To see which programs people are explicitly coming to see To see which programs appear in "Other" | ||||||||||||||||||||||||||||
(Check all that apply) [Checkboxes] |
| To find out the general topics of interest | ||||||||||||||||||||||||||||
(Check all that apply) [Checkboxes] |
| Competitive analysis | ||||||||||||||||||||||||||||
[Radio button grid with "not valuable," "somewhat valuable," and "extremely valuable"] |
| To get an idea of the desirability of different kinds of content offerings | ||||||||||||||||||||||||||||
[Radio button grid with "never," "sometimes," and "often" buttons] |
| To get an idea of the desirability of different kinds of site features | ||||||||||||||||||||||||||||
[Radio button grid with "not important," "somewhat important," and "very important" buttons] |
| What qualities do people value in stories? Timeliness Background Original perspective | ||||||||||||||||||||||||||||
[Pop-up] |
| |||||||||||||||||||||||||||||
[Pop-up] |
| |||||||||||||||||||||||||||||
[Pop-up] |
|
| ||||||||||||||||||||||||||||
[Pop-up] |
| All demographics questions for advertising profiling to compare with previous survey research, both online and offline | ||||||||||||||||||||||||||||
[Pop-up] |
| |||||||||||||||||||||||||||||
[Pop-up] |
|
|