Three months out of college and into a job that required him to sit at a desk most of the day, Jeff realized he was gaining weight. He resolved to start an exercise program.
A friend suggested that Jeff buy a heart rate monitor, a wrist-worn computer that looks like a watch and receives heart rate signals from a chest strap. The system would make it easier for him to track his heart rate and stay within his target zone while exercising.
Jeff had never paid much attention to his heartbeat before, but this device made it easy. He wore the device while working out at the corporate gym. He also wore it during the day and sometimes even to bed. That way, he could have the system store readings at periodic intervals while he slept. Jeff figured his resting heart rate while sleeping would be a good indicator of how much progress he was making in getting aerobically fit.
From the beginning of modern computing, [1 ]computers were created to be tools that had two basic functions: storing data and performing calculations. The early view of computers was a narrow one. In 1943, Thomas Watson, then chairman of IBM, infamously projected that “there is a world market for maybe five computers.” [2 ]The idea of a personal computer probably seemed outlandish, if anyone thought of it at all.
Computers as tools have come a long way in just over 50 years, as the opening anecdote illustrates. They perform so many functions, from word processing to bookkeeping to health monitoring, that many of us would feel lost without them. In the future, it will become increasingly clear that computers can be used as persuasive tools, designed to change attitudes and behaviors—to motivate people to exercise, buy more products, donate to charity, stay in touch with family members, or pursue a new career, to name a few potential applications. This chapter focuses on the use of computers as persuasive tools—the first corner of the functional triad (Figure 3.1).
Figure 3.1: Computers as persuasive tools.
[1 ]In 1946 the ENIAC (Electronic Numerical Integrator and Computer) was introduced. This was the world’s first “fully electronic, general-purpose (programmable) digital computer” ( http://www.kurzweilai.net). For a recent book on the ENIAC, see S. McCartney, ENIAC: The Triumphs and Tragedies of the World’s First Computer (New York: Berkley Pub Group, 2001).
[2 ]The now-famous 1943 comment by IBM chair Thomas Watson is widely cited.
A persuasive technology tool is an interactive product designed to change attitudes or behaviors or both by making desired outcomes easier to achieve.
For purposes of captology, I define a persuasive technology tool as an interactive product designed to change attitudes or behaviors or both by making a desired outcome easier to achieve. I have identified seven types of persuasive technology tools:
This chapter describes each of the seven types of persuasive technology tools, discusses the principles underlying them, and provides examples of actual or potential uses of each. Each type of tool applies a different strategy to change attitudes or behaviors. Although I list the seven types as separate categories, in reality a persuasive technology product usually incorporates two or more tool types to achieve a desired outcome.
When a long-distance phone company tries to persuade you to change your carrier, it doesn’t make you fill out forms, cancel your previous service, or sign any documents. You simply give your approval over the phone, and the new company takes care of the details. This is an example of a reduction strategy— making a complex task simpler.
I once used a reduction strategy to persuade my family to write letters to me when I moved to Europe. Before leaving, I presented a gift to each of my family members: a set of stamped envelopes with my new address already written on them—a primitive reduction technology. I hoped that reducing the number of steps required to drop me a note would persuade my family to write to me regularly. It worked.
Reduction technologies make target behaviors easier by reducing a complex activity to a few simple steps (or ideally, to a single step). If you purchase products on Amazon.com, you can sign up for “one-click” shopping. With one click of a mouse, the items you purchase are billed automatically to your credit card, packed up, and shipped off. The reduction strategy behind “one-click” shopping is effective in motivating users to buy things. [3 ]
Psychological and economic theories suggest that humans seek to minimize costs and maximize gains. [4 ]The theory behind reduction technologies is that making a behavior easier to perform increases the benefit/ cost ratio of the behavior. Increasing the perceived benefit/cost ratio increases a person’s motivation to engage in the behavior more frequently. [5 ]
In the process of simplifying a behavior or activity, reduction technologies also may increase a person’s self-efficacy, or the person’s belief in his or her ability to perform a specific behavior. This, in turn, can help the person to develop a more positive attitude about the behavior, try harder to adopt the behavior, and perform it more frequently. [6 ]
Principle of Reduction
Using computing technology to reduce complex behavior to simple tasks increases the benefit/cost ratio of the behavior and influences users to perform the behavior.
At capitoladvantage.com, we can find a more detailed example of a reduction strategy. Suppose you wanted to increase grassroots participation in policy making, how news gets written, and the political process in general. That’s what the people behind capitoladvantage.com wanted. So they created an online system that makes it simpler for people in the United States to share their views with their elected leaders. The leading product in “cyberadvocacy,” the “CapWiz” system takes the complexity out of sharing your views (Figure 3.2).
Figure 3.2: CapWiz simplifies the process of writing to elected officials.
The goal of CapWiz is “to empower, activate, educate, and mobilize constituencies to influence policymakers and the media to achieve public affairs objectives” [7 ]—in other words, to get ordinary citizens involved in public affairs. And apparently, this approach is working. On any given day (at the time of this writing), the system sends out between 20,000 and 45,000 constituent messages. [8 ]
You don’t have to search for a name, address, paper, or stamps. You simply go to a site using the CapWiz system (at the time of this writing these included AOL, Yahoo!, MSN, and USA Today, among others) and enter your zip code. “Write all your elected officials with just one click,” the Capitol Advantage site tells users.
When I wrote my representatives, I found it takes more than one click, but the CapWiz system has reduced the complexity significantly. To further reduce complexity (perhaps too much), organizations can use CapWiz to provide a template letter for their members to send to government officials.
[3 ]In investigating the effectiveness of Amazon’s one-click shopping method, Stacy Perman concluded that the technique does increase sales, although Amazon would not release any data to her. In the August 2000 issue of Business 2.0 (see http://www.business2.com/articles/mag/0,1640,6864|6925,00.html), she writes:
Amazon.com’s patented one-click technology, which allows shoppers to place an order by clicking a single button, is simplicity itself. We asked Amazon what the feature has done for its business—whether it has increased sales, how many books are sold via one-click, and so on—but the company politely declined to give us any information. Here’s what we know about one-click: Apart from the fact that everyone we know uses it, Amazon’s big rival, Barnesandnoble.com, liked it a lot too and built a similar feature into its site. Amazon felt sufficiently threatened that it asked Barnesandnoble.com to cease and desist. So we’re assuming that one-click has been a success for Amazon—but that it would rather keep just how successful it is several clicks away from the rest of us.
[4 ]Various theories in both cognitive science and social psychology account for our natural inclinations to do a cost/benefit assessment. One of the most explicit is expectancy theory (or valence-instrumentality-expectancy theory), which posits that behavior results from expectations about what alternatives will maximize pleasure and minimize pain. A noted work in this domain is
V. H. Vroom, Work and Motivation (New York: John Wiley and Sons, 1964; reprinted Malabar, FL: Krieger Publishing Company, 1982).
Much of economics hinges on assessments of cost/benefit. For an overview, see J. Taylor, Economics, 3rd ed. (New York: Houghton Mifflin Company, 2001).
[5 ]A. Bandura, [Self-Efficacy: The Exercise of Self-Control ](New York: W. H. Freeman, 1997).
[6 ]A. Bandura, Self-Efficacy: The Exercise of Self-Control (New York: W. H. Freeman, 1997). For an online article by Bandura on self-efficacy, see http://www.emory.edu/EDUCATION/mfp/BanEncy.html.
[7 ]The stated goal of CapWiz can be found at http://www.e-advocates.com/aboutus.html.
[8 ]To see the current volume of messages sent by CapWiz, click on the “current stats” link found at http://capitoladvantage.com.
Another way that computers act as persuasive tools is by leading users through a predetermined sequence of actions or events, step by step. I refer to this strategy as “tunneling.” Using a tunneling technology is like riding a roller coaster at an amusement park: once you board the ride, you are committed to experiencing every twist and turn along the way.
When you enter a tunnel, you give up a certain level of self-determination. By entering the tunnel, you are exposed to information and activities you may not have seen or engaged in otherwise. Both of these provide opportunities for persuasion.
People often put themselves into tunnel situations voluntarily to change their attitudes or behaviors. They may hire personal trainers who direct them through workouts, sign up for spiritual retreats that control their daily schedules, or even check themselves into drug rehab clinics.
Tunneling technologies can be quite effective. For users, tunneling makes it easier to go through a process. For designers, tunneling controls what the user experiences—the content, possible pathways, and the nature of the activities. In essence, the user becomes a captive audience. If users wish to remain in the tunnel, they must accept, or at least confront, the assumptions, values, and logic of the controlled environment.
Finally, tunneling technologies are effective because people value consistency. Once they commit to an idea or a process, most people tend to stick with it, even in the face of contrary evidence. [9 ]This is particularly true in the case of tunnel situations that people have freely chosen.
Software installation provides a good example of tunneling technology. For the most part, installing software today is simple; the computer takes you through the process, step by step. At a few points in the installation tunnel, you can choose which aspects of the application to install and where, but you are still in the tunnel.
This is where the potential to persuade comes in. To persuade you that the application is a great product with many features you’ll appreciate, an installation program may give you a promotional tour while the software is being copied onto your hard drive. The tour may congratulate you for making a smart choice, point out how the software will help you, and show people happily using the features. It may even advertise other company products. Because you’re in the installation tunnel, you are a captive audience, seeing information you would not have seen otherwise.
Registration on Web sites is another form of tunneling. To gain access to many sites’ services or content, users must go through a registration process.
During the registration process at eDiets.com, currently the leading diet Web site, the Web page gathers information about you while making offers about premium services or other products. After the program asks questions about your attitude toward weight loss (“I find I often eat in response to tension and stress”—strongly agree, agree, slightly agree, and so on), it then offers you an audiotape program designed “to create the mental changes needed for success.” [10 ]The point is that through the tunneling process the eDiets system leads users through a step-by-step process that enables them to identify weaknesses in resolve, creating a need that is immediately filled by the audiotape offer.
Principle of Tunneling
Using computing technology to guide users through a process or experience provides opportunities to persuade along the way.
Tunneling can be an ethical and useful persuasion strategy. A retirement software program could lead users through the various steps of analyzing their financial situation, setting financial goals, and taking action to meet these goals. A health site could lead users through a series of questions designed to identify poor health habits and take steps to improve them.
On the other hand, tunneling can include negative or even unethical elements. Back to our software installation example: Many software programs include product registration as part of the installation procedure. At some point the program asks you to enter your personal information—your name, company, and other contact information. Often, it seems there is no easy way to avoid giving away your personal information and still successfully complete the software installation. Depending on the nature and scope of the personal information demanded to complete the installation, some programs might be considered unethical since they essentially put you into a corner where you have little choice: either give up the personal information or risk a faulty installation. (One unintended consequence is that some users who want to maintain their privacy simply give false information.)
In the worst-case scenarios, tunneling technologies may border on coercion, depending on how they are designed. To avoid coercion, designers of tunneling technology must make clear to users how they can exit the tunnel at any time without causing any damage to their system.
[9 ]S. Plous, The Psychology of Judgment and Decision Making (New York: McGraw-Hill, 1993).
[10 ]The text fromediets.com was found at http://www.ediets.com/dietprofile/dietprofile20.cfm.
A tailoring technology is a computing product that provides information relevant to individuals to change their attitudes or behaviors or both. Tailoring technologies make life simpler for computer users who don’t want to wade through volumes of generic information to find what’s relevant to them.
Psychology research has shown that tailored information is more effective than generic information in changing attitudes and behaviors. [11 ]Much of the research has taken place in the area of health interventions, in which information has been tailored to match people’s education level, type and stage of disease, attitude toward the disease, and other factors.
Tailoring technology can be embedded in a variety of persuasive technology products. One example: A word processing application might suggest that you increase your working vocabulary by learning a word each day (the program has noticed that you use a relatively small set of words). You might be more motivated to follow up on this suggestion if the application provided tailored information showing the limited range of your working vocabulary, as well as a comparison chart that shows that you are well below the vocabulary level of others in your profession.
The Web offers some good examples of tailoring information to individuals to achieve a persuasive result. Consider Chemical Scorecard (Figure 3.3) found at scorecard.org. Created by Environmental Defense (formerly known as the Environmental Defense Fund), this site encourages users to take action against polluting organizations and makes it easy to contact policy makers to express concerns. [12 ]
Figure 3.3: The Web site scorecard.org provides tailored information in order to persuade visitors to take action against polluters.
When users enter their zip code, the site lists names of the polluting institutions in their area, gives data on chemicals being released, and outlines the possible health consequences. The technology can also generate a map that enables you to see the location of pollution sources relative to where you live, work, or attend school.
This tailored information can be compelling. The report I generated for my area of Silicon Valley identified hazards I didn’t know about, and it identified the companies that were the major offenders. To my surprise, the polluters included a few companies with stellar reputations, including one of my former employers.
I also learned from Chemical Scorecard that exercising in the neighborhood of my YMCA may not be a good idea. A company located next to my running path emits almost 10,000 pounds of dichlorolfluoroethane each year; this chemical is a suspected cardiovascular toxicant. Such tailored information can influence people to change attitudes and behavior. It certainly changed mine; after consulting the site, I began to run on a treadmill inside the gym, rather than outside.
Many sites provide tailored information for commercial purposes. More and more e-commerce Web sites are suggesting additional items for consumers to buy, based on information gathered in previous visits. This form of tailoring can be effective. Not only can a site recommend more products to buy when the customer revisits the site, if the customer opts in, it can email discount coupons, offer newsletters to keep customers informed of new products and promotions, or use other online techniques to persuade customers to do more business with the site.
Principle of Tailoring
Information provided by computing technology will be more persuasive if it is tailored to the individual’s needs, interests, personality, usage context, or other factors relevant to the individual.
It’s not surprising that tailored information is more effective. But what may be surprising is that the mere perception that information has been tailored is likely to make a difference, according to some scholars. [13 ]In other words, information doesn’t have to be personally relevant; it just has to appear that way.
Why does this work? When people believe messages are tailored for them, they pay more attention. [14 ]They will then process the information more deeply, and—if the information stands up to scrutiny—they will be more likely to be persuaded. [15 ]
Unfortunately, the fact that people are more likely to be persuaded if they simply perceive that information has been tailored for them enables designers to apply tailoring techniques in unethical ways. Suppose an interactive financial planning product gathers information about the user, then recommends that he invest mostly in tax-free bonds. The research on tailoring suggests the user will consider this path seriously. In reality, the advice engine may give everyone the same information—or, even worse, it may advise potential investors according to what would provide the greatest profit to the company behind the service. But the appearance of taking the user’s special needs into account will make the advice more compelling.
Chemical Scorecard tailors information to individuals, but it does not tailor information for context. That’s the next big step for this and other tailoring technologies. In the case of Chemical Scorecard, tailoring information for context would mean taking the information from the system’s databases and providing it to people during the normal routines of life.
Imagine a young couple shopping for a home. A tailoring technology in their car could inform them about the environmental status of the neighborhoods they are considering. Or a portable tailoring technology could inform me about toxic chemicals anywhere I jog.
Conceptually, it’s easy to make the leap from personalized information to contextualized information. But from a technology and practical standpoint, there’s a long way to go to make this a reality. To deliver contextualized information, the technology would have to not only locate you but also determine, among other things, whether you are alone or with others, what task you were performing, whether you are in a rush or at leisure, and what kind of mood you are in. All of these are important elements in determining an effective persuasion strategy. Then there are practical and social issues such as who will pay for the required technology and how privacy will be maintained. As such hurdles are overcome, tailoring technologies will have a greater impact on attitudes and behavior.
[11 ] For a review of tailoring in the context of computer technology, see H. B. Jimison. Patient specific interfaces to health and decision-making information, in R. L. Street, W. R. Gold, and T. Manning (eds.), Health Promotion and Interaction Technology: Theoretical Applications and Future Directions (Mahwah, NJ: Lawrence Earlbaum, 1997). See also:
a. V. J. Strecher et al., The effects of computer-tailored smoking cessation message in family practice settings, J. Fam Prac, 39: (1994).
b. C. S. Skinner, J. F. Strecher, and H. Hospers, Physician recommendations for mammogram; do tailored messages make a difference? AM J Public Health, 84: 43–49 (1994).
c. M. K. Campbell et al., The impact of message tailoring on dietary behavior change for disease prevention in primary care settings, Am J Public Health, 84: 783–787 (1993).
d. M. W. Kreuter and V. J. Strecher, Do tailored behavior change messages enhance the effectiveness of health risk appraisal? Results from a randomized trial, Health Educ. Res., 11(1): 97–105 (1996).
e. James O. Prochaska and John C. Norcross, Changing for Good (Avon Books, 1995).
[12 ]See http://www.scorecard.org.
[13 ]For example, see J. R. Beninger, Personalization of mass media and the growth of pseudocommunity, Communication Research, 14(3): 352–371 (1987).
[14 ]S. J. Ball-Rokeach, M. Rokeach, and J. Grube, The Great American Values Test: Influencing Behavior and Belief through Television (New York: Free Press, 1984). See also J. R. Beninger, Personalization of mass media and the growth of pseudo-community, Communication Research, 14(3): 352–371 (1987).
[15 ]R. E. Petty and J. T. Cacioppo, Attitude and Persuasion: Classical and Contemporary Approaches (Dubuque, IA: Wm. C. Brown, 1981).
One soggy winter day, 15 students stood on the edge of a bridge that spans Highway 101 in Palo Alto, California. Each student held a poster painted with a bold orange letter. Lined up in order, the letters spelled out a simple but provocative message for the Silicon Valley drivers below: “W-H-Y N-O-T C-A-RP- O-O-L-?” The automobiles below were moving at a snail’s pace, bumper to bumper. However, one lane was nearly empty: the carpool lane. It’s hard to imagine a driver trapped in the rush hour crawl who didn’t—at least for a moment—reconsider his or her commute strategy. “Yeah, why not carpool? I could be home by now.”
Principle of Suggestion
A computing technology will have greater persuasive power if it offers suggestions at opportune moments.
This anecdote illustrates the potential impact of making a suggestion at the most appropriate time. That’s the principle behind another type of persuasive tool that I call “suggestion technology,” which I define as an interactive computing product that suggests a behavior at the most opportune moment. To be viable, a suggestion technology must first cause you to think, “Should I take the course suggested here, or should I continue along my current path?”
The dynamics underlying suggestion technology date back at least 2,000 years, to a principle of persuasion called kairos. Discussed by ancient Greek rhetoricians, kairos means finding the opportune moment to present your message. [16 ](In Greek mythology Kairos was the youngest son of Zeus and the “god of the favorable moment.”)
Suggestion technologies often build on people’s existing motivations—to be financially stable, to be healthy, to be admired by others. The suggestion technology simply serves to cue a relevant behavior, essentially saying, “Now would be a good time to do X”—to get out of growth stocks and into government bonds, to change the air filter in your home’s heating system, to send a card to a friend you haven’t seen in a while, to call a customer to see if she needs more of your product. For the technology to be successful, the suggested action must be compelling and timely enough that you implement it.
One familiar example of a suggestion technology is the Speed Monitoring Awareness and Radar Trailer (SMART), [18 ]a portable trailer (Figure 3.4) that can be placed at the side of the road to monitor the speed of oncoming vehicles. If you’ve seen SMART before, you’ve likely seen it in school zones and neighborhoods where drivers tend to exceed the posted speed limit.
Figure 3.4: The SMART trailer is designed to influence drivers by using information comparison as a suggestion.
As a driver approaches the trailer, SMART senses how fast the car is traveling, as far away as about 90 yards. It then displays the car’s speed on a large output device, big enough that the driver can read it from afar. In most versions of this device, the trailer also shows the speed limit for the street, allowing drivers to compare their actual speed with the posted limit.
The goal of SMART is to suggest that drivers reevaluate their driving behavior. It creates a decision point about driving speed at the right time—when people are driving too fast.
SMART doesn’t make an explicit suggestion; the suggested behavior is implicit: drive within the posted speed limit. The motivation to act on the suggestion comes from within the driver—either a fear of getting a speeding ticket or a sense of duty to drive safely.
Timing is critical for a suggestion technology to be effective. The technology must identify the right time to make the suggestion. But what is the “right” time?
Although classical rhetoricians emphasized the importance of kairos in persuasion, they did not leave behind practical guidelines on how to recognize or create moments that would be most opportune. However, psychologists have identified some characteristics that define opportune moments of persuasion: When people are in a good mood, they are more open to persuasion. [19 ]When they find their current world view no longer makes sense, people are more open to adopting new attitudes and opinions. [20 ]In addition, people are more likely to be persuaded to comply with a request when they can take action on it immediately or when they feel indebted because of a favor they’ve received, [21 ]a mistake they have made, [22 ]or a request they recently denied. [23 ]
These are simple examples of opportune moments. In reality, the timing issues in persuasion are not easily reduced to guidelines. Timing involves many elements in the environment (ranging from the physical setting to the social context) as well as the transient disposition of the person being persuaded (such as mood, feelings of self-worth, and feelings of connectedness to others).
To illustrate the difficulty of creating opportune moments of persuasion, consider a concept that two students in my Stanford lab explored, using Global Positioning System (GPS) technology to identify a person’s location. Theoretically, by using GPS you could create a suggestion technology to persuade a person to do something when she is at a specific location.
The students created a prototype of a stuffed bear that McDonald’s could give away to children or sell at a low price. Whenever the bear came near a McDonald’s, it would begin singing a jingle about French fries—how delicious they are and how much he likes to eat them.
The toy was never implemented, but you can imagine how kids could be cued by the bear’s song and then nag the parent driving the car to stop by McDonald’s. You could also imagine how the technology might backfire, if the parent is in a hurry, in a bad mood, or doesn’t have the money to spend on fast food. The point is, while the geography may be opportune for persuasion, the technology doesn’t have the ability to identify other aspects of an opportune moment: the parent’s state of mind, financial situation, whether the family has already eaten, and other variables. (In this example, there also are obvious ethical concerns related to manipulating children—a topic I’ll explore in Chapter 9.)
Suggestion technology can be used for macrosuasion, as in the case of SMART, whose purpose is to promote safe driving. It also can be used for microsuasion—persuasion that’s part of a larger interactive system. A personal finance application may suggest that you pay your utility bill today, a software management system could suggest that you back up your data soon, or an electronic reminder service may suggest that you buy and ship your mother’s birthday gift early next week. The key to the success of such technology applications is creating a decision point at or near the time when it’s appropriate to take action.
[16 ]James Kinneavy and Catherine Eskin, Kairos in Aristotle’s Rhetoric, Written Communication, 11(1): 131–142 (January 1994). See also Stephen P. Witte, Neil Nakadate, and Roger D. Cherry (eds.), A Rhetoric of Doing: Essays on Written Discourse in Honor of James L. Kinneavy (Carbondale, IL: Southern Illinois University Press, 1992).
For more on kairos, see http://www.sagepub.co.uk/journals/details/issue/sample/a009710.pdf.
[18 ]You’ll find more information about these trailers at http://www.kustomsignals.com.
[19 ]For more details on the effects of moods on persuasion processes, see the following:
a. Diane M. Mackie and Leila T. Worth, Feeling good, but not thinking straight: The impact of positive mood on persuasion, in J. Forgas (ed.), Emotion and Social Judgments (Oxford: Pergamon Press, 1991), 201–219.
b. Richard E. Petty, Faith Gleicher, and Sara M. Baker, Multiple roles for affect in persuasion, in J. Forgas (ed.), Emotion and Social Judgments (Oxford: Pergamon Press, 1991), 181–200.
c. N. Schwarz, H. Bless, and G. Bohner, Mood and persuasion: Affective states influence the processing of persuasive communications, Advances in Experimental Social Psychology, 24: 161–199 (1991).
d. Joel B. Cohen and Charles S. Areni, Affect and consumer behavior, in T. Robertson and H. Kassarjian (eds.), Handbook of Consumer Behavior (Englewood Cliffs, NJ: Prentice Hall, 1991), pp. 188–240.
[20 ]Unless the circumstances are unusual (such as belonging to a doomsday group whose predictions fail) or people feel personally threatened, what psychologists call “ disconfirming information” will lead people to experience cognitive dissonance and seek new beliefs and actions that are more consistent. The classic work in this area is L. Festinger, A Theory of Cognitive Dissonance (Stanford, CA: Stanford University Press, 1957).
For a more recent treatment of dissonance and consistency issues in persuasion, see Chapter 5 (“Motivational Approaches”) in R. Petty and J. Cacioppo, Attitudes and Persuasion: Classic and Contemporary Approaches (Dubuque, IA: Wm. C. Brown Publishers, 1981).
For studies of how self-worth mediates assimilation of disconfirming information, see Geoffrey L. Cohen, Joshua Aronson, and Claude M. Steele, When beliefs yield to evidence: Reducing biased evaluation by affirming the self, Personality & Social Psychology Bulletin, 26(9): 1151–1164 (2000).
[21 ]For more on how the rule of reciprocity works as well as its role in compliance, see the following:
a. A. W. Gouldner, The norm of reciprocity: A preliminary statement, American Sociological Review, 25: 161–178 (1960).
b. M. S. Greenberg, A theory of indebtedness, in K. Gergen, M. S. Greenberg, and R. H. Willis (eds.), Social Exchange: Advances in Theory and Research (New York: Plenum, 1980), pp. 3–26.
c. M. S. Greenberg and D. R. Westcott, Indebtedness as a mediator of reactions to aid, in New Directions in Helping, Vol. 1, (Orlando, FL: Academic, 1983), pp. 85–112.
[22 ]For more on how people comply with requests in order to affirm their self-concept after it has been threatened, see
a. Claude M. Steele, The psychology of self-affirmation: Sustaining the integrity of the self, in L. Berkowitz (ed.) et al., Advances in Experimental Social Psychology, Vol. 21: Social Psychological Studies of the Self: Perspectives and Programs (1988), 261–302.
b. Amy Kaplan and Joachim Krueger, Compliance after threat: Self-affirmation or self presentation? Current Research in Social Psychology, Special Issue: 4(7): (1999).
c. For an online literature review by Kaplan, see http://www.uiowa.edu/~grpproc/crisp/crisp.4.7.htm.
[23 ]For a relatively recent meta-analysis of “door in the face” research, see Daniel J. O’Keefe and Scott L. Hale, An odds-ratio-based meta-analysis of research on the door-in-the-face influence strategy, Communication Reports, Special Issue: 14(1): 31–38 (2001).
My Stanford students Daniel Berdichevsky and Kavita Sarin carried out early explorations into location-based persuasion.
The next type of persuasive technology tool is self-monitoring technology. This type of tool allows people to monitor themselves to modify their attitudes or behaviors to achieve a predetermined goal or outcome. Ideally, self-monitoring technologies work in real time, giving users ongoing data about their physical state (or inferences about their mental state, based on physical feedback), their location, or their progress on a task. The goal is to eliminate the tedium of measuring and tracking performance or status. This makes it easier for people to know how well they are performing the target behavior, increasing the likelihood that they will continue to produce the behavior. [25 ]
In addition, self-monitoring technologies feed the natural human drive for self-understanding. [26 ]Like personality inventories or aptitude tests, self-monitoring technologies can help people learn about themselves. For this reason, using self-monitoring technologies may be intrinsically motivating. [27 ]
Heart rate monitors (Figure 3.5) are a good example of a self-monitoring technology designed to change behavior. Often worn directly on the body, these devices monitor a person’s heart rate during exercise. By using these wearable computers, users can track their heart rate accurately and easily.
Figure 3.5: Heart rate monitors allow people to modify their exercise levels to achieve a target heart rate.
Heart rate monitors help people modify their physical behavior so their heart rate stays within a predetermined zone. The more advanced devices make it easier to stay within your desired zone by sounding an audio signal when your heart beats too fast or too slow, so you know whether to decrease or increase your level of exertion.
Heart rate monitors not only help people modify their behaviors, they also change attitudes in two ways. First, because the technology makes it easy to track your heart rate, you no longer need to focus on how far you’ve jogged or how fast you’re going; you simply monitor your heart rate, which is the best indicator of an effective workout. Using a heart rate monitor shifted my attitude about exercise; I became more concerned about my heart rate than about adhering to a specific exercise regimen. Having a tool like a heart rate monitor can also change a person’s general attitude about exercise. Because the device provides information on the person’s physiological status, working out can be more interesting and, for some people, more fun.
Principle of Self-Monitoring
Applying computing technology to eliminate the tedium of tracking performance or status helps people to achieve predetermined goals or outcomes.
Sometimes self-monitoring tools can be quite specialized. Tanita Corporation markets a jump rope (Figure 3.6) with a built-in monitor that lets users know how many calories they’ve burned as well as how many jumps they’ve completed. [28 ]Not only does the device make it easier to track a person’s level of exercise, but getting concrete feedback from the device likely provides motivation to perform the activity.
Figure 3.6: The HealthyJump product tracks calories burned while jumping rope.
A team of my students29 created a conceptual design that illustrates how self-monitoring technology could work to change language behavior. They targeted a behavior that they themselves had problems with: using the word “like” too often (“I went to class and it was, like, so crowded” and “I was, like, ‘Wow, I can’t find a place to sit.’”). The student team knew they and most people in their age group had this language quirk. They were worried about speaking this way in job interviews or on the job: “It’s, like, I’ve almost completed the presentation for tomorrow’s client meeting.”
In my students’ conceptual design, they showed how someone could use next-generation mobile phone systems to self-monitor and eliminate or reduce this language quirk. While their conceptual design included various facets, the essence of the idea is that a word recognition system would listen to them as they talked on the mobile phone. Whenever they used the word “like,” the phone would give a signal, making them aware of it. The signal could be a vibration or a faint audio signal that only the speaker could hear. In this way, the speaker could be trained to use the word “like” less frequently.
[25 ]Various theories support the idea that people are more likely to do things that are easy to do. See, for example, A. Bandura, Self-Efficacy: The Exercise of Self-Control (New York: W. H. Freeman, 1997).
[26 ]L. Festinger, A theory of social comparison process, Human Relations, 7: 117–140 (1954).
[27 ]The concept of intrinsic motivation—the natural desire to do something because it is inherently rewarding—is discussed in more detail in Chapter 8.
[28 ]You can find out more about the HealthyJump jump rope at http://tanitascale.com/healthyjump.html.
While self-monitoring technology enables individuals to learn about themselves, surveillance technology enables them to learn about others. For the purposes of captology, surveillance technology is defined as any computing technology that allows one party to monitor the behavior of another to modify behavior in a specific way. [30 ]
Of all the types of persuasive technology tools in this chapter, surveillance technology is the most common in today’s marketplace. There are applications for tracking how employees use the Internet, how teenagers drive, how phone support workers serve customers, and many more.
As early as 1993, one researcher reported that 26 million American workers were monitored through their computers. [31 ]Another figure from 1998 showed that two-thirds of major U.S. companies electronically monitor their workers. [32 ]And in 2001, a survey released by the American Management Association reported that over 77% of major U.S. firms used some form of high-tech workplace surveillance, a number that they say had doubled since 1997.33 One reason that surveillance technology is so common is that it works: Surveillance has long been an active research topic in the field of social psychology, and the overwhelming conclusion is that observation has powerful effects on how people behave. When people know they’re being watched, they behave differently. According to the research, if others can observe a person’s actions and can reward or punish the person for them, the person is likely to make his actions meet the observer’s expectations. [34 ]
Principle of Surveillance
Applying computing technology to observe others’ behavior increases the likelihood of achieving a desired outcome.
Hygiene Guard is one example of a surveillance technology. The system (Figure 3.7), which monitors hand washing, is installed in employee restrooms to make sure workers follow hygiene rules. The system uses sensor technology located in various places: on the employee’s ID badge, in the restroom ceiling, and at the sink. It identifies each employee who enters the restroom. After the employee uses the toilet facilities, it verifies that the employee stands at the sink for 30 seconds. If not, the system records the infraction.
Figure 3.7: Hygiene Guard is a surveillance system that tracks employee hand washing.
Another example of surveillance technology is AutoWatch, a computer system that enables parents to monitor the driving behavior of their children. [36 ](Makers of this system suggest AutoWatch can also let you monitor how employees drive corporate vehicles.) According to company literature, AutoWatch is a “little black box” that records driving speed, starts and stops, and other data. Parents can then remove the device from the vehicle and download the information to a PC.
At first glance, AutoWatch seems a reasonable idea: Parents should be able to monitor how their children drive. However, the product literature suggests that parents “conceal the AutoWatch unit under the dash or under the seat.” [37 ]AutoWatch is a covert technology when installed this way. When used covertly, AutoWatch is no longer a persuasive technology because its goal is not to motivate or influence; it’s just secretly monitoring.
This brings up a key point: For surveillance technologies to effectively change behaviors, they must be overt, not covert. Delivery companies sometimes post a message on the back of their trucks: “How am I driving?” with a toll-free number to report problems. The fact that the truck drivers know others can report them for reckless driving probably motivates them to drive more safely.
Contrast this with a covert installation of AutoWatch. How will teens be motivated to avoid driving recklessly if they don’t know their driving is being monitored by AutoWatch? When used covertly, AutoWatch is geared toward punishment, not persuasion. There are important ethical questions surrounding the use of covert technologies, but I will not address them here, since covert technologies by definition are not persuasive technologies.
While surveillance technologies may use the threat of punishment to change behavior, they also can be designed to motivate people through the promise of rewards. For example, parents could use the AutoWatch system to reward their teens for driving safely, perhaps providing teens with gas money for future driving.
In terms of workplace surveillance, several companies have created systems that track employee behavior and reward them for doing what their company wants done. [38 ](Rather than calling these products “surveillance systems,” companies may refer to them as “incentive systems” or “incentive management technology.” [39 ])
An Illinois-based company called Cultureworx has created a product that can track employee behavior throughout the day. And it can reward employees who do things that meet company policies or help boost the bottom line. If a company wants employees in its call centers to use its customer relationship management (CRM) software, inputting customer information and results of each customer contact, the Cultureworx system can track employee performance along these lines. The harshness of tracking employees this way is softened somewhat because the surveillance system gives points that can be exchanged online for products from places like Eddie Bauer and Toys R Us. (However, it’s not clear to what extent employees would embrace such a system simply because it offers rewards.)
While surveillance can be effective in changing behavior, in many cases it leads to public compliance without private acceptance. Some theorists describe this as “compliance versus internalization.” [40 ]People will behave in a prescribed way if they know they are being observed, but they will not continue the behavior when they are no longer being observed unless they have their own reasons for doing so. [41 ]In other words, without motivation to internalize the behavior, the conformity and compliance effects will weaken and often disappear when a person is no longer being observed.
The real power in surveillance technology lies in preventing infractions; surveillance should focus on deterrence, not punishment. Even so, using surveillance as a motivating tool is not the most noble approach to persuasion, even when it leverages the promise of rewards rather than the fear of punishment. The use of surveillance technology also raises serious ethical questions about maintaining the privacy and dignity of individuals. (We’ll explore the ethical concerns in more detail in Chapter 9.)
[30 ]The “Like’s Gone” concept was created by captology students Marissa Treinen, Salvador Avila, and TatianaMejia.
[31 ]Strictly speaking, surveillance technology is not interactive in the way I’ve described other interactive computing products to this point; these products focus on interactivity between the technology and the end user. However, I consider surveillance systems to be persuasive technology under my definition, as they do incorporate a limited kind of interactivity. Input comes from one person (the user) and output is sent to another person (the observer), who then closes the loop by interacting with (rewarding or punishing) the person observed.
[32 ]K. Bell DeTienne, Big brother or friendly coach? Computer monitoring in the 21st century, The Futurist, pp. 33–37 (1993).
[34 ]Jennifer Granick, Big Brother: Your boss, Wired (July 1998 ).
The American Management Association (AMA) conducts an annual survey of workplace testing and monitoring. For a summary of its recent results, showing the increased prevalence of workplace monitoring and surveillance since 1997, see http://www.amanet.org/research/pdfs/ems_short2001.pdf.
[36 ]J. C. Turner, Social Influence (Pacific Grove, CA: Brooks/Cole, 1991).
[37 ]Hygiene Guard is produced by Net/Tech International in New Jersey.
[38 ]For more information on AutoWatch, see http://drivehomesafe.com/control_teendriver_speeding_driverlicense2.htm.
[39 ]The user manual for AutoWatch suggests various ways to conceal this device. To see an online version of the manual, go to http://www.obd2.com/pdffiles/Userawfw.pdf.
[40 ]For a long list of companies that provide incentive management solutions, see http://www.workindex.com/extrefs.asp?SUBCATID=1714.
[41 ]For a brief article explaining incentive management in call centers, see http://www.callcentermagazine.com/article/CCM20010627S0002.
The last type of persuasive technology tool is what I call “conditioning technology.” A conditioning technology is a computerized system that uses principles of operant conditioning to change behaviors.
B. F. Skinner was the leading proponent of operant conditioning, [42 ]which peaked in popularity four decades ago and now is controversial in some circles. In simple terms, operant conditioning (also called “behaviorism” and “ instrumental learning”) is a method that uses positive reinforcements—or rewards— to increase the instances of a behavior or to shape complex behaviors. [43 ]( Operant conditioning also may involve the use of punishments to decrease the instances of a behavior, but this approach is fraught with ethical problems and is not, in my view, an appropriate use of conditioning technology. [44 ]As a result, I will not discuss it further here.)
If you’ve ever tried to train a dog to do tricks, you’ve probably used operant conditioning. By rewarding your dog with praise, petting, or a snack after a successful performance, you’ve given positive reinforcement. When you praise your child, send a thank-you note to a friend, or give someone a gift, you are subtly shaping their future behavior, whether you intend to or not. Operant conditioning is pervasive among human beings. [45 ]
Computers also can use operant conditioning to bring about behavior changes. In my classes at Stanford, my students and I explored various approaches to using high-tech conditioning technology. Two engineering students built a prototype of “Telecycle” (Figure 3.8), an exercise bike connected to a TV through a small computer. [46 ]In this system, as you pedal at a rate closer to the target speed, the image on the TV becomes clearer. If you slowdown too much or stop pedaling, the TV picture becomes fuzzy, almost worthless. The students hypothesized that receiving a clearer picture would be reinforcing and produce the desired behavior change—exerting more effort on the exercise bike.
Figure 3.8: The Telecycle research prototype, a simple application of operant conditioning.
The Telecycle is a simple example of using reinforcement for a specific target behavior. More complex uses of operant conditioning can be found in many computer games.
While game designers rarely talk about their designs in terms of behaviorism, [47 ]good game play and effective operant conditioning go hand in hand. From a designer’s perspective, one mark of a good computer game is one that players want to keep playing. The bottom line is that game designers seek to change people’s behaviors. Ideally, players become obsessed with the game, choosing it above other computer games or above other things they could be doing with their time.
Computer games provide reinforcements through sounds and visuals. The rewards also come in other ways: through points accumulated, progression to the next level, rankings of high scores, and more. Discussed in Chapter 1 as an example of microsuasion, Warcraft III is just one of thousands of computer games using reinforcement to keep players engaged with the game.
Computer games may be the purest example of technology using operant conditioning. They are effective platforms for administering reinforcements and punishments, with a bit of narrative and plot layered over the top.
To be most effective, positive reinforcement should immediately follow the performance of the target behavior. However, the reinforcement need not follow every performance of the behavior. In fact, to strengthen an existing behavior, reinforcers are most effective when they are unpredictable. Playing slot machines is a good example: Winning a payoff of quarters streaming into a metal tray is a reinforcer, but it is random. This type of unpredictable reward schedule makes the target behavior—in this case, gambling—very compelling, even addictive.
A good example of the use of periodic reinforcement can be found at TreeLoot.com (Figure 3.9). When you visit the site, you’ll be asked to click on the image of a tree. After you click, you will get feedback from the system, depending on where you click. Sometimes you’ll just get a message to click again. Other times, the system will tell you that you’ve won “Banana Bucks.”
Figure 3.9: TreeLoot.com uses periodic reinforcement to persuade visitors to keep playing at the site.
Although the TreeLoot experience is more complicated than what I’ve just described, the relevant point is that TreeLoot.com behaves much like a slot machine: it offers unpredictable rewards to reinforce a behavior. Some of my students have admitted to spending hours clicking over and over on the tree image in hopes of hitting it big. Like pigeons pecking a lever to get a food pellet, some of my students—and thousands of other people—keep clicking on the TreeLoot image. Operant conditioning is undeniably powerful.
As noted earlier, operant conditioning can be used not just to reinforce behavior but to shape complex behaviors. Shaping is a process of reinforcing behaviors that approximate the target behavior. You find this approach in animal training. Through shaping, dolphins can be trained to jump out of the water over a rope. At the beginning of the training, the dolphin receives a reward for swimming over a rope that is resting on the bottom of the tank. Then the rope is moved up a few feet in the water, and the dolphin gets rewarded for swimming over it, not under it. The process continues until the rope is out of the water.
Technology could be designed to shape complex behaviors in a similar way. For example, conditioning technology might be used to foster collaboration among employees working in different locations. While there’s no clear formula for collaboration, certain activities are likely to indicate collaboration is taking place: email to colleagues, sharing of documents, follow-up phone calls, and scheduling appointments, to name a few examples. It’s possible that a computer system could be designed to shape such collaborative behavior by reinforcing the elements of collaboration.
Principle of Conditioning
Computing technology can use positive reinforcement to shape complex behavior or transform existing behaviors into habits.
Despite the widespread use of operant conditioning in everyday life, we have yet to see the full potential of using computer-based operant conditioning to affect human behavior (except in making computer games compelling or nearly addictive). My lab is currently researching what technology can do to reinforce behavior; we’re studying sounds, images, and other digital experiences to develop an effective reportoire of “digital rewards.” But we apply our new knowledge with caution. Like surveillance, this use of technology raises ethical questions, especially since operant conditioning can change our behaviors even when we don’t consciously recognize the connection between the target behavior and the reinforcement given.
[42 ]H. C. Kelman, Compliance, identification, and internalization: Three processes of attitude change, Journal of Conflict Resolution, 2: 51–60 (1958).
[43 ]Can compliance lead to internalization? Yes, in some cases. At least four theories argue that behaviors can eventually change attitudes, even if the behaviors were initially motivated by extrinsic factors. These theories include cognitive dissonance, self-perception, self-presentation, and self-affirmation. For brief reviews, see D. R. Forsythe, Our Social World (Pacific Grove, CA: Brooks/Cole, 1995). While I won’t describe each theory here, I should note that scholars don’t agree whether these are competing or complementary theories. However, taken together, one thing seems clear: behaviors—even those motivated by compliance strategies—can eventually change attitudes.
[44 ]B. F. Skinner, About Behaviorism (New York: Alfred A. Knopf, 1974).
[45 ]Operant conditioning is more complex than I describe here. For a readable overview of operant conditioning (my lab’s favorite introduction to the topic), see R. A. Powell, D. G. Symbaluk, and S. E. MacDonald, Introduction to Learning and Behavior (Belmont, CA: Wadsworth, 2002).
[46 ]For an example of software that punishes, see B. Caldwell and J. Soat, The hidden persuader (booby-trapped software), Information WEEK, 42(3): 47 (1990).
[47 ]An intriguing (and readable) book on this topic is Karen Pryor, Don’t Shoot the Dog: The New Art of Teaching and Training (New York: Bantam Doubleday Dell, 1999).
The research on intrinsic and extrinsic motivation shows that the gentler the intervention to achieve the desired behavior change, the better the long-term outcome. [48 ]It’s good to keep that research in mind when considering persuasive technology tools. For example, if a suggestion technology can produce the desired behavior, that approach should be used rather than surveillance technology. Not only will the gentler suggestion technology produce better results, it will do so without raising the ethical issues relating to surveillance.
In many cases, effective persuasion requires more than using a single tool or strategy. Some of the examples I’ve described in this chapter are combinations of persuasive technology tools: The heart rate monitor is a combination of self-monitoring and suggestion technologies; it monitors the heart rate, and it notifies the users when the rate wanders beyond the preset zone. The chemical scorecard at scorecard.org uses tailoring technology to provide targeted information and reduction technology to make it easy to take action—to send e-mail and faxes to government officials and offending companies. It even writes the letter for you, including relevant details.
As these examples illustrate, in many cases effective persuasion requires more than one tool or strategy. Whether you are designing, analyzing, or using persuasive technology, look for these natural synergies as different tool types come together to create a persuasive interactive experience.
For updates on the topics presented in this chapter, visit www.persuasivetech.info.
[48 ]Ajit Chaudhari and Emily Clark were my students who created the Telecycle prototype.