One important implication of the research is remarkably profound: If we want users to like our software, we should design it to behave like a likeable person. If we want users to be productive with our software, we should design it to behave like a good human work mate. Simple, huh? Nass and Reeves say that software should be "polite" because this is a universal human behavioral trait. (Which actions are considered polite might vary from culture to culture, but the trait is present in all cultures.) Our high-cognitive-friction products should follow this simple lead and also be polite. Many high-tech products interpret politeness to mean that it's okay to behave rudely as long as they say "please" and "thank you," but that is emphatically not what politeness is all about. If the software is stingy with information, obscures its process, forces the user to hunt around for common functions, and is quick to blame the user for its own failings, the user will dislike the software and have an unpleasant experience. This will happen regardless of "please" and "thank you" regardless, too, of how cute, representational, visually metaphoric, content-filled, or anthropomorphic the software is. On the other hand, if the interaction is respectful, generous, and helpful, the user will like the software and have a pleasant experience. Again, this will happen regardless of the composition of the interface; a green-screen command-line interface will be well liked if it can deliver on these other points. What Is Polite?What exactly does it mean for software to be friendly or polite? What does it mean for software to behave more like humans? Used-car salesmen wear handsome clothes, smile broadly, and are filled with impressive information, but does that make them likeable? Humans are error prone, slow, and impulsive, but it doesn't follow that software with those traits is good. Human beings have many other qualities that are present only conditionally but that make them well suited to the service role. Software is always in the service role.[4]
Most good software engineers are at a disadvantage in the politeness realm. Robert X. Cringely says that programmers
You can see how the concept of "politeness" or even "humanness" can be a stumbling block when we ask programmers to be the interpreters of such fuzzy concepts. They struggle with the idea of making computers behave more like humans, because they see humans as weak and imperfect computing devices. I asked my friend Keith Pleas, who is well known in the engineering community as an articulate, expert programmer sensitive to user-interface issues, about making software more human. Keith interpreted adding "humanness" as adding imprecision to the interaction. He replied:
Keith's response is natural from the programmer's point of view. True, the computer would never give you an approximate bank balance, but then the computer wouldn't differentiate between taking one tenth of a second to say you have "about $500" in your account, versus taking 17 minutes to say you have "exactly $503.47." A really polite, more-human program would immediately say you have "about $500" and then inform you it will give you a more precise figure in a few additional minutes. Now it would be your choice whether to invest more time for additional precision. This is an application of the principle of commensurate effort; if you want more information you will sympathize with the need to spend more time. |