Hack74.Maintain the Status Quo


Hack 74. Maintain the Status Quo

People don't like change. If you really want people to try something new, you should just coerce them into giving it a go and chuck the idea of persuading them straight off.

By default, people side with what already is and what happened last time. We're curious, as animals go, but even humans are innately conservative. Like the Dice Man, who delegates all decisions to chance in Luke Rhinehart's classic 1970s novel of the same name, was told: "It's the way a man chooses to limit himself that determines his character. A man without habits, consistency, redundancyand hence boredomis not human. He's insane."1

In this hack we're going to look at our preference for the way things are and where this tendency comes from. I'm not claiming that people don't changeobviously this happens all the time and is the most interesting part of lifebut, in general, people are consistent and tend toward consistency. Statistically, if you want to predict what people will do in a familiar situation, the most useful thing you can measure is what they did last time. Past action correlates more strongly with their behavior than every other variable psychologists have tried to measure.2 If you're interested in predicting who people will vote for, what they will buy, what kind of person they will sleep with, anything at all really, finding out what tendencies they've exhibited or what habits they've formed before is the most useful information at your disposal. You're not after what they say they will donot what party, brand, or sexual allegiance they tick on a formnor the choice they think they're feeling pressured into making. Check out what they actually did last time and base your prediction on that. You won't always be right, but you will be right more often by basing your guess upon habit than upon any other single variable.

This bias is the result of a number of factors, not least the fact that people's previous choice is often the best one or the one that best reflects their character. But also we have mental biases,3 like the mental biases we have about numbers [Hack #70], which produce consistent habits and an innate conservativism.

Biases in reasoning are tendencies, not absolutes. They make up the mental forces that push your conclusions one way or the other. No single force ever rules completely, and in each case, several forces compete. We're mostly trying to be rational so we keep a look out for things that might have biased us so we can discount them. Even if we know we can't be rational, we mostly try to be at least consistent. This means that often you can't give the same person the same problem twice if it's designed to evoke different biases. They'll spot the similarity between the two presentations and know their answers should be the same.

I'm carelessly using the word "rational" here, in the same way that logicians and people with a faith in pure reason might. But the study of heuristics and biases should make us question what a psychological meaning of "rational" could be. In some of the very arbitrary situations contrived by psychologists, people can appear to be irrational, but often their behavior would be completely reasonable in most situations, and even rational considering the kind of uncertainties that normally accompany most choices in the everyday world.

T.S.

But some biases are so strong that you can feel them tugging on your reason even when the rational part of your mind knows they are misleading. These "cognitive illusions" work even when you present two differently biased versions of the choice side by side. The example we're going to see in action is one of these.

7.6.1. In Action

I'm going to tell you in advance that the two versions of the problem are logically identical, but I knowbecause your brain evolved in the same way mine didthat you'll feel as if you want to answer them differently despite knowing this. If your supreme powers of reason don't let you feel the tug induced by the superficial features of the problem (the bit that conveys the bias), take the two versions and present them to two different friends.

Here we go . . .

7.6.1.1 Version 1

A lethal disease is spreading through the city of which you are mayor. It is expected to kill 600 people. Your chief medical adviser tells you that there is a choice between two treatment plans. The first strategy will definitely save 200 people, whereas the second strategy has a one-third chance of saving 600 people and a two-thirds chance of saving no one. Which strategy do you choose?

7.6.1.2 Version 2

A lethal disease is spreading through the city of which you are mayor. It is expected to kill 600 people. Your chief medical adviser tells you that there is a choice between two treatment plans. The first strategy will definitely kill 400 people, whereas the second strategy has a one-third chance that nobody will die and a two-thirds chance that 600 people will die.

Do you feel it? The choices feel different, even though you know they are the same. What's going on?

7.6.2. How It Works

At least two things are going on here. The first is the effect of the way the choice is presentedthe framing effect. Anyone who has ever tried to persuade someone of something knows the importance of this. It's not just what you say, but how you say it, that is important when presenting people with a choice or argument. The second thing is a bias we have against risking an already satisfactory situationwe're much more willing to take risks when we're in a losing position to begin with. In the examples, the first frame makes it look like you stand to gain without having to take a riskthe choice is between definitely saving 200 people versus an all-or-nothing gamble. The second frame makes it appear as though you start in a losing position (400 people down) and you can risk the all-or-nothing gamble to potentially improve your standing. In experimental studies of this dilemma, around 75% of people favor not gambling in the first frame, with the situation reversed in the second.4

So why do we gamble when we think we might lose out, but have a bias to avoid gambling on gains? I'm going to argue that this is part of a general bias we have toward the way things are. Let's call it the "status quo bias." This is probably built into our minds by evolutionnature's way of saying "If it ain't broke, don't fix it."

With habits, it is easy to see why the status quo bias is evolutionary adaptive. If you did it last time and it didn't kill you, why do it differently? Sure, you could try things differently, but why waste the effort, especially if there's any risk at all of things getting worse?

7.6.3. In Real Life

There's a way to hack this habit bias, and it's well-known to advertisers. If people generally stick with what they know, the most important thing you can do is get them to start off with your product in the first place (hence the value of kids as a target market). But you can make use of the bias: people choose based on what they did before, so it is more effective to advertise to influence what they choose rather than how they feel about that choice. Even if there's no good reason for someone using your product in the first place, the fact that they did once has established a strong bias for them doing so again. A computer user may prefer one browser, but if another one comes bundled with her new operating system, we can bet that's what she'll end up relying on. You may have no rational reason for choosing Brand A over Brand B when you buy jam, but if the manufacturers of Brand B can get you to try it (maybe by giving you a free sample or a special offer), they've overcome the major barrier that would have stopped you from buying it next time.

Status quo bias works for beliefs as well as behaviors. In many situations we are drawn to confirm what we already know, rather than test it in a way that might expose it to be false [Hack #72] .

It's an experience I've had a lot when debugging code. I do lots of things that prove to me that it must be the bug I first think it is, but when I fix that bug, my code still doesn't work.

It's not just me, right?

T.S.

Another manifestation of our preference for the way things are is the so-called endowment effect,5 whereby once we have something, however we acquired it, we give it more value than we would give up to obtain it. In one study, students were given a mug with their university emblem, worth $6. In a trading game they subsequently wanted an average of around $5 to give up their mug, whereas students without mugs were willing to offer an average of only around $2 to buy a mug. The mere sense of ownership that came with being given the mug was enough to create a difference between how the two groups valued the object. This is just one of the ways in which human behavior violates the rationality supposed by classical economic theory.

So we can see that if you want people to give something up, you shouldn't give it to them in the first place, and if you want to introduce something new, you should make people try it before trying to persuade them to accept it. If you can't do this, you should at least try and introduce the new change elements as part of the familiar experience.

7.6.4. End Notes

  1. Rhinehart, L. (1971). The Dice Man.

  2. Ajzen, I. (2002). Residual effects of past on later behavior: Habituation and reasoned action perspectives. Personality and Social Psychology Review, 6, 107-122. See also: Ouellette, J. A., & Wood, W. (1998). Habit and intention in everyday life: The multiple processes by which past behavior predicts future behavior. Psychological Bulletin, 124, 54-74.

  3. The Wikipedia has an enjoyable, if unstructured, list of cognitive biases (http://en.wikipedia.org/wiki/List_of_cognitive_biases). A good introduction to cognitive biases and heuristics is Nicholls, N. (1999). Cognitive illusions, heuristics, and climate prediction. Bulletin of the American Meteorological Society, 80(7), 1385-1397 (http://ams.allenpress.com/pdfserv/i1520-0477-080-07-1385.pdf).

  4. Tversky, A., & Kahneman, D. (1981). The framing of decisions and the psychology of choice. Science, 211, 453-458.

  5. Kahneman, D., Knetch, J. L., & Thaler, R. H. (1991). Anomalies: The endowment effect, loss aversion, and status quo bias. Journal of Economic Perspectives, 5(1), 193-206. A reverse of the endowment effect is the windfall effect in which people value less highly money they didn't expect to come to them (like lottery wins and inheritance).



    Mind Hacks. Tips and Tools for Using Your Brain
    Mind Hacks. Tips and Tools for Using Your Brain
    ISBN: 596007795
    EAN: N/A
    Year: 2004
    Pages: 159

    flylib.com © 2008-2017.
    If you may any questions please contact us: flylib@qtcs.net