Programmers Focus on What Is Possible to the Exclusion of What Is Probable


Programmers share the mathematician's abstract view of complex systems, so it is not surprising that they look at things differently from most people. Here's what I mean: Imagine that you flipped a coin 1,000,000 times, and 999,999 times the coin landed heads up. To a mathematician, the assertion that "the coin always lands heads up" is false. That single tails-up result disproves the assertion. In mathematical terms, a proposition is true only if it is always true, and this way of thinking is very familiar and reasonable to Homo logicus because, not surprisingly, it's the way computers behave.

On the other hand, most normal people will declare the proposition true because of the preponderance of heads to tails. They also will claim that not only is the proposition true, but it is overwhelmingly, convincingly, indisputably true. The odds are a million to one! In the context of human behavior, million-to-one odds are definitive. They are odds beyond consideration. There's a better chance that I will get hit by lightning, accidentally fall off a bridge, or win the lottery than that the coin will land tails up.

The probability that the proposition is true is enormous, and Homo sapiens live in a world of probabilities. However, there is always that possibility that the proposition is false, and programmers live in the world of possibilities. If it might happen, it is something that must be considered. In the world of software the world of precisely articulated propositions enormously remote possibilities are issues that cannot be ignored.

graphics/07inf05.gif

Programmers call these one-in-a-million possibilities edge cases.[3] Although these oddball situations are unlikely to occur, the program will crash whenever they do if preparations are not made. Although the likelihood of edge cases is small, the cost for lack of preparedness is immense. Therefore, these remote possibilities are very real to the programmer. The fact that an edge case will crop up only once every 79 years of daily use is no consolation to the programmer. What if that one time is tomorrow?

[3] They are also variously called corner cases, special cases, and boundary conditions.

Arguably, the single most important difference between amateur programmers and experienced professionals is the journeyman's obsessive emphasis on preparing for edge cases. This fanatic preparation for the possible has the inevitable consequence of obscuring the probable. This results in products whose interaction is encrusted with little-used or never-used controls that obscure the frequently used ones. Users' most common complaint is that software is hard to use because it has too many options all jumbled into the interface without any discrimination.

The profusion of unneeded and unwanted features brought about by the programmer's possibility thinking is an excellent example of what Po Bronson means by programmers being "generous in their selfishness." They give us lots of what they want.

graphics/kolam.gif

A common joke among programmers is that there are only three numbers: 0, 1, and infinity. In the world of computer processing, this makes a lot of sense. In the binary world inside a computer, a process either happens or it doesn't 1 or 0. If any process can happen more than once, that means it can happen an infinite number of times.

Setup and shut-down code is written so that it can be executed only once. If the program tries to execute it a second time, the computer will probably crash, or at least provoke some major errors. Other parts of programs are designed for more than one execution. Almost any part of any program that can be executed a second time without crashing can also be executed as many times as desired. For the code for the programmer's Homo logicus point of view there is little difference between two executions and two million or two billion executions.

Humans are different. They understand 0 and 1, but they also have a firm grasp on 2, 7, and 31. Most humans have a harder time visualizing a million things than they do visualizing 300 things. A typical human does things in quantities that are in a programmer's no-man's-land. Enthusiastic amateur skiers, for example, might go skiing a dozen weekends each season. Over a span of 40 years of active skiing, that is fewer than 500 times in a lifetime! Modern digital computers can process 500 things in the blink of an eye. An enthusiastic user of any program will still only use it a few thousand times, yet programmers still think in terms of an infinite number of occurrences.

Good programmers purposefully turn a blind eye to practical numbers such as 500 because doing so ensures that their programs will be better able to handle a possible 501st occurrence. This is what Po Bronson means when he says, "Blindness improves their vision."



Inmates Are Running the Asylum, The. Why High-Tech Products Drive Us Crazy and How to Restore the Sanity
The Inmates Are Running the Asylum Why High Tech Products Drive Us Crazy &How to Restore the Sanity - 2004 publication
ISBN: B0036HJY9M
EAN: N/A
Year: 2003
Pages: 170

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net