Skin in the Game


One strong cultural determinant of software engineering is that it is done alone. Programmers sit alone. Only one programmer can type in code at one time. Code is largely invisible inside a computer, and it is almost never read. Reading someone else's code is less like reading a book than it is like reading someone's lecture notes, written in a private, inscrutable shorthand. Programming is so complex that it takes single-minded focus and lots of uninterrupted time. Programmers have a strong sense of this insularity and of what it implies. Nobody can have significant control over what a programmer does inside his own program. Programmers know that the quality of their code is largely a matter of their own conscientiousness. The boss can demand quality, but the boss isn't going to invest the time and effort required to verify that such quality exists. It can take more time to decipher a programmer's code than it took to write it. Programmers know this, and they know that their personal decisions and actions have more leverage on the final product and the user's satisfaction than any other consideration. Ultimately, they will personally hold the bag for the product's success. They know that they have a lot of skin in the game.

The lonely work of the programmer gives him a strong sense of his power. Some programmers are uncomfortable with the sense of power, but they are even more uncomfortable delegating authority to others with less skin in the game. When marketers, managers, or designers give advice to them, programmers regard the suggestions with a healthy dose of skepticism. If they take the advice and it turns out to be bad, they know the advisor will be long gone and that the blame will fall squarely on the programmer.

Letting programmers do their own design results in bad design, but it also has a collateral effect: The programmers lose respect for the design process.

Programmers have been successfully bluffing their way through the design process for so long that they are conditioned to disregard its value. When a trained interaction designer is finally hired, the programmer naturally treats the designer's work dismissively.

This leads to a general lack of respect for the interaction designer, the design process, and sadly, the design itself. This disrespect reinforces the cultural valuation of the design as opinion and vague advice, rather than as a clear, specific, and unequivocal statement. Because the programmer rightly assumes that his fancy carries equal weight to mere opinion, he feels free to cherry-pick elements of the design from the specification. Instead of seeing the written design specification as a blueprint, he sees it as the op-ed page of the newspaper. Some items are interesting but untrue; others are true but irrelevant. Unfortunately, the programmer is making these decisions on the basis of implementation considerations or on a self-referential basis, so they are frequently wrong.

On the other hand, every programmer has horror stories to tell of good products that failed because of dunderheaded design imperatives from managers who were equally confused about what users might be looking for. I remember one senior executive who hated to type, demanding that all of his company's programs be controllable only by the mouse. I also remember another senior executive who was clumsy with a mouse, declaring that all of his company's programs must be controllable only with the keyboard. These destructive, self-referential designs caused despair to ripple through both companies.

graphics/kolam.gif

Certainly some programmers are consciously malicious and destructive, but judging from the many programmers I have met they as are rare as hen's teeth. Their training and discipline are so tough that it is inevitable, as they reach the peak of their abilities, that they see nonprogrammers as less competent. Software engineers respect others in their own areas, but when a nonprogrammer ventures into the world of programming, as Moody describes, programmers become condescending or even elitist.

The programmer has every right to sneer at the amateur who pokes his nose into the highly technical world of software development. Likewise, if the programmer knocked on the controller's door and began recalculating business ratios, the controller would be justified in sneering at the presumption and arrogance of the interloping programmer.

The difficulty arises because designing interaction and implementing interaction are so thoroughly mixed in the typical development process. Although a manager might request a change in the program's behavior, she wouldn't presume to ask the programmer to use different construction methods. But because the behavior and its implementation are so tightly bound, it is impossible to assail one without appearing to assail the other. This is part of the difficulty Moody observed at Microsoft.

Most people involved in the creation of software-based products want theirs to be easy to use. Consequently, they are constantly encroaching on the programmers. The developers never have a surplus of time, so this poaching can make them testy. Many retreat into solitude and communicate only reluctantly with other, nonprogramming team members. Tamra Heathershaw-Hart related this story to me about getting information from programmers when she worked as a technical writer:

I discovered that bribery worked a lot better than begging. I used chocolate most of the time. The bribery method worked so well that I once had an engineering manager apologize on his knees in public for forgetting to tell me about a product change. (Yes, he got his treat anyway.) At one company I had a chocolate-craving engineer tell me all his co-workers' changes, just so he could get their chocolate. Before the bribery method I spent a lot of overtime hours trying to figure out what stuff in the product had changed. Afterwards, I managed to cut my overtime by more than half.

This anecdote is amusing because if we have any experience at all in the software-development business we recognize its truth. If you heard a story about a company's controller having to bribe an accounts-receivable clerk with chocolate to get information on today's deposits, you'd be astonished, indignant, and incredulous.

graphics/kolam.gif

Many executives are accustomed to having their subordinates respond immediately to any directive or even mild suggestion that they might offer. They imagine that programmers being technical are not very high up the totem pole of authority and will obediently follow direction from their higher-ups. From the programmer's point of view, the executive doesn't have any skin in the game, so obedience is problematic. The independent-minded software engineer won't change his code just because someone tells him to, regardless of the magnitude of that person's title.

If you want to change some existing code, you have to first change the programmer's mind. He will have a vested interest both in the existing code and in avoiding the seemingly unnecessary effort of changing it. You cannot merely demand, let alone ask, but you must present a rational, defensible reason for making the change. It must be presented in terms the engineer can understand, and it has to be presented by someone with skin in the game.

Paul Glen's book Leading Geeks[2] is a remarkably accurate and revealing analysis of how programmers think and behave. If you wish to learn more about programmers and programming culture, I strongly recommend Glen's book.

[2] Paul Glen Leading Geeks: How to Manage and Lead the People Who Deliver Technology, 2003, John Wiley & Sons, New York, New York, ISBN: 0-7879-6148-5.

Scarcity Thinking

One of the strongest influences on software design is what I call scarcity thinking. This comes from two forces working in concert. The newness of the computer software industry is well known, but our very youth conspires to make us a very nonintrospective industry. We are too busy assimilating new technologies to reflect on the misconceptions surrounding the older ones. Consequently, the software industry resounds with myths and misunderstandings that go quite unquestioned.

Astonishingly, the simple and obvious fact that computers are vastly more powerful, cheaper, and faster than they were just a few years ago hasn't really penetrated the practice of software construction. Consequently, most software products don't work very hard to serve the user. Instead, they are protective of the central processing unit (CPU) in the mistaken impression that it is overworked. The result is that software-based products tend to overwork the human user. Design guru Bill Moggridge calls this attitude "be kind to chips and cruel to users."

In the last decade, the incredible advances in computer construction have put awesome power on the average desktop for bargain prices. Any student or homemaker can have power that General Motors' corporate data-processing center would have lusted after in 1974. Yet most software is still built today with tools, technologies, methods, and mind-sets that come directly from that world of scarcity thinking. Developers are conditioned to ask themselves, "Can we fit it in? Will it respond fast enough? What nonessentials can we discard to make it more efficient?" It forces out of consideration more-relevant questions, such as, "Will the user understand it? Can we present this information in a way that makes sense? Is the sequence of instructions appropriate for what the user wants? What information does the user need most?"

With few exceptions, most CPUs are spending the overwhelming majority of their time idling doing nothing. Yes, some processes are compute-bound, but they are much fewer and rarer than we are led to believe by hardware vendors who want to sell us the latest and greatest and most-powerful electronic wonders. It would not be in their best interest to let consumers know that their CPUs work hard only in very brief spurts and sit idling for 75% 80% percent of the time.

Just two or three decades ago, computers were so weak and precious that any good idea was likely to be restrained by the feebleness of the host computer. The main thrust of computer science back then was to develop technologies that relieved the strain on the scarce computing resource. Such widely used technologies as the relational database, ASCII code, file systems, and the BASIC language were designed primarily to ease the load on the computer. Software written during that time gave priority to performance at the expense of other considerations, such as ease of use. But don't forget that prewritten code is like a force of nature, and much of that old code, written for weak computers, is running on modern, abundantly powerful systems.



Inmates Are Running the Asylum, The. Why High-Tech Products Drive Us Crazy and How to Restore the Sanity
The Inmates Are Running the Asylum Why High Tech Products Drive Us Crazy &How to Restore the Sanity - 2004 publication
ISBN: B0036HJY9M
EAN: N/A
Year: 2003
Pages: 170

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net