Tenet 6: Use software leverage to your advantage

5.1 Tenet 6: Use software leverage to your advantage

Let's suppose you're one of the world's best programmers. Every piece of code you write turns to gold. Your applications become instant hits the day they're released. Critics on the Web sites shower your work with praise, and your software adorns the covers of the trade rags. Your programs are truly unique.

Unfortunately, being "one of a kind" poses a problem for you. The uniqueness that distinguishes your work becomes the chain that binds you. If you do all the work yourself, you can only do so much. Unless you can find a way to off-load some of it, you will burn yourself out long before you achieve your maximum potential.

5.1.1 Good programmers write good code; great programmers borrow good code

The best way to write lots of software is to borrow it. By borrowing software, we mean incorporating other people's modules, programs, and configuration files into your applications. In producing a derivative work, you augment the previous developers' efforts, carrying their implementations to new heights of utility. Their software becomes more valuable as it finds a home in more applications; your software becomes more valuable because your investment in it has been reduced relative to its return. It's a mutually beneficial situation.

Although you may have lowered your investment in an application, you must not necessarily settle for reduced profits. Applications built by integrating other people's code can sell for considerable amounts of money. They also tend to grab significant market share because they usually reach the market before those developed by competitors. The old adage, "the early bird gets the worm," holds especially true here. If you can be the first with a hot new application, it doesn't matter that you achieved your position by using other people's work. Potential customers just want to know whether your software can do the job. They are less interested in how your software works than in what it can do for them.

Leveraging other people's code can result in powerful advantages for the individual programmer, too. Some programmers believe that they protect their job security by writing the code themselves. "Since I write good code, I'll always have a job," they reason. The problem is, writing good code takes time. If you have to write every line of code used in an application, you will appear slow and inefficient. The real job security belongs to the software developer who can cut and paste modules together quickly and efficiently. Developers like that often produce so much software in a short time that companies generally consider them indispensable.

I recall a less-than-top-notch software engineer who couldn't program his way out of a paper bag. He had a knack, however, for knitting lots of little modules together. He hardly ever wrote any of them himself, though. He would just fish around in the system's directories and source code repositories all day long, sniffing for routines he could string together to make a complete program. Heaven forbid that he should have to write any code. Oddly enough, it wasn't long before management recognized him as an outstanding software engineer, someone who could deliver projects on time and within budget. Most of his peers never realized that he had difficulty writing even a rudimentary sort routine. Nevertheless, he became enormously successful by simply using whatever resources were available to him.

5.1.2 Avoid the not invented here syndrome

Symptoms of NIH appear in the finest of organizations. When a group refuses to recognize the value of another group's application, when one would prefer to write an application from scratch instead of using one "off the shelf," when software written elsewhere isn't used simply because it was written elsewhere, NIH is at work.

Contrary to popular belief, NIH does not expand creativity. Viewing another's work and declaring that you could do it better doesn't necessarily make you more creative. If you start from scratch and redesign an existing application, you're engaging in imitation, not creativity. By avoiding NIH, however, you open doors to new and exciting worlds of engineering design. Since less time is spent rewriting existing routines, you can devote more time to developing new functional capabilities. It's like starting out in a Monopoly game owning hotels on Boardwalk and Park Place. You don't have to spend half the game trying to acquire the property and build the hotels.

NIH can be especially dangerous with today's emphasis on standardization in the software industry. Standards drive software vendors toward commoditization. All spreadsheets begin to look alike, all word processors provide the same capabilities, and so on. The resulting oversupply of available software to accomplish everyday tasks drives prices down, thus limiting profitability. Every vendor needs a spreadsheet, a word processor, and so on, just to stay in the game. But few vendors can afford to produce those staples from scratch. The most successful companies will be those that "borrow" the software, leaving them the opportunity to create enhancements or "added value," in industry jargon.

I once worked with a team of software engineers who were working on a GUI for a window system. We had an idea to mimic another popular interface on the market. Since the other was so successful, we reasoned, then ours would surely be a hit as well. The plan was to rewrite the user interface from scratch, making it more efficient in the process.

We had two obvious strikes against us. First, in attempting to write a more efficient program, we would have to take some steps that would result in nonportable software. By "hardwiring" the application to our target architecture, we severely limited the size of our potential market. Second, it would take several months to write the user interface from scratch. While we were busy writing our own user interface, the developers of the one we were imitating weren't exactly twiddling their thumbs. They were busy adding features and enhancements to their software. By the time we released our version, theirs would be at least a generation removed from ours.

Fortunately for us, we were too blind then to realize that the other company's user interface would have evolved considerably while we were developing ours. Instead we became concerned that we might become involved in a patent infringement suit if our software looked and felt too much like the one we were bent on imitating. So we ran our ideas past one of our corporate lawyers. He opened our eyes to a more interesting possibility.

"Instead of duplicating the other company's work, why not use their software in our product?" he asked. We all took a deep swallow on that one, and the letters P-R-I-D-E got stuck in our throats. His suggestion dealt a real blow to the NIH tradition we had carefully nurtured and guarded over the years. The toughest part was admitting that his idea made sense.

So we set about learning how to incorporate the other vendor's software into our product and enhance it. It was a painful endeavor, given our history of wanting to do it our own way. Eventually, we released a set of programs built using the other software as a base. The result? Customers praised our efforts at compatibility. They bought our product because it offered value above and beyond competitors' packages while remaining compatible with de facto industry standards. We had a winner on our hands.

Take note of the phrase "added value," for it holds the key to success in the software realm of the 1990s and beyond. As computer hardware has become a commodity, so software will proceed down the same path. The various ongoing standardization efforts virtually guarantee this. Prices of software will drop, since every major vendor will be providing similar capabilities. Software companies will then have two options: They can watch their profit margins shrink to zero or else preserve them by adding value to standard applications. They must invent features that differentiate their products from industry standards as they simultaneously retain compatibility with those same standards. To survive, companies must meet the challenge of the conflicting goals of uniformity and independence. The way to win will be to add glitter to the wheel, not to reinvent it.

Sounds a lot like the hurrah coming from the Open Source movement, doesn't it? You're catching on. Later we shall see how Linux vendors and consultants take the idea of added value and turn it into the basis for a viable business model.

NIH adherents are under attack on several fronts in today's business environment. New high-capacity storage technologies, such as DVDs, pose a huge threat. These same shiny disks used to distribute the latest movies can also store over 100 gb of programs and data cheaply. Inexpensive media like these have the potential to alter the software landscape permanently. As CDs and DVDs have become commonplace, storage technology has taken an irreversible leap forward.

In a recent visit to a local PC show, I discovered a CD containing over 2,400 programs selling for a mere $5. (Admittedly, this is an example of "low-tech" these days, but it further illustrates the point.) It will be difficult to justify rewriting programs available for less than a penny elsewhere—unless you're a software developer who can live for months on a few cents.

Perhaps the biggest threat to NIH comes from high-speed Internet access technologies such as cable modems and DSL. These make it possible for anyone to download thousands of programs for free. They don't even have to go to a PC show to find them. They're readily a few mouse clicks away.

When good software becomes available at zero cost, NIH all but disappears. Any method of software development that does not incorporate software developed by others becomes too expensive. This is not to say that no new code will ever be written. Most new software will either (1) enhance and extend existing software, or (2) realize a completely new application. Linux and the Open Source community are uniquely positioned to take advantage of either scenario.

5.1.3 Allow other people to use your code to leverage their own work

Software engineers tend to hoard their source code. It's as if they believe they have written a unique contrivance, a magic formula that would change the world whose secret they alone possess. They harbor subconscious fears that if they give the source away, they will no longer control this mysterious pearl of great price.

First, software is no magic formula. Anyone with a reasonably logical mind can write the stuff. You can be clever or trite, but all software boils down to a series of calculated statements that cause the hardware to perform certain well-defined operations. A programmer who has never seen the source code can disassemble even the best program. Disassembly is a slow, tedious process, but it can be done.

What about the question of control? A commonly held belief in the computer world is that whoever controls the source code "owns" the program. This is partly true. A company that holds the source code exercises some authority over who modifies a program, who can obtain runtime licenses for it, and so on. Unfortunately, this ability to control the life of a piece of software protects only the company's temporal investment in the program's development, not the software itself. It cannot prevent the onset of imitators (clones) that seek to emulate its features and functions. Most ideas that are good enough for a company to invest in are also good enough for its competitors to invest in. It's only a matter of time before imitations appear as other vendors strive to catch the wave. The most successful software then becomes the one that appears on the most computers. Companies operating in a proprietary fashion find themselves at a significant disadvantage here.

Unix owes much of its success to the fact that its developers saw no particular need to retain strong control of its source code. Most people believed it lacked any real value. They regarded Unix as a curious oddity, suitable for research labs and universities, but not much else. No one—except its developers—considered it a serious operating system. Consequently, one could obtain its source code for a pittance.

Soaring development costs caused hardware vendors to reduce their investment in the software needed to make their platforms marketable. A pittance for an operating system soon looked like a very good deal. Unix flourished as a result. Whenever anyone wanted to save the expense of developing an operating system for a new computer, they turned to Unix. Even today many still consider it among the least expensive operating systems available.

Low cost has made Linux the platform of choice for many software houses today. By building on the kernel, programming interfaces, and applications provided by the Linux system developers, these companies use software leverage to tremendous advantage. In avoiding the cost of writing an operating system and a set of suitable applications, they can focus instead on enhancing their own applications to provide superior and value-added customizations. This puts them in a stronger position in the software world than those companies that must first invest in operating system development before they can produce applications. Companies operating in this fashion often sell expertise, rather than the software itself. The software business then becomes one where companies provide consulting and customization services, instead of functioning as software manufacturers.

5.1.4 Automate everything

One powerful way to leverage software to your advantage is to make your machines work harder. Any process you do manually today that your computer can do is a waste of your time. Even in modern engineering labs, surprisingly many skilled personnel still rely on crude, manual methods to accomplish their daily tasks. They ought to know better, but old habits die hard. Here are some clues that you may be working harder than necessary, while your computer sits around with little to do:

  • Do you use hard copy often? Once data or text is printed on paper, managing it becomes a manual process. This is terribly inefficient, not to mention a waste of paper. Traditional-style corporate managers often fall into this trap.

  • Do you sort data or count lines and objects manually? Most operating environments, especially Linux, provide tools to perform these tasks much faster than you can.

  • How do you find files on the system? Do you locate them by browsing your directories one by one? Or do you create a list of your files and scan it with an editor or browsing tool? The Linux commands find, grep, and locate can be a powerful combination in these situations.

  • When trying to find a particular item in a file, do you scan the file manually, relying on your eyes to lead you to the right place? Or do you employ a browser or editor "search" command to let the system do the scanning for you?

  • Do you use a history mechanism if your command interpreter provides one? A history mechanism allows you to invoke previous commands using a shorthand method. The Linux shells csh and bash are especially good in this regard.

  • If you have a system with multiple-window capability, do you use only one window at a time? You benefit by opening two or more windows simultaneously. You can target one window as the space in which you work (edit, compile, etc.) The other becomes your test space. Even better, some Linux window managers offer multiple-desktop capabilities. These allow you to manage several windows as a group. The resulting efficiency gains are often dramatic.

  • How often do you use cut-and-paste facilities? If you find yourself frequently entering long strings manually, you're probably not making the most of these. One person I know keeps a window on the screen containing frequently used strings. He cuts the strings from the window and then pastes them in other windows as he needs them, saving himself much typing.

  • Does the command interpreter you use provide command and/ or file completion capabilities? Do you use these capabilities to accelerate your input, saving you from having to enter many additional keystrokes?

The use of software leverage through increased automation can result in huge productivity gains. I once worked at a place where a significant part of the job involved culling small amounts of information from a diverse collection of sources. In a study we conducted, we observed that each individual was spending as many as 20 hours a week locating the necessary data. This number did not include the time spent reading and verifying the information once it had been located either. A simple tool was written to index information from the wide variety of sources for rapid retrieval. The payoff? People using the tool spent as few as three hours a week locating the same data that couldn't be found in more than 15 hours before. Office productivity soared. Everyone was excited. What was once a cumbersome research job was now a painless, highly interactive series of queries carried out by machines. This freed up our time to tackle tougher problems that the machines could not handle.

Every time you automate a task, you experience the same kind of leverage that my aunt enjoyed when she found others to sell Tupperware for her. Each command procedure and each program you invoke sends your computer off on a wild spree to complete a task. Instead of convincing people to do your bidding, you direct a well-tuned machine to carry out certain procedures according to instructions you specify. The faster the machine, the bigger the leverage. With next year's machine, your leverage will have an even greater impact. And, of course, a machine never gets tired or demands a larger percentage of the profits.

In the first part of this chapter we explored some principles concerning the use of leverage, both from a general standpoint and specifically with respect to how leverage applies to software. We have seen how it is important to become a "software scavenger" with an eye toward leveraging the work of others. After discussing the troublesome NIH syndrome, we stressed the value of sharing your work. Finally, we touched upon the obvious but often overlooked desirability of using the computer to leverage your time by automating daily tasks as much as possible.

Now that we have laid a foundation, it's time to build upon that foundation with another element of Unix, the shell script. Shell scripts capitalize on software leverage in interesting ways. They make it easy for both naïve and expert users to tap into the incredible potential found in Unix. Experienced Unix programmers use them religiously. You should, too.

Shell scripts bear some resemblance to other command interpreters and control mechanisms, such as batch files under MS-DOS and DCL command files under OpenVMS. Unlike these other implementations, however, Unix shell scripts exist in an environment ideally suited for indirect command execution. To highlight their significance, we include a special tenet in the Unix philosophy for them.



Linux and the Unix Philosophy
Linux and the Unix Philosophy
ISBN: 1555582737
EAN: 2147483647
Year: 2005
Pages: 92
Authors: Mike Gancarz

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net