For me, three major events happened in 1969:
As a student in a major eastern university I had stumbled on a new and fascinating business of writing software for computers. In those days there was no such thing as going down to your local computer store and buying a shrink-wrapped box of software off the shelf. In fact, with the average computer having less than four thousand words of memory, a processor measured in millions of instructions per minute instead of billions of instructions per second, and costing hundreds of thousands of dollars to buy, software was written to make maximum use of the machine's power. Software was written from scratch, with the customer defining the inputs they could supply and the outputs that were desired. The software team (whether in-house or consultants) were then responsible for writing the software that made the transition. If the software did not work, the writers were not paid. Obviously, like any tailor-made item, the software cost a lot of money, but so did the hardware.
Fortunately for me, the Digital Equipment Corporation User Society (DECUS) had a library where people could contribute programs that they had written so that other people could use them. Authors of programs would submit their code to the DECUS library, and the library would then put out a catalog of useful programs, with the cost of copying and mailing by postal mail paid for by the person requesting a copy of the program. The software was free, the distribution (and the costs of printing the catalog, etc.) were not.
Through DECUS I got a lot of free software that I could use. This was good, since as a student I did not have much money to buy tailored software. It was either the software or beer, and the five dollars I would spend in 1969 for copying a text editor would be equivalent to ten pitchers of beer. Imagine how thin I would have been to try and purchase "commercial" software.
Why did these people contribute the software they had written? They had written the software because they needed it for their own work or research, and they (graciously) thought that perhaps someone else might be able to use it also. They also (rightly) hoped that others might help them improve it, or at least give them ideas for improvement.
The second major event of 1969 was that Ken Thompson, Dennis Ritchie and a few other researchers at AT&T Bell Laboratories in New Jersey started writing an operating system that eventually became known as "UNIX". As they wrote it, they formed a methodology that the operating system should be made up of small, re-usable components rather than large, monolithic programs. In addition, they developed an architecture for an operating system that would allow it to be portable, and eventually run on the smallest of embedded devices to the largest of supercomputers, no matter what the instruction set of the main CPU looked like. Eventually this operating system escaped to universities and then to corporate America, where it was used mostly on server machines.
The third major event, one that would change my life and the life of millions of other people in a significant way, but at the time hardly noticed by anyone other than two proud parents in Helsinki, was the birth of Linus Torvalds, who was later to become the architect of the Linux kernel.
Over the next ten years computer science moved at a steady pace. Companies invested in new software and techniques. As each company reached further with what they were doing with computers, they followed the technique of purchasing tailor-made software to work on those computers that were still (by today's standards) impossibly small, impossibly slow and impossibly expensive. Yet the people at AT&T working on Unix and the universities that were helping them continued to do their work in creating an operating system that encouraged the re-use of software, and traded off the possibility of more efficient code for the efficiency in not having to re-write and re-invent the wheel. Universities and computer science researchers liked having the source code for the operating system so they could collaborate on making it better. Life was good.
Then in the early 1980s three major events again happened.
The first was the commercialization of Unix. By this time the cost of minicomputers had dropped to the point where the costs of programming and training of users were beginning to overcome the costs of the basic hardware. Sun Microsystems created a market where various customers were demanding an "Open System", which ran on many processor types, rather than the current "proprietary" systems such as MVS, MPE, VMS, and the other commercial operating systems. Companies such as Digital Equipment Corporation, IBM and Hewlett Packard began to think about creating a "commercial" version of Unix for themselves. In order to protect their investments, and to get a lower royalty price from AT&T, they put out a binary-only distribution. The source code for Unix systems was placed at such a high price that few could afford it. Wholesale sharing of the improvements to Unix came to a halt.
The second event was the advent of the microprocessor. At first this created an atmosphere of software sharing through bulletin boards, magazines and computer clubs. However a new concept came about, that of "shrink-wrapped" software. Software written to "commodity" processor architectures such as Intel or Motorola, with hardware produced in the hundreds and thousands of units. Initially with CP/M, then later with MSDOS and Apple's OS, the number of shrink-wrapped products increased. I still remember walking into my first computer store and seeing lots of computers from different companies and of different hardware architectures on the shelf, but only three or four software products (a word processor, a spreadsheet, some type of "modem software"). Of course none of these had the source code with them. If you could not get them to work with the software and documentation you had with them, you were stuck. Open collaboration in writing software was slowly replaced by binary-only code, end-user licenses (telling you how you should use the software you bought), software patents and eternal copyrights.
Fortunately back in those days the number of customers for each company was still small. You might even get some help when you called their support line. But trying to build on top of what these companies had done was close to impossible.
The third event was actually a result of the first two. At a small office in MIT, a researcher named Richard Stallman decided that he liked hacking on the source code to Unix and other software packages. He hated the ever-decreasing exposure to source code he had, and decided to start a project in 1984 to write a complete operating system that would forever have its source code available to people. As a side-effect of having freely distributable sources, the software would forever be free of cost for people who wished to pull down the source code and compile their own operating system. He called this project "GNU" for "GNU is Not Unix", to show his displeasure for having the sources of Unix taken away from him.
Time marched on. Microsoft became a dominant force in the operating system business, with other system vendors bringing out various versions of Unix, most of which were incompatible with each other in the name of "innovation". A whole market of shrink-wrapped, inflexible software emerged, each packaged program written for a commodity market that needed unique solutions.
Then around 1990 three more events happened:
The first was that a small group of Unix professionals were sitting around a table comparing the different types of software in different markets and one of them asked (and it may have been me):
"Why do you like Unix?"
Most of the people at the table did not know what to answer at first, but as the question lingered in the air each of them started to give their reason for liking the operating system to which they had such fierce loyalty. Issues such as "re-use of code", "efficient, well-written utilities", "simple, yet elegant" were mentioned and expanded as reasons. None of us noticed that one person was writing down these ideas as we brought them out. Those ideas became the core for the first edition of this book, and it was the first time that someone had written a "philosophy book" about Unix.
Why was a book on the "Philosophy of Unix" needed? To help first-time Unix users to understand the true power and elegance of the system. To show them that a little thought in writing programs using the tools and structures of Unix could save them significant amounts of time and effort. To help them extend the system, not work against it. To help them stand on the shoulders of those who had gone before.
Secondly, by 1990, a lot of the GNU project had been written. Command interpreters, utilities, compilers, libraries, and more. These pieces of the GNU project, in conjunction with other pieces of freely available software such as sendmail, BIND, and the X Window System were only missing the core part of the operating system called the "kernel".
The kernel was the brains of the system, the thing that controlled the time that programs ran, what memory they were given, what files they had access to and other things. It had been left to last because kernels were changing day to day and improving all the time. Having a kernel with no utilities or command interpreters would not have been useful. Having all the tools available for use on other operating systems and other kernels had been very useful over the years.
Third, in December of 1990 Linus Torvalds had grown to be a university student at the University of Helsinki, and he had just gotten a new Intel 386 computer. Linus recognized that the Microsoft operating system of that time did not take advantage of all of the power of the 386, and he was determined to write a kernel, and mate it with the rest of the freely available software available to create an entire operating system. Almost as an afterthought he determined that the kernel should be licensed by the General Public License of the GNU project. He announced the project to some newsgroups on the Internet, and work commenced.
In April of 1994 I watched while Kurt Reisler, chairman of the Unix Special Interest Group (SIG) of DECUS tried to gather funds to have a programmer come to the United States to talk to the DECUS SIG about the project he was working on. Eventually I asked my management at Digital to fund this effort, mostly on the faith that Kurt usually had good insight into things. In May of 1994 I attended the DECUS event in New Orleans, met Linus Torvalds, saw the operating system for the first time and my life was changed forever.
In the past years I have spent advocating a re-implementation of the Unix operating system I loved so much, but in a way that encouraged people to look at, modify and improve the source code that others had built before them. Actually it has grown much beyond what most people would think of as "Unix", since the free software movement now includes databases, multimedia software, business software and other software valuable to millions of people.
Once again the tide of software production had changed. Hardware had become so sophisticated and so cheap, collaboration on the Internet so easy, and the speed of software information so fast that at long last major groups could come together to develop software that helped them solve their own problems. No longer did software have to be done in lofty "cathedrals", using expensive machinery, created by self-proclaimed druids of architecture. Cries of "here is the code" came from people who now had the means to make major contributions to computer science from their homes and classrooms, whether that was in the United States, Brazil, China, or "even" Helsinki, Finland. Tens of thousands of projects started, with hundreds of thousands of programmers helping, and it keeps expanding at an ever-increasing speed.
The future of computer programming will not be the small group of huddled programmers trying futilely to create code that matches 100% of everyone's needs. Instead large groups of software will exist on the net in source code form, begging for consultants and value-added resellers to pull it down and create a solution for the customer by tailoring this code exactly to their customer's needs.
So you see, the concept of Linux and the GNU project while appearing to be the "next step" of the Unix Philosophy is only the return from a wayward path. Everything stated in the Unix Philosophy's first edition is just as true today, perhaps even more so. The addition of source code availability allows you to see exactly how these masters of code created their systems, and challenges you to create even faster code with greater capabilities.
May you stand on the shoulders of giants, and touch the stars.
Carpe Diem,
Jon "maddog" Hall
Executive Director
Linux International