Linux was developed as a clone of Unix. In other words, the developers of Linux built their system without using the programming instructions, also known as the source code, used to build Unix.
Because Linux is a Unix clone, you can use most of the same command-line commands on either operating system.
Although it would have been easier to adapt Unix for the personal computer, important historical reasons lie behind the development of Linux. And the way Linux was developed drives the way Linux developers, companies, and users work today.
Computers were once quite expensive. They were the domain of universities and larger corporations. There was a lot of demand for these early computers; to support this demand, a number of computer scientists developed the concept of time-sharing , where multiple users are connected to the same computer simultaneously .
Even though computers have become more powerful and less expensive, we have returned to this notion of time-sharing. Today, administrators are quite familiar with the concept of the time-sharing system: it is now known as the multiuser server. One network often includes multiple servers; your username may be the same across all of these servers. In fact, it s fair to say that we re all time-sharing users on the biggest network of all ”the Internet.
Let s take a look at some of the developments that occurred along the road to Linux.
One of the early time-sharing projects was Multics (Multiplexed Information and Computing Service), a joint project between MIT, AT&T s Bell Labs (now Lucent Technologies), and General Electric. Although Bell Labs withdrew from the project in 1969, two of their developers, Ken Thompson and Dennis Ritchie, still had an itch for what would become the multiuser operating systems we know today.
Thompson and Ritchie continued development work through the early 1970s. Perhaps the key to their success was their development of the C programming language for writing the kernel and a number of basic commands, including those in the Bourne shell.
When Unix was developed in 1969, AT&T was a regulated monopoly in the United States. Various court and regulatory rulings and agreements kept AT&T out of the computer business.
In 1974, AT&T distributed Unix to the University of California for the cost of the manuals and tapes. It quickly became popular at a number of universities. Nevertheless, AT&T was not allowed to make money from it.
Bell Labs has a history of groundbreaking research. The company had some of the best minds in the world working on fundamental problems. Bell Labs wanted the goodwill of the academic community. Since AT&T wasn t allowed to make money from software, it kept the license for Unix and distributed the operating system with source code to universities for a nominal fee. In exchange, AT&T s lawyers insisted that the license explicitly state that Unix came with no warranty. This release technique became known as open source .
The timing was good. Various universities adapted the Unix source code to work with three different kinds of computers available at the time: mainframes, minicomputers, and microcomputers.
At about the same time, the U.S. Department of Defense s Advanced Research Project Agency (ARPA) wanted to set up a nationwide communications network that could survive a nuclear war. Most universities on this ARPA network used Unix. TCP/IP was built on Unix and eventually became the communication protocol for the ARPANET. The ARPANET eventually developed into the Internet that you know today. Unix and derivative clones , like Linux, are critical parts of the Internet.
AT&T retained the license to Unix through the 1980s. When the U.S. government settled the AT&T antitrust suit in 1982, one of the provisions allowed AT&T to go into the computer business. This became known as the AT&T consent decree. At that point, AT&T was able to sell the Unix operating system and source code with all the protections associated with a copyright.
The programmers who used Unix wanted to keep the advantages of an open-source operating system. Unix programmers wanted the ability to customize the software. As academics , they wanted to share the results. The Unix users of the time had the high level of knowledge that made open-source software worthwhile.
Ironically, AT&T was never very successful at selling Unix and eventually sold the rights to the operating system. The direct successor is now owned by the SCO Group , which also owns the rival SCO (formerly Caldera) Linux distribution.
The SCO Group has recently filed suit against IBM over Unix. This is controversial as there are many in the Linux community who see this as a threat.
At the time, with their limited budgets , universities did not have the money to purchase the now proprietary Unix, and they did not want to have their academic freedoms limited by copyrights. Generally, academics are most comfortable when they can share all of their data. To this end, Douglas Comer developed Xinu (Unix, spelled backwards ) in 1983 to illustrate operating system structures in a classroom setting. In 1986, Andrew Tannenbaum developed Minix as a Unix clone and free alternative. Like Linux, Minix does not use Unix s source code, and therefore does not infringe on any of AT&T s Unix copyrights.
Even before the consent decree, Bill Joy of the University of California worked on Unix. He also started work on the Berkeley Standard Distribution (BSD), which, like Unix, was released under an open-source style license. A number of BSD utilities were incorporated into later versions of Unix. In 1982, Joy became a cofounder of Sun Microsystems.
Several other operating systems are closely related to Unix, as shown in Table 1.2.
The Advanced Interactive eXecutive operating system, developed by IBM; used with high-end CPUs such as Power4 and RS64 IV (64-bit PowerPC chips).
The Berkeley Standard Distribution, an open-source alternative to Linux.
Developed by Hewlett-Packard; version 11i is developed for 64-bit RISC and Itanium CPUs.
Developed by Silicon Graphics for 64-bit CPUs.
The free operating system clone of Unix.
Developed by Sun Microsystems for its UltraSPARC CPUs.
Formerly known as Digital Unix, optimized for 64-bit CPUs.
The successor to AT&T s version of Unix, now owned by the SCO Group.
One telling trend is that a number of these companies are moving toward using Linux on many of their servers. While this book is based on the 32-bit Red Hat Linux kernel, a 64-bit Red Hat kernel is available.
Some of the work of the academic community eventually became something of a rebellion. In its early stages, it was led by Richard Stallman and his Free Software Foundation (FSF). (For more information, see the website at www.fsf.org .)
Stallman started work on the GNU s Not Unix (GNU) project in 1984. He summarized the focus of the FSF in his introductory Usenet message: I consider that the golden rule requires that if I like a program I must share it with other people who like it. Stallman s purpose was to set up a group where the free sharing of software would be strongly encouraged. To realize his dream, Stallman needed an operating system, free of the code that was then copyrighted by AT&T.
The FSF developed the General Public License (GPL) to build a body of free software protected from those who would use it to create proprietary closed-source systems. This same license still protects Linux today; you can read it in Web Chapter 4, which can be found on the Sybex website at www.sybex.com .
By 1991, the FSF had cloned all of the major components of a Unix-style operating system, except the kernel.
Richard Stallman developed the GPL to bring the advantages previously available with Unix to the general software community. He wanted to develop a license that would protect software from anyone who would hide its source code. GNU software is licensed under the GPL. While you can read the GPL in Web Chapter 4, you can also read about three basic principles behind the GPL:
All GPL software must be distributed with a complete copy of the source code. The source code must include clear documentation.
Any software added to GPL software must also be clearly documented. If the new software interacts with the GPL software, the package as a whole must be distributed as GPL software.
Any GPL software comes without a warranty.
In 1991, Linus Torvalds was a graduate student in Finland. He was not happy with the operating systems available for his new computer with a 386 CPU. So he put together a kernel to allow some operating system components to communicate with computer hardware. By 1995, several companies assembled Linus s kernel with the GNU software of the FSF to produce the first Linux distributions.
Richard Stallman and the people of the FSF believe that the Linux operating system is more properly known as GNU/Linux because it combines a large number of GNU-licensed programs, commands, and utilities with one Linux kernel.