Thanks to my editor, Andy Oram, who valiantly fought many battles—a few with me—in seeing this book to completion. Andy's concern to make this the best book possible was continually evident and greatly appreciated. Andy suggested many improvements and caught several errors that would have caused me serious embarrassment. Way to go, Andy! Thanks also to Margot Maley of Waterside Productions, Inc., who brought this authorship opportunity to my attention.
Several reviewers, some working for O'Reilly & Associates and some working elsewhere, commented on the manuscript and suggested helpful corrections and improvements. In particular, I'd like to thank the following people for taking time away from their busy schedules to review this latest edition: Jason Hall, Andy Oram, Chip Turner. I greatly appreciate their assistance and readily confess that any errors in the manuscript were added by me after their reviews and so are entirely my responsibility.
My family—Jennifer, Patrick, and Sara—provided their customary compassion and assistance during this latest authorship experience. Thanks, guys.
I also acknowledge the love, concern, and support of my savior, Jesus Christ. His perfect love is entirely undeserved.
Chapter 1. Why Run Linux?
Welcome to Linux, the operating system everyone's talking about. Unlike the weather—which proverbial wisdom says you can't do anything about—you can do something about Linux. You can run it on your own PC, so that you can see firsthand what the talk is about and perhaps contribute suggestions to its future development.
This chapter is the first leg of your journey into the land of Linux. Here, you'll learn whether this particular journey is right for you and what you can expect down the road. If you're impatient to get started, you can jump ahead to the next chapter, which helps you prepare your PC for installing Linux. But, if you'd like to know more about the history and capabilities of Linux, read on.
1.1 Why Red Hat Enterprise Linux and Fedora?
This book explains Release 3 of Red Hat Enterprise Linux WS and Release 1 of Fedora Core. As explained in the following sections, Red Hat offers several Linux products, or distributions as they're called. Moreover, other companies sell or freely provide Linux distributions. Why, then, does this book focus on Red Hat Enterprise Linux WS and Fedora Core?
From the standpoint of market share, Red Hat is the leading provider of Linux distributions in the U.S. and worldwide. To many people, Red Hat Linux is Linux. And, among the various Linux distributions provided by Red Hat, Red Hat Enterprise Linux WS and Fedora Core stand out as the most appropriate distributions for desktop users, especially those in corporate environments. To understand why this is so, it's necessary to understand more about Linux, operating systems, and open source (http://www.opensourece.org) software.
1.2 What Is Linux?
Linux is an operating system, a software program that controls your computer. Most PC vendors load an operating system—generally, Microsoft Windows—onto the hard drive of a PC before delivering the PC; so, unless the hard drive of your PC has failed or you've upgraded your operating system, you may not understand the function of an operating system.
An operating system handles user interaction with a system and provides a comfortable view of the system. In particular, it solves several problems arising from variation among hardware. As you're aware, no two PC models have identical hardware. For example, some PCs have an IDE hard drive, while others have a SCSI hard drive. Some PCs have one hard drive; others have two or more. Most PCs have a CD-ROM drive, but some do not. Some PCs have an Intel Pentium CPU, while others have an AMD Athlon, and so on.
Suppose that, in a world without operating systems, you're programming a new PC application—perhaps a new multimedia word processor. Your application must cope with all the possible variations of PC hardware. As a result, it becomes bulky and complex. Users don't like it because it consumes too much hard drive space, takes a long time to load, and—because of its size and complexity—has more bugs than it should. Operating systems solve this problem by providing a standard way for applications to access hardware devices. Thanks to the operating system, applications can be more compact, because they share the commonly used code for accessing the hardware. Applications can also be more reliable, because common code is written only once—and by expert systems programmers rather than by application programmers.
As you'll soon learn, operating systems do many other things as well; for example, they generally provide a filesystem so you can store and retrieve data and a user interface so you can control your computer. However, if you think of a computer's operating system as its subconscious mind, you won't be far off the mark. It's the computer's conscious mind—applications such as word processors and spreadsheets—that do useful work. But, without the subconscious—the operating system—the computer would cease breathing and applications would not function.
1.2.1 Desktop and Server Operating Systems
Now that you know what an operating system is, you may be wondering what operating systems other PC users are using. According to the market research firm IDC, Microsoft products account for over 90 percent of sales of desktop operating systems. Because Linux is a free operating system, Linux sales are a mere fraction of actual Linux installations. Unlike most commercial operating systems, Linux is not sold under terms of a per-seat license; a company is free to purchase a single Linux CD-ROM and install Linux on as many systems as they like. So, sales figures understate the popularity of Linux. Moreover, it's important to consider who uses a product and what they use it for, rather than merely the number of people using it. Linux is particularly popular among power users who run web sites and databases and write their own code. Hence, though Linux is popular, its influence is even greater than its popularity suggests.
Later in this chapter you'll learn how Linux is distributed, but notice that Linux was termed a free operating system. If you have a high-speed Internet connection, you can download, install, and use Linux without paying anyone for anything (except perhaps your Internet Service Provider, who may impose a connection fee). It's anyone's guess how many people have downloaded Linux, but it appears that about 10 million computers now run Linux.
This book focuses on how Linux can be used on the desktop. However, if you plan to set up a Linux server and are unfamiliar with Linux and Unix, this book is a great starting point.
This book will take you through the basics of setting up and using Linux as a desktop system. After you've mastered what this book offers, you should consult Running Linux, by Matt Welsh, Matthias Kalle Dalheimer Terry Dawson, and Lar Kaufman (O'Reilly & Associates, Inc.), a more advanced book that focuses on setting up and using Linux servers. You might also enjoy Linux in a Nutshell, by Ellen Siever, Stephen Figgins, and Aaron Weber (O'Reilly); this book puts useful Linux reference information at your fingertips.
1.2.2 How Linux Is Different
Linux is distinguished from other popular operating systems in three important ways:
1.2.3 The Origins of Linux
Linux traces its ancestry back to a mainframe operating system known as Multics (Multiplexed Information and Computing Service). Multics was one of the first multiuser computer systems and is still in use today. Participating in its development, which began in 1965, was Bell Telephone Labs, along with the Massachusetts Institute of Technology (MIT) and General Electric.
Two Bell Labs software engineers, Ken Thompson and Dennis Ritchie, worked on Multics until Bell Labs withdrew from the project in 1969. One of their favorite pastimes during the project was playing a multiuser game called Space Travel. Without access to a Multics computer, they found themselves unable to indulge their fantasies of flying around the galaxy. Resolving to remedy this, they decided to port the Space Travel game to run on an otherwise unused PDP-7 computer. Eventually, they implemented a rudimentary operating system they named Unics, as a pun on Multics. Somehow, the spelling of the name became Unix.
Their operating system was novel in several respects, most notably its portability. Most previous operating systems had been written for a specific target computer. Just as a tailor-made suit fits only its owner, such an operating system could not be easily adapted to run on an unfamiliar computer. In order to create a portable operating system, Ritchie and Thompson first created a programming language called C. Like assembly language, C let a programmer access low-level hardware facilities not available to programmers writing in a high-level language such as FORTRAN or COBOL. But, like FORTRAN and COBOL, a C program was not bound to a particular computer. Just as a ready-made suit can be altered here and there to fit a purchaser, writing Unix in C made it possible to easily adapt Unix to run on computers other than the PDP-7.
As word of their work spread and interest grew, Ritchie and Thompson made copies of Unix freely available to programmers around the world. These programmers revised and improved Unix, sending word of their changes back to Ritchie and Thompson, who incorporated the best improvements in their version of Unix. Eventually, several Unix variants arose. Prominent among these was BSD (Berkeley Systems Division) Unix, written at the University of California, Berkeley, in 1978. Bill Joy—one of the principals of the BSD project—later became a founder of Sun Microsystems, which sold another Unix variant (originally called SunOS and later called Solaris) to power its workstations. In 1984, AT&T, the parent company of Bell Labs, began selling its own version of Unix, known as System V.
1.2.4 Free Software
What Ritchie and Thompson began in a distinctly noncommercial fashion ended up spawning several legal squabbles. When AT&T grasped the commercial potential of Unix, it claimed Unix as its intellectual property and began charging a hefty licensing fee to those who wanted to use it. Soon, others who had implemented Unix-like operating systems were distributing licenses only for a fee. Understandably, those who had contributed improvements to Unix considered it unfair for AT&T and others to appropriate the fruits of their labors. This concern for profit was at odds with the democratic, share-and-share-alike spirit of the early days of Unix.
Some, including MIT scientist Richard M. Stallman, yearned for the return of those happier times and the mutual cooperation of programmers that had existed. So, in 1983, Stallman launched the GNU (GNU's not Unix) project, which aimed at creating a free Unix-like operating system. Like early Unix, the GNU operating system was to be distributed in source form so that programmers could read, modify, and redistribute it without restriction. Stallman's work at MIT had taught him that, by using the Internet as a means of communication, programmers could improve and adapt software at incredible speed, far outpacing the fastest rate possible using traditional software development models, in which few programmers actually see one another's source code.
As a means of organizing work on the GNU project, Stallman and others created the Free Software Foundation (FSF), a nonprofit corporation that seeks to promote free software and eliminate restrictions on the copying, redistribution, understanding, and modification of software. Among other activities, the FSF accepts tax-deductible charitable contributions and distributes copies of software and documentation for a small fee, using this revenue to fund its operations and support development activities.
If you find it peculiar that the FSF charges a fee—even a small fee—for "free" software, you should understand that the FSF intends the word free to refer primarily to freedom, not price. The FSF believes in three fundamental software freedoms:
1.2.5 The Linux Kernel
By the early 1990s, the FSF had obtained or written all the major components of the GNU operating system except for one: the kernel. About that time, Linus Torvalds, a Finnish computer science student, began work on a kernel for a Unix-like system. Linus had been working with Minix, a Unix-like operating system written by Andrew Tannenbaum primarily for pedagogical use. Linus was disappointed by the performance of the Minix kernel and believed that he could do better. He shared his preliminary work with others on Internet newsgroups. Soon, programmers around the world were working together to extend and improve his kernel, which became known as Linux (for Linus's Minix). As Table 1-1 shows, Linux grew rapidly. Linux was initially released on October 5, 1991, and as early as 1992, Linux had been integrated with GNU software and other open source software (www.opensource.org) to produce a fully functional operating system, which became known as Linux after the name of its kernel.
Work on Linux has not ceased. Since the initial production release, the pace of development has accelerated as Linux has been adapted to include support for non-Intel processors and even multiple processors, sophisticated TCP/IP networking facilities such as firewalling, network address translation (NAT), and more. Versions of Linux are now available for such computer models and architectures as the PowerPC, the Compaq/DEC Alpha, the Motorola 68k, the Sun SPARC, the MIPS, IBM mainframes, and many others. Moreover, Linux does not implement an obscure Unix variant: it generally complies with the POSIX (Portable Operating System Interface) standard that forms the basis of the X/Open specifications of The Open Group.
1.2.6 The X Window System
Another important component of Linux is its graphical user interface (GUI; pronounced gooey), the X Window System. Unix was originally a mouse-less, text-based system that used noisy teletype machines rather than modern video monitors. The Unix command interface is very sophisticated and, even today, some power users prefer it to a point-and-click graphical environment, using their video monitors as though they are noiseless teletypes. Consequently, some remain unaware that Unix long ago outgrew its text-based childhood and now provides users a choice of graphical or command interfaces.
The X Window System (or simply X) was developed as part of MIT's Project Athena, which it began in 1984. By 1988, MIT released X to the public. Responsibility for X has since been transferred to The Open Group. The XFree86 Project, Inc. provides a freely redistributable version of X that runs on Intel-architecture PCs.
X is a unique graphical user interface in three major respects:
1.2.7 Linux Distributions
Because Linux can be freely redistributed, you can obtain it in a variety of ways. Various individuals and organizations package Linux, often combining it with free or proprietary applications. Such a package that includes all the software you need to install and run Linux is called a Linux distribution. Table 1-2 shows some of the most popular Linux distributions.
Red Hat Enterprise Linux, Mandrake Linux, SuSE, and Slackware are packaged by commercial companies, which seek to profit by selling Linux-related products and services. However, because Linux is distributed under the GNU GPL, you can download the source code related to these distributions from the respective companies' web sites and make additional copies. (Note, however, that you cannot necessarily make additional copies of proprietary software that these companies may distribute with their Linux distribution.) Debian GNU/Linux is the product of volunteer effort conducted under the auspices of Software in the Public Interest, Inc. (http://www.spi-inc.org), a nonprofit corporation. Fedora Core and Gentoo are also the products of volunteer efforts. However, Red Hat provides significant support to the team responsible for Fedora Core.
1.2.8 Red Hat's Linux Distributions
Red Hat formerly provided a single series of Linux distributions known as Red Hat Linux. Red Hat Linux included distinct offerings for various uses (for instance, workstations versus servers), but the offerings were all based on one stable core. Now Red Hat provides several Linux distributions, and it's worth understanding the strengths and weaknesses of each. At the same time, remember that the distributions all include essentially the same software.
Red Hat came to recognize that its distribution interests two very different types of people. One type is business clients who want stable software that comes with support and are willing to pay significant money for that support. If the software changes slowly, that is fine with such users because it means fewer wrinkles and less time spent on upgrades. The other type of people who use Red Hat love to experiment and make use of the newest, most advanced features of Linux. These people don't mind upgrading frequently, but they are not running mission-critical systems and don't want to pay for support. In fact, they're used to getting Linux at no cost.
Red Hat decided that by offering multiple distributions, it could make both types of users happy—and get revenue from its efforts—while benefiting from the testing and experimentation of the user community. Thus, any software that bears the trademark "Red Hat" is licensed software. It is backed by Red Hat in some manner, often through support contracts that include upgrades. It is expected to change only once every two or three years. By contrast, the freely downloadable version of the distribution is called "Fedora Core" or simply "Fedora." Fedora comes without Red Hat support. New versions of Fedora are released every few months and contain a lot of new features.
Right now, the Red Hat Enterprise Linux distribution and Fedora are very similar. To readers of this book, they essentially work the same way, and only some details of installation differ. We include the publisher's edition of Fedora Core on two CDs, but we have tested the material thoroughly with both Fedora Core and Red Hat Enterprise Linux, so the book applies to both.
Naturally, if you install another version of either system, you may find minor differences between the menus or software versions in this book and the ones on your system. However, most of the things described in this book have been stable for some time and are not likely to change quickly.
Despite the outward similarity between Red Hat Enterprise Linux and Fedora Core, these distributions reflect divergent goals and methods. The Fedora Core project team promises 2-3 releases per year of their distribution, a rapid development pace that will enable them to incorporate the most recent available versions of Linux software in a timely manner. However, Fedora Core's rapid release cycle does not afford opportunity to perform quality assurance appropriate for enterprise environments, where reliability and security are crucial. Moreover, updates to a given release of Fedora Core are promised to be available only until 2-3 months after a subsequent release. Thus, users of Fedora Core can expect to have to install a new release of Fedora Core every 6-9 months or leave their systems vulnerable to security flaws for which updates are not available. Consequently, Fedora Core is best suited for use by hobbyists and others interested in sampling the bleeding edge of Linux technology and features.
Red Hat Enterprise Linux has a longer release cycle of 12-18 months. And, updates to releases of Red Hat Enterprise Linux are promised to be available for five years after each release. Consequently, users of Red Hat Enterprise Linux will not need to re-install their operating system as often as users of Fedora Core. Moreover, the longer release cycle of Red Hat Enterprise Linux permits third-party certification of Red Hat Enterprise Linux applications and hardware vendor support of systems running Red Hat Enterprise Linux, considerations which are important, even crucial, to enterprise users. However, the long release cycle is not without its drawbacks. For instance, at any given time, several releases of Red Hat Enterprise Linux may be in use. So, if you're using a release of Red Hat's Enterprise Linux other than Red Hat Enterprise Linux WS 3, you'll probably find some differences between what's installed on your system and what's shown in this book.
1.2.9 Linux Features and Performance
The origins of Linux and the availability of its source code set it apart from other operating systems. But most users choose an operating system based on features and performance—and Linux delivers these in spades.
Linux runs on a wider range of hardware platforms and runs adequately on less costly and powerful systems than other operating systems. Moreover, Linux systems are generally highly reliable.
But this impressive inventory of selling points doesn't end the matter. Let's consider some other technical characteristics of Linux that distinguish it from the pack:
126.96.36.199 Filesystem reliability
Microsoft claims that its NTFS filesystem is so reliable that you'll probably never need special software tools to recover lost data—truth is, Microsoft provides no such tools. Despite Microsoft's ambitious claims, some Windows NT users report that NTFS reliability is less than satisfactory.
Here's a case in point: When my Windows NT workstation crashed a little over a year ago I discovered that its NTFS filesystem was damaged. I searched the Microsoft web site for recovery instructions and tools and found nothing that helped. So I went to my local software store and purchased a third-party disk recovery tool for Windows NT. When I opened the box, I was angered to discover that it supported recovery of FAT and FAT32 data, but not NTFS data.
Eventually, I recovered 95 percent of my data by using a free Linux utility that was able to open the damaged NTFS partition and copy its files. If I'd been without Linux, I'd be without my data.
188.8.131.52 The command-line interface
If you're an old computer dog who remembers the days of MS-DOS, you may have a fondness for what's now called the MS-DOS Prompt window or the Command Line Interface (CLI). However, if you've worked exclusively within the Windows point-and-click environment, you may not fully understand what the MS-DOS Prompt window is about. By typing commands in the MS-DOS Prompt window, you can direct the computer to perform a variety of tasks.
For most users, the MS-DOS Prompt is not as convenient as the GUI offered by Windows. That's because you must know the commands the operating system understands and must type them correctly if you expect the operating system to do your bidding.
However, the MS-DOS Prompt window lets you accomplish tasks that would be cumbersome and time-consuming if performed by pointing and clicking. Linux comes with a similar command interface, known as the shell. But, the word similar fails to do justice to the Linux shell's capabilities, because the MS-DOS Prompt provides only a fraction of the capabilities provided by the Linux shell.
You may have used the MS-DOS Prompt and, finding it distastefully cumbersome, forever rejected it in favor of pointing and clicking. If so, you'll be pleasantly surprised to see how easy it is to use the Linux shell. You'll certainly be pleased—perhaps amazed—by the enormous power it offers. Moreover, you can customize the operation of the Linux shell in an almost limitless number of ways and even choose from among a variety of shells, and automate your work by combining commands into files called shell scripts. You'll learn more about the Linux shell in Chapter 7.
184.108.40.206 Developing portable code
If you're a programmer, you'll also admire the ease with which it's possible to develop portable, Unix-compliant software. Linux comes with a suite of software development tools, including an assembler, C/C++ compilers, a make application, and a source code librarian. All of these are freely distributable programs made available under the terms of the GNU GPL.