Appendix B

Section: Part VIII:  Appendixes

Appendix B . Internet 101

IN THIS APPENDIX

        In the Beginning: 1962 1969

        UNIX Is Born: 1969 1973

        The Internet's Formative Years: 1972 1975

        Moving On: The '90s Internet

This appendix discusses the Internet's early history. If you already know the story, feel free to skip it.


 

Section: Appendix B .  Internet 101

In the Beginning: 1962 1969

Our setting is the early 1960s 1962, to be exact. Jack Kennedy was in the White House, and the Beatles had just recorded their first hit single ("Love Me Do"). Americans were enjoying an era of prosperity. Elsewhere, however, Communism was spreading, and with it came weapons of terrible destruction.

In anticipation of nuclear war, the United States Air Force charged a small group of re searchers with a formidable task: creating a communication network that could survive a nuclear attack. Their concept was revolutionary a network that had no centralized control. If one (or 10, or 100) of its nodes were destroyed, the system would continue to run. In essence, this network (designed exclusively for military use) would survive the apocalypse itself (even if we didn't).

The individual most responsible for the Internet's existence was Paul Baran. In 1962, Baran worked at Rand Corporation, the think tank charged with developing this concept. Baran imagined a network in which all machines could communicate with one another. This was a radical concept that went well against the grain of conventional wisdom. However, Baran knew that centralized networks were simply too vulnerable to attack. In his now-famous memorandum titled On Distributed Communications: I. Introduction to Distributed Communications Network, Baran wrote

The centralized network is obviously vulnerable as destruction of a single central node destroys communication between the end stations.

The Rand Corporation has generously made this memorandum and the report delivered by Baran available via the World Wide Web. The document(s) can be found at http://www.rand.org/publications/RM/baran.list.html.

Baran was referring to the way most computer networks were constructed. In the old days, networks relied on mainframes. These were large, powerful machines that housed centralized information. Users accessed that information through terminals wired directly to the mainframe. Data would travel from their terminals, down the cable, and to the mainframe. The mainframe would then distribute that data to other terminals. This was a very effective method of networking but had disastrous security implications. For example, terminals could not communicate directly with one another. Hence, if the mainframe were destroyed, the network would be destroyed. This placed our national networks at considerable risk.

Baran had a simple solution: Design a network where all points could communicate with one another. In many ways, this design bore similarities to the national telephone system. As Baran explained

In practice, a mixture of star and mesh components is used to form communications networks. Such a network is sometimes called a "decentralized" network, because complete reliance upon a single point is not always required.

Baran's proposal was thorough, right down to routing conventions. He envisioned a system whereby data could dynamically determine its own path. For example, if the data encountered a problem at one crossroads of the Net, it would find an alternate route. This system was based on certain rules. For instance, a network node would only accept a message if it had adequate space to store it. Equally, if all lines were currently busy, the message would wait until a new path became available. In this way, the network would provide intelligent data transport. Baran also detailed other aspects of the network, including

        Security

        Priority schemes (and devices to avoid network overload)

        Hardware

        Cost

Unfortunately, Baran's ideas were ahead of their time. The Pentagon had little faith in such radical concepts. Baran delivered an 11-volume report to defense officials, who promptly shelved it. As it turned out, the Pentagon's shortsightedness delayed the birth of the Internet, but not by much. By 1965, the push was on again. Funding was allocated to develop a decentralized computer network, and, in 1969, that network became a reality. The system was called ARPANET.

As networks go, ARPANET was pretty basic. It consisted of links between machines at four academic institutions (Stanford Research Institute, the University of Utah, the University of California at Los Angeles , and the University of California at Santa Barbara). One of those machines was a DEC PDP-10. These ancient beasts are now more useful as furniture than computing devices. However, I mention the PDP-10 here to briefly recount another legend in computer history.

It was at roughly that time that a Seattle, Washington, company began providing computer time-sharing. (This was a system whereby corporate clients could rent CPU time. They were generally charged by the hour.) The company took on two bright young men to test its software. In exchange for their services, the boys were given free dial-up access to a PDP-10. (This would be the equivalent of getting free access to a private bulletin board system.) Unfortunately for the boys, the company folded shortly thereafter, but the learning experience changed their lives. At the time, they were just old enough to attend high school. Today, they are billionaires. Can you guess their identities? The two boys were Bill Gates and Paul Allen.

In any event, ARPANET had very modest beginnings: four machines connected by telephone. At the time, this seemed like an incredible achievement. However, the initial euphoria of creating ARPANET quickly wore off when engineers realized they had several serious problems. One problem was this: They didn't have an operating system suitable to create the massive network Baran had conceived.

Fate would now play a major role. Halfway across the United States, researchers were developing an obscure operating system. Their work which occurred simultaneously with the development of ARPANET would change the world forever. The operating system was called UNIX.


 

Section: Appendix B .  Internet 101

UNIX Is Born: 1969 1973

The year was 1969, (the same year that ARPANET was established). A man named Ken Thompson from Bell Labs (with Dennis Ritchie and Joseph Ossanna) developed the first version of UNIX. The hardware was a Digital Equipment Corporation (DEC) PDP-7. The software was homegrown, written by Thompson himself.

Thompson's UNIX system bore no resemblance to modern UNIX. For example, modern UNIX is a multiuser system. (In other words, many users can work simultaneously on a single UNIX box.) In contrast, Thompson's first prototype was a single-user system and a bare bones one at that. However, I should probably define the term bare bones.

When you think of an operating system, you probably imagine something that includes basic utilities, text editors, help files, a windowing system, networking tools, and so forth. That's because today, end-user systems incorporate great complexity and user-friendly design. Alas, the first UNIX system was nothing like this. Instead, it had only the most necessary utilities to operate. For a moment, place yourself in Ken Thompson's position. Before you create dozens of complex programs like those just mentioned, you are faced with a more practical task: getting the system to boot.

Thompson did eventually get his UNIX system to boot. However, he encountered many problems along that road. One was that the programming language he used wasn't well suited to the task. Once again, fate played a tremendous role. At roughly that same time, other researchers at Bell Labs (Dennis Ritchie and Brian Kernighan) created a new programming language called C.

About C

C is often used to write language compilers and operating systems. I examine C here because it drastically influenced the Internet's development. Here is why.

Today, nearly all applications that facilitate communication over the Internet are written in C. Indeed, both the UNIX operating system (which forms the underlying structure of the Internet) and TCP/IP (the suite of protocols that traffic data over the Net) were developed in C. If C had never emerged, the Internet as we know it would never have existed at all.

C's popularity is based on several factors:

        C is small and efficient.

        C code is easily portable from one operating system to another.

        C can be learned quickly and easily.

However, only the first of these facts was known to AT&T Bell Labs researchers when they decided to rewrite UNIX in C. That's exactly what they did. Together, Thompson and Ritchie ported UNIX to a DEC PDP-11/20. From there, UNIX underwent considerable development. Between 1970 and 1973, UNIX was completely rewritten in C. This was a major improvement and eliminated many bugs inherent in the original UNIX system.


 

Section: Appendix B .  Internet 101

The Internet's Formative Years: 1972 1975

Briefly, I turn away from the on-going development of UNIX and C because, between 1972 and 1975, advances were being made in other areas. These advances would have strong bearing on how and why UNIX was chosen as the Internet's operating system.

The year was 1972. ARPANET had some 40 hosts. (In today's terms, that is smaller than many local area networks, or LANs.) It was in that year that Ray Tomlinson, a member of Bolt, Beranek, and Newman, Inc., forever changed Internet communication. Tomlinson created electronic mail.

Tomlinson's invention was probably the single most important computer innovation of the decade. Email allowed simple, efficient, and inexpensive communication. This naturally led to an open exchange of ideas and interstate collaboration between folks researching different technologies. Because many recipients could be added to an email message, these ideas were more rapidly implemented. From that point on, the Net was alive.

Another key invention emerged in 1974: Vinton Cerf and Robert Khan invented the Transmission Control Protocol (TCP). This protocol was a new means of moving data across the Network bit by bit and then later assembling these fragments at the other end.

Note

TCP is the primary protocol used on the Internet today. It was developed in the early 1970s and was ultimately integrated into Berkeley Software Distribution UNIX. It has since become an Internet standard. Today, almost all computers connected to the Internet run some form of TCP.

 

By 1975, ARPANET was a fully functional network. The groundwork had been done, and it was time for the U.S. government to claim its prize. In that year, control of ARPANET was given to an organization then known as the United States Defense Communications Agency. (This organization would later become the Defense Information Systems Agency.)

What remained was to choose the official operating system for ARPANET. The final choice was UNIX. The reasons that UNIX was chosen over other operating systems were complex. In the next section, I discuss those reasons at length.

UNIX Comes of Age

Between 1974 and 1980, UNIX source code was distributed to universities throughout the country. This, more than any other thing, contributed to the success of UNIX.

First, the research and academic communities took an immediate liking to UNIX. Hence, it was used in many educational exercises. This had a direct effect on the commercial world. As explained by Mike Loukides, an editor for O'Reilly & Associates and a UNIX guru:

Schools were turning out loads of very competent computer users (and systems programmers) who already knew UNIX. You could therefore "buy" a ready-made programming staff. You didn't have to train them on the intricacies of some unknown operating system.

Also, the source was free to universities and therefore, UNIX was open for development by students. This openness quickly led to UNIX being ported to other machines, which only increased the UNIX user base.

Note

Because UNIX source is widely known and available, more flaws in the system security structure are also known. This is in sharp contrast to proprietary systems. Proprietary software manufacturers refuse to disclose their source except to very select recipients, leaving many questions about their security as yet unanswered.

 

UNIX continued to gain popularity and in 1978, AT&T decided to commercialize the operating system and demand licensing fees (after all, they had obviously created a winning product). This caused a major shift in the computing community. As a result, in a stunning move to establish creative independence, the University of California at Berkeley created its own version of UNIX. The Berkeley distribution was extremely influential, being the basis for many modern forms of commercial UNIX.

So,in brief,UNIX was chosen for several reasons,including the following:

        UNIX was a developing standard.

        UNIX was an open system.

        UNIX source code was publicly available for scrutiny.

        UNIX had powerful networking features.

UNIX and the Internet Evolve Together

Once UNIX was chosen as the Internet's operating system, advances in UNIX were incorporated into the Internet's design. Thus, from 1975 onward, UNIX and the Internet evolved together. And, along that road, many large software and hardware manufacturers released their own versions of UNIX. The most popular versions are listed in Table B.1.

Table B.1. Commercial Versions of UNIX and Their Manufacturers

UNIX Version

Software Company

SunOS & Solaris

Sun Microsystems

HP-UX

Hewlett-Packard

AIX

IBM

Digital UNIX

Compaq

Linux

Open Source Multiple Distributors

Many of these UNIX flavors run on high-performance machines called workstations. Workstations differ from PC machines in several ways. First, workstations contain superior hardware and are therefore more expensive. This is due in part to the limited number of workstations built. In contrast, PCs are mass produced, and manufacturers constantly look for ways to cut costs. A consumer buying a new PC motherboard therefore has a much greater chance of receiving faulty hardware. Moreover, workstations are typically more technologically advanced than PCs. For example, onboard sound, Ethernet, and SCSI were standard features of workstations in 1989. In fact, onboard ISDN was integrated not long after ISDN was developed.

Linux is an interesting version of UNIX. It was designed to run on PC hardware and is freely available. This combination, plus the reliability of Linux, has made it an important platform for Internet servers.

Note

Technological advantages of workstations aren't always immediately apparent, either. Often, the power of a workstation is under the hood, obscured from view. For example, many workstations have extremely high throughput, which translates to blinding speeds over network connections and superb graphics performance. In fact, SGI and Sun now make machines that have absurd throughput, measuring hundreds of gigabytes per second.

 

High-end performance comes at a terrific price. In the past, workstations would set you back five, or even six, figures. Naturally, for average users, these machines are cost prohibitive. In contrast, PC hardware and software are cheap, easily obtainable, simple to configure, and widely distributed. However, over the past few years, workstations have dropped greatly in price and now are just slightly more expensive than PCs.

However, we are only concerned with UNIX as it relates to the Internet. As you might guess, that relationship is strong. Because the U.S. government's Internet development was implemented on the UNIX platform, UNIX contains the very building blocks of the Net. No other operating system had ever been so expressly designed for use with the Internet.

Let's have a brief look at UNIX before continuing.

The Basic Characteristics of UNIX

Modern UNIX runs on disparate hardware, including IBM-compatibles and Macintoshes. Installation differs little from installation of other operating systems. Most vendors provide CD-ROM media. On workstations, installation is performed by booting from a CD-ROM. You are usually given a series of options and the remainder of the installation is automatic. On other hardware platforms, a boot disk that loads a small installation routine into memory generally accompanies the CD-ROM.

Starting a UNIX system is also similar to booting other systems. The boot routine takes quick diagnostics of all existing hardware devices, checks the memory, and starts vital system processes. In UNIX, some common system processes started at boot and include the following:

        Electronic mail services

        General network services

        Logging and system administration services

After the system boots, a login prompt appears. Here, you provide your username and password. When login is complete, you are dropped into a shell environment.

Note

A shell is an environment in which commands can be typed and executed. A shell interpreter then translates those commands to machine language for processing. In MSDOS, for example, the shell is COMMAND.COM. The user interfaces with the shell by typing commands (for example, the command DIR to list directories). In this respect, at least in appearance, basic UNIX marginally resembles MS-DOS. All commands are entered using the shell. Output of commands appears on the monitor unless you specify otherwise.

 

Navigation of directories is accomplished in a similar fashion to navigation of a DOS system. DOS users can easily navigate a UNIX system using the conversion information in Table B.2. The UNIX commands listed here operate identically or very similarly to their DOS counterparts.

Table B.2. Command Conversion Table: UNIX to DOS

DOS Command

UNIX Equivalent

cd \ <directory>

cd /<directory>

dir

ls -l

dir \ directory

ls /directory

dir /w

ls

chkdsk drive

fsck drive/partition

copy filename1 filename2

cp filenme1 filename2

edit filename

vi filename

fc filename1 filename2

diff filename1 filename2

find text_string

grep text_string

format drive

format drive/partition

mem/c|more

more /proc/meminfo

move filenme1 filename2

mv filename1 filename2

sort filename

sort filename

type filename|more

more filename

help <command>

man <command>

To learn more about basic UNIX commands, go to http://www.geek-girl.com/Unixhelp/. This archive is a comprehensive collection of information about UNIX. Or for good printed documentation, I recommend UNIX Unleashed (ISBN 0-672-31411-8), a title that provides many helpful tips and tricks on using this popular operating system.

What Kinds of Applications Run on UNIX?

UNIX runs many different applications. Some are high-performance programs used in scientific research and artificial intelligence. However, not all UNIX applications are so specialized. Popular, commercial applications also run in UNIX, including Adobe PhotoShop, Corel WordPerfect, and other products commonly associated with PCs.

In all, modern UNIX is like any other platform. Window systems tend to come with suites of applications integrated into the package. These include file managers, text editors, mail tools, clocks, calendars, calculators, and the usual fare.

A rich collection of multimedia software can be used with UNIX, including movie players, audio CD utilities, recording facilities for digital sound, two-way camera systems, multimedia mail, and other fun things. Basically, just about anything you can think of has been written for UNIX.

UNIX in Relation to Internet Security

UNIX security is a complex field. It has been said that UNIX is at odds with itself, because the same advantages that make UNIX a superb server platform also make it vulnerable to attack. UNIX was designed as the ultimate networked operating system, providing you with the ability to execute almost any application remotely and transparently. (For example, UNIX enables you to perform tasks on one machine from another, even though those boxes are located thousands of miles apart.) As such, by default, UNIX remote services will accept connections from anywhere in the world.

Moreover, UNIX is an open system, and its code is publicly available. So, just as researchers can look at UNIX code and find weaknesses so can computer criminals, crackers, and other malcontents. However, UNIX is a mature operating system and over the years, many advances have been made in UNIX security. Some of these advances (many of which were implemented early in the operating system's history) include the following:

        Encrypted passwords

        Strong file and directory-access control

        System-level authentication procedures

        Sophisticated logging facilities

UNIX is therefore used in many environments that demand security. Hundreds of programs are available to tune up the security of a UNIX system. Many of these tools are freely available on the Internet. Such tools can be classified into three basic categories:

        Security-audit tools

        System-logging tools

        Intrusion-detection tools

        Encryption tools

Security-audit tools are programs that automatically detect holes within systems. These check for known vulnerabilities and common misconfigurations that can lead to security breaches. Such tools are designed for wide-scale network auditing and, therefore, can be used to check many machines on a given network (thousands, if you want). These tools are advantageous because they automate base-line security assessments. However, these tools are also liabilities, because they provide powerful capabilities to crackers who can obtain them just as easily.

System-logging tools record the activities of users and system messages. These logs are recorded to plain text files or files that automatically organize themselves into one or more database formats. Logging tools are a staple resource in any UNIX security toolbox. Often, the logs generated by such utilities form the basis of evidence to build a case against a cracker. However, deep logging of the system can be costly in terms of disk space and bandwidth.

Intrusion-detection tools are programs that automatically detect patterns that suggest an intrusion is under way. In some respects, intrusion detection tools can be viewed as intelligent logging utilities. The difference is that the logs are generated, analyzed, and acted upon in real-time.

Lastly, encryption tools allow data to be encrypted. The data might be encrypted on the hard drive so others can not read it. Data being sent across the Internet can also be encrypted, so that people cannot intercept and read the transmission.

Despite these superb tools, however, UNIX security is difficult to achieve. UNIX is a large and complicated operating system and hiring true UNIX security experts can be costly. For although these people aren't particularly rare, most of them already occupy key positions in firms throughout the nation. As a result, consulting in this area has become a lucrative business.


 

Section: Appendix B .  Internet 101

Moving On: The '90s Internet

So, this history of the Net is edging up on 1990. By that time, the Internet was used almost exclusively by either military or academic personnel. Casual users probably numbered several hundred thousand, if that. And the network was managed by the National Science Foundation, an entity that placed strict restrictions on the network's use. Flatly stated, it was forbidden to use the Internet for commercial purpose.

This placed the NSF in a unique position. Although the Internet was not user-friendly (all access was command-line only), the network was growing in popularity. The number of hosts had grown to some 300,000. Within months, the first freely available public access Internet server was established, and researchers were confronted with the inevitable. It was only a matter of time before humanity would storm the beach of cyberspace.

Amidst debates over cost (operating the Internet backbone required substantial resources), NSF suddenly relinquished its authority in 1991. This opened the way for commercial entities to seize control of network bandwidth.

However, the public at large did not advance. Access was still command-line based and far too intimidating for the average user. It was then that a tremendous event occurred that changed the history of not just the Internet, but the world: The University of Minnesota released new software called Gopher. Gopher was the first Internet navigation tool for use in GUI environments. The World Wide Web browser followed soon thereafter.

In 1995, NSF retired as overseer of the Net. The Internet was completely commercialized almost instantly as companies across America rushed to get connected to the backbone. These companies were immediately followed by the American public, which was empowered by new browsers such as NCSA Mosaic, Netscape Navigator, and Microsoft Internet Explorer. The Internet was suddenly accessible to anyone with a computer, a windowing system, and a mouse.

As more users flocked to the Net, Internet service providers cropped up everywhere. These were small, localized companies that provided basic gateway access to the general public. For $20 a month, anyone with a computer and a modem could enjoy Internet connectivity. And, it wasn't long before monster corporations (such as America Online and Prodigy) jumped on the bandwagon. This caused the Internet user base to skyrocket.

The late 1990s saw a huge rise in businesses using the Internet. Shopping on the Internet became a reality with millions, if not billions, of dollars spent across the Internet every day. New companies, such as Yahoo! and Amazon, rose out of nowhere to become huge in just a couple years. However, a lack of bandwidth remained a stumbling block for people wanting to fully utilize the Internet.

The Future

There have been many projections about where the Internet is going. Most of these projections are cast by marketers and spin doctors anxious to sell more bandwidth, more hardware, more software, and more hype. A significant number of people now have access to high-speed Internet connectivity if they are willing to pay for it. Two technologies for this have emerged: DSL (Digital Subscriber Line) and cable modem.

DSL is a high-speed phone line that ranges in speed from 128Kbps to more than 1Mbps. The problem with DSL is that the maximum speed varies depending on how far you are from the phone company central office. If you are far enough away, you cannot get DSL service at all. Also, getting a DSL line successfully installed is challenging and it often takes weeks to get the line working.

Having used both DSL and cable modem on a regular basis, my opinion is that cable modem is the far superior technology. Cable modem uses the cable TV lines that come into your home. However, some people do not have cable TV available in their area and some cable providers do not offer Internet access.

DSL providers like to claim that your neighbors slow down your cable modem access because you share the same cable line. Only on a poorly designed cable system will you notice this problem. In reality, DSL suffers from the same problem from the central office to your ISP. Therefore, this argument used by DSL providers is invalid. Your high-speed Internet connection is only as good as your provider regardless of whether you use DSL or Cable.

These technologies are always on, and people leave their computers hooked to the Internet 24 hours a day, which causes new security issues.

Additionally, many people think that these high-speed technologies will make real-time audio and video a reality on the Internet.

While decent-quality audio is usually possible, video even on DSL and cable remain problematic. Pauses are frequent, and quality is sacrificed. Faster Internet access technologies are needed to make watching movies across the Internet a reality.

The Internet is about to become an important part of many Americans'life, if it's not already. Banks and other financial institutions are now offering banking over the Internet. Within five years, this could replace the standard method of banking.

Additionally, much of the stock trading has moved from traditional brokers to the Internet. One common problem for employers is that some of their employees spend a good part of the day checking their stock prices because they can get current quotes easily. Also, the volume of "day trading" has exploded due to the Internet because it is so much easier to do now. Some have struck it rich, and others have let greed get the best of them looking for the quick, easy money.


 

Section: Appendix B .  Internet 101

Summary

This appendix briefly examined the birth of the Internet. Next on the agenda is the historical and practical points of the network's protocols, or methods of data transport. These topics are essential for understanding the fundamentals of Internet security.


 



Enterprises - Maximum Security
We Only Played Home Games: Wacky, Raunchy, Humorous Stories of Sports and Other Events in Michigans
ISBN: 0000053155
EAN: 2147483647
Year: 2001
Pages: 38

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net