How Computing Started and Has Evolved

The world of computing was dominated by the mainframe just over 20 years ago. Few people had access to computers, and then only via the nearest IT department. The Personal Computer (PC) and the Graphical User Interface (GUI) changed that, opening the doors to computing for tens of millions of people and transforming the computer into a mass-market product. Corporations realized that networks of PCs and PC-based servers could change the way they did business.

When the Internet arrived, it revolutionized communications, created a rich source of information and entertainment, and added an "e" to business. Now close to 300 million people use the World Wide Web. According to International Data Corporation, more than a quarter of a trillion dollars in business will be transacted over the Internet this year alone.

System Design Issues

Computers and the Internet developed in parallel for some time and in some sense drive each other. However, they did not previously support each other as much as they could have. For example, some Internet and Web technologies—initially developed in the late 1960s and early 1970s and enhanced and refined in the early 1990s—are constrained by design to the least common denominator of computing devices connected to the Internet. They do not even take advantage of the enormous capabilities of today's least expensive PCs.

At the same time, most application software designed for PCs does not yet fully exploit the capabilities of internal corporate networks, let alone the power of a global network capable of supplying electronic applications on demand. The client/server development model that dominated the past decade tried to harness some of this power.

Client/Server Development

Client/server development breaks an application down into two or more layers, usually these three: the presentation layer, which encapsulates interacting with the user; the application layer, which encapsulates business rules and object persistence; and the data layer, which encapsulates access to database management systems. Objects in each of these layers perform small, discreet functions.

Because an application could be broken into multiple components, developers gained the ability to distribute components between multiple PCs within a corporate network, thereby harnessing some of the lost processing power by using a single machine for each component. To distribute these objects, however, they had to contain information on how and where another object could be found. The objects, therefore, became tightly coupled.

This coupling required a homogeneous computing environment so the system could run at peak efficiency. While corporations were able to control their own computing environment (at least to some extent), as soon as these applications crossed a firewall to the Internet, the computing environment changed drastically from one Web site to the next.

Needs of These Systems

Today's Internet combines the old mainframe model and the current client/server model. Despite the availability of bandwidth, information is still locked up in centralized databases, with gatekeepers (Web servers) controlling access. Users rely on these servers to perform most operations, just like the mainframe timesharing model. The Web server, in turn, invokes objects on other servers to perform several tasks simultaneously. Still, even with this interaction between Web servers, application servers, and database servers, Web sites are isolated islands of information and functionality.

In this environment the user must adapt to the technology instead of vice versa. Corporate administrators and planners face additional challenges. While the introduction of server farms has made the overall computing experience more reliable, it has also made system management more complex. Performance measurement, capacity planning, and operations management are more challenging in today's multitier, multifunction Web sites.

What we need is a model for developing software that will allow developers to exploit both the power of the PC and the global network, which will make truly distributed computing possible. Furthermore, this framework must facilitate the separation of data from the way in which the data is presented—it must be a true peer-to-peer network that will allow information to flow freely between devices.



XML Programming
XML Programming Bible
ISBN: 0764538292
EAN: 2147483647
Year: 2002
Pages: 134

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net