Building Blocks

The .NET platform is the direct result of a major shift in the computer application architecture that took place during the 1990's. To fully appreciate the significance of the new Microsoft platform, it's necessary to examine this architectural shift, as well as its causes and continuing impact on the computer community.

The Applications Architectural Shift

When Internet technology, notably the Web, moved into the computing mainstream in the middle of the 1990's, the model for business computing changed dramatically. This shift (Figure 1.1) was centered on the industry's notion of client/server computing, which until this time was very complex, costly, and proprietary.

click to view at full size

Figure 1.1 Application architecture shifts since 1970

Computing on the Web

The Web model is characterized by loosely connected tiers of diverse collections of information and applications that reside on a broad mix of hardware platforms. Remember, the driving force behind the Internet since its inception has been the desire to provide a common information-delivery platform that is scalable, extensible, and highly available. This platform is flexible by design and not limited to one or two computing tiers. The only real limits to application development in the Internet world are computer capacity and the imagination of the application designer.

As the Web browser rapidly became ubiquitous and Web servers proliferated throughout companies, it was clear—despite the best efforts of client/server software producers to Web-enable their products—that a radically different way of thinking about the application model was needed. The developer community was the first group to be hit with the idea of the "lowest common denominator" approach to business computing. Obviously, new techniques and tools were required to meet the technology shifts and challenges facing developers.

Technology Shifts and Developer Challenges

As the Internet revolution took hold and new technology appeared, developers faced several challenges that existing design models and tools couldn't address adequately. These challenges were centered on the following issues:

  • Heterogeneous environments
  • Scalability
  • Rapid application development and deployment
  • Platform administration and management
  • Network-aware application design

Heterogeneous Environments

One of the earliest, and perhaps biggest, challenges was the need to build applications that could readily fit into heterogeneous environments. In most large organizations there were a mix of terminals, rich clients, and thin (Web) clients. In addition to accommodating the client base, new applications had to interact with legacy data and applications that were hosted on mainframe and mid-range computers, often from different hardware vendors.

Scalability

Prior to the influx of Internet technologies, scalability was a relatively easy issue to manage. To begin with, the computing environment was essentially a closed system because there was a limited amount of remote access by staff, customers, or business partners. This meant that the size of the user base and their usage patterns for given applications and services were well known. Strategic planners had ample historical data on which to base their projections for scaling the computing environment to match consumer demand.

Next, the application development life cycle typically spanned several years. Once again, planners had ample time to plan for system and application scaling.

Finally, microcomputers still hadn't realized their full potential—still being viewed by many as something slightly smarter than a terminal—and their deployment throughout corporations was just starting to take off. As time passed, there was an expectation that the desktop would become part of any given application.

While the microcomputer was redefining how people worked, Internet technology, notably the Web, altered the corporate mindset. Initially, this new technology was viewed as an ideal low-cost method for sharing information throughout the organization. Not only was it inexpensive, but it also made it very easy for users to do their own development, and internal Web sites (intranets) quickly appeared on the computing landscape.

The foundation for scalability planning started to erode, and when companies opened their doors to the outside world, it crumbled completely. The new design paradigm said that systems had to be designed to accommodate anywhere from less than one-hundred to more than one-million users.

Rapid Application Development and Deployment

The intranet and Internet phenomenon highlighted the possibility of, and need for, rapid applications deployment. The corporate intranet experience clearly demonstrated that business applications could be built quickly. An added bonus was the simplicity of URL-based deployment. The net result was that business managers and users began to question the entire traditional development platform and processes. They were no longer prepared to wait several years before being able to use an application. From an investment perspective, the business community questioned any investment in applications that would be legacy systems by the time they were completed.

As businesses expanded their applications horizon from the intranet to the Internet, the notion of rapid application development was redefined even further. In order to be competitive, applications needed to be created virtually on demand for immediate use—just-in-time (JIT) development. To achieve this, the developers needed to completely revamp and revitalize their approach to applications development.

Platform Administration and Management

As with any aspect of computer technology, things aren't perfect in the Internet/Web world. The Information Technology (IT) professionals that embraced this new application model discovered that along with freedom and flexibility came a completely new set of administration and management issues. These issues revolved around clients, applications, and hosts.

The browser, coming as it did from the grass roots, left most organizations in the position of not having a browser standard. (Day-to-day support and upgrade issues themselves were often a logistical nightmare.) From a development perspective, the lack of standardization meant that application designers had to accommodate the core and extended HTML rendering capabilities of each browser version.

Application deployment was even more difficult to manage, because system administrators had to contend with large numbers of content publishers rather than a single developer group. The management of this aspect of Web-based computing became increasingly difficult as businesses bought into the idea of providing data-driven, dynamic content. The scope of the Web programming model was broadened by the need to include diverse data stores and accommodate several different scripting languages.

Any Webmaster from the initial days of Internet-based business applications can attest to the hours of painstaking, manual work required to keep even a medium-sized site operating properly and continuously—because another aspect of the Internet phenomenon was the users' expectation of 24-hour/7-day access. Support demand increased as servers were added to accommodate increased traffic demands on Web sites.

Unfortunately, the Web's designers and advocates neglected to include a set of tools for managing the platform; it was left to the IT community to come up with a solution.

Network-Aware Applications

The final challenge facing developers is a result of the advances made in portable computer technology and the decline in cost for portable computers, such as laptops, notebooks, and palmtops. Coupled with the global access made possible by the Internet, mobile computing has grown at a rate comparable to the Web. Recent figures indicate that laptop sales now exceed that of desktop computers.

Offline, or disconnected, use is no longer the exception. The user community expects to be able to use applications and services in both online and offline mode. The application developer must be able to provide this capability in an application.

An Overview of Distributed Web Applications

The .NET initiative addresses the challenges facing organizations by combining an architectural vision with a complete set of Microsoft technologies. These technologies can be used to develop, deploy, and support n-tier, distributed applications. Highly integrated but flexible, this product suite enables developers to build end-to-end business solutions: solutions that can leverage existing architectures and applications. Let's take a look at the philosophy behind, and the major elements of, this platform.

Philosophy and Benefits

The key tenet of distributed Web-based applications is the logical partitioning of an application into three tiers:

  • Presentation
  • Business logic
  • Data access and storage

By partitioning applications along these lines, using component-based programming techniques, and by fully utilizing the services provided by the Microsoft Windows operating system, developers can build highly scalable and flexible applications. Table 1.1 summarizes the major benefits that can be derived by adopting the distributed Web-based applications model.

Table 1.1 The Benefits of Using the Windows Platform to Build Applications

BenefitDescription
Rapid developmentUse the declarative programming techniques of Component Object Model (COM) and snap-together components.
ScalabilityUse Windows Component Services to manage thread, resource, distribution, and concurrency issues.
Easy deployment and managementUse the Windows operating system services to contain or reduce the cost of deployment and tie it into a management schema.
Support for disconnected clientsBuild rich clients that will continue to work after a user is disconnected.
Ease of customization Use standard end-user and programming tools to customize components.
Support for multiple data storesUse data services to enable the application to access databases, the message system, and the file system.
Integration and interoperabilityUse Windows services to access data and communicate with heterogeneous systems.

Platform Components

A simple application model consists of a client that communicates with the middle tier, which itself consists of the application server and an application containing the business logic. The application, in turn, communicates with a back-end database that is used to supply and store data.

Let's look at the elements of each tier in more detail, starting with the Presentation layer, which is supported by Presentation Services.

Presentation Services

The Presentation layer consists of either a rich or thin client interface to an application. The rich client, which uses the Microsoft Win32 API, provides a full programming interface to the operating system's capabilities and uses components extensively. Arguably not as robust or capable of offering the performance levels as a rich client, the thin client (Web browser) is rapidly becoming the interface of choice for many developers. A developer is able to take advantage of several simple-yet-robust scripting languages to build business logic that can be executed on any of the three application tiers. With full support for HTML and the DHTML and XML object models, the thin client is able to provide a visually rich, flexible, and interactive user interface to applications. Thin clients also have the added advantage of providing a greater degree of portability across platforms.

Business Logic/Application Services

This layer is divided into application servers (Internet Information Services [IIS], Site Server, and SNA Server) and services, which are available to support clients. Web application logic, typically consisting of Active Server Pages (ASP) and written in Microsoft® Virtual Basic® Scripting Edition (VBScript), are processed in the IIS server space. Either ASP- or COM-based applications can be written to take advantage of Microsoft Transaction Server (MTS), Message Queuing (MSMQ), directory, and security services. Application services, in turn, can interact with several data services on the back end.

Data Access and Storage

The data services that support data access and storage consist of:

  • Microsoft ActiveX Data Objects (ADO), which provides simplified programmatic access to data by using either scripting or programming languages.
  • OLE database, which is an established universal data provider developed by Microsoft.
  • XML, which is a markup standard for specifying data structures.

XML is a recent standard put forward by the Internet community. Whereas HTML focuses on how information is rendered by the browser and displayed on the screen, the goal of XML is to handle a data structure and its representation.

System Services

Elements within each segment of our model are fully supported by the Windows operating system, which among its many services provides directory, security, management, and communications services that work across the three tiers. The programming tools that make up the Visual Studio version 6.0 development system enables developers to build application components across the tiers.

.NET Enterprise Servers

The Microsoft .NET Enterprise Servers tier extends the distributed applications vision. The basic philosophy and objectives—to provide a model and tools for building n-tier, distributed business solutions—have not changed. The diagram shown in Figure 1.2 illustrates the .NET Enterprise Servers tier, including Application Center, and shows how the .NET platform fits in the Microsoft platform.

click to view at full size

Figure 1.2 Positioning Application Center in the Microsoft business platform

The most notable additions to the platform, and most important from our perspective, are the new server technologies in the Enterprise Servers layer. It was obvious that customers needed an economical and easy way to scale their Web server farms (also known as Web clusters) to accommodate increasing traffic. Plus, tools were needed for deploying and managing content and applications on these servers. The solution is Application Center, a product that:

  • Addresses issues related to scaling-out Web-based applications across multiple servers.
  • Accommodates deployment of content and applications across clusters.
  • Transparently load-balances and distributes work across a cluster.
  • Provides proactive monitoring of health and performance metrics.
  • Supports performance testing to enable scaling for next-generation applications.


Microsoft Application Center 2000 Resource Kit 2001
Microsoft Application Center 2000 Resource Kit 2001
ISBN: N/A
EAN: N/A
Year: 2004
Pages: 183

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net