Just as participants in the software value chain (including nonsoftware companies) maintain ongoing business relationships (see section 6.3), so do participants within the software creation industry. Monolithic software solutions are today the exception rather than the rule; in most cases, a total solution of value to users integrates content from a number of software companies.
Example A single desktop computer with a standard suite of office software might serve the needs of some users, and the software on such a platform might come from a single supplier like Apple Computer, Microsoft, or Sun Microsystems. Even in this simple case there will likely be contributions from other suppliers. For example, Microsoft Word XP includes modules and content acquired from other suppliers, like the equation editor, the document version comparer, parts of the spelling correction system, thesaurus, hyphenators, and dictionaries for many different languages, as well as some templates and fonts. Further, when the user browses the Web, the Web server may well be open source software (like Apache) or proprietary Web server software from another supplier (like IBM WebSphere or BEA WebLogic).
A pervasive issue is the coordination of software suppliers so that their solutions are either automatically composable or can at least be purposefully integrated. This creates a conspicuous need and opportunity for different software companies to coordinate through business or cooperative arrangements. As in other industries, coordination can take many forms, the extremes being proprietary bilateral business relationships on the one hand, and cooperative standards processes open to all interested parties on the other.
Some business relationships within the software industry take the traditional supplier-consumer form, although this does not evoke standard images of warehouses, shipping, and inventory. Since software can be freely replicated, a supplier need only provide a single copy to the customer together with the appropriate authorization, in the form of a licensing agreement (see chapter 8) spelling out the terms and conditions, for the customer to replicate the software in its products or to provision within its environment.
Where one software supplier is incorporating modules supplied by others, those modules must be integrated. This system integration step frequently requires modification of the purchased modules. The license permitting, changes to acquired modules may be made by the integrator (this requires source code). More often, the supplier makes these changes, and these repairs or refinements benefit all customers. Generally, the process and issues to be addressed are similar to end-user acquisition of software (see section 6.3.4). Differences do arise if the customer passes this software through to its own customers rather than provisioning it internally. Thus, the revenue stream expected from its customers, rather than its internal value proposition, becomes an element of price negotiation. Further, there is the issue of customer support: How do operators and users obtain support for modules acquired rather than developed by their immediate supplier? Is this a two-step process, or should the supplier directly support operators and users? Of course, customers generally prefer an integrated customer support solution. A common form of licensing agreement refers to the indirect suppliers as original equipment manufacturers (OEMs) and leaves all customer support with the immediate supplier of a product. Internally, that supplier will draw on technical support from the OEM.
Example Comparable OEM agreements exist between manufacturers of personal computers and the suppliers of preinstalled operating systems and applications. Customers receive integrated support from the computer manufacturer, who may in turn depend on the OEM when it cannot deal with an issue.
Recall that the API is an interface designed to support a broad class of extensions (see section 4.3.4). The open API allows one software supplier to extend or incorporate software from another supplier without establishing a formal business relationship. The owner of the open API exports services through an interface that is documented and where the software license allows for the unfettered use of this interface unconstrained by intellectual property restrictions and without payment. Technically, it is possible for a module from another supplier to invoke actions at this interface, which requires that it be accessible through mechanisms embodied in industry-standard infrastructure. One of the roles of infrastructure is to enable the composability of modules from different suppliers, and the API is one of the key constructs made possible.
Example An application may offer an API that allows other suppliers to add value to that application, for instance, in the common gateway interface to a Web server that allows other applications to display their content via a Web browser. This API is technically feasible because the operating system provides mechanisms for one program to interact with another program executing on the same host.
It should be emphasized that not all interfaces in a software product are APIs. Most interfaces are proprietary, designed for internal interaction of modules and neither documented nor made available through technical means to software from other suppliers. Other interfaces may be documented and technically available, but because they are designed for a specific and narrow purpose, they fail to quality as an API. Further, suppliers reserve the right to change internal interfaces but implicitly or explicitly commit to extending but not changing an API (so as not to break other modules depending on it). Choosing to include an API in a software product is a serious business decision. Besides potential benefits, there are significant costs. Future versions of the product will either have to maintain compatibility, thus possibly requiring substantial engineering effort, or abandon existing clients using the API, thus risking dissatisfied customers and opening an opportunity for competitors.
If future extensions can be anticipated and are of broad interest, the supplier may wish to create and market these extensions itself, rather than ceding this opportunity to other suppliers, by building in an API. Infrastructure software's value is through the applications supported, and the API is the enabler. To the application software supplier, the API may be a vehicle by which other suppliers or even customers may customize that application to more specific (company or vertical industry) needs, increasing its value.
An alternative to the API is to offer contract development services to customize software. The supplier may maintain a services organization that contracts for customizations or extensions to meet specific customer needs.
The API is a software interface for software executing within a single host. A similar idea can be achieved over the network, where software from one supplier can interface with software from another supplier using the network. In this case, the API is replaced by a network protocol with similar business issues and characteristics.
An industry standard is a specification that is commonly agreed upon, precisely and completely defined, and well documented so that any supplier is free to implement and use it. Of course, it may or may not be widely adopted or uniformly implemented. In the software industry, the most common targets for standardization are architectural decomposition and the interfaces or network protocols defining the interactions of modules within that architecture. This type of standard seeks interoperability among modules, either within the same host or across the network (see section 4.5).
Example The USB (universal serial bus) port is a standard interface to personal computer peripherals that includes physical (plug geometry, functions of the different pins) and electrical (voltage and waveform) characteristics, as well as the formats of bit streams that allow messages to be passed between a CPU and a peripheral. Like most standards, it does not address the complementarity of function in the computer and the peripheral, such as printing a page or communicating over a telephone line.
Another common target for standardization is the representation used for specific types of information, so that information can be exchanged among different applications or within an application.
Example JPEG and MPEG are popular industry-standard compressed representations for pictures and audio/video, respectively. They allow one application to capture music in a file and another application to access that file and recreate the music. MP3 is a popular standard for sharing compressed music based on the audio portion of MPEG.
Programming languages are often standardized as well.
An open standard is available for anybody to implement, well documented, and unencumbered by intellectual property restrictions, so any supplier is free to implement the standard without making prior business arrangements.
Example Many open standards are created by independent standardization bodies in which a number of companies collaborate in finding a mutually satisfactory solution. The body that creates the open Internet standards (including IP, UDP, and TCP) is the Internet Engineering Task Force (IETF). The World Wide Web Consortium (W3C) defines open standards for the evolution of the Web.
"Openness" is not an absolute because some of these properties can be relaxed, making the standard less open but still not closed or proprietary.
Example Sometimes a standard encumbered by intellectual property rights may be considered open if a promise has been made to exercise those rights in a measured fashion. In the most common arrangement, in return for inclusion in the standard the owner promises that a license will be granted to any and all under terms that are reasonable (moderate in cost) and nondiscriminatory (the same terms for all). For instance, the MP3 audio compression standard is an ISO/IEC standard, but is still covered by patents held by Fraunhofer IIS-A and Thomson multimedia that require licensing and fee payment for any but private and small-scale use.
Other interfaces, protocols, or representations may carry more restrictions and still be labeled an industry standard, even if not considered an open standard.
Example Java is promulgated as a programming language, associated virtual machine for supporting portable execution, and an environment for portable and mobile code (see section 4.4). The specifications and associated tools were first developed by Sun Microsystems, which maintained licensing terms intended to prevent variants. Sun imbued Java with the characteristics of a standard (widely promulgated and used) while retaining control through intellectual property laws (see chapter 8). Among those provisions, any implementations that use the Java trademark must meet certain acceptance tests.
The industry standard helps coordinate suppliers of complementary products, but it is not the only such mechanism. The supplier-customer business relationship allows a supplier to buy rather than make some portion of its software product. The API enables a one-to-many relationship, where one software supplier deliberately creates an opportunity for all other suppliers to extend or exploit its product without the need for a formal business relationship. The industry standard excels at supporting a multilateral relationship among suppliers. The typical approach is to define and document an interface or a network protocol that can be exploited by many companies. In contrast to the API, where one supplier maintains a proprietary implementation of one side of an interface and allows other suppliers to define products on the other side of that interface, a standardized interface allows companies to define products that support the interface from either side.
From the customer and societal perspectives, open standards allow competition at the subsystem level: suppliers can create competitive substitutes for subsystems and know that the customer will have available the necessary complementary subsystems from other suppliers to forge a complete solution. Similarly, customers can mix and match subsystems from different suppliers if they feel this results in a better overall solution in dimensions such as price, features, performance, and quality. Modules can be replaced without replacing the entire system, reducing switching costs and lock-in. The disadvantage is that the customer must integrate subsystems from different vendors. In spite of standards, this additional integration takes time and effort, and sometimes introduces problems.
Example The PC offers open standards for serial and parallel connections between CPU and peripherals, including modems, printers, and display, so customers can mix and match PCs and peripherals from different manufacturers. Apple Computer pursued a more monolithic approach with the original Macintosh, which had the advantage that the system was operational out of the box. As the PC platform has matured, plug-and-play technology has made integration more seamless, and vendors like Dell accept customized system orders and perform the integration for the customer. Meanwhile, the Macintosh has moved toward an open standards approach (supporting open industry standards such as the universal serial bus). Today this distinction between the two platforms is minimal.
Network effects sometimes drive standardization (see section 3.2.3) in a multi-vendor solution. The incentive for standardization in this case is to avoid the proliferation of multiple networks, with the resulting fragmentation and reduced value to users and the benefits of positive feedback.
Example The peer-to-peer architecture for distributed applications creates a need for standards to support interoperability among peers (see section 4.5.3). Without such a standard, users could only participate in the application with users who have adopted that same vendor's solution. This is illustrated by instant messaging, where several firms offer services (AOL, Microsoft, and Yahoo, among others) that are currently incompatible, creating fragmented networks. The DVD illustrates the benefit of a standardized information representation that tames indirect network effects. Two industrial consortiums proposed incompatible standards for video playback but ultimately negotiated a single standard, driven by concern about the market dampening effect of network effects and consumer confusion if two or more standards were marketed, and by pressure from content suppliers, who were concerned about these issues.
An industry standard is the outcome of a process, sometimes a long and messy one. Influences on the eventual standard may be user needs, market forces, the interests of or strategy pursued by individual suppliers, and occasionally government laws or regulations. The possibilities range from a de facto standard to a de jure standard. The de facto standard begins life as a proprietary interface or protocol, but through market forces becomes so commonly adopted by many companies that it is an industry standard in fact (Shapiro and Varian 1999a). In the case of interfaces, some de facto standards begin life as APIs. Undocumented proprietary interfaces are less likely to become de facto standards because they prohibit (using intellectual property laws) or discourage participation by other suppliers.
Example The Hayes command set started as an API chosen by a leading voiceband modem manufacturer and was initially offered by most suppliers of telecommunications software to control the Hayes modem. Since this API was widely supported by software, other modem manufacturers began to implement the same API, and it became a de facto standard. Later, Hayes attempted to force other modem manufacturers to pay royalties based on patented technology it had incorporated into the implementation. Another example is the operating system APIs, allowing programs to send a packet over the network or save a file to disk. The primary purpose is encouraging application software suppliers to build on the operating system; a diversity of applications provides greater value to users. A side effect is to potentially enable an alternative infrastructure software supplier to independently implement and sell a direct substitute operating system, except to the extent that the API may be protected by intellectual property restrictions (see chapter 8.) Such independent reimplementation is unlikely for an operating system, however, because of the large investment and unappealing prospect of head-to-head competition with an entrenched supplier with substantial economies of scale (see chapter 9).
In the case of both interfaces and protocols, de facto standards often begin as an experimental prototype from the research community.
Example The protocol and data format used to interact between client and server in the Web (HTTP and HTML) allows a Web browser and server to compose regardless of who supplies the client and server. It began as a way to share documents within a research community. Later, it was popularized by the innovative Mosaic Web browser from the University of Illinois, which provided a general graphical user interface. Today, there are more than a half-dozen suppliers of servers and browsers, and within the limits of the imprecise definitions of HTML, any server can interoperate with any browser using these standards (and their successors). Similarly, the socket is an operating system API that allows applications to communicate over the network. It has become a de facto standard resident in several operating systems, but it started as an API for the Berkeley UNIX operating system from the University of California.
At the other end of the spectrum, the de jure standard is sanctioned by a legal or regulatory entity.
Example Regulatory forces are most likely to impose themselves when some public resource like the radio spectrum is required. In most countries there is a single legally sanctioned standard for radio and television broadcasting, as for wireless telephony. In the latter case the United States is an exception; the Federal Communications Commission specifically encouraged the promulgation of several standards. These standards deal with the representation and transmission of voice only across the wireless access link, and admit the conversions that allow for end-to-end voice conversations; direct network effects do not intervene. Another example is the Ada programming language, defined by the U.S. Department of Defense and imposed on its contractors until the late 1990s.
There are many cases intermediate to de facto and de jure, some of which are discussed later in conjunction with standards processes.
As applied to interfaces and network protocols, an essential first step in defining such standards is to locate interfaces or protocols that are candidates for standardization. This is an architectural design issue for purposes of standardization as well as implementation. There are several approaches to determining where there should be an interface or protocol to standardize. The first is to explicitly define the location of an interface as part of the standardization process. Such decomposition is called a reference model, a partial software architecture covering aspects of the architecture relevant to the standard. A reference model need not be a complete architecture; for example, modules within an implementation may be hierarchically decomposed from a single reference-model module, an implementation choice not directly affecting compliance with the standard.
Example CORBA is a standard for a middleware infrastructure supporting object-oriented distributed systems promulgated by the Object Management Group. One of its primary contributions is a reference model for a number of common services that support such distributed applications.
A second approach to defining the location of a standardized interface is creating an interface and putting it into the public domain as a standard or letting it grow into a de facto standard.
Example Desktop computer vendors (both IBM and Apple) defined a number of interfaces that grew into industry standards, including interfaces to monitor and peripherals, an API for application programs, and standards for the bus that supports expansion cards.
Third, the location of an open interface might be defined by market dynamics or be a side effect of the preexisting industrial organization. These types of interfaces typically follow the lines of core competencies, such as integrated circuit manufacture and infrastructure or application software.
Example When IBM designed its first PC, it made decisions on outside suppliers that predefined some interfaces within the design, and those interfaces later evolved into de facto standards. By choosing an Intel microprocessor rather than developing its own, IBM implicitly chose an instruction set for program execution. By deciding to license its operating system (MS-DOS) from Microsoft (which importantly targeted this instruction set) rather than develop its own, IBM adopted operating system APIs that were later used by alternative operating systems (for example, Linux uses a FAT32 file system adopted from DOS, and Novell marketed a version of DOS). These choices reduced the time to market but also created an opportunity for other suppliers, including the PC clone manufacturers (Compaq was an early example) and AMD (which manufactures microprocessor chips compatible with Intel's).
Standards also address a serious problem in software engineering. In principle, a new interface could be designed whenever any two modules need to compose. However, the number of different interfaces must be limited to contain the development and maintenance costs arising from a proliferation of interfaces. Besides this combinatorial problem, there is the open-world problem. The open-world assumption in systems allows new modules to be added that weren't known or in existence when the base system was created—this is the motivation for APIs. It is impractical (indeed impossible) to have a complete set of special-case or proprietary interfaces to connect a full range of modules that may arise over time. A practical alternative is to define a limited set of standardized interfaces permitting interoperability over a wide range of functionality and complementarity.
Example The CORBA standards standardize IIOP, a network protocol layered on top of TCP, which allows modules (in this case, the most limited case of objects) to interface with one another in a similar way whether they reside on the same host or different hosts. In effect, IIOP hides the details of the underlying network protocols (potentially multiple) and multiple platform implementations of those protocols behind a familiar interface. While individual applications would be free to develop a similar capability on a proprietary basis, an industry-standard solution reduces the number of implementations that are developed and maintained.
Interfaces, the functionality related to these interfaces, the preferred decomposition of systems, and the representations used for sharing information can all be standardized to enable interoperability. For needs that are well understood and can be anticipated by standardization bodies (such as industrial consortiums or governmental standardization institutions) standards can be forged in advance of needs and later implemented by multiple vendors. This process has unfortunately not worked well in the software industry because of the rapid advances made possible by software's inherent flexibility and rapid distribution mechanisms, with the result that new products are often exploring new technology and applications territory. Thus, this industry has relied heavily on de facto standardization.
Another approach has been to emphasize future extensibility in standards that are developed. This is a natural inclination for software, which emphasizes elaboration and specialization of what already exists (e.g., through layering; see section 7.1.3). For example, it is often feasible to elaborate an existing API rather than to define a new one. This can be accomplished by following the open-closed principle, which requires that interfaces be open to extension but closed to change. As long as existing actions are unchanged, the interface can be elaborated by adding new actions without affecting modules previously using the interface.
Example Successive generations of an operating system try to maintain compatibility with existing applications by not changing the actions available in its API. The new version may be new or improved "under the hood," for example, improving its stability or performance without changing the API. The new version may add new capabilities (added actions) to benefit future applications without changing those aspects of the API used by old applications.
The Internet has increased the importance of standards because of the new dependence (and direct network effects) it creates across different platforms. In an attempt to satisfy this thirst for standards, but without introducing untoward delay and friction in the market, industry has experimented with more facile and rapid standardization processes.
One trend is standardization processes well integrated with a research or experimental endeavor, in which the standard is deliberately allowed to evolve and expand in scope over time based on continual feedback from research outcomes and realworld experience. In fact, this type of standardization activity shares important characteristics (like flexibility and user involvement) with agile software development processes (see section 4.2.5).
Example IETF has always recognized that its standards are a work in progress. The mechanism is to publish standards but to allow newer versions to make older ones obsolete. Most IETF standards arise directly from a research activity, and there is a requirement that any additions to the suite of standards be based on working experimental code. One approach used by the IETF and others is to rely initially on a single implementation that offers open-world extension hooks. Once it is better understood, a standard may be lifted off the initial implementation, enabling a wider variety of interoperable implementations.
In contrast to this continual refinement, a traditional top-down process is less chaotic and allows greater reflection on the overall structure and goals. It attempts to provide a lasting solution to the whole problem, all at once. A refinement process acknowledges that technologies are dynamic; whereas a reference architecture must be reasonably well defined to begin with, the details of functionality and interfaces can evolve over time.
Much depends on the maturity of an industry. For the time being, the de facto and continual refinement standardization processes are appropriate for many aspects of software because they allow innovation and evolution of solutions, reflecting market realities. When a stage of maturity is reached where functionality is better defined and stable, traditional top-down standardization processes can take over.
Layering is important because it allows standards to be built up incrementally rather than defined all at once (see section 7.1.3). The bottom layer (called wiring or plumbing standards) is concerned with simple connection-level standards. Functionality can then be extended one layer at a time, establishing ever more elaborate rules of interoperation and composability.
Example The early Internet research, as more recently the IETF, used layering. The bottom layer consisted of existing local-area networking technologies and displayed horizontal heterogeneity because there were numerous local-area and access networking technologies. The Internet standard added an IP layer interconnecting these existing technologies, and it provides today a spanning layer supporting a number of layering alternatives above. The IETF has systematically added layers above for various specific purposes. Sometimes lower layers need to be modified. For example, version four of the IP layer is widely deployed today, and the next version (version six) has been standardized. Because IP is widely used, any new version should satisfy two key constraints if at all possible. First, it should coexist with the older version, since it is impractical to upgrade the entire network at once. Second, it should support existing layer implementations above while offering new services or capabilities to new implementations of those layers or to newly defined layers.
Another trend is standardization processes that mimic the benefits of de facto standards but reduce or eliminate the time required for the marketplace to sort out a preferred solution. A popular approach is for a group of companies to form a consortium (often called a forum) that tries to arrive at good technical solutions by pooling expertise; the resulting solutions do not have the weight of a formal standard but rather serve as recommendations to the industry. Often such a consortium will request proposals from participants and then choose a preferred solution or combine the best features of different submissions or ask that contributors work to combine their submissions. Member companies follow these standards voluntarily, but the existence of these recommendations allow the market to arrive at a de facto standard more quickly.
Example The Object Management Group, the developer of the CORBA standards, was formed to develop voluntary standards or best practices for infrastructure software supporting distributed object-oriented programs. It now has about 800 member organizations. W3C was formed by member companies at the Massachusetts Institute of Technology to develop voluntary standards for the Web; it now has more than 500 member organizations. ECMA was formed to reach de facto standards among European companies but has evolved into a standards body that offers a fast track to the International Organization for Standardization.
While standardization has many benefits, they have disadvantages as well. In an industry that is changing rapidly with robust innovation, the existence of standards and the standardization process can impede technical progress. Sometimes standards come along too late to be useful.
Example The Open Systems Interconnect (OSI) model was a layered network protocol providing similar capabilities to the Internet technologies. It was an outgrowth of a slow international standardization process and, because it attempted to develop a complete standard all at once, was expensive and slow to be implemented as well. By the time it arrived, the Internet had been widely deployed and was benefiting from positive feedback from network effects. OSI was never able to gain traction in the market.
Where standards are widely adopted, they can become a barrier to progress. This is an example of lock-in of the entire industry resulting from the difficulty and expense of widely deploying a new solution.
Example Version six of IP has been much slower to deploy than expected. While it will likely gain acceptance eventually, version four has been incrementally upgraded to provide many of the capabilities emphasized in version six, and the substantial trouble and expense of deploying version six is an obstacle.
Another disadvantage of standards is that they may inhibit suppliers from differentiating themselves in the market. A way to mitigate this, as well as to allow greater innovation and faster evolution of the technology, is to define flexible or extensible standards.
Example XML is a W3C standard for representing documents. Originally defined as a replacement for HTML in the Web, XML is gaining momentum as a basis for exchanging information of various types among departmental, enterprise, and commerce applications, and is one underpinning of Web services (see section 7.3.7). One advantage is that unlike HTML, it separates the document meaning from screen formatting, making it useful to exchange meaningful business documents whose content can be automatically extracted and displayed according to local formatting conventions. Another advantage is its extensibility, allowing new industry- or context-specific representations to be defined. XML and its associated tools support a variety of context-specific standards or proprietary representations.
Where reasonable to do so, it is appropriate to minimize or eliminate the role of standards altogether. Although standards are always necessary at some level, modern software technologies and programmability offer opportunities to reduce their role, especially within applications (as opposed to infrastructure).
Example The device driver shown in figure 7.8 is used in connecting a peripheral (like a printer) to a personal computer. The idea is to exploit the programmability of the computer to install a program that communicates with the printer, with complementary embedded software in the printer. This allows the operating system to focus on defining standard high-level representations for printed documents (such as Postscript), while the device driver encapsulates low-level protocols for interoperation between the computer and the printer. Since the device driver is supplied by the printer manufacturer, it can do whatever it chooses (like differentiating one printer from another) without requiring an interoperability standard. Of course, the device driver and printer build on a standard for exchanging content-blind messages, such as the computer serial or parallel port.
Figure 7.8: The device driver can allow interoperability while moving standardizations to a higher abstraction.
Mobile code can realize the same idea dynamically (see figure 7.9). Interoperability issues suggest the need for standardization when two modules on different hosts may originate with different suppliers. However, if both modules originate with the same supplier, they may be interoperable by construction with no need for standardization. Their interfaces can even be changed in new versions, as long as both modules are upgraded simultaneously.
Figure 7.9: Direct network effects can be eliminated by mobile code.
Example Real Networks supplies complementary streaming audio-video RealServer and a RealPlayer for the client desktop. Over time, Real has been relatively free to upgrade its RealServer capabilities, even at the expense of compatibility with the RealPlayer, because it is reasonable to expect users to upgrade the client to the latest available version over the network. (Users with no Internet connection would not be candidates to use streaming audio-video.)
Downloaded software or mobile code is a particularly powerful way to bypass direct network effects, as evident in a peer-to-peer architecture.
Example Successful peer-to-peer applications like Napster (music file sharing) and Groove (file sharing and collaborative tools) have benefited from the ability to download the peer software from a central server. To join the network, a new user can easily download and install the necessary software (or with mobile code it can even be made transparent). Were it necessary to purchase equipment or software in a store, these sorts of applications would find it dramatically more difficult to reach critical mass and benefit from positive feedback.
Another approach to mitigating some problems with standardization is to standardize languages that can describe application elements, such as the interaction between modules, the functionality of modules, or the representation of information elements. We call this a meta-standard because it standardizes a way of describing something rather than standardizing that something directly. This can substantially increase the ability of suppliers to innovate and differentiate.
Example For communicating images from one host to another, a representation that includes a way of digitizing the image and compressing it must be shared by the transmitter and receiver. To avoid standardization, a meta-standard might take the form of a language capable of describing a large collection of decompression algorithms. A typical description of a compression algorithm would be something like "use an n by n discrete cosine transform with n = 8 followed by a quantization algorithm of the following form…." Constrained only by the linguistic expressiveness of the meta-standard language, a transmitter is free to choose any compression algorithm and convey it (this is a form of mobile code) along with the image representation to the receiver. An early example is self-extracting archives, which are compressed collections of computer files arriving as an executable bundle that, upon execution, unpacks itself, yielding the collection of files.
Rudimentary forms of meta-standards already exist.
Example The interface definition language (IDL) of CORBA allows modules to disclose their capabilities by describing the actions that are available. XML for documents provides a language for describing, in effect, new markup languages.
see <http://www.mp3licensing.com/> for a description of license terms and conditions.