A quick orientation of how emerging technologies are affecting industries and lifestyle highlights the importance of understanding the principles of telecommunications, and, hopefully, to inspire you to "think telecom." The changes discussed here are ultimately very important to how telecommunications networks will evolve and to where the growth areas will be.
An enormous amount of the activity driving telecommunications has to do with the emergence of advanced applications; likewise, advances in telecommunications capabilities spur developments in computing platforms and capabilities. The two are intimately and forever intertwined. The following sections discuss some of the changes that are occurring in both telecommunications and in computing platforms and applications, as well as some of the changes expected in the next several years.
Telecommunications has allowed a virtual world to emerge one in which time and distance no longer represent a barrier to doing business or communicating but we're still lacking something that is a critical part of the human information-processing realm. The human mind acts on physical sensations in the course of its information processing; the senses of sight, sound, touch, and motion are key to our perception and decision making. Developments in sensory technologies and networks will allow a new genre of sensory reality to emerge, bridging the gap between humans and machines. One of the most significant evolutions occurring in computing and communications is the introduction of the human senses into electronic information streams. The following are a few of the key developments in support of this more intuitive collaborative human machine environment:
Computers are now capable of hearing and speaking, as demonstrated by Tellme, a popular U.S. voice-activated service that responds to defined voice prompts and provides free stock quotes, weather information, and entertainment guides to 35,000 U.S. cities.
The capability to produce three-dimensional sound through digital mastery a technology called "virtual miking" is being developed at the University of Southern California's Integrated Media Systems Center.
Virtual touch, or haptics, enables a user to reach in and physically interact with simulated computer content, such as feeling the weight of the Hope diamond in your hand or feeling the fur of a lion. Two companies producing technology in this area are SensAble Technologies and Immersion Corporation. They are producing state-of-the-art force feedback, whole-hand sensing, and real-time 3D interaction technologies, and these hardware and software products have a wide range of applications for the manufacturing and consumer markets, including virtual-reality job training, computer-aided design, remote handling of hazardous materials, and "touch" museums.
The seduction of smell is also beginning to find its way into computers, allowing marketers to capitalize on the many subtle psychological states that smell can induce. Studies show that aromas can be used to trigger fear, excitement, and many other emotions. Smell can be used to attract visitors to Web sites, to make them linger longer and buy more, to help them assimilate and retain information, or to instill the most satisfying or terrifying of emotional states (now that's an interactive game!). Three companies providing this technology today are Aromajet, DigiScents, and TriSenx. Aromajet, for example, creates products that address video games, entertainment, medical, market research, personal and home products, and marketing and point of sales applications.
The visual information stream provides the most rapid infusion of information, and a large portion of the human brain is devoted to processing visual information. To help humans process visual information, computers today can see; equipped with video cameras, computers can capture and send images, and can display high-quality entertainment programming. The visual stream is incredibly demanding in terms of network performance; thus, networks today are rapidly preparing to enable this most meaningful of information streams to be easily distributed.
How we engage in computing and communications will change dramatically in the next decade. Portable computing devices have changed our notion of what and where a workplace is and emphasized our desire for mobility and wireless communication; they are beginning to redefine the phrase dressed for success. But the portable devices we know today are just a stepping stone on the way to wearables. Context-aware wearable computing will be the ultimate in light, ergonomic, reliable, flexible, and scalable platforms. Products that are available for use in industrial environments today will soon lead to inexpensive, easy-to-use wearables appearing at your neighborhood electronics store:
Xybernaut's Mobile Assistant IV (MA-IV), a wearable computer, provides its wearer with a full-fledged PC that has a 233MHz Pentium chip, 32MB memory, and upward of 3GB storage. A wrist keyboard sports 60 keys. Headgear suspended in front of the eye provides a full-color VGA screen, the size of a postage stamp but so close to the eye that images appear as on a 15-inch monitor. A miniature video camera fits snugly in a shirt pocket. Bell Canada workers use MA-IVs in the field; they replace the need to carry manuals and provide the ability to send images and video back to confer with supervisors. The MA-IV is rather bulky, weighing in at 4.4 pounds (2 kilograms), but the soon-to-be-released MA-V will be the first mass-market version, and it promises to be lightweight.
MIThril is the next-generation wearables research platform currently in development at MIT's Media Lab. It is a functional, operational body-worn computing architecture for context-aware human-computer interaction research and general-purpose wearable computing applications. The MIThril architecture combines a multiprotocol body bus and body network, integrating a range of sensors, interfaces, and computing cores. It is designed to be integrated into everyday clothing, and it is both ergonomic and flexible. It combines small, light-weight RISC processors (including the StrongARM), a single-cable power/data "body bus," and high-bandwidth wireless networking in a package that is nearly as light, comfortable, and unobtrusive as ordinary street clothing.
BandwidthA term that you hear often when discussing telecommunications is bandwidth. Bandwidth is a critical commodity. Historically, bandwidth has been very expensive, as it was based on the sharing of limited physical resources, such as twisted-pair copper cables and coax. Bandwidth is largely used today to refer to the capacity of a network or a telecom link, and it is generally measured in bits per second (bps). Bandwidth actually refers to the range of frequencies involved that is, the difference between the lowest and highest frequencies supported and the greater the range of frequencies, the greater the bandwidth, and hence the greater the number of bits per second, or information carried. The analogy of a hose is often used to describe bandwidth:
|
To be truly useful, wearables will need to be aware of where you are and what you're doing. Armed with this info, they will be able to give you information accordingly. (Location-based services are discussed in Chapter 14, "Wireless Communications.")
As we distribute intelligence across a wider range of devices, we are experiencing pervasive computing, also called ubiquitous computing. We are taking computers out of stand-alone boxes to which we are tied and putting them into ordinary things, in everyday objects around us. These new things, because they are smart, have a sense of self-awareness and are able to take care of themselves. When we embed intelligence into a device, we create an interesting new opportunity for business. That device has to have a reason for being, and it has to have a reason to continue evolving so that you will spend more money and time on it. To address this challenge, device manufacturers are beginning to bundle content and applications with their products. The result is smart refrigerators, smart washing machines, smart ovens, smart cabinets, smart furniture, smart beds, smart televisions, smart toothbrushes, and an endless list of other smart devices. (These smart devices are discussed in detail in Chapter 15, "The Broadband Home and HANs.")
Devices are becoming smaller and more powerful all the time, and they're getting physically closer to our bodies, as well. The growing amount of intelligence distributed throughout the network is causing changes in user profiles.
We are moving away from human-to-human communications to an era of machine-to-machine communications. Today, there are just over 6 billion human beings on the planet, yet the number of microprocessors is reported to be more than 15 billion. Devices have become increasingly intelligent, and one characteristic of an intelligent system is that it can communicate. As the universe of communications-enabled devices grows, so does the traffic volume between them. As these smart things begin to take on many of the tasks and communications that humans traditionally exchanged, they will change the very fabric of our society. For example, your smart washing machine will initiate a call to the service center to report a problem and schedule resolution with the help of an intelligent Web agent long before you even realize that something is wrong! These developments are predicted to result in the majority of traffic up to 95% of it being exchanged between machines, with traditional human-to-human communications representing only 5% of the network traffic by 2010.
Sharing of information can occur in a number of ways via smoke signals, by letters sent through the postal service, or as transmissions through electrical or optical media, for example. Before we get into the technical details of the technologies in the industry, it's important to understand the driving forces behind computing and communications. You need to understand the impact these forces have on network traffic and therefore on network infrastructure. In today's environment, telecommunications embodies four main traffic types, each of which has different requirements in terms of network capacity, tolerance for delays and particularly variations in the delay in the network, and tolerance for potential congestion and therefore losses in the network:
Voice Voice traffic has been strong in the developed world for years, and more subscriber lines are being deployed all the time. However, some three billion people in the world haven't even used a basic telephone yet, so there is yet a huge market to be served. Voice communications are typically referred to as being narrowband, meaning that they don't require a large amount of network capacity. For voice services to be intelligible and easy to use, delays must be kept to a minimum, however, so the delay factors in moving information from Point A to Point B have to be tightly controlled in order to support real-time voice streams. (Concepts such as delay, latency, and error control are discussed in Chapter 6, "Data Communications Basics.")
Data Data communications refers to the exchange of digitized information between two machines. Depending on the application supported, the bandwidth or capacity requirements can range from medium to high. As more objects that are visual in nature (such as images and video) are included with the data, that capacity demand increases. Depending again on the type of application, data may be more or less tolerant of delays. Text-based exchanges are generally quite tolerant of delays. But again, the more real-time nature there is to the information type, as in video, the tighter the control you need over the latencies. Data traffic is growing much faster than voice traffic; it has grown at an average rate of about 30% to 40% per year for the past decade. To accommodate data communication, network services have been developed to address the need for greater capacity, cleaner transmission facilities, and smarter network management tools. Data encompasses many different information types. In the past, we saw these different types as being separate entities (for example, video and voice in a videoconference), but in the future, we must be careful not to separate things this way because, after all, in the digital age, all data is represented as ones and zeros.
Image Image communications requires medium to high bandwidth the greater the resolution required, the greater the bandwidth required. For example, many of the images taken in medical diagnostics require very high resolution. Image traffic tolerates some delay because it includes no motion artifacts that would be affected by any distortions in the network.
Video Video communications, which are becoming increasingly popular and are requiring ever-greater bandwidth, are extremely sensitive to delay. The future is about visual communications. We need to figure out how to make video available over a network infrastructure that can support it and at a price point that consumers are willing to pay. When our infrastructures are capable of supporting the capacities and the delay limitations required by real-time applications, video will grow by leaps and bounds.
All this new voice, data, and video traffic means that there is growth in backbone traffic levels as well. This is discussed further later in the chapter, in the section "Increasing Backbone Bandwidth."
The telecommunications revolution has spawned great growth in the amount and types of traffic, and we'll see even more types of traffic as we begin to incorporate human senses as part of the network. The coming chapters talk in detail about what a network needs in order to handle the various traffic types.
The new traffic patterns imply that the network will also be host to a new set of applications not just simple voice or text-based data, but to new genres of applications that combine the various media types.
The ability to handle digital entertainment applications in a network is crucial. In some parts of the world, such as Asia, education may have primary focus, and that should tell us where we can expect greater success going forward. But throughout much of the world, entertainment is where people are willing to spend the limited numbers of dollars that they have to spend on electronic goods and services. The digital entertainment realm will include video editing, digital content creation, digital imaging, 3D gaming, and virtual reality applications, and all these will drive the evolution of the network. It's the chicken and the egg story: What comes first, the network or the applications? Why would you want a fiber-optic broadband connection if there's nothing good to draw over that connection? Why would you want to create a 3D virtual reality application when there's no way to distribute it? The bottom line is that the applications and the infrastructures have to evolve hand-in-hand to manifest the benefits and the dollars we associate with their future.
Another form of application that will be increasingly important is in the realm of streaming media. A great focus is put on the real-time delivery of information, as in entertainment, education, training, customer presentations, IPO trade shows, and telemedicine consultations. (Streaming media is discussed in detail in Chapter 11, "Next-Generation Network Services.")
E-commerce (electronic commerce) and m-commerce (mobile commerce) introduce several new requirements for content management, transaction platforms, and privacy and security tools, so they affect the types of information that have to be encoded into the basic data stream and how the network deals with knowledge of what's contained within those packets. (Security is discussed in detail in Chapter 11.)
Many of the changes discussed so far, but primarily the changes in traffic patterns and applications, will require immense amounts of backbone bandwidth. Table 1.1 lists a number of the requirements that emerging applications are likely to make on backbone bandwidth.
Table 1.1. Backbone Bandwidth Requirements for Advanced Applications | ||
Application | Bandwidth Needed | Examples |
Online virtual reality | 1,000 70,000 terabits per second | Life-size 3D holography; telepresence |
Machine communications | 50,000 200,000 terabits per second | Smart things; Web agents; robots |
Meta-computing | 50,000 200,000 terabits per second | Weather prediction; warfare modeling |
In addition, advances in broadband access technologies will drive a demand for additional capacity in network backbones. Once 100Gbps broadband residential access becomes available and there are developments on the horizon the core networks will require capacities measured in exabits per second (that is, 1 billion Gbps). These backbone bandwidth demands make the revolutionary forces of optical networking critical to our future. (Optical networking is discussed in detail in Chapter 12, "Optical Networking.")
New developments always bring with them politics. Different groups vie for money, power, the ability to bring new products to market first and alone, and the right to squash others' new ideas. A prominent characteristic of the telecommunications sector is the extent to which it is influenced by government policy and regulation. The forces these exert on the sector are inextricably tied to technological and market forces.
Metric Prefixes and EquivalentsThe following table defines commonly used metric prefixes:
For example, 10Gbps = 10,000,000,000bps, and 4KHz = 4,000Hz (that is, cycles per second). The following shows the relationships of commonly used units of measure to one another: 1Kbps = 1,000bps 1Gbps = 1,000Mbps 1Tbps = 1,000Gbps 1Pbps = 1,000Tbps 1Ebps = 1,000Pbps |
Because of the pervasive nature of information and communication technologies and the services that derive from them, coupled with the large prizes to be won, the telecommunications sector is subjected to a lot of attention from policymakers. Particularly over the past 20 years or so, telecommunications policy and regulation have been prominent on the agendas of governments around the world. This reflects the global trend toward liberalization, including, in many countries, privatization of the former monopoly telcos. However, interest from policymakers in telecommunications goes much deeper than this. A great deal of this interest stems from the extended reach and wide impact that information and communication technologies have. Here are some examples:
Telephony, e-mail, and information services permit contact between friends and families and offer convenience to people in running their day-to-day lives. Thus, they have major economic and social implications.
In the business arena, information and communication technologies offer business efficiency and enable the creation of new business activities. Thus, they have major employment and economic implications.
Multimedia and the Internet offer new audio, video, and data services that affect entertainment and education, among other areas. These new services are overlapping with traditional radio and television broadcasting, and major cultural implications are appearing.
News delivery influences peoples' perceptions of governments and their own well-being, thereby influencing voter attitudes. Telecommunications brings attention to cultural trends. Therefore, telecommunications has major political as well as cultural implications.
Government applications of information and communication technologies affect the efficiency of government. Defense, national security, and crime-fighting applications are bringing with them major political implications.
Given this background of the pervasive impact that information and communication technologies have, it is hardly surprising they get heavy policy attention.
Although many national regulatory authorities today are separate from central government, they are, nevertheless, built on foundations of government policy. Indeed, the very act of creating an independent regulatory body is a key policy decision. Historically, before telecommunications privatization and liberalization came to the fore, regulation was often carried out within central government, which also controlled the state-run telcos. That has changed in recent years in many, but not all, countries.
Given their policy foundation, and the fact that government policies vary from country to country and from time to time, it is not surprising that regulatory environments evolve and differ from country to country. These evolutions and international variations sometimes pose planning problems for the industry, and these problems can lead to frustrations and tensions between companies and regulatory agencies. They can also lead to disagreements between countries (for example, over trade issues). Although moves to encourage international harmonization of regulatory regimes (for example, by the International Telecommunications Union [ITU] and by the European Commission) have been partially successful, differences remain in the ways in which countries interpret laws and recommendations. Moreover, given that regulations need to reflect changing market conditions and changing technological capabilities, it is inevitable that over time regulatory environments will change, too. So regulation is best viewed as another of the variables, such as technological change, that the telecommunications industry needs to take into account.
At the global level, there are a number of international bodies that govern or make recommendations about telecommunications policy and regulation. In addition to the ITU and the European Commission, there are various standards bodies (for example, Institute of Electrical and Electronics Engineers [IEEE], European Telecommunications Standards Institute [ETSI], American National Standards Institute [ANSI], the Telecommunication Technology Committee [TTC]) and industry associations (for example, the European Competitive Telecommunications Association [ECTA], the Telecommunications Industry Association [TIA]). Representatives of national governments and regulatory authorities meet formally (for example, ITU World Radio Conferences, where many countries are represented) and informally (for example, Europe's National Regulatory Authorities [NRAs] exchange views at Independent Regulators Group [IRG] meetings). Other organizations, such as the World Trade Organization (WTO) and regional bodies, also influence telecommunications policy and regulation at the international level.
At the national level, several parts of central government are generally involved, and there can sometimes be more than one regulatory body for a nation. Some of these organizations are major players; others play less prominent, but nevertheless influential, roles. In the United States, for example, the Federal Communications Commission (FCC) is the national regulatory body, and public utility commissions regulate at the state level. The U.S. State Department coordinates policy regarding international bodies such as the ITU. The White House, the Department of Commerce, largely through the National Telecommunications and Information Administration (NTIA), the Justice Department, the Trade Representative, and the Department of Defense are among the various parts of the administration that set or contribute to telecommunications policy. The U.S. Congress and the U.S. government's legislative branch also play important roles. In addition, industry associations, policy "think tanks," regulatory affairs departments within companies, telecommunications lawyers, and lobbyists all contribute to policy debates and influence the shape of the regulatory environment.
Other countries organize their policy and regulatory activities differently from the United States. For example, in the United Kingdom, the Office of Telecommunications (OFTEL) mainly regulates what in the United States would be known as "common carrier" matters, whereas the Radiocommunications Agency (RA) deals with radio and spectrum matters. However, at the time of writing, it has been proposed that OFTEL and RA be combined into a new Office of Communications (OFCOM). In Hong Kong, telecommunications regulation was previously dealt with by the post office, but now the Office of the Telecommunications Authority (OFTA) is the regulatory body. So, not only do regulatory environments change, but so, too, do the regulatory players.
Let's look briefly at what regulators do. Again, this varies somewhat from country to country and over time. In the early years of liberalization, much time would typically be spent in licensing new entrants and in putting in place regulations designed to keep a former monopoly telco from abusing its position by, for example, stifling its new competitors or by charging inappropriately high prices to its customers. Here the regulator is acting as a proxy for market forces. As effective competition takes root, the role of the regulator changes somewhat. Much of the work then typically involves ensuring that all licensed operators or service providers meet their license obligations and taking steps to encourage the development of the market such that consumers benefit.
The focus of most regulatory bodies is, or should be, primarily on looking after the interests of the various end users of telecommunications. However, most regulators would recognize that this can be achieved only if there is a healthy and vibrant industry to deliver the products and services. So while there are often natural tensions between a regulator and the companies being regulated, it is at the same time important for cooperation between the regulator and the industry to take place. In Ireland, for example, the role of the regulator is encapsulated by the following mission statement: "The purpose of the Office of the Director of Telecommunications Regulation is to regulate with integrity, impartiality, and expertise to facilitate rapid development of a competitive leading-edge telecommunications sector that provides the best in price, choice, and quality to the end user, attracts business investment, and supports ongoing social and economic growth."
Flowing from regulators' high-level objectives are a range of activities such as licensing, price control, service-level agreements, interconnection, radio spectrum management, and access to infrastructure. Often, regulatory bodies consult formally with the industry, consumers, and other interested parties on major issues before introducing regulatory changes. A more detailed appreciation of what telecommunications regulators do and what their priorities are can be obtained by looking at the various reports, consultation papers, and speeches at regulatory bodies' Web sites.