Multimedia Networking Requirements

Today's networks simply were not built for multimedia and, in particular, for applications that involve video communications, multimedia collaboration, and/or interactive-rich media. Curiously, it is through the use of sophisticated computer applications and devices that we have been able to determine what the human information-processing model comprises: There is a very strong tendency for us to rely on the visual information stream for rapid absorption and longer retention. More than 50% of a human's brain cells are devoted to processing visual information, and combined with the delights of sound, smell, and touchdespite our enormous dependence on the written wordwe're very active in processing the cues from the physical world. By changing the cues, we can change the world. Digital-rich media, in every conceivable sort of formatincluding audio, animation, graphic, full-motion video, application, whiteboards, and communitieswill increasingly depend on multimedia. Video and multimedia applications require substantial bandwidth, as well as minimal latencies and losses. The table on the following page is a snapshot of the per-user bandwidth requirements for various services.

Application

Bandwidth Requirement

E-mail and Web (not optimum support)

56Kbps

Web as an always-on utility, crude hosted applications, 15-second video e-mail

500Kbps

Hosted applications, reasonable videophone

5Mbps

Massive multiplayer/multimedia communities

10Mbps

Scalable NTSC/PAL-quality video

100Mbps

Digital high-definition video-on-demand (uncompressed)

1Gbps

Innovation applications (3D environments, holography, and so on)

10Gbps

Digital video and digital audio also require minimal, predictable delays in packet transmission, which conventional shared-bandwidth, connectionless networks do not offer. (Chapter 3, "Establishing Communications Channels," discusses connectionless networks in detail.) They also require tight controls over losses, and again, connectionless networks do not account for this. As more people simultaneously access files from a server, bandwidth becomes a significant issue. Correct timing, synchronization, and video picture quality are compromised if the bandwidth is not sufficient. As discussed in the following sections, two key issues relate to multimedia communications: the nature of digital video and the role of television.

Digital Video

One of the fascinating areas driving and motivating the need for broadband access is television. Although TV has a tremendous following throughout the worldmore than computing and even telecommunicationsit remained untouched by the digital revolution until recently. Despite major advances in computing, video, and communications technologies, TV has continued to rely on standards that are more than 55 years old. The biggest shortcoming with the existing TV standardsthat is, National Television Standards Committee (NTSC; used in North America and Japan), Phase Alternating Line (PAL; used throughout the majority of the world), and Systeme Electronique Couleur Avec Memoire (SECAM; used in France and French territories)is that they are analog systems, in which video signals degrade quickly under adverse conditions. Most of this signal degradation occurs along the path the picture travels from the studio to a TV.

Digital TV (DTV) offers numerous advantages over the old analog TV signal, among which is the fact that it is nearly immune to interference and degradation. Another advantage of DTV is the ability to display a much better range of colors. The human eye can discriminate more than 16 million colors, and sophisticated computer monitors and DTVs can display those 16 million colors and more. DTV can transmit more data in the same amount of bandwidth, and it can also transmit more types of data. Combined with high-definition TV (HDTV) and digital sound, what this means to the end user is a better picture, better sound, and digital data. However, digital broadcasters are not restricted to just sending a high-definition picture; they can still broadcast a standard-definition picture over DTV, referred to as standard-definition TV (SDTV). But why would they want to do that? The answer is simple: In the same amount of bandwidth, they can deliver four standard-definition programs instead of only one high-definition program. But most importantly, digital technology is converting television from a mechanism that supports passive viewing to an interactive experiencean environment in which you choose when, where, and how you engage with the world at your disposal. Of course, you can still passively watch TV, but you can also customize the experience and make it your own. DTV is already offering us more choices, and it's going to make our viewing experience even more interactive.

People have only so much time and money to spend on electronic goods and services. In many parts of the world, the first thing people seem willing to spend their time and money on involves entertainment. Therefore, the television industry, as well as the content, entertainment, and application worlds, will be increasingly important to how the local loop develops and how this further demands the introduction of home area networking. Of course, TV and networks will deliver more than entertainment. They will deliver edutainment and infotainment, too, and the presentation of the information and knowledge you need will be in a format that is palatable and ensures assimilation and retention on a rapid and effective basis. Video and multimedia facilitate our ability to understand and retain information and therefore will become the basis of much information delivery. This will drive the need for more bandwidth not just to the home but also within the home, to network the growing variety of computing and entertainment systems. Very importantly, it will also drive the need to deliver programming and content on a mobile basisyet another argument for fixed-mobile convergence (FMC)and fuel the broadband wireless arena to new generations of wireless technology and spectrum utilization.

What is required to carry a digitized stream to today's TVs? In the North American system, a 6MHz NTSC channel requires approximately 160Mbps; a digitized PAL stream, used throughout Europe, requires about 190Mbps (the PAL system uses an 8MHz channel); and HDTV requires 1.5Gbps. Videoconferencing needs much less bandwidth than TV, but it still requires a substantial amount; the H.323 standard from the ITU allows videoconferencing to be carried at bandwidths ranging from 384Kbps to 1.5Mbps. Streaming video requirements vary, depending on the quality: Low quality requires 3Mbps, medium quality requires 5Mbps, and high quality requires 7Mbps.

An important driver behind broadband access is content, and much of the content for which people are willing to pay is entertainment oriented. The television industry is now beginning to undergo the revolution that digital technology has caused in other communications-related industries, and it is now starting to capitalize on the potential new revenue-generating services that personal digital manipulation may allow. One example is digital video recording (DVR), also called personal video recording (PVR), in which television programs are digitally recorded onto a hard disk, letting viewers pause live TV, watch programs on their own schedule, and even skip commercials. With the introduction of DTV and the mandate by spectrum management agencies to phase out or decommission analog broadcasting, we will need a much greater amount of bandwidth in our homes to feed the new generations of televisions.

In terms of information transfer, television has generally been associated with the concept of broadcast, terrestrial or satellite, or cable delivery of someone else's programming on someone else's timetable. Video is associated with the ability to record, edit, or view programming on demand, according to your own timetable and needs. Multimedia promises to expand the role of video-enabled communications, ultimately effecting a telecultural shift, with the introduction of interactive television.

Before we begin our detailed discussion of video compression and DTV standards, it makes sense to provide a brief explanation of the key parameters that determine not only the viewing experience but the bandwidth required:

  • Number of pixels on a screen A pixel (which is short for picture element) is one of the thousands of small rectangular dots that comprise television and computer screen images. Basically, the more pixels per screen, the greater, or better, the resolutionthat is, the more defined, detailed, and crisp the image appears.
  • Frame rate The frame rate is a measure of how fluid or natural the motion onscreen appears. As a quick reference, motion pictures use 24 frames per second (fps), the North American NTSC television standard uses 30fps, and the European PAL standard uses 25fps. (Television standards are discussed later in this chapter.)
  • Number of bits per pixel The number of bits per pixel is a measure of the color depth; the more bits per pixel, the more colors can be represented. Remember that the human eye can perceive more than 16 million colors. A digitally encoded image, using 24 bits per pixel, can display more than 16 million colors, providing a rich and natural experience.

As we talk about compression techniques and digital television standards, you will notice that most of the standards define the number of pixels and frames per second that can be supported.

Video Compression

To make the most of bandwidth, it is necessary to apply compression to video. Full-motion digital video needs as much compression as possible in order to fit into the precious spectrum allocated to television and wireless communications, not to mention to fit on most standard storage devices. Moving Picture Experts Group (MPEG; www.chiariglione.org/mpeg) is a working group of the International Organization for Standardization (ISO; www.iso.ch) and the International Electrotechnical Commission (IEC; www.iec.ch) that is in charge of developing standards for coded representation of digital audio and video. It has created the MPEG compression algorithm, which reduces redundant information in images. One distinguishing characteristic of MPEG compression is that it is asymmetric: A lot of work occurs on the compression side, and very little occurs on the decompression side. It is offline versus real-time compression. Offline allows 80:1 or 400:1 compression ratios, so it takes 80 or 400 times longer to compress than to decompress. Currently, MPEG-2 generally involves a compression ratio of 55:1, which means it can take almost an hour to compress 1 minute of video. The advantage of this asymmetrical approach is that digital movies compressed using MPEG run faster and take up less space.

There are several MPEG standards, in various stages of development and completion, and with different targeted uses. The following are some of the most common MPEG standards:

  • MPEG-1 MPEG-1 is a standard for storage and retrieval of moving pictures and audio on storage media. MPEG-1 is the standard on which such products as Video CD and MP3 are based. MPEG-1 addresses VHS-quality images with a 1.5Mbps data rate. MPEG-1 can play back from a single-speed CD-ROM player (150Kbps or 1.2Mbps) at 352 x 240 (i.e., quarter-screen) resolution at 30fps.
  • MPEG-2 MPEG-2 is the standard on which such products as DTV set-top boxes and DVD are based, and at this point, it is the compression scheme of choice. It addresses DTV- or computer-quality images. MPEG-2 carries compressed broadcast NTSC at a 2Mbps to 3Mbps data rate, broadcast PAL at 4Mbps to 6Mbps, broadcast HDTV at 10Mbps to 12Mbps, and professional HDTV at 32Mbps to 40Mbps. MPEG-2 supports both interlaced and progressive-scan video streams. (Interlaced and progressive-scan techniques are discussed later in this chapter.) MPEG-2 on DVD and Digital Video Broadcasting (DVB) offers resolutions of 720 x 480 and 1,280 x 720 at up to 30fps, with full CD-quality audio. On MPEG-2 over Advanced Television Systems Committee (ATSC), MPEG-2 also supports resolutions of 1,920 x 1,080 and frame or field rates of up to 60fps.
  • MPEG-4 MPEG-4 is a standard for multimedia applications. MPEG-4, an evolution of MPEG-2, features audio, video, and systems layers and offers variable-bit-rate encoding for both narrowband and broadband delivery in a single file. It also uses an object-based compression method, rather than MPEG-2's frame-based compression. MPEG-4 enables objectssuch as two-dimensional or three-dimensional video objects, text, graphics, and soundto be manipulated and made interactive through Web-like hyperlinks and/or multimedia triggers. The best feature of MPEG-4 is that the RealNetworks players, Microsoft Windows Media Player, and Apple QuickTime all support MPEG-4. MPEG-4 is intended to expand the scope of audio/visual content to include simultaneous use of both stored and real-time components, plus distribution from and to multiple endpoints, and also to enable the reuse of both content and processes.
  • MPEG-4 Advanced Video Compression (AVC) MPEG-4 AVC, also called Part 10 or ITU H.264, is a digital video codec standard noted for achieving very high data compression. It is the result of a collaborative partnership effort between the ITU Video Coding Experts Group (VCEG) and the ISO/IEC MPEG known as the Joint Video Team (JVT). AVC contains a number of new features that allow it to compress video much more effectively than older standards and to provide more flexibility for application to a wide variety of network environments. H.264 can often perform radically better than MPEG-2 video compression, typically achieving the same quality at half the bit rate or less. It is planned to be included as a mandatory player feature in an enormous variety of implementations and standards.
  • MPEG-7 MPEG-7 is a multimedia content description standard for information searching. Thus, it is not a standard that deals with the actual encoding of moving pictures and audio, like MPEG-1, MPEG-2, and MPEG-4. It uses XML to store metadata and can be attached to timecodes in order to tag particular events, or, for example, to synchronize lyrics to a song.
  • MPEG-21 Today, many elements are involved in building an infrastructure for the delivery and consumption of multimedia content. However, there is no big picture to describe how these elements relate to each other. MPEG-21 was created to provide a framework for the all-electronic creation, production, delivery, and trade of content. Within the framework, we can use the other MPEG standards, where appropriate. The basic architectural concept in MPEG-21 is the digital item. Digital items are structured digital objects, including a standard representation and identification, as well as metadata. Basically, a digital item is a combination of resources (e.g., videos, audio tracks, images), metadata (such as MPEG-7 descriptors), and structure (describing the relationship between resources).

MPEG-1, MPEG-2, and MPEG-4 are primarily concerned with the coding of audio/visual content, whereas MPEG-7 is concerned with providing descriptions of multimedia content, and MPEG-21 enables content to be created, produced, delivered, and traded entirely electronically.

Faster compression techniques using fractal geometry and artificial intelligence are being developed and could theoretically achieve compression ratios of 2,500:1. Implemented in silicon, this would enable full-screen, NTSC-quality video that could be deliverable not only over a LAN but also over the traditional PSTN as well as wireless networks. Until better compression schemes are developed, we have standardized on MPEG-2, which takes advantage of how the eye perceives color variations and motion. Inside each frame, an MPEG-2 encoder records just enough detail to make it look like nothing is missing. The encoder also compares adjacent frames and records only the sections of the picture that have moved or changed. If only a small section of the picture changes, the MPEG-2 encoder changes only that area and leaves the rest of the picture unchanged. On the next frame in the video, only that section of the picture is changed.

MPEG-2 does have some problems, but it is a good compression scheme, and it is already an industry standard for digital video for DVDs and some satellite television services. One problem with MPEG-2 is that it is a lossy compression method. This means that a higher compression rate results in a poorer picture. There's some loss in picture quality between a digital video camera and what you see on your TV. However, MPEG-2 quality is still a lot better than the average NTSC or PAL image. By applying MPEG-2 encoding to NTSC, we can reduce the bandwidth required.

Another important video compression technique is Windows Media 9 (WM9). The WM9 series codec standard, implemented by Microsoft as Windows Media Video (WMV) 9 Advanced Profile, is based on the VC-1 video codec specification currently being standardized by the Society of Motion Picture and Television Engineers (SMPTE; www.smpte.org) and provides for high-quality video for streaming and downloading. By making use of improved techniques, VC-1 decodes high-definition video twice as fast as the H.264 standard while offering two to three times better compression than MPEG-2. WM9 is supported by a wide variety of players and devices. It supports a wide range of bit rates, including high-definition at one-half to one-third the bit rate of MPEG-2, as well as low-bit-rate Internet video delivered over a dialup modem. (More detailed information is available at www.microsoft.com/windows/windowsmedia/9series/codecs/video.aspx.)

Even if we achieve the faster data rates that MPEG-2 offers, how many of us have 20Mbps pipes coming into our homes? A 1.5Mbps connection over DSL or cable modem cannot come close to carrying a 20Mbps DTV signal. Therefore, broadband access alternatives will shift over time. We will need more fiber, we will need that fiber closer to the home, and we will need much more sophisticated compression techniques that can allow us to make use of the even more limited wireless spectrum to carry information. We will also need to move forward with introducing new generations of wireless technologies geared toward the support of multimedia capacitiesa combination of intelligent spectrum use and highly effective compressionwith support for the requisite variable QoS environment and strong security features. We will also need better compression techniques. Improvements in compression are on the way, as Table 10.1 illustrates.

Table 10.1. Improvements in Compression

2006 MPEG-2

2006 MPEG-4/VC-1

2007 MPEG-4/VC-1 Enhancements

2009 MPEG-4/VC-1 Improvements

Standard definition

2.53Mbps

1.52Mbps

<11.5Mbps

High definition

1519Mbps

112 Mbps

<710Mbps

 

Delay and Jitter

Along with their demands for so much capacity, video and other real-time applications such as audio and voice also suffer from delay (i.e., latency) and bit errors (e.g., missing video elements, synchronization problems, complete loss of the picture). Delay in the network can wreak havoc with video traffic. The delay in a network increases as the number of switches and routers in the network increases. The ITU recommends a maximum delay of 150 milliseconds, and evolving agreements promise packet loss of 1% or less per month and a round-trip latency guarantee of 80 milliseconds. However, the public Internet has as much as 40% packet loss during peak traffic hours and average latencies of 800 to 1,000 milliseconds. Although we really can't control the delay in the public Internet, we can engineer private IP backbones to provide the levels we're seeking.

Jitter is another impairment that has a big impact on video, voice, and so on. Jitter is introduced when delay does not remain the same throughout a network, so packets arrive at the receiving node at different rates. Video can tolerate a small amount of delay, but when congestion points slow the buffering of images, jitter causes distortion and highly unstable images. Reducing jitter means reducing or avoiding the congestion that occurs in switches and routers, which in turn means having as many priority queues as the network QoS levels require.

Television Standards

Given the importance of the new era in television, the following sections establish some reference points for television standards, both analog and digital.

Analog TV

In 1945, the U.S. Federal Communications Commission (FCC; www.fcc.gov) allocated 13 basic VHF television channels, thus standardizing the frequencies and allocating a broadcast bandwidth of 4.5MHz. The NTSC was formed in 1948 to define a national standard for the broadcast signal itself. The standard for black-and-white television was finally set in 1953 and ratified by the Electronic Industries Association (EIA; www.eia.org) as the RS-170 specification. Full-time network color broadcasting was introduced in 1964, with an episode of Bonanza.

The NTSC color TV specification determines the electronic signals that make up a color TV picture and establishes a method for broadcasting those pictures over the air. NTSC defines a 4:3 horizontal:vertical size ratio, called the aspect ratio. This ratio was selected in the 1940s and 1950s, when all picture tubes were round, because the almost-square 4:3 ratio made good use of round picture tubes. An NTSC color picture with sound occupies 6MHz of frequency spectrum, enough bandwidth for 2,222 voice-grade telephone lines. To transmit this signal digitally without compression requires about 160Mbps.

The English/German PAL system was developed after NTSC and adopted by the United Kingdom, Western Germany, and The Netherlands in 1967. The PAL system is used today in the United Kingdom, Western Europe (with the exception of France), Asia, Australia, New Zealand, the Middle East, Africa, and Latin America. Brazil uses a version of PAL called PAL-M. The PAL aspect ratio is also 4:3, and PAL channels occupy 8MHz of spectrum. Uncompressed PAL, digitally transported, requires approximately 200Mbps.

The SECAM system is used in France and the former French colonies, as well as in parts of the Middle East. Russia and the former Soviet-allied countries use a modified form of SECAM. There are two versions of SECAM: SECAM vertical and SECAM horizontal.

The PAL and SECAM standards provide a sharper picture than NTSC, but they display a bit of a flicker because they have a slower frame rate. Programs produced for one system must be converted in order to be viewed on one of the other systems. The conversion process detracts slightly from the image quality, and converted video often has a jerky, old-time-movie look.

DTV

DTV represents the ongoing convergence of broadcasting and computing. Simply put, DTV makes use of digital modulation and compression to broadcast audio, data, and video signals to TV sets. Thanks to MPEG-2, studio-quality images can be compressed and transformed into a digital stream. DTV is the next generation of television; its development has improved the audio and video quality of broadcast television, and it has in many cases replaced the film cameras used in movie production.

The difference between analog TV and DTV is profound in terms of picture quality as well as special screen effects, such as multiple-windowed pictures and interactive viewer options. The quality of DTV is almost six times better than what analog TV offers, delivering up to 1,080 lines of resolution and CD-quality sound. But the real promise of DTV lies in its huge capacity: the ability to deliver, during a single program, information equivalent to that contained on dozens of CDs. The capacity is so great that whole new industries are being created to use this digital potential for whole new revenue streams. Recognizing that the Web and other Internet services may grow to rival television, it is highly likely that new generations of television system infrastructure design will include this medium as part of the total systemmaking television a critical aspect of convergence on all fronts, including devices, applications, fixed networks, wireless infrastructure, and service providers.

The Move to DTV and HDTV

One main application of DTV is to carry more channels in the same amount of bandwidth, either 6MHz or 8MHz, depending on the standard in use. The other key application is to carry high-definition programming, known as HDTV. Because DTV makes use of a digital signal, many common analog broadcasting artifacts can be eliminated, including static in the audio, snow on the screen, and the presence of ghost images (called multipath distortion). However, digital signals can also suffer from artifacts. For example, when the data rate is too low, MPEG compression results in artifacts such as blocking, or blocky images. In addition, while analog TV may produce an impaired picture under some circumstances, it is still viewable, whereas DTV may not work at all in the same situation. Basically, depending on the level and sophistication of error correction defined by the standard and chosen by the provider, DTV may work either perfectly or not at all.

The move to DTV systems is generally associated with a switch in picture format, going from the aspect ratio of 4:3 to one of 16:9, although both HDTV and SDTV are available in both formats. However, the aspect ratio is only part of HDTV.

The History of Aspect Ratio

The 4:3 aspect ratio was originally developed by W. K. L. Dickson in 1889 while he was working at Thomas Edison's laboratories. Dickson was experimenting with a motion-picture camera called a Kinescope, and he made his film 1 inch wide with frames 0.75 inch high. This film size, and its aspect ratio, became the standard for the film and motion-picture industry because there was no apparent reason to change. In 1941, when the NTSC proposed standards for television broadcasting, they adopted the same ratio as the film industry.

In the 1950s, Hollywood wanted to give the public a reason to buy a ticket to attend the theater rather than sit at home watching the TV. Because our two eyes give us a wider view, a wider movie makes more sense. Widescreen formats are formatted much closer to the way we see. Our field of vision is more rectangular than square. When we view movies in widescreen format, the image fills more of our field of vision and has a stronger visual impact. Wider screens gave the theater audience a more visually engulfing experience. The 16:9 aspect ratio allows TV to move closer to the movie experience.

Besides being formatted for a wider screen, an HDTV picture has more detail and crisper images. With the bigger pictures comes a finer resolution. TV images are made up of pixels, each of which is a tiny sample of video information, one of the little squares that make up an overall picture. Each pixel is composed of three close dots of color: red, green, and blue. Combined on the phosphor screen, the three separate colors appear to blend into a single color. Each phosphor emits light in proportion to the intensity of the electron beam hitting it. On a standard TV screen, the electron beam has about 256 levels of intensity for each of the three colored phosphors. Therefore, each pixel has a spectral range of about 16.8 million colors. From a distance, each pixel ends up looking like a single dot of color, but up close, you can see that each pixel is really a rectangular trio of red, green, and blue; this is most visible on projection televisions, where the colors separate a little more. HDTV uses smaller pixels that are closer together, and they are square, just like on most computer monitors. Digital pixels are also smaller. HDTV has 4.5 pixels in the area taken up by a single pixel on standard NTSC TVs. The more pixels in a given area, the more detailed and better the picture. A quick comparison will help you appreciate the vastly improved resolution offered by HDTV. The maximum resolution of an NTSC TV is a display that is 720 pixels wide by 486 active lines, resulting in a total of 349,920 pixels. A high-end HDTV display is 1,920 pixels wide by 1,080 active lines, resulting in 2,073,600 pixelssix times more pixels than the older NTSC resolution.

DTV not only improves the visual experience, it also improves the sound quality, using advances in digital sound. HDTV broadcasts sound by using the Dolby Digital/AC-3 audio encoding system, which is the same digital sound used in most movie theaters, in DVDs, and in many home theater systems. It can include up to 5.1 channels of sound: 3 in front (left, center, and right), 2 in back (left and right), and a subwoofer bass for a sound you can feel (the .1 channel). Sound on DTV is CD quality, with a range of frequencies lower and higher than most people can hear.

DTV Implementations and Distribution

Service providers are increasingly interested in providing service bundlesa strategy often referred to as triple play (voice, data, and video) or quadruple play (voice, data, video, and wireless/mobile). DTV, which is a big part of this service strategy, can be implemented and distributed in a number of ways. The following are some of the possible implementations:

  • Terrestrial DTV A number of countries are in the process of deploying digital terrestrial television (DTT), which offers many advantages. Governments see DTT as an opportunity to free up existing TV frequencies for resale, as well as a technology that ensures that the country is on the forefront of the digital revolution. Broadcasters see DTT as a means by which they can fight the growing competition from cable DTV, satellite providers, and telcos, not to mention emerging digital program distribution technologies such as PVR/DVR and video-on-demand (VOD). Manufacturers see DTT as an opportunity to sell new equipment, ranging from digital set-top boxes to new DTV sets. Consumers view the move to DTT as a way to get new and exciting programming. At the moment, however, HDTV sets are still very expensive, and there is not much HDTV programming available.
  • Satellite DTV In the satellite TV market, DTV is mainly used to multiplex large numbers of channels, including pay TV, onto the available bandwidth. Because satellite operators have much more bandwidth available to them, they often can compete very effectively with terrestrial DTV providers in terms of both number of channels and picture quality.
  • Cable DTV For cable TV providers, the main advantage of replacing their analog systems with digital cable was initially the ability to offer users more channels and better picture quality. Of course, in today's era of convergence and service bundles, a digital two-way system is absolutely required to support the emerging modern services such as VoIP, IPTV, VOD, and interactive HDTV. In addition, an expanding range of set-top boxes and middleware software also makes many new features possible. Depending on the choices an operator makes in hardware and software, features such as TV guides, program reminders, content censorship, interactive Web-style content viewing, gaming, voting, and on-demand services such as VOD can add significantly more value and ultimately revenues.
  • IP television (IPTV) The Internet is starting to be adapted for use with DTV deployments as part of triple play. When combined with advances in new compression standards, such as MPEG-4 H.264 or WM9, and in the picture quality supported by HDTV, IPTV represents a big step forward as a new approach to distributing television programming. (IPTV is discussed in Chapter 9, "IP Services.")

Telcos of all sorts, far and wide, are helping to lead the way into the video space. Many are using new IP-based technologies, while others are tapping older, but known and reliable, techniques such as radio frequency (RF) broadcasting to deliver a full-service menu of high-definition programming, VOD, DVR, music channels, interactive gaming, and more. Combined with voice and data services, telco TV is expected to become serious business.

Some analysts suggest that telcos' success will hinge on video, which means telcos face many challenges ahead. The move to becoming a video provider means spending billions on network upgrades, rolling out services with unproven IPTV platforms, and navigating the difficult content acquisition process. Telcos are in transition, and the basic building blocks are there for the packet networks and transport needed to move video around. However, telcos will need to develop content relations and appropriate bandwidth to support high-definition channels and interactivity with in-home wiring. According to the Consumer Electronics Association (www.ce.org), HDTV is the compelling device at retailers and the roadmap to the future, with IP driving it. Without it, telcos won't be able to compete. But key issues need attention, including the integration of billing and operational support systems. In addition, set-top manufacturers and other vendors must support the service, and new compression technologies and a new generation of set-top boxes will make a big difference. The transformation is under way, but telcos must have the right software, content streams, and security provisions, and that will take a while. More and more small telcos, cities, and electric companies are deploying fiber delivery systems that will serve smaller communities, so we're likely to see many variations with telco TV.

Mobile TV constitutes another new and fascinating approach to distributing television programming and entertainment content. Mobile is the fourth screen, after movies, TV, and the PC. Production costs for a big-budget film run about US$1 million per minute, while production costs for the mobile world range from US$2,000 to US$8,000 per minute. Set-top boxes are not only becoming more intelligent but will also interact with other devices, such as PDAs, mobile phones, and the Internet, to provide a truly flexible solution that allows local information (weather, traffic, news, and so on) to be tailored to specific regions. Industry analysts predict that broadcast mobile TV has undoubted potential, with interactive TV and the extension of advertising at the forefront of that success. These beliefs are based on the fact that TV plays in the lives of people worldwide, mobile subscriber penetration has reached (and sometimes exceeded) a high saturation level in many markets, and the subsequent convergence of the broadcast and mobile industries is inevitable. But it is important to keep in mind that not everyone feels joyful about watching TV on a tiny screen; the young generation is most likely to embrace this new viewing experience.

Some feel the mobile phone is the most exciting software platform in history; it has become an essential part of just about everyone's lifestyle, and it is truly global. As a result, the simple mobile phone is morphing into a futuristic entertainment system and the most exciting new technology platform since the Internet. There are more than 1.5 billion mobile phones in use worldwide, compared to just 690 million PCs and laptops. Needless to say, entertainment giants and newly inspired entrepreneurs are rushing to develop songs, graphics, games, and videos to populate millions of tiny screens. (Mobile TV is discussed in more detail in Chapter 16, "Emerging Wireless Applications.")

DTV Standards

Initially, an attempt was made to prevent the fragmentation of the global DTV market into different standards, as was the case with the NTSC, PAL, and SECAM analog standards. However, as usually seems to be the case, the world could not reach agreement on one standard, and as a result, several major standards exist today. These standards fall into two categories:

  • Fixed reception The fixed-reception digital broadcasting standards include the U.S. Advanced Television Systems Committee (ATSC; www.atsc.org) system, the European DVB-Terrestrial (DVB-T; www.dvb.org) system, the Japanese Integrated Services Digital Broadcasting (ISDB) system, and the Korean Terrestrial Digital Media Broadcasting system. The most widely adopted standard worldwide is DVB-T. Argentina, Canada, Mexico, and South Korea have followed the United States in adopting ATSC. Digital Multimedia Broadcasting-Terrestrial (DMB-T) is the youngest major broadcast standard and provides the best reception quality for the power required. The DMB standard is derived from the Digital Audio Broadcast (DAB) standard that enjoys wide use in Europe for radio broadcasts. DAB and DMB-T are the preferred Chinese standards. Korea has since renamed its preferred standard T-DMB to differentiate it from the Chinese standard DMB-T. T-DMB is currently used in Korea but will also go into trial in 2006 in Germany, France, Switzerland, and the United Kingdom. A related Korean standard, S-DMB, exists for satellite television services, allowing for TV reception over larger areas than can be served with T-DMB. There could also be additional high-resolution digital formats for markets other than home entertainment introduced in the future. One such format being proposed by Japan's public broadcaster, NHK, is Ultra High Definition Video (UHDV). UHDV provides a resolution that is 16 times greater than that of HDTV.
  • Mobile reception As far as mobile standards go, DVB-Handheld (DVB-H) is the selected standard in Europe, India, Australia, and southeast Asia. North America also uses DVB-H, as well as the MediaFLO standard proposed by Qualcomm. MediaFLO, used only in North America at this time, supports relatively fast channel switching and uses its own broadcast towers as well as available bandwidth in the cellular network. Japan is adopting the ISDB-T Mobile Segment standard. Korean is embracing T-DMB. China may follow DVB-H or something else, and for the time being, it is unknown which standard South America and Africa will follow. The mobile broadcast market is nascent, and many developments are in store before a winner emerges in this arena.

As far as the broadband evolution goes, the importance of entertainment content and DTV is significant, so being aware of what constitutes DTV is mandatory to understanding the requirements of the next-generation network. The following sections describe the most commonly followed DTV fixed-reception standards, and Chapter 16 discusses the details of mobile TV standards. A comprehensive list of DTV deployments around the world is available at http://en.wikipedia.org/wiki/List_of_digital_television_deployments_by_country.

ATSC Standards

The ATSC, an international, nonprofit organization, develops voluntary standards for DTV. It was formed in 1982 by the member organizations of the Joint Committee on InterSociety Coordination (JCIC): the Electronic Industries Association (EIA), the Institute of Electrical and Electronics Engineers (IEEE), the National Association of Broadcasters (NAB), the National Cable Television Association (NCTA), and the Society of Motion Picture and Television Engineers (SMPTE). Today, approximately 140 ATSC members represent the broadcast, broadcast equipment, motion picture, consumer electronics, computer, cable, satellite, and semiconductor industries.

The ATSC's DTV standards include high-definition TV (HDTV), enhanced-definition TV (EDTV), standard-definition TV (SDTV), data broadcasting, multichannel surround-sound audio, direct-to-home satellite broadcast, and interactive television. On December 24, 1996, the U.S. FCC adopted the major elements of the ATSC DTV standard (Standard A/53). The ATSC DTV standard has since been adopted by the governments of Argentina, Canada, Mexico, and South Korea.

Types of Scanning

One of the biggest issues in TV standards involves how DTV images are drawn to the screen. There are two perspectives: those of the broadcast TV world and those of the computer environment. The broadcasters would rather initiate DTV with interlaced scanning, which is used by today's TV sets; computer companies want progressive-scanning DTV signals, similar to those used by computer monitors. The source of the conflict is different historical bandwidth limits.

Originally, the NTSC decided that the best way to fit a 525-line video signal into a 6MHz broadcast channel was to break each video frame into two fields, each holding half of the picture. Interlacing is a technique cameras use to take two snapshots of a scene within a frame time. During the first scan, the camera creates one field of video, containing even-numbered lines, and during the second, it creates another, containing the odd-numbered lines. The fields are transmitted sequentially, and the receiver reassembles them. This technique makes for reduced flicker and therefore greater brightness on the TV receiver for the given frame rate (and bandwidth). Interlacing is rough on small text, but moving images look fine.

Progressive, or noninterlaced, scanning is a method for displaying, storing, or transmitting moving images in which the lines of each frame are drawn in sequence. This type of scanning is used in most computer monitors. There are a number of advantages associated with progressive scanning, such as a subjective perception of an increased vertical resolution. With interlaced images, the perceived vertical resolution is usually equivalent to 60% of the active lines. This is why HDTV standards such as 1080i (1,920 x 1,080, interlaced) are generally perceived as a poorer quality than 720p (1,280 x 720, progressive). Additional benefits include the absence of flickering of narrow horizontal patterns, easier compression, and simpler video-processing equipment.

The ATSC high-definition standard includes three basic formats: HDTV, EDTV, and SDTV. EDTV is largely a marketing term, referring to low-resolution TVs with minor enhancements. Digital TVs often have a 16:9 widescreen format and can display progressive-scan content. Each of these formats is defined by the number of lines per video frame, the number of pixels per line, the aspect ratio, the frame repetition rate, and the frame structure (i.e., interlaced scan or progressive scan). The ATSC standard recommends that the receiver seamlessly and without loss of video continue to display all these formats in the native format of the television receiver.

ATSC signals are designed to work on the same bandwidth as NTSC (6MHz) or PAL (8MHz) channels. The video signals are compressed using MPEG-2, and the data stream is then modulated. The modulation technique varies, depending on the transmission method. Because any terrestrial TV system must overcome numerous channel impairments such as ghosts, noise bursts, signal fades, and interference in order to reach the home viewer, the selection of the right RF modulation format is critical. In the case of terrestrial broadcasters, the technique used is 8-VSB (Vestigial Sideband), with a maximum transfer rate of 19.39Mbps. This is sufficient to carry several video channels and metadata. Because cable TV operators usually have a higher signal-to-noise ratio (SNR), they can use 16-VSB or 256-QAM to achieve a throughput of 38.78Mbps using the same size channel. (Modulation techniques are discussed in Chapter 5, "Data Communication Basics.") Table 10.2 shows the details of the various ATSC DTV standards.

Table 10.2. ATSC DTV Standard

Lines of Resolution

Pixels per Line

Aspect Ratio

Frame Rate

Scanning Sequence

HDTVNTSC

1,080

1,920

16:9

30, 24

Progressive

1,080

1,920

16:9

60

Interlaced

1,080

1,440

4:3

30, 24

Progressive

1,080

1,440

4:3

60

Interlaced

720

1,280

16:9

60, 30, 24

Progressive

720

1,280

16:9

60

Interlaced

720

960

4:3

60, 30, 24

Progressive

720

960

4:3

60

Interlaced

HDTVPAL

1,080

1,920

16:9

25

Progressive

1,080

1,920

16:9

50

Interlaced

1,080

1,440

4:3

25

Progressive

1,080

1,440

4:3

50

Interlaced

720

1,280

16:9

50, 25

Progressive

720

1,280

16:9

50

Interlaced

720

960

4:3

50, 25

Progressive

720

960

4:3

50

Interlaced

EDTVNTSC

480

720

16:9, 4:3

60

Progressive

EDTVPAL

576

720

16:9, 4:3

50

Progressive

SDTVNTSC

480

704

16:9

60, 30, 24

Progressive

480

704

16:9

60

Interlaced

480

640

4:3

60, 30, 24

Progressive

480

640

4:3

60

Interlaced

SDTVPAL, SECAM

576

1024

16:9

50, 25

Progressive

576

1024

16:9

50

Interlaced

576

768

4:3

50, 25

Progressive

576

768

4:3

50

Interlaced

ATSC requires about half of the power for the same reception quality, in absence of errors, as the more widely used DVB-T standard, but it is more susceptible to errors. One recognized limitation with ATSC is that unlike DVB-T and ISDB-T, which are able to dynamically change the error correction modes, code rates, interleaver mode, and randomizer, ATSC cannot be adapted to changes in propagation conditions. However, despite ATSC's fixed transmission mode, under normal conditions, it is still a very robust waveform.

DVB Standards

DVB is a suite of internationally accepted, open standards for DTV maintained by the DVB Project (www.dvb.org). Services using the DVB standards are available on every continent, with more than 110 million DVB receivers deployed. Formed in 1993, the DVB Project is responsible for designing global standards for the global delivery of DTV and data services. The DVB Project's 270+ membership includes broadcasters, manufacturers, regulatory bodies, software developers, network operators, and others from more than 35 countries. DVB standards are published by the European Telecommunications Standards Institute (ETSI; www.etsi.org), and there is considerable day-to-day cooperation between the two organizations. ETSI, the European Committee for Electrotechnical Standardization (CENELEC; www.cenelec.org), and the European Broadcasting Union (EBU; www.ebu.ch) have formed a joint technical committee to handle the DVB family of standards.

DVB standards are very similar to ATSC standardsincluding MPEG-2 video compression, packetized transport, and guidelines for a 1,080-line-by-1,920-pixel HDTV formatbut they provide for different audio compression and transmission schemes. DVB embraces four main standards that define the physical and data link layers of a distribution system:

  • DVB-S and DVB-S2 (satellite TV) DVB-S is an open standard for digital video broadcast over satellites, defined by ETSI and ratified in 1994. DVB-S supports only MPEG-2 encoded video streams. DVB-S2 is an open standard for digital video broadcast over satellites, defined by ETSI and ratified in 2005. It has improved quality over DVB-S and allows for coded video in H.264 (AVC) or VC-1 bitstreams.
  • DVB-C (cable TV) DVB-C is an open standard for digital video transmission over cable that was defined by ETSI and ratified in 1994.
  • DVB-T (terrestrial TV) DVB-T, an open standard defined by ETSI and ratified in 1997, is used as the de facto standard for terrestrial TV broadcasts in many nations, particularly those in Europe. It supports only MPEG-2 compression.
  • DVB-H (terrestrial TV for handhelds) The DVB-H standard is an adaptation of DVB optimized for mobile handheld devices. It is widely used in Europe and is starting to see adoption in North America.

These four distribution systems vary in their modulation schemes, based on the technical constraints associated with the different operating environments. DVB-T and DVB-H use Coded Orthogonal Frequency Division Multiplexing (COFDM), DVB-S uses Quadrature Phase-Shift Keying (QPSK), and DVB-C uses QAM, especially 64-QAM. (Modulation schemes are covered in Chapter 5.)

The DVB Project has also designed an open middleware system for DTV, called the DVB-Multimedia Home Platform (DVB-MHP, www.mhp.org), which is being used to support interactive applications in many countries. DVB-MHP enables the reception and execution of interactive, Java-based applications on a TV set, including applications such as e-mail, SMS, information services, shopping, and games. Although the European Commission has not mandated EU-wide standards for interactive DTV middleware, there is clear support for the continued development of DVB-MHP in the commitment to continue to promote and support open and interoperable standards and to monitor the use of proprietary technologies. As of mid-2005, the largest deployments of DVB-MHP were in Italy (DVB-T) and Korea (DVB-S), with other small deployments or trials taking place in Australia, Finland, Germany, and Spain. In the United States, CableLabs has specified its own middleware system called OpenCable Applications Platform (OCAP), which is based on DVB-MHP.

ISDB Standards

ISDB is the DTV and DAB format that Japan has created to allow radio and television stations there to convert to digital. ISDB is maintained by the Association of Radio Industries and Businesses (ARIB; www.arib.or.jp). ARIB is a standards organization in Japan, designated as the center of promotion of the efficient use of the radio spectrum and frequency change support.

ISDB incorporates several standards:

  • ISDB-S (digital satellite TV) ARIB developed the ISDB-S standards to meet a number of requirements Japanese broadcasters were asking for, including HDTV capability, interactive services, network access, and effective frequency utilization. ISDB-S, operating in the 12GHz band, uses PSK modulation. ISDB-S allows 51Mbps to be transmitted through a single transponder, making it 1.5 times more efficient than DVB-S, which can handle a bitstream of approximately only 34Mbps. This means the ISDB-S system can carry two HDTV channels using one transponder, along with other independent audio and data. ISDB-S was commercially launched in 2000 and today is used by several service providers.
  • ISDB-T and ISDB-Tsb (terrestrial) In the 1980s, Japan started research on and development of a completely digital system that led to the introduction of ISDB. ISDB-T began commercial operation in Japan in December 2003. ISDB-T specifies OFDM transmission with one of four modulation schemes: QPSK, DQPSK, 16-QAM, or 64-QAM. With ISDB-T, an audio program and TV for both fixed and mobile reception can be carried in the same multiplex. For example, ISDB-T can transmit three SDTV streams in one channel or carry an HDTV and a mobile phone channel in the same 6MHz usually reserved for TV transmission. The combination of services can be changed at any time, as can the modulation schemes. ISDB-T can support HDTV on moving vehicles at over 62 mph (100 kph), and it can be received on mobile phones moving at over 250 mph (400 kph). (DVB-T can only support SDTV on moving vehicles, and ATSC cannot be used on moving vehicles at all.) ISDB-T is applicable to all 6MHz, 7MHz, and 8MHz bandwidth systems, so it could be adopted worldwide.

    ISDB-Tsb refers to the terrestrial digital sound broadcasting specification and is the same technical specification as ISDB-T. ISDB-Tsb can also be used for mobile reception.

  • ISDB-C (digital cable TV) ISDB-C is the cable digital broadcasting specification. It supports terrestrial digital broadcasting services over cable using the OFDM scheme with a 6MHz channel. It employs 64-QAM modulation.
  • 2.6GHz band (mobile broadcasting) The mobile broadcasting 2.6GHz band uses Code Division Multiplexing (CDM). A Japanese company named MobaHO! began using mobile broadcasting in October 2004, constituting the world's first satellite digital multimedia broadcasting for personal and mobile device use. Users throughout Japan can listen to and view the same programs from 30 audio channels (including overseas FM radio and genre-specific music programming) and from 7 video channels (including news, sports, and entertainment programming).

All these standards are based on MPEG-2 video and audio coding and are capable of HDTV.

Brazil is currently the only country considering adopting ISDB-T for its DTV standard.


Part I: Communications Fundamentals

Telecommunications Technology Fundamentals

Traditional Transmission Media

Establishing Communications Channels

The PSTN

Part II: Data Networking and the Internet

Data Communications Basics

Local Area Networking

Wide Area Networking

The Internet and IP Infrastructures

Part III: The New Generation of Networks

IP Services

Next-Generation Networks

Optical Networking

Broadband Access Alternatives

Part IV: Wireless Communications

Wireless Communications Basics

Wireless WANs

WMANs, WLANs, and WPANs

Emerging Wireless Applications

Glossary



Telecommunications Essentials(c) The Complete Global Source
Telecommunications Essentials, Second Edition: The Complete Global Source (2nd Edition)
ISBN: 0321427610
EAN: 2147483647
Year: 2007
Pages: 160

Similar book on Amazon

Flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net