Television and Video Recording

[ LiB ]

Television and Video Recording

In 1940, the first TV station began broadcasting in Chicago. The first television sets showed images in black and white only. Throughout the 1940s, different systems for color television were tried and finally, in 1953, the Federal Communications Commission (FCC) approved a system that was backwards -compatible with the existing black-and-white system, so that even though the signal was broadcast in color , a black-and-white TV would still display the picture. This meant that people who already owned black-and-white TV sets would not have to buy new ones. The system is called the National Television System Committee (NTSC) color video standard, and is the same system we use now for broadcast television. Figure 1.5 shows a camera from the early days of TV. Note the tubes used inside.

Figure 1.5. An early television camera from the late 1950s, a Blonder-Tongue TVC-1-C Vidicon Camera Head


Also during the early 1950s, teams were developing techniques to record the images from television cameras onto magnetic tape. In 1952, Ampex played back the first recognizable image from a video tape recorder. Throughout the rest of the 1950s and 1960s, many video recording, editing, and synchronization systems were developed, creating a maze of incompatible standards. In 1969, the Society of Motion Picture and Television Engineers (SMPTE) created the standard time code that we use today to synchronize videotape machines and identify tape locations. Chapter 3, "Synchronization and the SMPTE Time Code," is dedicated to SMPTE time Code, its uses, and its variations.

Sound in Star Wars and Apocalypse Now

The debut of Star Wars , in 1977, changed the world of film sound with Ben Burtt's groundbreaking sound effects. The use of manipulated sounds in Star Wars was very unique and identifiablefor example, Burtt created an entire language of mechanical and electrical sounds for the small robot R2-D2. The use of cable vibrations to create the TIE fighter sounds was another innovation. Star Wars signified a departure from the traditional method of capturing the performance of the music or actors to creating new sounds by manipulating the recordings of other sounds. The simplest method used to do this is changing the speed and pitch of a recording. When a sound is slowed down greatly, it can begin to sound like something completely new and different. This way, the sound engineer is not limited to realistic sounds. Movies such as Star Wars enlarged the palette sound engineers could use.

As sound engineers became more creative, their role evolved into that of a sonic artist rather than just a technician. In response to this role change, the title of "Sound Designer" began being used, and it really did accurately describe what these people were doing. The first person to be credited as a sound designer in a film was Walter Murch, for Apocalypse Now. During the 1970s, Dolby Laboratories created new sound formats, including a low-frequency effects (LFE) channel, in response to the wide sonic palette that movies like Star Wars, Apocalypse Now, and Close Encounters of the Third Kind contained. Apocalypse Now was released in Dolby Split Surround. With six discrete channels for left, center, right, left surround, right surround, and LFE, Dolby Split Surround is the predecessor to Dolby Digital, the 5.1 surround format we hear on DVDs and in home theaters today. Movie theater sound has evolved even further, using 6.1 and even 7.1 systems. Other standards include extremely large-format screens such as Imax and its wraparound cousin Omnimax, both of which have hi-fidelity, multi-channel sound systems.

Internet Video

With the advent of the Internet, video technology entered a new era. The ability to transmit moving images across Internet connections has been a continual motivating force behind the advancement of video technology. In just a few short years , it has evolved from the early herky-jerky clips to entire movies streaming over broadband connections. Figure 1.6 shows a frame from a Flash movie, a popular format for Internet media. A slew of audio and video codecs have been created to improve the quality of movies and music transmitted via the Internet and other digital media.

Figure 1.6. A frame from a Flash movie called "The Fly Family," for which I helped create the audio.


NOTE

WHAT IS A CODEC ?

Codec is short for compression/ decompression algorithm. When audio/visual data is sent over the Internet or put on a CD-ROM, it is typically compressed (using a specific compression scheme, or codec ) in order to get more information in the same amount of space. Some examples of AV codecs are MPEG, AVI, QuickTime, AVR77 (Avid Media Composer's standard), and MP3 (for audio only).

HDTV

In the mid-1970s, the Japanese company NHK started developing a new broadcast technology that would be much higher in quality than the current television system. The results of the research and development were astounding, and were the foundation of our present-day digital TV. In 1975, Francis Ford Coppola was introduced to an early HDTV system in the basement of NHK while visiting Japan. He envisioned that films would eventually be made entirely digitally and that the creative process would benefit greatly from this use of technology. Today, with HDTV already being broadcast in many areas, Mr. Coppola's vision is becoming a reality.

HDTV is capable of generating images that are wider than a traditional television picture. The dimensions of these images are expressed as a ratio of the width to the height, called the aspect ratio . Normal television has an aspect ratio of 4:3. HDTV is capable of a 16:9 aspect ratio, as seen in Figure 1.7. This advancement in television technology reflects the desire to emulate the film industryHDTV is trying to compete with an experience that until now has been available only in theaters. The image quality of HDTV is amazing, and with its ability to deliver high fidelity sound, many feel that the home theater sight and sound experience now rivals that of commercial theaters. The HDTV broadcast standard includes specifications for the delivery of six channels of surround sound in the form of Dolby AC3. AC3 is a surround format that is currently used on DVDs to provide discrete 5.1 channel sound. This compressed format can also be used for transmission of HDTV via satellite or local broadcast.

Figure 1.7. The Samsung LTM405W LCD screen is capable of displaying HDTV signals with a resolution of 1280x768 pixels. It retains the 16:9 aspect ratio common to widescreen theaters. This screen is 40 inches wide.


An increasing number of films today are created with HD cameras and editing equipment and then transferred to film for presentation in theaters. Certain theaters have already seen the future of digital projection systems capable of projecting high quality large images. Star Wars: Attack of the Clones was actually filmed, edited, and, in some theaters, projected entirely in the digital domain.

From Thomas Edison's first experiments with films such as The Sneeze, to George Lucas and Attack of the Clones , audio-visual technology has enjoyed a century-long evolution at a breakneck pace that shows no signs of slowing down.

[ LiB ]


PRO TOOLS R for video, film, and multimedia
PRO TOOLS R for video, film, and multimedia
ISBN: N/A
EAN: N/A
Year: 2003
Pages: 70

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net