5.4 Client-Design Issues

 < Day Day Up > 



5.4 Client-Design Issues

The design of the client will, as always, be dominated by questions of cost, functionality, and convenience. The variety of available client devices is high, ranging from a full-fledged personal computer or workstation, an optimized personal computer, an upgraded television set with internal or external set-top capability, or a mobile phone with video/graphical capabilities. For instance, the IST SAMBITS[113]project, driven by Siemens Corporate Technology, Heinrich-Hertz Institute, Philips France, and Frauenhofer Gesellschaft, aimed to provide digital video broadcasting (DVB) with complementary Internet services to set-top boxes at home.

In contrast to a pull application, where users issue a request to a MMDBMS, it is a push application, where media and metadata are continuously streamed to the end-user. The aim of the SAMBITS and related projects is to develop real-time technology for interactive multimedia services in a push scenario.

Metadata play an important role in the future of interactive television. Interactive broadcasting will combine traditional television with additional services including Internet access, content-environment DVB-MHP (DVB-multimedia home platform), interactivity through the use of MPEG-4, and content navigation and search through the use of MPEG-7. [114] Television viewers will be able to watch background information related to the program. For an advertisement, one may receive additional information on the product features and prices and may display the homepage of the product. In a music contest, the viewer can look at the show and see, in addition, backstage, look at a video clip showing the actor at home, and retrieve metadata on the song and the actor, such as his or her birthday, favorite meal, and so on.

In view of the emerging scalable coding technologies, synchronization of different streams of an audiovisual document is of importance, as well as synchronization of the media documents with the exchanged descriptive information. As there is a solution proposed for MPEG-2 and metadata (e.g., MPEG-7), solutions for synchronization mechanisms of MPEG-7 and MPEG-4 access units and cross-referencing of MPEG-7 and MPEG-4 data are under development.

Apart from the above hardware considerations, decisions about the software interfaces are of importance. Most related works have proposed very proprietary solutions, reflected by the hardware prerequisites and the software tools employed. [115], [116], [117] Moreover, the possible limited client resource availability led not only to the question of which functions to be executed locally or at the server but also to the question of how to adapt the content of the media to use the client resources effectively. This is accentuated by the growing need for personalized environments.

Several multimedia software frameworks have been defined to help the programmer to define and implement the multimedia applications; for instance, the JMF for Java (Java Media Framework[118] for Java) and the MET++[119] for C++. Let us consider JMF as a very popular use case.

5.4.1 Software Media Framework Use Case: JMF

JMF (current version can be downloaded from http://java.sun.com/products/java-media/jmf/) is a large and versatile API (application program interface) for creating Java programs enabled to play back a wide variety of time-based media formats such as video and audio.

The popularity of JMF comes from vast media support, such as AVI (audio video interlaced), Quicktime, AIFF (Audio Interchange File Format), AU (Audio File Format), WAV (Windows Wave File Format), Real Media, MPEG-1, MPEG-2, and MPEG-4, and protocol support for UDP, RTP, TCP, and so on. In addition, JMF incorporates media capture capabilities as well as playback. By programming to the JMF API, developers can create platformindependent applications and applets that synchronize media playback. JMF can be combined with other software frameworks, such as JavaBean, allowing media playback to be incorporated into component-based programs. JMF can be used to:

  • Play multimedia files in a Java application or applet and play streaming media format from the Internet

  • Capture audio and video with a microphone and video camera, then store the data in a format supported by JMF

  • Transmit audio and video on the Internet, broadcast live radio and television programs

  • Process time-based media; for example, build a chain of analysis tools that perform content adaptation, video segmentation, render the video for display, and so on

The installation of a JMF Media Player will demonstrate the programming paradigm of JMF. A Media Player simply takes a stream of audio or video as input and renders it to the speaker or a screen.

The JMF Media Player has to prepare itself and its data source before it can start playing the media. JMF defines six states in a player (e.g., realizing, prefetching, start). Contained within the com.sun.media.content.video and com.sun.media.content.audio subpackages are classes capable of parsing and handling the variety of content types supported by JMF. All of these classes implement the javax.media.Player interface, and an object of the correct type is automatically instantiated to match the type of media being loaded, so the developer only needs to address an object of type player.

For example, assume that a player will display our MPEG-1 video pisa.mpeg file. When this video is loaded using the JMF API, an object of type com.sun.media.content.video.mpeg.Handler is created to deal with this specific content type. Then, a player object is created using a reference to either a local or remote media resource. Player objects have ancestor methods to return the UserInterface components needed to render the media content and also a custom set of controls for that content, which can be added to a standard Java UserInterface.

Exhibit 5.7 shows the standard player application using the default controls and the media property panel for our layer object.

Exhibit 5.7: JMF Standard Player application showing our example video, pisa.mpg.

start example

click to expand

end example

[113]SAMBITS is an acronym for System for Advanced Multimedia Broadcast and Information Technology Services. More information on SAMBITS may be found at http://www.irt.de/sambits/.

[114]Crysandt, H. and Wellhausen, J., Music classification with MPEG-7, in Proceedings of the SPIE International Conference on Electronic Imaging—Storage and Retrieval for Media Databases, Santa Clara, January 2003.

[115]Lauff, M. and Gellersen, H.-W., Multimedia client implementation on personal digital assistants, in Proceedings of the Interactive Distributed Multimedia Systems and Telecommunication Services Workshop, Darmstadt, September 1997, LNCS 1309, Springer-Verlag, New York.

[116]Hu, M.J., Wang, T.F., Boon, T.C., and Lian, C.W., Distributed multimedia database: configuration and application, in Proceedings of the International Conference on Information, Communications and Signal Processing (ICICS), Sydney, November 1999, LNCS 1726, Springer-Verlag, New York.

[117]Frankewitsch, T. and Prokosch, H.U., Image database, image proxy-server and searchengine, Proc.Int. Am.Med.Inf. Assn., pp. 765–769, 1999.

[118]http://java.sun.com/products/java-media/jmf/.

[119]http://www.ifi.unizh.ch/groups/mml/projects/met++/met++.html.



 < Day Day Up > 



Distributed Multimedia Database Technologies Supported by MPEG-7 and MPEG-21
Distributed Multimedia Database Technologies Supported by MPEG-7 and MPEG-21
ISBN: 0849318548
EAN: 2147483647
Year: 2003
Pages: 77
Authors: Harald Kosch

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net