5. Related Work


5. Related Work

Oomoto and Tanaka [25] have defined a video-based object oriented data model, OVID. They take pieces of video, identify meaningful features in them and link these features. They also outline a language called VideoSQL for querying such data.

Adali et al. [26] developed the AVIS video database system that introduced the importance of objects, activities, roles and players used here. They developed index structures to query such videos and algorithms to traverse such index structures to answer queries. Hwang et al. [27] developed a version of SQL to query video databases. However neither of these efforts provided an extension of the relational model of data to process video queries.

Gibbs et al. [28] study how stream-based temporal multimedia data may be modeled using object based methods. However, concepts such as roles and players, the distinction between activities and events and the integration of such video systems with other traditional database systems are not addressed.

Hjelsvold and Midtstraum [29] develop a "generic" data model for capturing video content and structure. Their idea is that video should be included as a data type in relational databases, i.e., systems such as PARADOX, INGRES, etc. should be augmented to handle video data. In particular, they study temporal queries.

Arman et al. [30] develop algorithms that can operate on compressed video directly they can identify scene changes by performing certain computations on DCT coefficients in JPEG and MPEG encoded video. Their effort complements ours neatly in the following way: their algorithms can identify, from compressed video, frame sequences that are of interest, and the objects/roles/events of these frame sequences can be queried using our algebra.




Handbook of Video Databases. Design and Applications
Handbook of Video Databases: Design and Applications (Internet and Communications)
ISBN: 084937006X
EAN: 2147483647
Year: 2003
Pages: 393

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net