Flavor (Formal Language for Audio-Visual Object Representation) is an object-oriented media representation language designed for simplifying the development of applications that involve a significant media processing component (encoding, decoding, editing, manipulation, etc.). It provides a formal way for describing any coded audio-visual or general multimedia bitstream, and it comes with a translator that can automatically generate C++/Java code from the Flavor description. The generated code can readily be used as a bitstream parser, generator or tracing tool.

Since Version 5.0, the translator has been enhanced to support XML. With the enhanced translator, the generated C++/Java code can also include a method for producing XML documents that correspond to the bitstreams described by Flavor. The description can also be used to generate a corresponding XML schema. Additionally, a software tool (bitgen) for converting XML representation of multimedia data back into bitstream form is provided. The enhanced translator and bitgen comprise XFlavor, a framework for providing XML features in media representation. Using XFlavor, multimedia data can be converted back and forth between the binary and XML representations.

Flavor is used in the Systems and Structured Audio parts of the ISO/IEC MPEG-4 standard for the representation of the bitstream syntax. The software for the Win32 platform is available as part of the Flavor package, and as the whole source code is included in the package, it can be recompiled to be used in any other platform. The source code building instructions for Win32/Unix/Linux environments are included in the package as well.

For more information, see the Overview and Papers pages. The specification is also available on-line as well as the manual. We welcome your feedback on both this web site as well as the Flavor design and software; see the Contact Us page on how to get in touch.



Copyright Notice



SourceForge Logo



This material is based upon work supported by the National Science Foundation under Grant No. 0313116. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.