Silvia, all,

We're working with multitrack MPEG transport streams, and have an 
implementation of the TimedTrack interface integrating with in-band metadata 
tracks.  Our prototype uses the Metadata Cues to synchronize a JavaScript 
application with a video stream using the stream's embedded EISS signaling.  
This approach is working very well so far.

The biggest issue we've faced is that there isn't an obvious way to tell the 
browser application what type of information is contained within the metadata 
track/cues.  The Cues can contain arbitrary text, but neither the Cue, nor the 
associated TimedTrack, has functionality for specifying the format/meaning of 
that text.

Our current implementation uses the Cue's @identifier for a MIME type, and puts 
the associated metadata into the Cue's text field using XML.  This works, but 
requires the JavaScript browser application to examine the cues to see if they 
contain information it understands.  It also requires the video player to 
follow this convention for Metadata TimedTracks.

Adding a @type attribute to the Cues would certainly help, though it would 
still require the browser application to examine individual cues to see if they 
were useful.

An alternate approach would be to add a @type attribute to the <track> 
tag/TimedTrack that would specify the mime type for the associated cues.  This 
would allow a browser application to determine from the TimedTrack whether  or 
not it needed to process the associated cues.

Eric
---
Eric Winkelman
CableLabs

> -----Original Message-----
> From: [email protected] [mailto:whatwg-
> [email protected]] On Behalf Of Silvia Pfeiffer
> Sent: Wednesday, February 09, 2011 5:41 PM
> To: WHAT Working Group
> Subject: [whatwg] How to handle multitrack media resources in HTML
> 
> Hi all,
> 
> One particular issue that hasn't had much discussion here yet is the issue of
> how to deal with multitrack media resources or media resources that have
> associated synchronized audio and video resources.
> I'm concretely referring to such things as audio descriptions, sign language
> video, and dubbed audio tracks.
> 
> We require an API that can expose such extra tracks to the user and to
> JavaScript. This should be independent of whether the tracks are actually
> inside the media resource or are given as separate resources, but should be
> linked to the main media resource through markup.
> 
> I am bringing this up now because solutions may have an influence on the
> inner workings of TimedTrack and the <track> element, so before we have
> any implementations of <track>, we should be very certain that we are
> happy with the way in which it works - in particular that <track> continues to
> stay an empty element.
> 
> We've had some preliminary discussions about this in the W3C Accessibility
> Task Force and the alternatives that we could think about are captured in
> http://www.w3.org/WAI/PF/HTML/wiki/Media_Multitrack_Media_API . This
> may not be the complete list of possible solutions, but it provides ideas for
> the different approaches that can be taken.
> 
> I'd like to see what people's opinions are about them.
> 
> Note there are also discussion threads about this at the W3C both in the
> Accessibility TF [1] and the HTML Working Group [2], but I am curious about
> input from the wider community.
> 
> So check out
> http://www.w3.org/WAI/PF/HTML/wiki/Media_Multitrack_Media_API
> and share your opinions.
> 
> Cheers,
> Silvia.
> 
> [1] http://lists.w3.org/Archives/Public/public-html-a11y/2011Feb/0057.html
> [2] http://lists.w3.org/Archives/Public/public-html/2011Feb/0205.html

Reply via email to