On 11 Jan 2011, at 23:00, Chris Pearce wrote:

> On 12/01/2011 11:20 a.m., Rob Coenen wrote:
>> I can imagine there are 'virtual' frames, where say frame 1 to 10 is
>> actually the same frame and internally encoded as frame 1 with a duration of
>> 10 frames?
> 
> Yes, as I understand it, this is a legal encoding.

Though keep in mind that most decoders their output frame buffer(s) will be 
polled/read/pushed at a regular rate - if only to help with audio sync and 
flicker - and that this rate is driven/inspired/set by metadata in the encoded 
stream.

>> Even then I'd like the 'virtual' FPS of the WebM file exposed to the
>> webbrowser- similar to how my other utilities report a FPS.
> 
> If the 'virtual' FPS value isn't provided by the container, and given that 
> the frame durations could potentially have any distribution and that the 
> media may not be fully downloaded, how can this be effectively calculated?

I cannot think of a format where this would in fact be the case - but for a few 
arcane ones like an animated push gif without a loop.

>> This way one could build web-tools in HTML5 that allow to access each
>> individual frame and do other things than simply playing back the movie in a
>> linear fashion from beginning to end.
> 
> I we've discussed this sort of thing before, and roughly agreed that we'd 
> look at processing frame data in a per-frame callback which runs in a web 
> worker thread, if we could agree on some use cases which couldn't be achieved 
> using SVG filters.

That sounds like a good path - so we should focus on a few use cases for just 
this.

> That discussion was in this thread:
> http://www.mail-archive.com/[email protected]/msg23533.html

That is very useful. Thanks!

Dw.

Reply via email to