On Mon, Mar 18, 2013 at 4:54 PM, Robert O'Callahan <[email protected]>wrote:
> I've been thinking about how to fix this better and permanently. It's > tricky because there are a lot of competing requirements. > > We already have a plan with ongoing work for reducing MediaStreamGraph > latency. We just landed libcubeb support for all platforms and Paul Adenot > is working on reducing its latency, and we're going to drive MSG off the > libcubeb callbacks. > > As far as I know there are two major problems with the way MSG video works > right now: > > 1) In WebRTC we don't want to hold up playing audio for a time interval > [T1, T2] until all video frames up to and including T2 have been decoded > (MSG currently requires this). We'd rather just go ahead and play the audio > and if video decoding has fallen behind audio, render the latest video > frame as soon as it's available (preferably without waiting for an MSG > iteration). Of course if video decoding is far enough ahead we should sync > video frames to the audio instead (and then MSG needs to be involved since > it has the audio track(s). It's probably worth mentioning at this point that the current WebRTC video implementation (as does the gUM one) just returns the latest video frame upon request. So if (say) two video frames come in during the time period between NotifyPull()s, we just deliver the most recent one. Obviously, we could buffer them and deliver as two segments, but if we went to a model where we pushed video onto the MSG (which is what GIPS expects), then we wouldn't bother. Note that in GIPS, video frames have times of arrival but no duration, so there is a difficult match there as well. -Ekr > 2) Various devices implement stream capture using ring buffers and > therefore don't really want to give away references to image buffers that > can live indefinitely ... so these image buffers aren't a good fit for the > Image object, which allows Gecko code to keep an Image alive indefinitely > ... unless we make copies of images, which of course we want to avoid. So > we'd really like a SourceMediaStream to be able to manage the lifetimes of > its frames, most of the time, and make frame copies (if necessary) only in > exceptional cases. > > Let me know if there are important issues I've overlooked. And share your > ideas if you have a solution. I'm still thinking :-). > > Rob > -- > Wrfhf pnyyrq gurz gbtrgure naq fnvq, “Lbh xabj gung gur ehyref bs gur > Tragvyrf ybeq vg bire gurz, naq gurve uvtu bssvpvnyf rkrepvfr nhgubevgl > bire gurz. Abg fb jvgu lbh. Vafgrnq, jubrire jnagf gb orpbzr terng nzbat > lbh zhfg or lbhe freinag, naq jubrire jnagf gb or svefg zhfg or lbhe fynir > — whfg nf gur Fba bs Zna qvq abg pbzr gb or freirq, ohg gb freir, naq gb > tvir uvf yvsr nf n enafbz sbe znal.” [Znggurj 20:25-28] > _______________________________________________ > dev-media mailing list > [email protected] > https://lists.mozilla.org/listinfo/dev-media > _______________________________________________ dev-media mailing list [email protected] https://lists.mozilla.org/listinfo/dev-media

