how is the presentationtime of two streams synchronised?
I have to synchronise the mpeg-4 es and a wave file. I am able to send the two 
streams together by creating single servermediasession and adding two separate 
servermediasubsession, but they are not synchronised. 
In case of mpeg-4 es video, the gettimeofday() is getting called when the 
constructor of MPEGVideoStreamFramer is called and in case of wave, in 
WAVAudioFileSource::doGetNextFrame(). I think due to this the video and audio 
is not getting synchronised. So in this case how should i synchronise the audio 
and video?

Regards,
Nisha
 
On Sun, 25 Jul 2010 15:14:30 +0530  wrote
>>We successfully combined the two streams into one stream and it works great.



Good. As you figured out, you can do this just by creating a single 

"ServerMediaSession" object, and adding two separate 

"ServerMediaSubsessions" to it.





>The Audio and Video are on the same url address. As it seems to us the Audio

>and Video are synchronized but we are not sure if we need to handle it (in

>some way other then setting presentation time) or it all handle in your

>library. The only thing we are currently doing is to update presentation

>time for the audio and for the video. We appreciate your input on this

>matter



Yes, if the presentation times of the two streams are in sync, and 

aligned with 'wall clock' time (i.e., the time that you'd get by 

calling "gettimeofday()"), and you are using RTCP (which is 

implemented by default in "OnDemandServerMediaSubsession"), then you 

will see A/V synchronization in standards-compliant clients.

-- 



Ross Finlayson

Live Networks, Inc.

http://www.live555.com/

_______________________________________________

live-devel mailing list

[email protected]

http://lists.live555.com/mailman/listinfo/live-devel

_______________________________________________
live-devel mailing list
[email protected]
http://lists.live555.com/mailman/listinfo/live-devel

Reply via email to