>
> hi,
>

We are dealing with a synchronization issue called multipoint or
inter-destination synchronization in
multiparty conferencing system, involving the synchronization of the play-
out processes of same streams in
different receivers, at the same time, to achieve fairness among the
receivers. We can cite the example of tele
teaching applications in which a teacher could send (multicast) a video
sequence (documentary or film—
stored content stream) and, during the session, sometimes the teacher could
make occasional comments
about the video (live content stream).Network quizzes are other examples, in
which the same multimedia
question must appear at the same time to all the participants to guarantee
fair play. In the first example, a
simultaneous play out of the streams is important for both stored content
and live content streams. Even if we
only send the video stream (documentary or film), each video MDU (frame)
should be played
simultaneously in all the receivers (students) and then the students could
comment on the video content with
other students.

When the source sends more than one simultaneous stream and the stereams are
having some temporal
relationships, then there will be a master stream based on which multipoint
synchronization will take place.
In our case of source sending audio and video streams, audio will be the
master stream based on which
multipoint synchronization will take place. Then at different receivers we
should also ensure inter-stream
synchronization, i.e. Synchronization between the audio and video streams.
We should maintain the
temporal relationship between different media data units of audio and video.

Our Proposed Approach
1) We adaptively calculate the expected playout instant for each audio frame
which is same for all the
receivers. Since different receivers will receive the same frame at
different time, playing out the
same frame at the same time will require shrinking or extending of silent
zones keeping data
unchanged.

2) Based on the expected playout instant of the audio frame, we will
calculate expected playout instant
of the video frames with the help of their intermedia relationships.

3) We will play the audio and video frames at their expected playout
instant.

in response to your mail, i am sending to you an attached document where i
have explained our algorithm. We are dealing with real time multiparty
conferencing where we will try to acheive multipoint and inter-stream
synchronization. We want to use ffmpeg libraries to manipulate the playout
instant of individual audio and video frames. But i have no idea which file
in the ffmpeg library should i target to manipulate individual audio and
video frames. I also do not know about the structures and functions used in
the libraries. It would be great if u could help us in this matter.

prajna
_______________________________________________
libav-user mailing list
libav-user@mplayerhq.hu
https://lists.mplayerhq.hu/mailman/listinfo/libav-user

Reply via email to