Hello everyone.
First I just want to say thank you for asking my questions when I am in trouble
with this great library.
I implemented a very basic client-server application for interacting with
graphics generated by OpenGL on the server and manipulate them on the client
side, where the user can see the scene as a stream video.
My problem now is that I want to reduce the delay between the server and the
client. There is 5 seconds of difference between both computers.
I checked the AVCodecContext Struct in order to search some parameter for
reducing this time, and I found things like hurry_up, max_b_frames, delay
parameter, I have tried but the delay still remains.
I read on the ffserver manual:
"When you connect to a live stream, most players (WMP, RA, etc) want to buffer
a certain number of seconds of material so that they can display the signal
continuously. However, ffserver (by default) starts sending data in realtime.
This means that there is a pause of a few seconds while the buffering is being
done by the player. The good news is that this can be cured by adding a
'?buffer=5' to the end of the URL. This means that the stream should start 5
seconds in the past -- and so the first 5 seconds of the stream are sent as
fast as the network will allow. It will then slow down to real time. This
noticeably improves the startup experience."
I search on the source code but I can not catch how can I implement this
capability on my own source code.
Any help can will be very appreciate.
Thanks.
PD. I am sorry for my poor english.
____________________________________________________________________________________
Yahoo! Deportes Beta
¡No te pierdas lo último sobre el torneo clausura 2008! Entérate aquí
http://deportes.yahoo.com
_______________________________________________
libav-user mailing list
[email protected]
https://lists.mplayerhq.hu/mailman/listinfo/libav-user