Steven Gong wrote:
> Jerky video is better than no video at all. If you are in a video 
> conference
> and your bandwidth is limited, I think the best result is you still 
> can hear
> others talking and see some of the motions of others.
>
Our application requirements goes beyond video conferencing 
unfortunately, its more about end user live/vod experience, we havent 
looked into chat app applications which can obviouslly have much smaller 
video windows and jerky freeze frame video.

The buffering and freeze frame problems that ive been going nuts over is 
latency problems definitely which FMS seems to somehow fix itself on the 
fly.

IE if the server streaming the video is only a few km's or the country 
as from you, its blazing quick, however going across continents is a 
problem. If its unfixable we may have to look at geo targeting load 
balancing streaming servers which becomes expensive because we'd have to 
move around gigs of video data to each location.

With testing FMS from the same servers, one keeps a buffer length of 20 
for a 400k stream but the other keeps it at 6, there is a strange 
difference but it holds out by doing some odd dynamic video data push to 
keep the length twice the buffertime when the bufferlength reaches the 
buffertime :)

It could work ok for live streaming however it would be impossible for a 
hardware load balancer doing the geo targeting to only target one 
application as it works on ports, and we obviouslly run more than one 
app unless geo targeting happened from within red5 :D

I think what may work is a proxying solution like with the FMS edge 
setup. IE someone publishes to one server located on a server near them, 
it is proxying or pushing to the rest of the servers and people can 
either subscribe to the closest one to them.

A similar thing would be nice for vod streaming however the content 
would have to be from the master servers where the NAS is. So the geo 
targeted slaved "FMS edge" machines is collecting the "video stream" 
from the master server and proxying it to the client maybe the server 
could pre buffer better than the client could :)

If the filename generator can generate a file to load from external 
drives and network shared drives. what if it could do this

file:rtmp://red5masterurl/application/thefile.flv

Would that be how the proxy would work ?

Let me know thoughts/ideas id really like to nut out this problem. IE i 
had a listee from germany play a stream from my machine located in a 
data centre in sydney australia and at 8 seconds buffer time the latency 
forces it to rebuffer however the streaming coming from a data centre in 
NY rebuffered a few times and ok.

On the other hand the streams coming from our data centre in NY is 
unplayable here therefore i cant test things properly ! :D

PS. If devs need to test latency we can use my server for testing maybe 
? Im going to have to assume things are tested on dev machines only a 
few footsteps away :D

_______________________________________________
Red5 mailing list
[email protected]
http://osflash.org/mailman/listinfo/red5_osflash.org

Reply via email to