Hi,
for my project, i've successuful done handshake and connect from a
flash movie to my c# application.

Video packets correctly follow the rules of 128 chunk data size, until
declared body size in header is spent.

So, i can grab a video stream in my app..

But the problem begin if i use audio in the stream. I've noticed that
audio packets do not follow the 128byte chunk rule and, absolutely not
64byte chunk as descripted in osflash.org wiki document.

I've searched entire WEB and i found only copy of your doc describing
rtmp, and messages of people, like me, damned for the strange audio
packet protocol, but absolutely NO RESPONSES to my question.

I've spent at least 20 night with hex editor and calc to distinguish a
logic in sniffed data of audio packet: i'm becoming crazy.

So, please, gently, someone can explain me HOW audio packet are sent
from flash client applications?

Thanks for patience.
Vincenzo.

_______________________________________________
Red5 mailing list
[email protected]
http://osflash.org/mailman/listinfo/red5_osflash.org

Reply via email to