Hi, I'm developing a mini flash server in C++. It will be able to do a minimal set of actions. Until now, I was able to connect, create streams, and do some publish.
The problem is that when I do the publish, the server starts eating data from the flash client. So far so good... but the flash client stops sending data after a period of time (unable to determine the length of this working period... now it stops after 3 packets, another time it stops after 3000 packets...). The connection is up, the ethereal dump is showing me that I send the BytesReceived message. I'm 100% sure that I'm making a mistake regarding the bandwidth control but I don't know what. Could you help me with any hint, idea, anything? I'm kinda running out of time here... :(( P.S. As far as I understand, BytesReceived message must be sent after X or more bytes received, and only after a complete message or message chunk has been received. X is set by the client through a Server BW message. If the client doesn't send a Server BW message, we default X to 125000 bytes. This is what red5 is doing (or should...) You start counting the received from the very beginning of the connection, including the handshake. Thank you!!! -- Andrei Gavriloaie ____________________________________________ Programmer SQSG (www.sqsg.ro) e-Mail: [EMAIL PROTECTED] Mobile:+40722537658
_______________________________________________ osflash mailing list [email protected] http://osflash.org/mailman/listinfo/osflash_osflash.org
