I have a small sockets stack that listens on a UDP port (accept) and sends datagrams to that port on an IP address. If the address is broadcast, the behavior is not consistent among platforms tested. This applies whether the broadcast is general (255.255.255.255) or subnet specific (for example, 10.99.255.255)

If the local computer is included in the broadcast specification (which it always is in the general broadcast), the datagram is received on Windows XP and on Mac OS 9.2. But it is not on OS X.

To my thinking the OS X behavior is goofy. Now it comes to my mind that maybe that behavior, goofy or not, is traditional on unix. (Maybe some folks would think it very irreverent to describe TCP/IP behavior as goofy on a descendent of BSD unix.)

Anybody know what is up here? Does this happen on linux, too?

Dar Scott





_______________________________________________
use-revolution mailing list
[EMAIL PROTECTED]
http://lists.runrev.com/mailman/listinfo/use-revolution

Reply via email to