If the local computer is included in the broadcast specification (which it always is in the general broadcast), the datagram is received on Windows XP and on Mac OS 9.2. But it is not on OS X.
To my thinking the OS X behavior is goofy. Now it comes to my mind that maybe that behavior, goofy or not, is traditional on unix. (Maybe some folks would think it very irreverent to describe TCP/IP behavior as goofy on a descendent of BSD unix.)
Anybody know what is up here? Does this happen on linux, too?
Dar Scott
_______________________________________________ use-revolution mailing list [EMAIL PROTECTED] http://lists.runrev.com/mailman/listinfo/use-revolution
