Note: This is not about networking codes in JDK. Therefore OT.

I have a test (test/sun/security/krb5/auto/BadKdc*.java in @run/othervm mode) that starts a UDP server listening to requests from the same machine. The server is started in a daemon thread and 3 requests are sent at least 6 seconds later, with an interval of 2 seconds between each request. There is a probability of 20% that the server does not see the 1st request at all. The 2nd and 3rd requests are always received.

I use JPRT to run the test. This only happens on solaris-i586 and solaris-x64 platforms. The failure never happens on linux and windows.

Anyone knows what the reason might be? I've doubled the time intervals but seems there is no benefit.

BTW, the test actually sends 3x3 UDP requests to localhost, only the first 6 ones are sent to random ports.

Thanks
Max

Reply via email to