Dear Gromacs users and developers, I have encountered a segmentation fault with Gromacs 4.0.3. It only happens in a parallel run (particle decomposition, open boundary conditions), the same setup runs fine in serial. I can provide a small test case that should reproduce it, if anyone is interested. The stack trace is:
starting mdrun 'Neat water' 500000 steps, 500.0 ps. [isis:23154] *** Process received signal *** [isis:23154] Signal: Segmentation fault (11) [isis:23154] Signal code: Address not mapped (1) [isis:23154] Failing at address: (nil) [isis:23154] [ 0] /lib/libpthread.so.0 [0x7f41e64050f0] [isis:23154] [ 1] /opt/gromacs-4.0.3/double/lib/libmd_mpi_d.so.5(vrescale_tcoupl+0x184) [0x7f41e8b3048c] [isis:23154] [ 2] /opt/gromacs-4.0.3/double/lib/libmd_mpi_d.so.5(update+0x1fa) [0x7f41e8bad03c] [isis:23154] [ 3] mdrun_d(do_md+0x25f9) [0x416ed1] [isis:23154] [ 4] mdrun_d(mdrunner+0xf35) [0x419bd3] [isis:23154] [ 5] mdrun_d(main+0x55f) [0x41a35f] [isis:23154] [ 6] /lib/libc.so.6(__libc_start_main+0xe6) [0x7f41e60a2466] [isis:23154] [ 7] mdrun_d [0x4062d9] [isis:23154] *** End of error message *** -------------------------------------------------------------------------- mpirun noticed that process rank 1 with PID 23154 on node isis exited on signal 11 (Segmentation fault). -------------------------------------------------------------------------- Seeing where the problem occurs, I have tried to switch to the the Berendsen thermostat. Then the calculation runs fine even in parallel. This looks like a bug to me, has anyone else seen this? Best regards, Ondrej Marsalek _______________________________________________ gmx-users mailing list [email protected] http://www.gromacs.org/mailman/listinfo/gmx-users Please search the archive at http://www.gromacs.org/search before posting! Please don't post (un)subscribe requests to the list. Use the www interface or send it to [email protected]. Can't post? Read http://www.gromacs.org/mailing_lists/users.php

