Re: [gmx-users] mpi segmentation error in continuation of REMD simulation with gromacs 4.5.5
Hi, That shouldn't happen if your MPI library is working (have you tested it with other programs?) and configured properly. It's possible this is a known bug, so please let us know if you can reproduce it in the latest releases. Mark On Fri, Nov 8, 2013 at 6:55 AM, Qin Qiao qiaoqi...@gmail.com wrote: Dear all, I'm trying to continue a REMD simulation using gromacs4.5.5 under NPT ensemble, and I got the following errors when I tried to use 2 cores per replica: [node-ib-4.local:mpi_rank_25][error_sighandler] Caught error: Segmentation fault (signal 11) [node-ib-13.local:mpi_rank_63][error_sighandler] Caught error: Segmentation fault (signal 11) ... Surprisingly, it worked fine when I tried to use only 1 core per replica.. I have no idea what caused the problem.. Could you give me some advice? ps. the command I used is srun .../gromacs-4.5.5-mpi-slurm/bin/mdrun_infiniband -s remd_.tpr -multi 48 -replex 1000 -deffnm remd_ -cpi remd_.cpt -append Best Qin -- gmx-users mailing listgmx-users@gromacs.org http://lists.gromacs.org/mailman/listinfo/gmx-users * Please search the archive at http://www.gromacs.org/Support/Mailing_Lists/Search before posting! * Please don't post (un)subscribe requests to the list. Use the www interface or send it to gmx-users-requ...@gromacs.org. * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists -- gmx-users mailing listgmx-users@gromacs.org http://lists.gromacs.org/mailman/listinfo/gmx-users * Please search the archive at http://www.gromacs.org/Support/Mailing_Lists/Search before posting! * Please don't post (un)subscribe requests to the list. Use the www interface or send it to gmx-users-requ...@gromacs.org. * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
Re: [gmx-users] mpi segmentation error in continuation of REMD simulation with gromacs 4.5.5
On Fri, Nov 8, 2013 at 7:18 PM, Mark Abraham mark.j.abra...@gmail.comwrote: Hi, That shouldn't happen if your MPI library is working (have you tested it with other programs?) and configured properly. It's possible this is a known bug, so please let us know if you can reproduce it in the latest releases. Mark Hi, I installed different versions of gromacs with the same MPI library. Surprisingly, the problem doesn't occur in gromacs-4.5.1.. but still in the gromacs-4.6.3... The MPI version is MVAPICH2-1.9a for infinite band. Best, Qin On Fri, Nov 8, 2013 at 6:55 AM, Qin Qiao qiaoqi...@gmail.com wrote: Dear all, I'm trying to continue a REMD simulation using gromacs4.5.5 under NPT ensemble, and I got the following errors when I tried to use 2 cores per replica: [node-ib-4.local:mpi_rank_25][error_sighandler] Caught error: Segmentation fault (signal 11) [node-ib-13.local:mpi_rank_63][error_sighandler] Caught error: Segmentation fault (signal 11) ... Surprisingly, it worked fine when I tried to use only 1 core per replica.. I have no idea what caused the problem.. Could you give me some advice? ps. the command I used is srun .../gromacs-4.5.5-mpi-slurm/bin/mdrun_infiniband -s remd_.tpr -multi 48 -replex 1000 -deffnm remd_ -cpi remd_.cpt -append Best Qin -- gmx-users mailing listgmx-users@gromacs.org http://lists.gromacs.org/mailman/listinfo/gmx-users * Please search the archive at http://www.gromacs.org/Support/Mailing_Lists/Search before posting! * Please don't post (un)subscribe requests to the list. Use the www interface or send it to gmx-users-requ...@gromacs.org. * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists -- gmx-users mailing listgmx-users@gromacs.org http://lists.gromacs.org/mailman/listinfo/gmx-users * Please search the archive at http://www.gromacs.org/Support/Mailing_Lists/Search before posting! * Please don't post (un)subscribe requests to the list. Use the www interface or send it to gmx-users-requ...@gromacs.org. * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists -- gmx-users mailing listgmx-users@gromacs.org http://lists.gromacs.org/mailman/listinfo/gmx-users * Please search the archive at http://www.gromacs.org/Support/Mailing_Lists/Search before posting! * Please don't post (un)subscribe requests to the list. Use the www interface or send it to gmx-users-requ...@gromacs.org. * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
Re: [gmx-users] mpi segmentation error in continuation of REMD simulation with gromacs 4.5.5
OK, thanks. Please open a new issue at redmine.gromacs.org, describe your observations as above, and upload a tarball of your input files. Mark On Fri, Nov 8, 2013 at 2:14 PM, Qin Qiao qiaoqi...@gmail.com wrote: On Fri, Nov 8, 2013 at 7:18 PM, Mark Abraham mark.j.abra...@gmail.com wrote: Hi, That shouldn't happen if your MPI library is working (have you tested it with other programs?) and configured properly. It's possible this is a known bug, so please let us know if you can reproduce it in the latest releases. Mark Hi, I installed different versions of gromacs with the same MPI library. Surprisingly, the problem doesn't occur in gromacs-4.5.1.. but still in the gromacs-4.6.3... The MPI version is MVAPICH2-1.9a for infinite band. Best, Qin On Fri, Nov 8, 2013 at 6:55 AM, Qin Qiao qiaoqi...@gmail.com wrote: Dear all, I'm trying to continue a REMD simulation using gromacs4.5.5 under NPT ensemble, and I got the following errors when I tried to use 2 cores per replica: [node-ib-4.local:mpi_rank_25][error_sighandler] Caught error: Segmentation fault (signal 11) [node-ib-13.local:mpi_rank_63][error_sighandler] Caught error: Segmentation fault (signal 11) ... Surprisingly, it worked fine when I tried to use only 1 core per replica.. I have no idea what caused the problem.. Could you give me some advice? ps. the command I used is srun .../gromacs-4.5.5-mpi-slurm/bin/mdrun_infiniband -s remd_.tpr -multi 48 -replex 1000 -deffnm remd_ -cpi remd_.cpt -append Best Qin -- gmx-users mailing listgmx-users@gromacs.org http://lists.gromacs.org/mailman/listinfo/gmx-users * Please search the archive at http://www.gromacs.org/Support/Mailing_Lists/Search before posting! * Please don't post (un)subscribe requests to the list. Use the www interface or send it to gmx-users-requ...@gromacs.org. * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists -- gmx-users mailing listgmx-users@gromacs.org http://lists.gromacs.org/mailman/listinfo/gmx-users * Please search the archive at http://www.gromacs.org/Support/Mailing_Lists/Search before posting! * Please don't post (un)subscribe requests to the list. Use the www interface or send it to gmx-users-requ...@gromacs.org. * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists -- gmx-users mailing listgmx-users@gromacs.org http://lists.gromacs.org/mailman/listinfo/gmx-users * Please search the archive at http://www.gromacs.org/Support/Mailing_Lists/Search before posting! * Please don't post (un)subscribe requests to the list. Use the www interface or send it to gmx-users-requ...@gromacs.org. * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists -- gmx-users mailing listgmx-users@gromacs.org http://lists.gromacs.org/mailman/listinfo/gmx-users * Please search the archive at http://www.gromacs.org/Support/Mailing_Lists/Search before posting! * Please don't post (un)subscribe requests to the list. Use the www interface or send it to gmx-users-requ...@gromacs.org. * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists