Dear all,
I have some difficulties running mdrun_mpi for large number of
iterations, typically > 40000. The code hangs, not necessarily at a
fixed iteration, for all configurations I run and this behavior shows
up under linux and mac OS. The simulations range from 10^4 to 10^6
coarse-grained atoms and I try to keep the amount of data generated
per much smaller than 1Gb. I have worked around this issue by running
25000-iteration similations with tpbconv-restarts, as follows:
mpiexec -np 256 mdrun_mpi -nosum -v -s dppc0_1.tpr -o dppc0_1.trr -c
dppc0_1.gro -e dppc0_1.edr -x dppc0_1.xtc
tpbconv -s dppc0_1.tpr -f dppc0_1.trr -e dppc0_1.edr -o dppc0_2.tpr -
extend 250.0
With such scripts, I have been able to generate 10^5 to 10^6
iterations without any problem. I was wondering if anyone has
experienced similar problems and if I am missing something.
I have pretty much ruled out a problem with mpi, since I have
thoroughly tested these computers with other mpi codes. I am now
wondering if there might be a problem with output files.
I run Gromacs 4.0.2 on a linux cluster (quad core processors, myrinet
and mpich, compiled with gcc, single precision) up to 256 processors,
on a macpro 2 quad and on a macbook dual core using the Fink package
for openmpi. All these computers have enough available disk space for
any simulation I run. A typical simulation is a coarse grained MD
using martini with .mdp files obtained from Marrink's website.
Best,
Yves
--
Yves Dubief, Ph.D., Assistant Professor
Graduate program coordinator
University of Vermont, School of Engineering
Mechanical Engineering Program
201 D Votey Bldg, 33 Colchester Ave, Burlington, VT 05405
Tel: (1) 802 656 1930 Fax: (1) 802 656 3358
Also:
Vermont Advanced Computing Center
206 Farrell Hall, 210 Colchester Ave, Burlington, VT 05405
Tel: (1) 802 656 9830 Fax: (1) 802 656 9892
email: [email protected]
web: http://www.uvm.edu/~ydubief/
_______________________________________________
gmx-users mailing list [email protected]
http://www.gromacs.org/mailman/listinfo/gmx-users
Please search the archive at http://www.gromacs.org/search before posting!
Please don't post (un)subscribe requests to the list. Use the
www interface or send it to [email protected].
Can't post? Read http://www.gromacs.org/mailing_lists/users.php