On Thu, Feb 13, 2014 at 6:39 PM, Thomas Schlesier wrote:
> Hi,
> I'm no expert for this stuff, but could it be that you generate about 40 of
> the #my_mol.log.$n# files (probably only 39)?
> It could be that the 'mpirun' starts 40 'mdrun'-jobs and each generates its
> own out put.
> For GROMACS 4.
Hi,
I'm no expert for this stuff, but could it be that you generate about 40
of the #my_mol.log.$n# files (probably only 39)?
It could be that the 'mpirun' starts 40 'mdrun'-jobs and each generates
its own out put.
For GROMACS 4.6.x I always used
mdrun -nt X ...
to start a parallel run (where X
Hi Mousumi,
from the fact that you get lots of backup files directly at the beginning
I suspect that your mdrun is not MPI-enabled. This behavior is exactly what
one would get when launching a number of serial mdrun’s on the same input file.
Maybe you need to look for a mdrun_mpi executable.
The
Dear GROMACS users,
I am facing a strange situation by running Gromacs (v - 4.6.3) in our local
cpu-cluster using MPI/OpenMP parallelization process. I am trying to simulate a
big heterogeneous aquas-polymer system in octahedron box.
I use the following command to run my simulations and use sge