I'm not sure that PD has any advantage here. From memory it has to
create a 128x1x1 grid, and you can direct that with DD also.
See mdrun -h -hidden for -dd.
Mark
The contents of your .log file will be far more helpful than stdout in
diagnosing what condition led to the problem.
Mark
On 24/12/2010 9:59 PM, Wojtyczka, André wrote:
I'm not sure that PD has any advantage here. From memory it has to
create a 128x1x1 grid, and you can direct that with DD also.
See mdrun -h -hidden for -dd.
Mark
The contents of your .log file will be far more helpful than stdout in
diagnosing
On 23/12/2010 10:01 PM, Wojtyczka, André wrote:
Dear Gromacs Enthusiasts.
I am experiencing problems with mdrun_mpi (4.5.3) on a Nehalem cluster.
Problem:
This runs fine:
mpiexec -np 72 /../mdrun_mpi -pd -s full031K_mdrun_ions.tpr
This produces a segmentation fault:
mpiexec -np 128
3 matches
Mail list logo