Hi, Mark,<o:p></o:p>
<o:p></o:p>
I’ve noticed about the minimum cell diameter restrict, but I still had no
idea about how to adjust the related parameter after I read the manual part
mentioned by the error info. I don’t have too much understanding about the
algorithm, so I turned around to rely on the ‘-pd’ option:)<o:p></o:p>
<o:p></o:p>
About choosing double precision, I notice normal mode analysis need the double
precision version
of some programs. And I don’t know whether I should use double precision
version of mdrun for covariance analysis, so I just chose the double one!
<o:p></o:p>
I also don’t have too much idea about choosing which ensemble to conduct
covariance
analysis. I’ve noticed that temperature coupling would ‘correct’ the motion of
the atoms. I think a more ‘natural’ trajectory with least artifact should be
generated for covariance analysis. Any comments about this?<o:p></o:p> -----
Original Message -----From: [email protected]: Tuesday, June 1, 2010
21:59Subject: Re: [gmx-users] “Fatal error in PMPI_Bcast: Other MPI error,
…..” occurs when using the ‘particle decomposition’ option.To: Discussion list
for GROMACS users <[email protected]>> > Hi, Mark,> Thanks
for the reply! > It seemed that I got something messed up. At the beginning,
I used ‘constraints = all-bonds’ and ‘domain decomposition’.>When the
simulation scale to more than 2 processes, an error like below will occur: The
"domain_decomposition" .mdp flag is an artefact of pre-GROMACS-4 development of
DD. It does nothing. Forget about it. DD is enabled by default unless you use
mdrun -pd.> ####################> Fatal error: There is no domain
decomposition for 6 nodes that is compatible with the given box and a minimum
cell size of 2.06375 nm> Change the number of nodes or mdrun option -rcon or
-dds or your LINCS settings> Look in the log file for details on the domain
decomposition> ####################> With DD and all-bonds, the
coupled constraints create a minimum cell diameter that must be satisfied on
all processors. Your system is too small for this to be true. The manual
sections on DD mention this, though perhaps you wouldn't pick that up on a
first reading.> I refer to the manual and found no answer. Then I turned to
use ‘particle decomposition’, tried> all kind of method, including change
mpich to lammpi, change Gromacs from V4.05> to V4.07,adjusting the mdp file
(e.g. ‘constraints = hbonds’ or no PME), and none of these> take effect! I
thought I have tried ‘constraints = hbonds’ with ‘domain decomposition’, at
least with lammpi. PD might fail for a similar reason, I suppose.> However,
when I tried ‘constraints = hbonds’ and ‘domain decomposition’ under mpich
today, it scaled to more than 2 processes well! And now it also scaled well
under lammpi using ‘constraints= hbonds’ and ‘domain decomposition’!Yep. Your
constraints are not so tightly coupled now.> So, it seemed the key is
‘constraints= hbonds’ for ‘domain decomposition’.Knowing how your tools work is
key :-) The problem with complex tools like GROMACS is knowing what's worth
knowing :-)> > Of course, the simulation still crashed when using
‘particle decomposition’ with ‘constraints = hbonds or all-bonds’, and I don’t
know why.Again, your system is probably too small to be bothered with
parallelising with constraints.> I use double precision version and NTP
ensemble to perform a PCA!I doubt that you need to collect data in double
precision. Any supposed extra accuracy of integration is probably getting
swapped by noise from temperature coupling. I suppose you may wish to run the
analysis tool in double, but it'll read a single-precision trajectory just
fine. Using single precision will make things more than a factor of two
faster.Mark
--
gmx-users mailing list [email protected]
http://lists.gromacs.org/mailman/listinfo/gmx-users
Please search the archive at http://www.gromacs.org/search before posting!
Please don't post (un)subscribe requests to the list. Use the
www interface or send it to [email protected].
Can't post? Read http://www.gromacs.org/mailing_lists/users.php