Makoto Yoneya wrote:
Dear Mark:
Thank you for your quick comment.
Well this looks like a classic problem of incompatibility of your
hardware and these lam versions that is not arising for MPICH. You
should look for help there, since the problem is not arising because of
gromacs - unless you can demonstrate lam works correctly on your
hardware for other parallel software.
OK, I should check our installation of lam itself as suggested.
However, the test simulation results looked different (e.g. densities of
the
system) with lam and mpich2 (I should check more).
How can I solve this situation?
It sounds like gromacs+MPICH is working fine...
Then, MPICH (same as MPICH2?) is safer to use (lam is dangerous) for a new
hardware?
MPICH version 1.x has always given trouble with gromac, whereas LAM
usually works fine. MPICH v 2.x should be OK though. The difference
between LAM and MPICH in load could also be problem related. MPICH2 is
somewhat better optimized.
Nevertheless, things like average density should be independent of the
MPI library and number of processors. So please, if you find
inconsistencies there report them back to the list, or,preferably, to a
bugzilla.
--
David.
________________________________________________________________________
David van der Spoel, PhD, Assoc. Prof., Molecular Biophysics group,
Dept. of Cell and Molecular Biology, Uppsala University.
Husargatan 3, Box 596, 75124 Uppsala, Sweden
phone: 46 18 471 4205 fax: 46 18 511 755
[EMAIL PROTECTED] [EMAIL PROTECTED] http://folding.bmc.uu.se
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
_______________________________________________
gmx-users mailing list gmx-users@gromacs.org
http://www.gromacs.org/mailman/listinfo/gmx-users
Please don't post (un)subscribe requests to the list. Use the
www interface or send it to [EMAIL PROTECTED]
Can't post? Read http://www.gromacs.org/mailing_lists/users.php