A 2nd attempt, hope this time it will go thru...

Dear Mark:

      Thanks a lot for the quick response.

On Mon, 28 Jun 2010, Mark Abraham wrote:

>
>
> ----- Original Message -----
> From: Pinchas Aped <[email protected]>
> Date: Monday, June 28, 2010 3:43
> Subject: [gmx-users] Problems with installing mpi-version of Gromacs
> To: Discussion list for GROMACS users <[email protected]>
>
>
>   -----------------------------------------------------------
> |
>
>
> > Dear All: > > I have installed Gromacs on a Linux (RedHat)  cluster, using 
> > the following csh script -  >   > 
> > -------------------------------------------------------------------------------------
> >  > #! /bin/csh > > set DIR=/private/gnss/Gromacs >   > setenv SOFT 
> > ${DIR}/software
> > setenv CPPFLAGS"-I$SOFT/include"
> > setenv LDFLAGS "-L$SOFT/lib"
> > setenv NCPU 4
> > setenvPATH "$PATH":$SOFT/bin
> > cd openmpi-1.2.8; ./configure --prefix=$SOFT; make -j$NCPU; make install
> > cd ../fftw-3.1.3; ./configure --prefix=$SOFT--enable-float; make -j $NCPU; 
> > make install
> > cd ../gsl-1.11; ./configure--prefix=$SOFT; make -j $NCPU; make install
> > cd ../gromacs-4.0.7; ./configure--prefix=$SOFT --with-gsl; make -j $NCPU; 
> > make install
> > make distclean;./configure --prefix=$SOFT --program-suffix=_mpi 
> > --enable-mpi --with-gsl; make  mdrun -j $NCPU; make install-mdrun > 
> > -------------------------------------------------------------------------------------
> >  > > It seemed to have worked OK, and we could run  Gromacs on a single 
> > processor. >   > When I tried to create a parallel version with  the script 
> > - >   > 
> > -------------------------------------------------------------------------------------
> >  > #! /bin/csh > > set DIR=/private/gnss/Gromacs >   > setenv SOFT 
> > ${DIR}/software
> > setenv CPPFLAGS"-I$SOFT/include"
> > setenv LDFLAGS "-L$SOFT/lib"
> > setenv NCPU 4
> > setenvPATH "$PATH":$SOFT/bin >   > cd gromacs-4.0.7; ./configure 
> > --prefix=$SOFT--with-gsl --enable-mpi; make -j $NCPU mdrun; make 
> > install-mdrun > 
> > -------------------------------------------------------------------------------------
> >  >   > - the installation log ended with  - >   > ......... > checking 
> > whether the MPI cc command works...configure: error: Cannot compile and 
> > link MPI code with mpicc
> > make: *** Norule to make target `mdrun'.  Stop.
> > make: *** No rule to make target`install-mdrun'.  Stop. >   > I can't 
> > figure out from this message what is  wrong or missing with my MPI.
>
>  It looks like you included the scripts the wrong way around, or something. 
> Both scripts should build MPI-enabled mdrun, with the second not naming it 
> with _mpi. See 
> http://www.gromacs.org/index.php?title=Download_%26_Installation/Installation_Instructions
>  for the usual procedure.
>

     This is the site from which I took the above script content.

>
> You can inspect the last 100 or so lines of config.log to see the actual 
> error issue.
>

     The second installation log file (with MPI) is only 28 line long, and
the lines I have quoted are the first which seem to indicate of a problem.

>
> The issue has probably got nothing to do with GROMACS. Try compiling and 
> running some MPI test program to establish this.
>

     I will.

     By the way, our cluster has 8 cores nodes (2 quad-core). Is it
possible to run Gromacs in "shared memory" parallel mode, thus avoiding
the need for MPI?

>
> Mark
>  |
> -----------------------------------------------------------
>

------------------------------------------------------------------------
Dr. Pinchas Aped                      Tel.:   (+972-3) 531-7683
Department of Chemistry               FAX :   (+972-3) 738-4053
Bar-Ilan University                   E-Mail: [email protected]
52900 Ramat-Gan, ISRAEL               WWW:    http://www.biu.ac.il/~aped
------------------------------------------------------------------------



-- 
gmx-users mailing list    [email protected]
http://lists.gromacs.org/mailman/listinfo/gmx-users
Please search the archive at http://www.gromacs.org/search before posting!
Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to [email protected].
Can't post? Read http://www.gromacs.org/mailing_lists/users.php

Reply via email to