Good to know. This suggests that building VASP properly with Open
MPI should work properly; perhaps there's some secret sauce in the
Makefile somewhere...? Off list, someone cited the following to me:
-----
Also VASP has a forum for things like this too.
http://cms.mpi.univie.ac.at/vasp-forum/forum.php
From there it looks like people have been having problems with
ifort 9.1.043 with vasp.
and from this post it looks like I'm not the only one to use openMPI
and VASP
http://cms.mpi.univie.ac.at/vasp-forum/forum_viewtopic.php?2.550
-----
I have not received a reply from the VASP author yet.
On May 10, 2007, at 8:52 AM, Terry Frankcombe wrote:
I have previously been running parallel VASP happily with an old,
prerelease version of OpenMPI:
[terry@nocona Vasp.4.6-OpenMPI]$
head /home/terry/Install_trees/OpenMPI-1.0rc6/config.log
This file contains any messages produced by compilers while
running configure, to aid debugging if configure makes a mistake.
It was created by Open MPI configure 1.0rc6, which was
generated by GNU Autoconf 2.59. Invocation command line was
$ ./configure --enable-static --disable-shared
--prefix=/home/terry/bin/Local --enable-picky --disable-heterogeneous
--without-libnuma --without-slurm --without-tm F77=ifort
In my VASP makefile:
FC=/home/terry/bin/Local/bin/mpif90
OFLAG= -O3 -xP -tpp7
CPP = $(CPP_) -DMPI -DHOST=\"LinuxIFC\" -DIFC -Dkind8 -DNGZhalf
-DCACHE_SIZE=12000 -DPGF90 -Davoidalloc -DMPI_BLOCK=500 -DRPROMU_DGEMV
-DRACCMU_DGEMV
FFLAGS = -FR -lowercase -assume byterecl
As far as I can see (it was a long time ago!) I didn't use BLACS or
SCALAPACK libraries. I used ATLAS.
Maybe this will help.
--
Dr Terry Frankcombe
Physical Chemistry, Department of Chemistry
Göteborgs Universitet
SE-412 96 Göteborg Sweden
Ph: +46 76 224 0887 Skype: terry.frankcombe
<te...@chem.gu.se>
_______________________________________________
users mailing list
us...@open-mpi.org
http://www.open-mpi.org/mailman/listinfo.cgi/users
--
Jeff Squyres
Cisco Systems