Re: [Wien] MPI error

2021-05-06 Thread Laurence Marks
Peter beat me to the response -- please do as he says and move stepwise forward, posting single steps if they fail. On Thu, May 6, 2021 at 10:38 AM Peter Blaha wrote: > Once the blacs problem has been fixed, the next step is to run lapw0 in > sequential and parallel mode. > > Add: > > x lapw0

Re: [Wien] MPI error

2021-05-06 Thread Peter Blaha
Once the blacs problem has been fixed, the next step is to run lapw0 in sequential and parallel mode. Add: x lapw0 and check the case.output0 and case.scf0 files (copy them to a different name) as well as the message from the queuing system. add: mpirun -np 4 $WIENROOT/lapw0_mpi

Re: [Wien] MPI error

2021-05-06 Thread Peter Blaha
One thing is clear: lapw1_mpi cannot work. You are linking with-lmkl_blacs_intelmpi_lp64 but you are using openmpi. You need to link with the blacs library for openmpi. It is mentioned in the usersguide. Am 06.05.2021 um 15:09 schrieb leila mollabashi: Dear all wien2k users, >I

Re: [Wien] MPI error

2021-05-06 Thread leila mollabashi
Dear all wien2k users, >I suggest that you focus on the PATH first, using I followed your suggestion. The script and results are in the https://files.fm/u/m2qak574g. The compile.msg_lapw0 and compile.msg_lapw0 are in the https://files.fm/u/pvdn52zpw . Sincerely yours, Leila On Wed, May 5,