Re: [Wien] Bader charge

2022-11-11 Thread leila mollabashi
hat you have the values in your case.inaim correct. Maybe there is something wrong with the clmsum/*.in* files etc that you used? I cannot find the source of the error. Would you please guide me? Leila Mollabashi On Fri, Nov 4, 2022 at 11:53 PM Laurence Marks wrote: > I cannot reproduce your

[Wien] Bader charge

2022-11-04 Thread leila mollabashi
: 2.08, Cr: 1.65, O: -1.24 in Ref. [Energy Environ. Sci., 2011, 4, 4933]. These results also correctly lead to zero approximately: 2.08+1.65+3*(-1.24) ~ 0.01. Would you, please, have a look at this issue and let us know the source of the above discrepancy? Sincerely yours, Leila Mollabashi

Re: [Wien] MPI Error

2021-06-19 Thread leila mollabashi
to zero. The mpirun command will be issued on the original > node, but the lapw1_mpi executables will run as given in .machines. > > This should solve your problem. > > Am 29.05.2021 um 08:39 schrieb leila mollabashi: > > Dear all wien2k users, > > Following the previous

Re: [Wien] MPI Error

2021-05-30 Thread leila mollabashi
er the environment with ssh. > > The recommended option for mpi version 2 (all modern mpis) is to set > MPI_REMOTE to zero. The mpirun command will be issued on the original > node, but the lapw1_mpi executables will run as given in .machines. > > This should solve your problem. > &

[Wien] MPI Error

2021-05-29 Thread leila mollabashi
Dear all wien2k users, Following the previous comment referring me to the admin, I contacted the cluster admin. By the comment of the admin, I recompiled Wien2k successfully using the cluster modules. >Once the blacs problem has been fixed, For example, is the following correct?

Re: [Wien] MPI error

2021-05-19 Thread leila mollabashi
Dear all wien2k users, Thankyou for your reply and guides. > You need to link with the blacs library for openmpi. I unsuccessfully recompiled wien2k by linking with the blacs library for openmpias “mkl_blacs_openmpi_lp64” due to gfortran errors. The video of this recompile is uploaded to a

Re: [Wien] MPI error

2021-05-06 Thread leila mollabashi
do not, repear do not use mpirun or mpiexec to start run_lapw. It has to be >> started by simply "run_lapw -p ..." by itself. >> >> I suggest that you create a very simple job which has the commands: >> >> which mpirun >> which lapw1_mpi >> echo $WIE

Re: [Wien] MPI error

2021-05-04 Thread leila mollabashi
ers Guide to the Galaxy: > "I think the problem, to be quite honest with you, > is that you have never actually known what the question is." > > ==== > Dr. Gerhard H. Fecher > Institut of Physics > Johannes Gutenberg - University > 55099 Mai

Re: [Wien] MPI error

2021-05-02 Thread leila mollabashi
nk what nobody > else has thought", Albert Szent-Györgyi > www.numis.northwestern.edu > > On Sun, May 2, 2021, 17:12 leila mollabashi > wrote: > >> >You have an error in the LD_LIBRARY_PATH def you sent -- it needs to be >> "...:$LD_LIB..." >>

Re: [Wien] MPI error

2021-05-02 Thread leila mollabashi
t; "Research is to see what everyone else has seen, and to think what nobody > else has thought", Albert Szent-Györgyi > www.numis.northwestern.edu > > On Sun, May 2, 2021, 16:35 leila mollabashi > wrote: > >> Dear all WIEN2k users, >> >> Thank you for

Re: [Wien] MPI error

2021-05-02 Thread leila mollabashi
“run_lapw -p” insead of my_mpi_app? I don’t know what should I do instead of input1 and output1 Sincerely yours, Leila On Mon, May 3, 2021 at 2:04 AM leila mollabashi wrote: > Dear all WIEN2k users, > > Thank you for your reply. > > >The error is exactly what it says -- mpirun n

Re: [Wien] MPI error

2021-05-02 Thread leila mollabashi
___ > Von: Wien [wien-boun...@zeus.theochem.tuwien.ac.at] im Auftrag von > Laurence Marks [laurence.ma...@gmail.com] > Gesendet: Sonntag, 2. Mai 2021 21:32 > An: A Mailing list for WIEN2k users > Betreff: Re: [Wien] MPI error > > Inlined response and questions > > O

Re: [Wien] MPI error

2021-05-02 Thread leila mollabashi
sure, for smaller > cases it is a severe limitation to have only ONE mpi job with many > k-points, small matrix size and many mpi cores. > > Am 23.04.2021 um 16:04 schrieb leila mollabashi: > > Dear Prof. Peter Blaha and WIEN2k users, > > > > Thank you for your

Re: [Wien] MPI error

2021-05-02 Thread leila mollabashi
sure, for smaller > cases it is a severe limitation to have only ONE mpi job with many > k-points, small matrix size and many mpi cores. > > Am 23.04.2021 um 16:04 schrieb leila mollabashi: > > Dear Prof. Peter Blaha and WIEN2k users, > > > > Thank you for your

Re: [Wien] MPI error

2021-04-23 Thread leila mollabashi
(WIEN2k) using openmpi/4.1.0_icc19 Now should I compile WIEN2k with SL or LI? Sincerely yours, Leila Mollabashi On Wed, Apr 14, 2021 at 10:34 AM Peter Blaha wrote: > It cannot initialize an mpi job, because it is missing the interface > software. > > You need to ask t

Re: [Wien] MPI error

2021-04-13 Thread leila mollabashi
est case it comes down to issuing the command below. srun --pty / bin / bash Sincerely yours, Leila Mollabashi On Wed, Apr 14, 2021 at 12:03 AM leila mollabashi wrote: > Dear Prof. Peter Blaha and WIEN2k users, > > Thank you for your assistances. > > > At least now the

[Wien] MPI error

2021-04-13 Thread leila mollabashi
(ignored) LAPW0 END [1]Done mpirun -np 4 -machinefile .machine0 /home/users/mollabashi/v19.2/lapw0_mpi lapw0.def >> .time00 Sincerely yours, Leila Mollabashi ___ Wien mailing list Wien@zeus.theochem.tuwien.ac.at ht

Re: [Wien] MPI error

2021-04-12 Thread leila mollabashi
llabashi/v19.2/lapw0_mpi lapw0.def >> .time00 0.067u 0.091s 0:02.97 5.0% 0+0k 52272+144io 54pf+0w mollabashi@eagle:~/test1/cein$ cat .machines Sincerely yours, Leila Mollabashi On Sun, Apr 11, 2021 at 9:40 PM Peter Blaha wrote: > Your script is still wrong. > The .ma

[Wien] MPI error

2021-04-11 Thread leila mollabashi
NTASKS_PER_NODE: Undefined variable error happened when I used your scripts without changing it. I have tried several times even in a new directory with no positive effect. >SLURM_NTASKS_PER_NODE: Undefined variable. Sincerely yours, Leila Mollabashi ___

[Wien] test

2021-04-10 Thread leila mollabashi
This is a test e-mail to check whether my e-mail can be sent to the mailing list. ___ Wien mailing list Wien@zeus.theochem.tuwien.ac.at http://zeus.theochem.tuwien.ac.at/mailman/listinfo/wien SEARCH the MAILING-LIST at:

[Wien] Error in MPI run

2021-03-29 Thread leila mollabashi
Leila Mollabashi Dear Prof. Laurence Marks Thank you for your kindly reply. >Presumably you have not exported WIENROOT when you started your job, and/or it is not exported by openmpi. Check how to use mpi on your system including exporting PATH. Since I have config WIEN2k correc

[Wien] Error in MPI run

2021-03-28 Thread leila mollabashi
ch file or directory LAPW2 - Error. Check file lapw2.error cp: cannot stat ‘.in.tmp’: No such file or directory grep: *scf1*: No such file or directory > stop error Would you please kindly guide me? Sincerely yours, Leila Mollabashi ___ Wien ma

[Wien] about cfp

2014-12-07 Thread leila mollabashi
Dear wien2k's users, I am interested in the cfp software by Pavel Novak. On the unsupported software goodies said that this software calculates crystal field parameters in rare-earth systems. I want to know can I use it for other systems such as 5f or d electrons? Thank you,