Re: [Wien] System configuration

2019-05-31 Thread Gavin Abo
Keep in mind that with 12 total cores [1], you might see little to no benefit from using mpi parallel with the computer (single node) that you have. You probably saw the siteconfig message: Do you have MPI, ScaLAPACK, ELPA, or FFTW installed and intend to run    finegrained parallel?    This

Re: [Wien] System configuration

2019-05-30 Thread Gavin Abo
Also unable to install ELPA The following might be helpful for those interested in ELPA. ELPA seemed to compile without any problems with the Intel compilers.  However, the "make check" tests do not pass for my AMD processor laptop.  My hardware resources seem to be lacking that needed for

Re: [Wien] System configuration

2019-05-30 Thread Indranil mal
After following the references now getting the following error > stop error grep: *scf1*: No such file or directory cp: cannot stat '.in.tmp': No such file or directory FERMI - Error grep: *scf1*: No such file or directory InBi.scf1_1: No such file or directory. [1] + Done

Re: [Wien] System configuration

2019-05-29 Thread Gavin Abo
Refer to [1,2], you may need to install the operating system's ssh-askpass package or setup passwordless login with ssh-keygen [3] and ssh-copy-id [4]. [1] https://stackoverflow.com/questions/10050556/setting-up-ssh-for-jenkins-to-use-at-runtime [2]

Re: [Wien] System configuration

2019-05-29 Thread Indranil mal
compiled fftw with intel mpi and successfully compiled without any error. After running a job in parallel scf GOT THE FOLLOWING ERROR grep: *scf1*: No such file or directory cp: cannot stat '.in.tmp': No such file or directory FERMI - Error grep: *scf1*: No such file or directory InBi.scf1_1: No

Re: [Wien] System configuration

2019-05-29 Thread Gavin Abo
As mentioned in a previous post [1], it looks like ompi_mpi in the error messages indicate that your fftw3 was compiled with Open MPI instead of Intel MPI. If you have both Open MPI and Intel MPI on your system, you have to take care of the double ii's. Perhaps, you used mpicc for Open MPI

Re: [Wien] System configuration

2019-05-29 Thread Indranil mal
/home/dps/fftw3/lib/libfftw3_mpi.a(api.o): In function `bogosity_hook': api.c:(.text+0x20): undefined reference to `ompi_mpi_comm_null' api.c:(.text+0x44): undefined reference to `ompi_mpi_comm_null' api.c:(.text+0x73): undefined reference to `ompi_mpi_comm_null'

Re: [Wien] System configuration

2019-05-29 Thread Gavin Abo
Look inside those compile.msg files as they likely contain messages showing why they failed to compile. On 5/28/2019 11:46 AM, Indranil mal wrote: Thank you for kind response After following all the instructions given by you I have installed WIEN2k with Intel parallel compiler. After

Re: [Wien] System configuration

2019-05-28 Thread Indranil mal
Thank you for kind response After following all the instructions given by you I have installed WIEN2k with Intel parallel compiler. After compiling I got Compile time errors (if any) were: SRC_lapw0/compile.msg:make[1]: *** [lapw0_mpi] Error 1 SRC_lapw0/compile.msg:make: *** [para] Error 2

Re: [Wien] System configuration

2019-05-28 Thread Gavin Abo
Continuing from post [1], I did a parallel mpi compile of WIEN2k 18.2 with fftw 3.3.8 (without ELPA), where -gcc-sys had to be added to CFLAGS [2], and siteconfig completed having no compile errors as seen below. username@computername:~$ cd ~ username@computername:~$ wget

Re: [Wien] System configuration

2019-05-27 Thread Gavin Abo
At [1], it has: /Questions about the library can be sent to the Libxc mailing list./ So, if you are having issues with libxc for your system, you may have to contact the libxc mailing list. However, I downloaded the 30 day trial of Intel's

Re: [Wien] System configuration

2019-05-26 Thread Indranil mal
Sir, I am unable to compile libxc any 4.x ./configure FC= ifort CC=icc but they can be easily compiled with the help of autoconfiguration ./ configure with ifort and icc getting the following error configure: error: in `/home/dps/libxc-4.1.1': configure: error: C compiler cannot create executables

Re: [Wien] System configuration

2019-05-25 Thread Gavin Abo
As the error message says, the libxc library was compiled with a different compiler.  Perhaps you compiled it with gfortran (FC=gfortran).  Recompile it with ifort [ https://www.mail-archive.com/wien@zeus.theochem.tuwien.ac.at/msg17716.html ]. WIEN2k versions after 17 can use the 4.x version

Re: [Wien] System configuration

2019-05-25 Thread Indranil mal
the corresponding compiler and linker options are as follows Recommended options for system linuxifc are: Compiler options:-O1 -FR -mp1 -w -prec_div -pc80 -pad -ip -DINTEL_VML -traceback -assume buffered_io -I$(MKLROOT)/include Linker Flags:$(FOPT)

Re: [Wien] System configuration

2019-05-25 Thread Indranil mal
After setting the compiler ifort and icc and setting the parallel config as follows Current settings: Parallel compiler : mpiifort SCALAPACK_LIBS : -lmkl_scalapack_lp64 -lmkl_blacs_intelmpi_lp64 FFTW_OPT : -DFFTW3 -I/opt/fftw/include

Re: [Wien] System configuration

2019-05-24 Thread Gavin Abo
For WIEN2k 18.2 to disable parallel after siteconfig has been ran once, you can try: ./siteconfig P Configure Parallel Execution Shared Memory Architecture? (y/N) y Enter N / your_specific_command: N Do you ... intend to run finegrained parallel? ... (y/N) N To change parallel settings after

Re: [Wien] System configuration

2019-05-23 Thread Pavel Ondračka
Right, as was written in the previous email, the provided config is a weird mix of ifort and gfortran options, also at some point in siteconfig you did chose that you want parallel build which now fails. > SRC_dstart/compile.msg:make: *** [para] Error 2 All the errors which I have seen up to

Re: [Wien] System configuration

2019-05-23 Thread Gavin Abo
The -mp1, -pad, -traceback, and so on look like ifort specific compiler flags . If you are using gfortran, compiler flags for gfortran need to be used for the Compiling Options in siteconfig.  A good starting pointing is to use the "Recommended options" by siteconfig for linuxgfortran, which

Re: [Wien] System configuration

2019-05-23 Thread Indranil mal
I did the patching but after compiling I am getting the SRC_dstart/compile.msg:gfortran: error: buffered_io: No such file or directory SRC_dstart/compile.msg:gfortran: error: unrecognized command line option ‘-mp1’ SRC_dstart/compile.msg:gfortran: error: unrecognized command line option

Re: [Wien] System configuration

2019-05-23 Thread Pavel Ondračka
I'm putting this also back to the list after I received several private emails. Your timing and the ldd shows that you are linking against reference lapack and blas. You need to replace -llapack -lblas in R_LIBS with -lopenblas (this was discussed before in this thread:

Re: [Wien] System configuration

2019-05-23 Thread Pavel Ondračka
Hi Indranil, I'm sending this again this time also to the list (haven't noticed you removed it), in the hope it might be useful for someone optimizing with gfortran as well... Pavel "Well, first we need to figure out why is your serial lapw so slow... You definitely don't have the

Re: [Wien] System configuration

2019-05-22 Thread Pavel Ondračka
Hi Indranil, While the k-point parallelization is usually the most efficient (provided you have sufficient number of k-points) and does not need any extra libraries, for 100atoms case it might be problematic to fit 12 processes into 32GB of memory. I assume you are already using it since you

Re: [Wien] System configuration

2019-05-22 Thread Dr. K. C. Bhamu
Hii, If you are doing k-point parallel calculation (having number of k-points in IBZ more then 12) then use below script on terminal where you want to run the calculation or use in your job script with -p option in run(sp)_lapw (-so). if anyone knows how to repeat a nth line m times in a file

[Wien] System configuration

2019-05-22 Thread Indranil mal
respected sir/ Users, I am using a PC with intel i7 8th gen (with 12 cores) 32GB RAM and 2TB HDD with UBUNTU 18.04 LTS. I have installed OpenBLAS-0.2.20 and using GNU FORTRAN and c compiler. I am trying to run a system with 100 atoms only two cores are using the rest of them