Re: [Wien] Installation with MPI and GNU compilers

2018-05-03 Thread Pavel Ondračka
f the authors are > > accepting patches anyway... > > > > Best regards > > Pavel > > > > > Ciao > > > Gerhard > > > > > > DEEP THOUGHT in D. Adams; Hitchhikers Guide to the Galaxy: > > > "I think the problem, to be quite hon

Re: [Wien] Installation with MPI and GNU compilers

2018-05-02 Thread Rui Costa
Changing "#if defined (INTEL_VML)" to "#if defined (INTEL_VML_HAMILT)" in SRC_lapw1/hamilt.F really improved Hamilt but seems like DIAG is a little slower. In my pc (Intel(R) Core(TM) i7-2630QM CPU @ 2.00GHz, 4 cores, 8 Gb RAM) the benchmark tests went from: Simulation Total (CPU/Wall)

Re: [Wien] Installation with MPI and GNU compilers

2018-05-02 Thread Rui Costa
I did the benchmark test with the -DINTEL_VML_HAMILT, but since my email was too big it was waiting for confirmation, so I'll divide it: I added the print statement to the inilpw.f file and I get the same results, i.e., it prints only: iunit = 4 iunit = 5 iunit = 6 Even when I run the

Re: [Wien] Installation with MPI and GNU compilers

2018-05-02 Thread Laurence Marks
roblem, to be quite honest with you, > is that you have never actually known what the question is." > > > Dr. Gerhard H. Fecher > Institut of Inorganic and Analytical Chemistry > Johannes Gutenberg - University > 55099 Mainz > and >

Re: [Wien] Installation with MPI and GNU compilers

2018-05-02 Thread Pavel Ondračka
dra...@email.cz] Gesendet: Mittwoch, 2. Mai 2018 12:05 An: Fecher, Gerhard Betreff: Re: [Wien] Installation with MPI and GNU compilers I'm using private answer since this might be getting too technical for the list and in fact not interesting for majority of users... Fecher, Gerhard píše v

Re: [Wien] Installation with MPI and GNU compilers

2018-05-02 Thread Laurence Marks
> 55099 Mainz > and > Max Planck Institute for Chemical Physics of Solids > 01187 Dresden > ________________ > Von: Pavel Ondračka [pavel.ondra...@email.cz] > Gesendet: Mittwoch, 2. Mai 2018 12:05 > An: Fecher, Gerhard > Betreff: Re: [Wien] Install

Re: [Wien] Installation with MPI and GNU compilers

2018-05-02 Thread Fecher, Gerhard
y Johannes Gutenberg - University 55099 Mainz and Max Planck Institute for Chemical Physics of Solids 01187 Dresden Von: Pavel Ondračka [pavel.ondra...@email.cz] Gesendet: Mittwoch, 2. Mai 2018 12:05 An: Fecher, Gerhard Betreff: Re: [Wien] Installation with M

Re: [Wien] Installation with MPI and GNU compilers

2018-05-02 Thread Fecher, Gerhard
[wien-boun...@zeus.theochem.tuwien.ac.at] im Auftrag von Pavel Ondračka [pavel.ondra...@email.cz] Gesendet: Mittwoch, 2. Mai 2018 10:30 An: A Mailing list for WIEN2k users Betreff: Re: [Wien] Installation with MPI and GNU compilers Rui Costa píše v Po 30. 04. 2018 v 22:24 +0100: > I have the

Re: [Wien] Installation with MPI and GNU compilers

2018-05-02 Thread Pavel Ondračka
users <w...@zeus.theochem.tuwien.ac > > .at> > > Datum: 30. 4. 2018 19:39:44 > > Předmět: Re: [Wien] Installation with MPI and GNU compilers > > > > > I was able to install wien2k with gfortran+MKL. Apparently the > > > MKL libraries are free [https://software.

Re: [Wien] Installation with MPI and GNU compilers

2018-05-01 Thread Gavin Abo
Using: 64 bit Ubuntu 16.04.4 LTS WIEN2k 17.1 (with the siteconfig, libxc, and gfortran patches [ https://github.com/gsabo/WIEN2k-Patches/tree/master/17.1 ]) username@computername:~$ gfortran --version GNU Fortran (Ubuntu 5.4.0-6ubuntu1~16.04.9) 5.4.0 20160609 In SRC_lapw1/inilpw.f, I added a

Re: [Wien] Installation with MPI and GNU compilers

2018-04-30 Thread Rui Costa
d: Rui Costa <ruicosta@gmail.com> > Komu: A Mailing list for WIEN2k users <wien@zeus.theochem.tuwien.ac.at> > Datum: 30. 4. 2018 19:39:44 > Předmět: Re: [Wien] Installation with MPI and GNU compilers > > I was able to install wien2k with gfortran+MKL. Apparently the MKL >

Re: [Wien] Installation with MPI and GNU compilers

2018-04-30 Thread Pavel Ondračka
-- Původní e-mail -- Od: Rui Costa <ruicosta@gmail.com> Komu: A Mailing list for WIEN2k users <wien@zeus.theochem.tuwien.ac.at> Datum: 30. 4. 2018 19:39:44 Předmět: Re: [Wien] Installation with MPI and GNU compilers " I was able to install wien2k with gfortra

Re: [Wien] Installation with MPI and GNU compilers

2018-04-30 Thread Rui Costa
I was able to install wien2k with gfortran+MKL. Apparently the MKL libraries are free [https://software.intel.com/en-us/performance-libraries] but not the compilers. While doing the benchmark tests we noticed that during the Hamilt there was a huge difference between this and an ifort+MKL

Re: [Wien] Installation with MPI and GNU compilers

2018-04-05 Thread Pavel Ondračka
Laurence Marks píše v St 04. 04. 2018 v 16:01 +: > I confess to being rather doubtful that gfortran+... is comparable to > ifort+... for Intel cpu, it might be for AMD. While the mkl vector > libraries are useful in a few codes such as aim, they are minor for > the main lapw[0-2]. Well, some

Re: [Wien] Installation with MPI and GNU compilers

2018-04-04 Thread Laurence Marks
I confess to being rather doubtful that gfortran+... is comparable to ifort+... for Intel cpu, it might be for AMD. While the mkl vector libraries are useful in a few codes such as aim, they are minor for the main lapw[0-2]. On Wed, Apr 4, 2018, 10:55 Pavel Ondračka

Re: [Wien] Installation with MPI and GNU compilers

2018-04-04 Thread Pavel Ondračka
Rui Costa píše v St 04. 04. 2018 v 14:21 +0100: > I will see what I can do about the Intel compilers. I've had a > question about this, supposedly the intel compilers are the fastest > [https://www.mail- > archive.com/wien@zeus.theochem.tuwien.ac.at/msg13021.html], but how > much faster are they

Re: [Wien] Installation with MPI and GNU compilers

2018-04-04 Thread Rui Costa
I will see what I can do about the Intel compilers. I've had a question about this, supposedly the intel compilers are the fastest [ https://www.mail-archive.com/wien@zeus.theochem.tuwien.ac.at/msg13021.html], but how much faster are they than the others? I expect this to vary from case to case

Re: [Wien] Installation with MPI and GNU compilers

2018-04-03 Thread Gavin Abo
Some comments: I haven't seen many mailing list posts about using a gfortran-based mpi.  That is probably because the clusters used for mpi are likely systems that cost something like $100k to $1 millon.  Those systems usually seem to be running Intel MPI.  So companies, computing centers,