Re: [Wien] MPI Problem

2013-05-04 Thread Laurence Marks
It looks as if your .machines file is OK, I assume that you added the A*** in front for emailing, but Wien2k does not use a hosts file itself. I guess that you are using a server at ibm in almaden. Unfortunately very few people that I know of are running WIen2k on ibm/aix machines which is going

Re: [Wien] MPI Problem

2013-05-03 Thread Oliver Albertini
Thanks to you both for the suggestions. The OS was recently updated beyond those versions mentioned in the link (now 6100-08). Adding the iostat statement to all the errclr.f files prevents the program from stopping altogether although error messages sill appear in the output: STOP LAPW0 END

Re: [Wien] MPI Problem

2013-05-03 Thread Laurence Marks
Please have a look at the end of case.outputup_* which gives the real cpu and wall times and post those. It may be that the times being reported are misleading. In addition, I do not understand why you are seeing an error and the script is continuing - it should not. Maybe some of the tasks are

[Wien] MPI Problem

2013-05-02 Thread Oliver Albertini
Dear W2K, On an AIX 560 server with 16 processors, I have been running scf for NiO supercell (2x2x2) in serial as well as MPI parallel (one kpoint). The serial version runs fine. When running in parallel, the following error appears: STOP LAPW2 - FERMI; weighs written errclr.f, line 64: 1525-014

Re: [Wien] MPI Problem

2013-05-02 Thread Laurence Marks
I think these are semi-harmless, and you can add ,iostat=i to the relevant lines. You may need to add the same to any write statements to unit 99 in errclr.f. However, your timing seems strange, 6.5 serial versus 9.5 parallel. Is this CPU time, the WALL time may be more reliable.

Re: [Wien] MPI Problem

2013-05-02 Thread Gavin Abo
STOP LAPW0 END inilpw.f, line 233: 1525-142 The CLOSE statement on unit 200 cannot be completed because an errno value of 2 (A file or directory in the path name does not exist.) was received while closing the file. The program will stop. STOP LAPW1 END If this is on operating system AIX

[Wien] MPI Problem

2012-01-23 Thread Paul Fons
Hi, I have Wien2K running on a cluster of linux boxes each with 32 cores and connected by 10Gb ethernet. I have compiled Wien2K by the 3.174 version of Wien2K (I learned the hard way that bugs in the newer versions of the Intel compiler lead to crashes in Wien2K). I have also

[Wien] MPI Problem

2012-01-23 Thread Paul Fons
Thank you very much for your suggestion. I actually managed to figure this out by myself an hour or so ago. At the same time (usually not a good idea) I also compiled the mkl interface for fftw2 rather than using the standalone version I had compiled by myself earlier. Thus the RP library

[Wien] MPI Problem

2012-01-23 Thread Peter Blaha
Read the UG about mpi-parallelization. It is not supposed to give you any performance for a TiC case. It is useful ONLY for larger cases. Using 5 mpi processes is particular bad. One should divide the matrices into 2x2, 4x4 or (for your 32 core machines into 4x8, but not into 1x5, 1x7,

[Wien] MPI Problem

2012-01-22 Thread Laurence Marks
A guess: you are using the wrong version of blacs. You need a -lmkl_blacs_intelmpi_XX where XX is the one for your system. I have seen this give the same error. Use http://software.intel.com/en-us/articles/intel-mkl-link-line-advisor/ For reference, with openmpi it is _openmpi_ instead of

[Wien] MPI problem for LAPW2

2009-09-30 Thread Robert Laskowski
Hi, we would need more information to help you. For example .machines, number of atoms, matrix size ... regards Robert On Wednesday 30 September 2009 03:09:22 Duy Le wrote: Dear Wien2kers, I managed successfully k-parallel. I have now to deal with a very big system that definitely needs

[Wien] MPI problem for LAPW2

2009-09-30 Thread Duy Le
Thank you for your all inputs. I am running test on a system of 21 atoms, with spin polarized calculation, with 2 k-points, without inversion symmetry. Of course this test only with small system. So there would be no problem with the matrix size. The .machines file I have provided in my previous

[Wien] MPI problem for LAPW2

2009-09-30 Thread Duy Le
Your comment sounds reasonable. However, our machines are pretty new and they do have 4GB RAM/core. I can handle this job with one single core, so I am not sure if you are correct about memory problem. I will check more details about memory when I get the same problem again. Anyway, it works now

[Wien] MPI problem for LAPW2

2009-09-29 Thread Duy Le
Dear Wien2kers, I managed successfully k-parallel. I have now to deal with a very big system that definitely needs fully mpi wien2k. The compilation was successful. I can run mpi with lapw0, lapw1, and lapw2. However, lapw2 can run without problem with certain number of PROCESSORS PER MPI JOB (in