Re: [Wien] problem with parralel execution

2016-07-04 Thread Peter Blaha

The error is in lapw0.

This points to a problem with the fftw3 mpi-installation.


On 07/01/2016 08:56 AM, remi marchal wrote:

Dear Wien-Users,

I have compiled successfully (at least without any error during the
compilation) WIEN2K_14 with openmpi-1.10, intel16 with mkl support and
gcc on Linux (Debian 8). Openmpi have been compiled with ifort16.

Everything works fine in serial for the TiC example but when I move to
parallel version (each Kpoint computed with 2 CPUS), the calculation
crashed with the following error message:

lapw0para -up lapw0.def

starting parallel lapw0 at vendredi 1 juillet 2016, 08:52:32 (UTC+0200)
 .machine0 : 2 processors
[*:13545] *** An error occurred in MPI_Comm_rank
[*:13545] *** reported by process [3506765825,1]
[*:13545] *** on communicator MPI_COMM_WORLD
[*:13545] *** MPI_ERR_COMM: invalid communicator
[*:13545] *** MPI_ERRORS_ARE_FATAL (processes in this communicator
will now abort,
[*:13545] ***and potentially your MPI job)
[*:13542] 1 more process has sent help message help-mpi-errors.txt /
mpi_errors_are_fatal
[*:13542] Set MCA parameter "orte_base_help_aggregate" to 0 to see
all help / error messages

bellow is my .machines file:
lapw0:localhost localhost
1:localhost localhost
granularity:1
extrafine:1
lapw2_vector_split:1

Do you have so idea to solve this problem.

Regards

Rémi






___
Wien mailing list
Wien@zeus.theochem.tuwien.ac.at
http://zeus.theochem.tuwien.ac.at/mailman/listinfo/wien
SEARCH the MAILING-LIST at:  
http://www.mail-archive.com/wien@zeus.theochem.tuwien.ac.at/index.html



--

  P.Blaha
--
Peter BLAHA, Inst.f. Materials Chemistry, TU Vienna, A-1060 Vienna
Phone: +43-1-58801-165300 FAX: +43-1-58801-165982
Email: bl...@theochem.tuwien.ac.atWIEN2k: http://www.wien2k.at
WWW:   http://www.imc.tuwien.ac.at/staff/tc_group_e.php
--
___
Wien mailing list
Wien@zeus.theochem.tuwien.ac.at
http://zeus.theochem.tuwien.ac.at/mailman/listinfo/wien
SEARCH the MAILING-LIST at:  
http://www.mail-archive.com/wien@zeus.theochem.tuwien.ac.at/index.html


[Wien] problem with parralel execution

2016-07-01 Thread remi marchal
Dear Wien-Users,

I have compiled successfully (at least without any error during the 
compilation) WIEN2K_14 with openmpi-1.10, intel16 with mkl support and gcc on 
Linux (Debian 8). Openmpi have been compiled with ifort16.

Everything works fine in serial for the TiC example but when I move to parallel 
version (each Kpoint computed with 2 CPUS), the calculation crashed with the 
following error message:

lapw0para -up lapw0.def 

starting parallel lapw0 at vendredi 1 juillet 2016, 08:52:32 (UTC+0200)
 .machine0 : 2 processors
[*:13545] *** An error occurred in MPI_Comm_rank
[*:13545] *** reported by process [3506765825,1]
[*:13545] *** on communicator MPI_COMM_WORLD
[*:13545] *** MPI_ERR_COMM: invalid communicator
[*:13545] *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now 
abort,
[*:13545] ***and potentially your MPI job)
[*:13542] 1 more process has sent help message help-mpi-errors.txt / 
mpi_errors_are_fatal
[*:13542] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help 
/ error messages

bellow is my .machines file:
lapw0:localhost localhost
1:localhost localhost
granularity:1
extrafine:1
lapw2_vector_split:1

Do you have so idea to solve this problem.

Regards

Rémi




___
Wien mailing list
Wien@zeus.theochem.tuwien.ac.at
http://zeus.theochem.tuwien.ac.at/mailman/listinfo/wien
SEARCH the MAILING-LIST at:  
http://www.mail-archive.com/wien@zeus.theochem.tuwien.ac.at/index.html