Dear Abdulla and Quantum Espresso friends,

It is great that you have sorted it out. However, to learn from this, please
consider the following questions?

If you change -nk 4 to -nk 1, then how does QE run over 24 processors?

or slightly differently phrased:

Why did you use 24 processors and not 16 or 23 or any other number?

I assumed that -np 24 means 24 processors.


On 5 May 2020, at 09:42, Abdulla Bin Afif 
<[email protected]<mailto:[email protected]>> wrote:

Hi Michal and QE friends,

Thanks for the helpful suggestions , the advice worked.

When the nk (last second line) was changed from 4 to 1 in the submit script, 
the SCF calculations were executed. I guess this was the way to change the 
npool.

/////////////////////////////////////////////////////////
#!/bin/bash
#SBATCH --partition=CPUQ
#SBATCH --account=iv-mtp
#SBATCH --time=99:00:00
#SBATCH --nodes=1
#SBATCH --ntasks-per-node=8
#SBATCH --mem=48000
#SBATCH --job-name="hello_test"
#SBATCH --output=job_%j.out
#SBATCH --error=job_%j.err
#SBATCH 
[email protected]<mailto:[email protected]>
#SBATCH --mail-type=NONE

echo "The working directory is $SLURM_SUBMIT_DIR"
echo "Running QE with: $cmd on $SLURM_JOB_NODELIST in directory "`pwd`
pwscf=/share/apps/software/QuantumESPRESSO/6.4.1-intel-2019a/bin/pw.x
input=TMA.pw.in

cd $SLURM_SUBMIT_DIR

echo "Job started at "`date`
mpiexec -np 24 $pwscf -nk 1 < $input > $SLURM_SUBMIT_DIR/TMA.pw.out
echo "Job finished at "`date`

////////////////////////////////////////////////////////////////////////////////////////////////////

Thanks and Regards,
Abdulla
-----Original Message-----
From: Michal Krompiec 
<[email protected]<mailto:[email protected]>>
Sent: Monday, May 4, 2020 1:05 PM
To: Quantum ESPRESSO users Forum 
<[email protected]<mailto:[email protected]>>;
Abdulla Bin Afif <[email protected]<mailto:[email protected]>>
Subject: Re: [QE-users] [SUSPECT ATTACHMENT REMOVED] Error in routine
divide_et_impera (1):

Dear Abdulla,
No need to run on 1 processor. The error is caused by the fact that the
number of k-points (1) is not divisible by the number of MPI pools
(4 in your case). Run with -npool 1 instead of -npool 4.
Best,
Michal Krompiec
Merck KGaA

On Mon, 4 May 2020 at 11:59, Offermans Willem 
<[email protected]<mailto:[email protected]>>
wrote:

Dear Abdulla and Quantum Espresso friend,

From the error message you showed I got the impression that there is
something wrong in the way you run mpi. If you run parallel along kpoints
(111), then some nodes will be workless.

Run on 1 processor (not parallel) and the error message might disappear.

If I recall correctly, you can also run in parallel along the bands. That might
be an alternative, if you insist on running in parallel.



On 3 May 2020, at 23:21, Abdulla Bin Afif 
<[email protected]<mailto:[email protected]>>
wrote:

Hi QE community members,


When I conduct scf calculation on an organometallic TMA
(trimethylaluminium), it shows an error.

For molecules we consider Kpoints 11100, with this its gives  the below
error and when the Kpoints are changed to 33300 there is no error, I am not
sure why it’s not working with 111000.

I have a considered TMA compound  inside a unit cell of 15A.


%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
    Error in routine divide_et_impera (1):
    some nodes have no k-points

%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%%%%%%%%%%%%%%%%%%%%%%
%%%%%%%%%%
&CONTROL
 calculation = 'scf',
 outdir = '.',
 prefix = 'calc',
 pseudo_dir = '.',
 tprnfor = .true.,
 tstress = .true.,
/
&SYSTEM
 degauss =   0.00734986475817d0,
 ecutrho =   367.493237909d0,
 ecutwfc =   36.7493237909d0,
 ibrav=1,
 celldm(1)=28.3459,
 nat = 13,
 ntyp = 3,
 occupations = 'smearing',
 smearing = 'cold',
 input_dft='PBE',
/
&ELECTRONS
 diagonalization='david',
 conv_thr=7.34986475817e-07,
 mixing_mode='plain',
 electron_maxstep=100,
 mixing_beta=0.7d0,
/
ATOMIC_SPECIES
Al     26.98154 Al.UPF,
C      12.011 C.UPF,
H      1.00794 H.UPF,
ATOMIC_POSITIONS {crystal}
Al           0.5344800000       0.5466800000       0.5342700000
C            0.4242100000       0.6103500000       0.5344500000
C            0.6447600000       0.6103500000       0.5344500000
C            0.5344800000       0.4193500000       0.5344500000
H            0.4032300000       0.6226500000       0.4659700000
H            0.3735600000       0.5709800000       0.5685900000
H            0.4329000000       0.6737500000       0.5688800000
H            0.6657400000       0.6222700000       0.6029900000
H            0.6954000000       0.5711800000       0.5000900000
H            0.6360700000       0.6739400000       0.5003700000
H            0.5343200000       0.3950300000       0.4659700000
H            0.5939000000       0.3951800000       0.5685900000
H            0.4752400000       0.3951800000       0.5688800000
K_POINTS (automatic)
1 1 1 0 0 0


Output

This program is part of the open-source Quantum ESPRESSO suite
    for quantum simulation of materials; please cite
        "P. Giannozzi et al., J. Phys.:Condens. Matter 21 395502 (2009);
        "P. Giannozzi et al., J. Phys.:Condens. Matter 29 465901 (2017);
         URL 
https://eur02.safelinks.protection.outlook.com/?url=http%3A%2F%2Fwww.quantum-espresso.org%2F&amp;data=02%7C01%7Cwillem.offermans%40vito.be%7Cbaa036ccb18b49140bba08d7f0c7ecd0%7C9e2777ed82374ab992782c144d6f6da3%7C0%7C1%7C637242613776612040&amp;sdata=MWiVAm9lC6q3m60rd6vPWtFvk4W7K84tEjg4evpgO9Y%3D&amp;reserved=0";,
    in publications or presentations arising from this work. More details at
    
https://eur02.safelinks.protection.outlook.com/?url=http%3A%2F%2Fwww.quantum-espresso.org%2Fquote&amp;data=02%7C01%7Cwillem.offermans%40vito.be%7Cbaa036ccb18b49140bba08d7f0c7ecd0%7C9e2777ed82374ab992782c144d6f6da3%7C0%7C1%7C637242613776612040&amp;sdata=A9wOSa7l3dVRjkV1vFgR1YWgamUhf4Z4qtFrHNXPzrQ%3D&amp;reserved=0

    Parallel version (MPI & OpenMP), running on      24 processor cores
    Number of MPI processes:                24
    Threads/MPI process:                     1

    MPI processes distributed on     1 nodes
    K-points division:     npool     =       4
    R & G space division:  proc/nbgrp/npool/nimage =       6
    Waiting for input...
    Reading input from standard input

    Current dimensions of program PWSCF are:
    Max number of different atomic species (ntypx) = 10
    Max number of k-points (npk) =  40000
    Max angular momentum in pseudopotentials (lmaxx) =  3

    IMPORTANT: XC functional enforced from input :
    Exchange-correlation      = PBE ( 1  4  3  4 0 0)
    Using LIBXC version       =    4   3   4
    Any further DFT definition will be discarded
    Please, verify this is what you really want


    Subspace diagonalization in iterative solution of the eigenvalue
problem:
    a serial algorithm will be used



%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
    Error in routine divide_et_impera (1):
    some nodes have no k-points

%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%%%%%%%%%%%%%%%%%%%%%%
%%%%%%%%%%

    stopping ...


Thanks and Regards,

Abdulla Bin Afif
Ph.D. Candidate
Norwegian University of Science and Technology (NTNU) MTP -
Department of Mechanical and Industrial Engineering Richard
Birkelandsvei 2b
NO-7491 Trondheim. Norway

Email: [email protected]<mailto:[email protected]>
Mobil: +47 41348358

_______________________________________________
Quantum ESPRESSO is supported by MaX
(https://eur02.safelinks.protection.outlook.com/?url=http%3A%2F%2Fww
w.max-centre.eu%2Fquantum-
espresso&amp;data=02%7C01%7Cwillem.offerma

ns%40vito.be<http://40vito.be>%7Cf05a0306fbdc4d8971c408d7efa81b0a%7C9e2777ed82374ab9
92

782c144d6f6da3%7C0%7C0%7C637241377603324058&amp;sdata=7BHR3eY2a
9gzlV
8EYWGj%2FA8AG5ueLB0hU5wmbi1yOxc%3D&amp;reserved=0)
users mailing list 
[email protected]<mailto:[email protected]>
https://eur02.safelinks.protection.outlook.com/?url=https%3A%2F%2Fli
sts.quantum-
espresso.org<http://espresso.org>%2Fmailman%2Flistinfo%2Fusers&amp;data=02%7C

01%7Cwillem.offermans%40vito.be<http://40vito.be>%7Cf05a0306fbdc4d8971c408d7efa81b0a%
7

C9e2777ed82374ab992782c144d6f6da3%7C0%7C0%7C637241377603324058&
amp;s

data=jOZq%2BJz9x4pojweKZaTHrMdgRp9tevdCyy5crCfHDrg%3D&amp;reserve
d=0

VITO Disclaimer: http://www.vito.be/e-maildisclaimer
_______________________________________________
Quantum ESPRESSO is supported by MaX
(https://eur02.safelinks.protection.outlook.com/?url=http%3A%2F%2Fwww.max-centre.eu%2Fquantum-espresso&amp;data=02%7C01%7Cwillem.offermans%40vito.be%7Cbaa036ccb18b49140bba08d7f0c7ecd0%7C9e2777ed82374ab992782c144d6f6da3%7C0%7C1%7C637242613776612040&amp;sdata=IQk3YzLUY2nrQhYBqCkzEQ5xkgF6gg96qYtbJGXwnjQ%3D&amp;reserved=0)
users mailing list 
[email protected]<mailto:[email protected]>
https://eur02.safelinks.protection.outlook.com/?url=https%3A%2F%2Flists.quantum-espresso.org%2Fmailman%2Flistinfo%2Fusers&amp;data=02%7C01%7Cwillem.offermans%40vito.be%7Cbaa036ccb18b49140bba08d7f0c7ecd0%7C9e2777ed82374ab992782c144d6f6da3%7C0%7C1%7C637242613776622037&amp;sdata=BR1S3ZKicQbI%2Bu85TQPN0wK2nshRhT1lhOmC0gVE3RA%3D&amp;reserved=0
_______________________________________________
Quantum ESPRESSO is supported by MaX 
(https://eur02.safelinks.protection.outlook.com/?url=http%3A%2F%2Fwww.max-centre.eu%2Fquantum-espresso&amp;data=02%7C01%7Cwillem.offermans%40vito.be%7Cbaa036ccb18b49140bba08d7f0c7ecd0%7C9e2777ed82374ab992782c144d6f6da3%7C0%7C1%7C637242613776622037&amp;sdata=4GwmE3Jxvx9obebur9pl6NgpJSYMAOzMaAAO6NxQYlM%3D&amp;reserved=0)
users mailing list 
[email protected]<mailto:[email protected]>
https://eur02.safelinks.protection.outlook.com/?url=https%3A%2F%2Flists.quantum-espresso.org%2Fmailman%2Flistinfo%2Fusers&amp;data=02%7C01%7Cwillem.offermans%40vito.be%7Cbaa036ccb18b49140bba08d7f0c7ecd0%7C9e2777ed82374ab992782c144d6f6da3%7C0%7C1%7C637242613776622037&amp;sdata=BR1S3ZKicQbI%2Bu85TQPN0wK2nshRhT1lhOmC0gVE3RA%3D&amp;reserved=0

_______________________________________________
Quantum ESPRESSO is supported by MaX (www.max-centre.eu/quantum-espresso)
users mailing list [email protected]
https://lists.quantum-espresso.org/mailman/listinfo/users

Reply via email to