Dear Iurii,

Yes, It can run successfully after setting d0psi_rs = .false.. Thanks for 
your help.




Kind regards
Weijie Zhou




---------------------------
University of Leeds
PhD student
Weijie Zhou






------------------ ???????? ------------------
??????:                                                                         
                                               "Iurii TIMROV"                   
                                                                 
<[email protected]&gt;;
????????:&nbsp;2022??1??17??(??????) ????6:13
??????:&nbsp;"??"<[email protected]&gt;;"Quantum ESPRESSO users 
Forum"<[email protected]&gt;;

????:&nbsp;Re: Re?? [QE-users] error on running turbo_lanczos.x with MPI



  
Dear Weijie,
 

 
 
If you examine the output file of the turbo_lanczos.x calculation you will see 
the following message:
 

 
 

 Calculation of the dipole in real space
 Real space dipole + USPP is not supported
 

 

 
 
So either you should use norm-conserving pseudopotentials with the option  
d0psi_rs = .true. or you can use ultrasoft pseudopotentials with the option 
d0psi_rs = .false.
 

 
 
HTH
 

 
 
Iurii
 

 
 
P.S.: Next time please share also the input and output files of the pw.x 
program.
 
 

 
     --
 Dr. Iurii TIMROV
 Senior Research Scientist
  Theory and Simulation of Materials (THEOS)
  Swiss Federal Institute of Technology Lausanne (EPFL)
 
   CH-1015 Lausanne, Switzerland
 +41 21 69 34 881
   http://people.epfl.ch/265334
 
 
 
 
 
 
 From: ?? <[email protected]&gt;
 Sent: Monday, January 17, 2022 6:35:54 AM
 To: Iurii TIMROV; Quantum ESPRESSO users Forum
 Subject: Re?? [QE-users] error on running turbo_lanczos.x with MPI &nbsp;
 
   Dear Iurii,
 
 
 As your suggestion, I tried QE7.0, but the error is still there. You can 
access the relative input &amp; output files in
 
 
 
https://drive.google.com/drive/folders/1BObhh63QFBB-oYX1su9aFFGWWTqfYhyn?usp=sharing
 
 
  I hope it can make some help. Thanks.
 
 
 Kind regards
 Weijie Zhou
 
 
 
 
  ---------------------------
 University of Leeds
 PhD student
 Weijie Zhou
 
 
 
  
 
 
 
 ------------------ ???????? ------------------
  ??????: "Iurii TIMROV" <[email protected]&gt;;
 ????????:&nbsp;2022??1??13??(??????) ????8:53
 ??????:&nbsp;"??"<[email protected]&gt;;"Quantum ESPRESSO users 
Forum"<[email protected]&gt;;
 
 ????:&nbsp;Re: [QE-users] error on running turbo_lanczos.x with MPI
 
 
 
  
Dear Weijie Zhou,
 

 
 
Can you try QE 7.0? If you still have the same problem, share your input and 
output files (via e.g. Google Drive).
 

 
 
HTH
 

 
 
Iurii
 
 

 
     --
 Dr. Iurii TIMROV
 Senior Research Scientist
  Theory and Simulation of Materials (THEOS)
  Swiss Federal Institute of Technology Lausanne (EPFL)
 
   CH-1015 Lausanne, Switzerland
 +41 21 69 34 881
   http://people.epfl.ch/265334
 
 
 
 
 
 
 From: users <[email protected]&gt; on behalf of ?? via 
users <[email protected]&gt;
 Sent: Thursday, January 13, 2022 12:51:40 PM
 To: users
 Subject: [QE-users] error on running turbo_lanczos.x with MPI &nbsp;
 
 Dear QE users, 
 
 I am using qe-6.5 version to run turbo_lanczos.x with MPI. It is fine to 
finish the calculation when using norm-conserving or optimized norm-conserving 
vanderbilt pseudopotentail, but the error happens when ultrasoft 
pseudopotential is used as:
 
 
  &nbsp; &nbsp; &nbsp;Program turboTDDFT v.6.5 starts on 12Jan2022 at 13:48:29 
 
 
 &nbsp; &nbsp; &nbsp;This program is part of the open-source Quantum ESPRESSO 
suite
 &nbsp; &nbsp; &nbsp;for quantum simulation of materials; please cite
 &nbsp; &nbsp; &nbsp; &nbsp; &nbsp;"P. Giannozzi et al., J. Phys.:Condens. 
Matter 21 395502 (2009);
 &nbsp; &nbsp; &nbsp; &nbsp; &nbsp;"P. Giannozzi et al., J. Phys.:Condens. 
Matter 29 465901 (2017);
 &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; URL http://www.quantum-espresso.org";, 
 &nbsp; &nbsp; &nbsp;in publications or presentations arising from this work. 
More details at
 &nbsp; &nbsp; &nbsp;http://www.quantum-espresso.org/quote
 
 
 &nbsp; &nbsp; &nbsp;Parallel version (MPI), running on&nbsp; &nbsp; 16 
processors
 
 
 &nbsp; &nbsp; &nbsp;MPI processes distributed on&nbsp; &nbsp; 16 nodes
 &nbsp; &nbsp; &nbsp;R &amp; G space division:&nbsp; proc/nbgrp/npool/nimage 
=&nbsp; &nbsp; &nbsp; 16
 
 
 &nbsp; &nbsp; &nbsp;Reading xml data from directory:
 
 
 &nbsp; &nbsp; &nbsp;../../tmp_Mo_h_p_scf_lda_USPP/Mo_h_p.save/
 
 
 &nbsp; &nbsp; &nbsp;IMPORTANT: XC functional enforced from input :
 &nbsp; &nbsp; &nbsp;Exchange-correlation= PZ
 &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; 
&nbsp; &nbsp; &nbsp;(&nbsp; &nbsp;1&nbsp; &nbsp;1&nbsp; &nbsp;0&nbsp; 
&nbsp;0&nbsp; &nbsp;0&nbsp; &nbsp;0&nbsp; &nbsp;0)
 &nbsp; &nbsp; &nbsp;Any further DFT definition will be discarded
 &nbsp; &nbsp; &nbsp;Please, verify this is what you really want
 
 
 &nbsp;
 &nbsp; &nbsp; &nbsp;Parallelization info
 &nbsp; &nbsp; &nbsp;--------------------
 &nbsp; &nbsp; &nbsp;sticks:&nbsp; &nbsp;dense&nbsp; smooth&nbsp; &nbsp; 
&nbsp;PW&nbsp; &nbsp; &nbsp;G-vecs:&nbsp; &nbsp; dense&nbsp; &nbsp;smooth&nbsp; 
&nbsp; &nbsp; PW
 &nbsp; &nbsp; &nbsp;Min&nbsp; &nbsp; &nbsp; &nbsp; 2610&nbsp; &nbsp; 
2610&nbsp; &nbsp; 651&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; 
&nbsp;401385&nbsp; &nbsp;401385&nbsp; &nbsp;50172
 &nbsp; &nbsp; &nbsp;Max&nbsp; &nbsp; &nbsp; &nbsp; 2612&nbsp; &nbsp; 
2612&nbsp; &nbsp; 654&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; 
&nbsp;401396&nbsp; &nbsp;401396&nbsp; &nbsp;50178
 &nbsp; &nbsp; &nbsp;Sum&nbsp; &nbsp; &nbsp; &nbsp;41777&nbsp; 
&nbsp;41777&nbsp; 10437&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; 
6422239&nbsp; 6422239&nbsp; 802807
 &nbsp;
 
 
 &nbsp; &nbsp; &nbsp;Check: negative core charge=&nbsp; &nbsp;-0.000003
 
 
 &nbsp; &nbsp; &nbsp;negative rho (up, down):&nbsp; 9.303E-01 0.000E+00
 &nbsp; &nbsp; &nbsp;Reading collected, re-writing distributed wavefunctions
 &nbsp;Symmetries are disabled for the gamma_only case
 
 
 &nbsp; &nbsp; &nbsp;Subspace diagonalization in iterative solution of the 
eigenvalue problem:
 &nbsp; &nbsp; &nbsp;a serial algorithm will be used
 
 
 
 
 &nbsp; &nbsp; 
&nbsp;=-----------------------------------------------------------------=
 
 
 &nbsp; &nbsp; &nbsp;Please cite the TDDFPT project as:
 &nbsp; &nbsp; &nbsp; &nbsp;O. B. Malcioglu, R. Gebauer, D. Rocca, and S. 
Baroni,
 &nbsp; &nbsp; &nbsp; &nbsp;Comput. Phys. Commun. 182, 1744 (2011)
 &nbsp; &nbsp; &nbsp;and
 &nbsp; &nbsp; &nbsp; &nbsp;X. Ge, S. J. Binnie, D. Rocca, R. Gebauer, and S. 
Baroni,
 &nbsp; &nbsp; &nbsp; &nbsp;Comput. Phys. Commun. 185, 2080 (2014)
 &nbsp; &nbsp; &nbsp;in publications and presentations arising from this work.
 
 
 &nbsp; &nbsp; 
&nbsp;=-----------------------------------------------------------------=
 
 
 &nbsp; &nbsp; &nbsp;Ultrasoft (Vanderbilt) Pseudopotentials
 
 
 &nbsp; &nbsp; &nbsp;Normal read
 
 
 &nbsp; &nbsp; &nbsp;Gamma point algorithm
 
 
 &nbsp; &nbsp; &nbsp;Calculation of the dipole in real space
 &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; Real space dipole + USPP is not supported
 
 
 --------------------------------------------------------------------------
 mpirun has exited due to process rank 0 with PID 0 on
 node dc1s2b3c exiting improperly. There are three reasons this could occur:
 
 
 1. this process did not call "init" before exiting, but others in
 the job did. This can cause a job to hang indefinitely while it waits
 for all processes to call "init". By rule, if one process calls "init",
 then ALL processes must call "init" prior to termination.
 
 
 2. this process called "init", but exited without calling "finalize".
 By rule, all processes that call "init" MUST call "finalize" prior to
 exiting or it will be considered an "abnormal termination"
 
 
 3. this process called "MPI_Abort" or "orte_abort" and the mca parameter
 orte_create_session_dirs is set to false. In this case, the run-time cannot
 detect that the abort call was an abnormal termination. Hence, the only
 error message you will receive is this one.
 
 
 This may have caused other processes in the application to be
 terminated by signals sent by mpirun (as reported here).
 
 
 You can avoid this message by specifying -quiet on the mpirun command line.
 --------------------------------------------------------------------------
 
 
 
 If you have any clue about this error, please help me. Thank you.
 
 
 
 
 Best wishes??
 Weijie Zhou
 
 
 
 
 
 
 
 
 
 
 
 
 ---------------------------
 University of Leeds
 PhD student
 Weijie Zhou
_______________________________________________
Quantum ESPRESSO is supported by MaX (www.max-centre.eu)
users mailing list [email protected]
https://lists.quantum-espresso.org/mailman/listinfo/users

Reply via email to