Dear all I tried to use full paths, but it didn't give positive results. It wrote an error message
application called MPI_Abort(MPI_COMM_WORLD, 0) - process 0 On 7 March 2011 10:30, Alexander Kvashnin <agkvashnin at gmail.com> wrote: > Thanks, I tried to use "<" instead of "-in" it also didn't work. > OK,I will try to use full paths for input and output, and answer about > result. > > ----- ???????? ????????? ----- > ??: Omololu Akin-Ojo <prayerz.omo at gmail.com> > ??????????: 7 ????? 2011 ?. 9:56 > ????: PWSCF Forum <pw_forum at pwscf.org> > ????: Re: [Pw_forum] ??: Re: problem in MPI running of QE (16 processors) > > Try to see if specifying the full paths help. > E.g., try something like: > > mpiexec /home/MyDir/bin/pw.x -in /scratch/MyDir/graph.inp > > /scratch/MyDir/graph.out > > (where /home/MyDir/bin is the full path to your pw.x and > /scratch/MyDir/graph.inp is the full path to your output ....) > > ( I see you use "-in" instead of "<" to indicate the input. I don't > know too much but _perhaps_ you could also _try_ using "<" instead of > "-in") . > > o. > > On Mon, Mar 7, 2011 at 7:31 AM, Alexander Kvashnin <agkvashnin at gmail.com> > wrote: > > Yes, I wrote > > > > #PBS -l nodes=16:ppn=4 > > > > And in userguide of MIPT-60 wrote,that mpiexec must choose number of > > processors automatically, that's why I didn't write anything else > > > > > > ________________________________ > > ????: Huiqun Zhou <hqzhou at nju.edu.cn> > > ????????????????????: 7 ?????????? 2011 ??. 7:52 > > ????????: PWSCF Forum <pw_forum at pwscf.org> > > ????????: Re: [Pw_forum] problem in MPI running of QE (16 processors) > > > > How did you apply number of node, procs per node in your job > > script? > > > > #PBS -l nodes=?:ppn=? > > > > zhou huiqun > > @earth sciences, nanjing university, china > > > > > > ----- Original Message ----- > > From: Alexander G. Kvashnin > > To: PWSCF Forum > > Sent: Saturday, March 05, 2011 2:53 AM > > Subject: Re: [Pw_forum] problem in MPI running of QE (16 processors) > > I create PBS task on supercomputer MIPT-60 where I write > > > > mpiexec ../../espresso-4.2.1/bin/pw.x -in graph.inp > output.opt > > all other > > [??????? ?? ???? ????? ????????? ?????????] > -- Sincerely yours Alexander G. Kvashnin -------------------------------------------------------------------------------------------------------------------------------- Student Moscow Institute of Physics and Technology http://mipt.ru/ 141700, Institutsky lane 9, Dolgoprudny, Moscow Region, Russia Junior research scientist Technological Institute for Superhard and Novel Carbon Materials http://www.ntcstm.troitsk.ru/ 142190, Central'naya St. 7a, Troitsk, Moscow Region, Russia ================================================================ -------------- next part -------------- An HTML attachment was scrubbed... URL: http://www.democritos.it/pipermail/pw_forum/attachments/20110307/6fa9ea8b/attachment-0001.htm
