Hi,

GROMACS is compute bound when it is not network bound, but the output of ps
is barely informative. Looking inside md.log file for the helpful
diagnostics mdrun prints is a fine start. Also, do check out
http://manual.gromacs.org/documentation/2016/user-guide/mdrun-performance.html
for
basics. To know how much hardware makes sense to use for a given
simulation, you need to look at performance on a single core and on all
cores of a node before worrying about adding a second node.

Mark

On Sun, Oct 16, 2016 at 8:24 PM Parvez Mh <parvezm...@gmail.com> wrote:

> Not sure, I do not use OpenMPI, you could try compile following simple mpi
> program and run, see if you get proper node allocation .
>
> --Masrul
>
> On Sun, Oct 16, 2016 at 1:15 PM, Mahmood Naderan <mahmood...@gmail.com>
> wrote:
>
> > Well that is provided by nodes=2:ppn=10 in the PBS script.
> >
> > Regards,
> > Mahmood
> >
> >
> >
> > On Sun, Oct 16, 2016 at 9:26 PM, Parvez Mh <parvezm...@gmail.com> wrote:
> >
> > > Hi,
> > >
> > > Where is -np option in mpirun ?
> > >
> > > --Masrul
> > >
> > > On Sun, Oct 16, 2016 at 12:45 PM, Mahmood Naderan <
> mahmood...@gmail.com>
> > > wrote:
> > >
> > > > Hi,
> > > > A PBS script for a gromacs job has been submitted with the following
> > > > content:
> > > >
> > > > #!/bin/bash
> > > > #PBS -V
> > > > #PBS -q default
> > > > #PBS -j oe
> > > > #PBS -l nodes=2:ppn=10
> > > > #PBS -N LPN
> > > > #PBS -o /home/dayer/LPN/mdout.out
> > > > cd $PBS_O_WORKDIR
> > > > mpirun gromacs-5.1/bin/mdrun_mpi -v
> > > >
> > > >
> > > > As I ssh'ed to the nodes and saw mdrun_mpi process, I noticed that
> the
> > > cpu
> > > > utilization is not good enough!
> > > >
> > > >
> > > > [root@compute-0-1 ~]# ps aux | grep mdrun_mpi
> > > > dayer     7552 64.1  0.0 199224 21300 ?        RNl  Oct15 1213:39
> > > > gromacs-5.1/bin/mdrun_mpi -v
> > > > dayer     7553 56.8  0.0 201524 23044 ?        RNl  Oct15 1074:47
> > > > gromacs-5.1/bin/mdrun_mpi -v
> > > > dayer     7554 64.1  0.0 201112 22364 ?        RNl  Oct15 1213:25
> > > > gromacs-5.1/bin/mdrun_mpi -v
> > > > dayer     7555 56.5  0.0 198336 20408 ?        RNl  Oct15 1070:17
> > > > gromacs-5.1/bin/mdrun_mpi -v
> > > > dayer     7556 64.3  0.0 225796 48436 ?        RNl  Oct15 1217:35
> > > > gromacs-5.1/bin/mdrun_mpi -v
> > > > dayer     7557 56.1  0.0 198444 20404 ?        RNl  Oct15 1062:26
> > > > gromacs-5.1/bin/mdrun_mpi -v
> > > > dayer     7558 63.4  0.0 198996 20848 ?        RNl  Oct15 1199:05
> > > > gromacs-5.1/bin/mdrun_mpi -v
> > > > dayer     7562 56.2  0.0 197912 19736 ?        RNl  Oct15 1062:57
> > > > gromacs-5.1/bin/mdrun_mpi -v
> > > > dayer     7565 63.1  0.0 197008 19208 ?        RNl  Oct15 1194:51
> > > > gromacs-5.1/bin/mdrun_mpi -v
> > > > dayer     7569 56.7  0.0 227904 50584 ?        RNl  Oct15 1072:33
> > > > gromacs-5.1/bin/mdrun_mpi -v
> > > >
> > > >
> > > >
> > > > [root@compute-0-3 ~]# ps aux | grep mdrun_mpi
> > > > dayer     1735  0.0  0.0 299192  4692 ?        Sl   Oct15   0:03
> mpirun
> > > > gromacs-5.1/bin/mdrun_mpi -v
> > > > dayer     1740  9.5  0.0 209692 29224 ?        RNl  Oct15 180:09
> > > > gromacs-5.1/bin/mdrun_mpi -v
> > > > dayer     1741  9.6  0.0 200948 22784 ?        RNl  Oct15 183:21
> > > > gromacs-5.1/bin/mdrun_mpi -v
> > > > dayer     1742  9.3  0.0 200256 21980 ?        RNl  Oct15 177:28
> > > > gromacs-5.1/bin/mdrun_mpi -v
> > > > dayer     1743  9.5  0.0 197672 19100 ?        RNl  Oct15 180:01
> > > > gromacs-5.1/bin/mdrun_mpi -v
> > > > dayer     1744  9.6  0.0 228208 50920 ?        RNl  Oct15 183:07
> > > > gromacs-5.1/bin/mdrun_mpi -v
> > > > dayer     1746  9.3  0.0 199144 20588 ?        RNl  Oct15 176:24
> > > > gromacs-5.1/bin/mdrun_mpi -v
> > > > dayer     1749  9.5  0.0 201496 23156 ?        RNl  Oct15 180:25
> > > > gromacs-5.1/bin/mdrun_mpi -v
> > > > dayer     1751  9.1  0.0 200916 22884 ?        RNl  Oct15 173:13
> > > > gromacs-5.1/bin/mdrun_mpi -v
> > > > dayer     1755  9.3  0.0 198744 20616 ?        RNl  Oct15 176:49
> > > > gromacs-5.1/bin/mdrun_mpi -v
> > > > dayer     1758  9.2  0.0 226792 49460 ?        RNl  Oct15 174:12
> > > > gromacs-5.1/bin/mdrun_mpi -v
> > > >
> > > >
> > > >
> > > > Please note that the third column is the cpu utilization.
> > > > Gromacs is a compute intensive application, so there is little IO or
> > > > something else for that.
> > > >
> > > >
> > > > Please also note that in compute-0-3 the first process is "mpirun
> > > > gromacs-5.1...." while the others are only "gromacs-5.1...."
> > > >
> > > >
> > > > Any idea is welcomed.
> > > >
> > > > Regards,
> > > > Mahmood
> > > > --
> > > > Gromacs Users mailing list
> > > >
> > > > * Please search the archive at http://www.gromacs.org/
> > > > Support/Mailing_Lists/GMX-Users_List before posting!
> > > >
> > > > * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
> > > >
> > > > * For (un)subscribe requests visit
> > > > https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users
> or
> > > > send a mail to gmx-users-requ...@gromacs.org.
> > > >
> > > --
> > > Gromacs Users mailing list
> > >
> > > * Please search the archive at http://www.gromacs.org/
> > > Support/Mailing_Lists/GMX-Users_List before posting!
> > >
> > > * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
> > >
> > > * For (un)subscribe requests visit
> > > https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
> > > send a mail to gmx-users-requ...@gromacs.org.
> > >
> > --
> > Gromacs Users mailing list
> >
> > * Please search the archive at http://www.gromacs.org/
> > Support/Mailing_Lists/GMX-Users_List before posting!
> >
> > * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
> >
> > * For (un)subscribe requests visit
> > https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
> > send a mail to gmx-users-requ...@gromacs.org.
> >
> --
> Gromacs Users mailing list
>
> * Please search the archive at
> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
> posting!
>
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>
> * For (un)subscribe requests visit
> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
> send a mail to gmx-users-requ...@gromacs.org.
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.

Reply via email to