Re: [gmx-users] Problem with OMP_NUM_THREADS=12 mpirun -np 16 mdrun_mpi

2012-08-29 Thread jesmin jahan
Thanks David and Szilárd.

I am attaching a log file that I have got from my experiment. Please
have a look. It says, gromacs version 4.6-dev
I am using  :-)  VERSION 4.6-dev-20120820-87e5bcf  (-: of Gromacs.

I have used the commands:

git clone git://git.gromacs.org/gromacs.git
cd gromacs
git checkout --track -b release-4-6 origin/release-4-6 as written on
the gromacs website.

to download it and

used cmake .. -DGMX_MPI=ON -DGMX_OPENMP=ON to configure it.

Is it the case that later version of 4.6 has this feature?

Please let me know.

Thanks,
Jesmin

On Wed, Aug 29, 2012 at 4:27 AM, Szilárd Páll szilard.p...@cbr.su.se wrote:
 On Wed, Aug 29, 2012 at 5:32 AM, jesmin jahan shraba...@gmail.com wrote:
 Dear All,

 I have installed gromacs VERSION 4.6-dev-20120820-87e5bcf with
 -DGMX_MPI=ON . I am assuming as OPENMP is default, it will be
 automatically installed.

 My Compiler is
 /opt/apps/intel11_1/mvapich2/1.6/bin/mpicc Intel icc (ICC) 11.1 20101201

 And I am using OMP_NUM_THREADS=12 mpirun -np 16 mdrun_mpi -s imd.tpr

 I was hopping this will run 16 processes each with 12 threads.
 However, in the log file I saw something like this:

  R E A L   C Y C L E   A N D   T I M E   A C C O U N T I N G

  Computing: Nodes Number G-CyclesSeconds %
 ---
  Domain decomp.16  10.0270.0 1.8
  Comm. coord.  16  10.0020.0 0.1
  Neighbor search   16  10.1130.1 7.7
  Force 16  11.2360.883.4
  Wait + Comm. F16  10.0150.0 1.0
  Update16  10.0050.0 0.4
  Comm. energies16  10.0080.0 0.5
  Rest  16   0.0760.0 5.1
 ---
  Total 16   1.4810.9   100.0
 ---


 Its not clear whether each of the 16 nodes runs 12 threads internally or not.

 No it's not. That out put is not from 4.6, you should have an extra
 column with the number of threads.

 --
 Szilárd


 If anyone knows about this, please let me know.

 Thanks for help.

 Best Regards,
 Jesmin



 --
 Jesmin Jahan Tithi
 PhD Student, CS
 Stony Brook University, NY-11790.
 --
 gmx-users mailing listgmx-users@gromacs.org
 http://lists.gromacs.org/mailman/listinfo/gmx-users
 * Please search the archive at 
 http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
 * Please don't post (un)subscribe requests to the list. Use the
 www interface or send it to gmx-users-requ...@gromacs.org.
 * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
 --
 gmx-users mailing listgmx-users@gromacs.org
 http://lists.gromacs.org/mailman/listinfo/gmx-users
 * Please search the archive at 
 http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
 * Please don't post (un)subscribe requests to the list. Use the
 www interface or send it to gmx-users-requ...@gromacs.org.
 * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists



-- 
Jesmin Jahan Tithi
PhD Student, CS
Stony Brook University, NY-11790.
-- 
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
* Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to gmx-users-requ...@gromacs.org.
* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

Re: [gmx-users] Problem with incorrect GB-Polarization Energy Value

2012-08-29 Thread jesmin jahan
LJ (SR)   Coulomb (SR)  PotentialKinetic En.
   -1.65116e+045.74908e+08   -2.37699e+055.74654e+086.36009e+11
   Total EnergyTemperature Pressure (bar)
6.36584e+111.60465e+100.0e+00

==  ###  ==
  A V E R A G E S  
==  ###  ==

Statistics over 1 steps using 1 frames

   Energies (kJ/mol)
GB PolarizationLJ (SR)   Coulomb (SR)  PotentialKinetic En.
   -1.65116e+045.74908e+08   -2.37699e+055.74654e+086.36009e+11
   Total EnergyTemperature Pressure (bar)
6.36584e+111.60465e+100.0e+00

   Total Virial (kJ/mol)
   -1.13687e+091.14300e+07   -1.23884e+07
1.14273e+07   -1.15125e+09   -5.31658e+06
   -1.23830e+07   -5.31326e+06   -1.16512e+09

   Pressure (bar)
0.0e+000.0e+000.0e+00
0.0e+000.0e+000.0e+00
0.0e+000.0e+000.0e+00

   Total Dipole (D)
1.35524e+03   -4.39059e+012.16985e+03


M E G A - F L O P S   A C C O U N T I N G

   RF=Reaction-Field  FE=Free Energy  SCFE=Soft-Core/Free Energy
   T=TabulatedW3=SPC/TIP3pW4=TIP4p (single or pairs)
   NF=No Forces

 Computing:   M-Number M-Flops  % Flops
-
 Generalized Born Coulomb 0.006162   0.296 0.2
 GB Coulomb + LJ  0.446368  27.22819.8
 Outer nonbonded loop 0.015554   0.156 0.1
 Born radii (HCT/OBC) 0.452530  82.81360.3
 Born force chain rule0.452530   6.788 4.9
 NS-Pairs 0.940291  19.74614.4
 Reset In Box 0.003179   0.010 0.0
 CG-CoM   0.006358   0.019 0.0
 Virial   0.003899   0.070 0.1
 Stop-CM  0.006358   0.064 0.0
 Calc-Ekin0.006358   0.172 0.1
-
 Total 137.361   100.0
-


D O M A I N   D E C O M P O S I T I O N   S T A T I S T I C S

 av. #atoms communicated per step for force:  2 x 7369.0


 R E A L   C Y C L E   A N D   T I M E   A C C O U N T I N G

 Computing: Nodes Number G-CyclesSeconds %
---
 Domain decomp.16  10.2100.111.4
 Comm. coord.  16  10.0060.0 0.3
 Neighbor search   16  10.1180.1 6.4
 Force 16  11.3190.871.4
 Wait + Comm. F16  10.0160.0 0.9
 Update16  10.0030.0 0.2
 Comm. energies16  10.0930.1 5.0
 Rest  16   0.0820.1 4.4
---
 Total 16   1.8471.1   100.0
---

NOTE: 5 % of the run time was spent communicating energies,
  you might want to use the -gcom option of mdrun


Parallel run - timing based on wallclock.

   NODE (s)   Real (s)  (%)
   Time:  0.036  0.036100.0
   (Mnbf/s)   (GFlops)   (ns/day)  (hour/ns)
Performance: 12.702  3.856  2.425  9.896
Finished mdrun on node 0 Wed Aug 29 02:32:21 2012



The GB- energy value reported is half of that reported by Amber 11 and
Octree based Molecular dynamic package.

Although I guess the difference can be due to the difference in
algorithms they are using, but there could be some other reason.
If anyone knows what are the possible reasons behind this, please let
me know. May be fixing them will give me same value for all different
Molecular Dynamic Package.

Best Regard,
Jesmin

On Mon, Aug 27, 2012 at 6:25 PM, Mark Abraham mark.abra...@anu.edu.au wrote:
 On 28/08/2012 2:33 AM, jesmin jahan wrote:

 Dear All,

 I am using Gromacs 4.5.3 for GB-Polarization Energy Calculation. As I
 am not interested to any other energy terms right now, I have set all
 the non-bonded parameters to 0.

 I am also calculating GB polarization energy using other available
 Molecular Dynamic Packages and doing a comparative study between them
 (say: Accuracy Vs. Speed Up).
 I have already used Gromacs for calculating GB-energy for 168
 different Protein molecules and the energy values reported

Re: [gmx-users] Problem with incorrect GB-Polarization Energy Value

2012-08-29 Thread jesmin jahan
Ops!

Thanks Justin for you quick reply.
Sorry, I have attached a log file from previous run. I am attaching
the correct log file here. Please have a look.

Actually, I am a Computer Science student. I do not have enough
background of Molecular Dynamics.
I am using these three commands and

pdb2gmx -f 1F15-full.pdb -ter -ignh -ff amber99 -water none
grompp -f mdr.mdp -c conf.gro -p topol.top -o imd.tpr
OMP_NUM_THREADS=12 mdrun -nt 16 -s imd.tpr

and my .mdp file is like this:

constraints =  none
integrator  =  md
pbc =  no
dt  =  0.001   ; ps
nsteps  =  0 ; 10 ps = 100 ns
rcoulomb= 1
rvdw= 1
rlist   =1
nstgbradii  = 1
rgbradii= 1
implicit_solvent=  GBSA
gb_algorithm=  HCT ; OBC ; Still
sa_algorithm=  None


What else might go wrong?

Thanks,
Jesmin

On Wed, Aug 29, 2012 at 11:14 AM, Justin Lemkul jalem...@vt.edu wrote:


 On 8/29/12 11:11 AM, jesmin jahan wrote:

 Thanks Mark for your reply.

 For the time being, I admit your claim that I am comparing apple with
 orange.
 So, to investigate more, I run the simulation without any modification
 in parameter fields and force field I am using. My test data is CMV
 virus shell.
 I am using the following commands.

 pdb2gmx -f 1F15-full.pdb -ter -ignh -ff amber99 -water none
 grompp -f mdr.mdp -c conf.gro -p topol.top -o imd.tpr
 OMP_NUM_THREADS=12 mdrun -nt 16 -s imd.tpr


 The log file looks like this:
   :-)  G  R  O  M  A  C  S  (-:

 GROningen MAchine for Chemical Simulation

 :-)  VERSION 4.6-dev-20120820-87e5bcf  (-:

  Written by Emile Apol, Rossen Apostolov, Herman J.C. Berendsen,
Aldert van Buuren, Pär Bjelkmar, Rudi van Drunen, Anton Feenstra,
  Gerrit Groenhof, Peter Kasson, Per Larsson, Pieter Meulenhoff,
 Teemu Murtola, Szilard Pall, Sander Pronk, Roland Schulz,
  Michael Shirts, Alfons Sijbers, Peter Tieleman,

 Berk Hess, David van der Spoel, and Erik Lindahl.

 Copyright (c) 1991-2000, University of Groningen, The Netherlands.
  Copyright (c) 2001-2010, The GROMACS development team at
  Uppsala University  The Royal Institute of Technology, Sweden.
  check out http://www.gromacs.org for more information.

   This program is free software; you can redistribute it and/or
modify it under the terms of the GNU General Public License
   as published by the Free Software Foundation; either version 2
   of the License, or (at your option) any later version.

:-)  mdrun_mpi  (-:


  PLEASE READ AND CITE THE FOLLOWING REFERENCE 
 B. Hess and C. Kutzner and D. van der Spoel and E. Lindahl
 GROMACS 4: Algorithms for highly efficient, load-balanced, and scalable
 molecular simulation
 J. Chem. Theory Comput. 4 (2008) pp. 435-447
   --- Thank You ---  


  PLEASE READ AND CITE THE FOLLOWING REFERENCE 
 D. van der Spoel, E. Lindahl, B. Hess, G. Groenhof, A. E. Mark and H. J.
 C.
 Berendsen
 GROMACS: Fast, Flexible and Free
 J. Comp. Chem. 26 (2005) pp. 1701-1719
   --- Thank You ---  


  PLEASE READ AND CITE THE FOLLOWING REFERENCE 
 E. Lindahl and B. Hess and D. van der Spoel
 GROMACS 3.0: A package for molecular simulation and trajectory analysis
 J. Mol. Mod. 7 (2001) pp. 306-317
   --- Thank You ---  


  PLEASE READ AND CITE THE FOLLOWING REFERENCE 
 H. J. C. Berendsen, D. van der Spoel and R. van Drunen
 GROMACS: A message-passing parallel molecular dynamics implementation
 Comp. Phys. Comm. 91 (1995) pp. 43-56
   --- Thank You ---  

 Input Parameters:
 integrator   = md
 nsteps   = 0
 init-step= 0
 ns-type  = Grid
 nstlist  = 10
 ndelta   = 2
 nstcomm  = 10
 comm-mode= Linear
 nstlog   = 1000
 nstxout  = 0
 nstvout  = 0
 nstfout  = 0
 nstcalcenergy= 10
 nstenergy= 100
 nstxtcout= 0
 init-t   = 0
 delta-t  = 0.001
 xtcprec  = 1000
 nkx  = 0
 nky  = 0
 nkz  = 0
 pme-order= 4
 ewald-rtol   = 1e-05
 ewald-geometry   = 0
 epsilon-surface  = 0
 optimize-fft = FALSE
 ePBC = no
 bPeriodicMols= FALSE
 bContinuation= FALSE
 bShakeSOR= FALSE
 etc  = No
 bPrintNHChains   = FALSE
 nsttcouple   = -1
 epc  = No
 epctype

[gmx-users] Problem with incorrect GB-Polarization Energy Value

2012-08-29 Thread jesmin jahan
 type 'Protein_chain_C20'
Excluding 3 bonded neighbours molecule type 'Protein_chain_A21'
Excluding 3 bonded neighbours molecule type 'Protein_chain_B21'
Excluding 3 bonded neighbours molecule type 'Protein_chain_C21'
Excluding 3 bonded neighbours molecule type 'Protein_chain_A22'
Excluding 3 bonded neighbours molecule type 'Protein_chain_B22'
Excluding 3 bonded neighbours molecule type 'Protein_chain_C22'
Excluding 3 bonded neighbours molecule type 'Protein_chain_A23'
Excluding 3 bonded neighbours molecule type 'Protein_chain_B23'
Excluding 3 bonded neighbours molecule type 'Protein_chain_C23'
Excluding 3 bonded neighbours molecule type 'Protein_chain_A24'
Excluding 3 bonded neighbours molecule type 'Protein_chain_B24'
Excluding 3 bonded neighbours molecule type 'Protein_chain_C24'
Excluding 3 bonded neighbours molecule type 'Protein_chain_A25'
Excluding 3 bonded neighbours molecule type 'Protein_chain_B25'
Excluding 3 bonded neighbours molecule type 'Protein_chain_C25'
Excluding 3 bonded neighbours molecule type 'Protein_chain_A26'
Excluding 3 bonded neighbours molecule type 'Protein_chain_B26'
Excluding 3 bonded neighbours molecule type 'Protein_chain_C26'
Excluding 3 bonded neighbours molecule type 'Protein_chain_A27'
Excluding 3 bonded neighbours molecule type 'Protein_chain_B27'
Excluding 3 bonded neighbours molecule type 'Protein_chain_C27'
Excluding 3 bonded neighbours molecule type 'Protein_chain_A28'
Excluding 3 bonded neighbours molecule type 'Protein_chain_B28'
Excluding 3 bonded neighbours molecule type 'Protein_chain_C28'
Excluding 3 bonded neighbours molecule type 'Protein_chain_A29'
Excluding 3 bonded neighbours molecule type 'Protein_chain_B29'
Excluding 3 bonded neighbours molecule type 'Protein_chain_C29'
Excluding 3 bonded neighbours molecule type 'Protein_chain_A30'
Excluding 3 bonded neighbours molecule type 'Protein_chain_B30'
Excluding 3 bonded neighbours molecule type 'Protein_chain_C30'
Excluding 3 bonded neighbours molecule type 'Protein_chain_A31'
Excluding 3 bonded neighbours molecule type 'Protein_chain_B31'
Excluding 3 bonded neighbours molecule type 'Protein_chain_C31'
Excluding 3 bonded neighbours molecule type 'Protein_chain_A32'
Excluding 3 bonded neighbours molecule type 'Protein_chain_B32'
Excluding 3 bonded neighbours molecule type 'Protein_chain_C32'
Excluding 3 bonded neighbours molecule type 'Protein_chain_A33'
Excluding 3 bonded neighbours molecule type 'Protein_chain_B33'
Excluding 3 bonded neighbours molecule type 'Protein_chain_C33'
Excluding 3 bonded neighbours molecule type 'Protein_chain_A34'
Excluding 3 bonded neighbours molecule type 'Protein_chain_B34'
Excluding 3 bonded neighbours molecule type 'Protein_chain_C34'
Excluding 3 bonded neighbours molecule type 'Protein_chain_A35'
Excluding 3 bonded neighbours molecule type 'Protein_chain_B35'
Excluding 3 bonded neighbours molecule type 'Protein_chain_C35'
... so on.

NOTE 5 [file topol.top, line 388]:
  System has non-zero total charge: 780.04
  Total charge should normally be an integer. See
  http://www.gromacs.org/Documentation/Floating_Point_Arithmetic
  for discussion on how close it should be to an integer.



Analysing residue names:
There are: 32280Protein residues
Analysing Protein...
Number of degrees of freedom in T-Coupling group rest is 1529097.00
This run will generate roughly 39 Mb of data

There were 5 notes

Back Off! I just backed up imd.tpr to ./#imd.tpr.1#

gcq#97: The Universe is Somewhere In Here (J.G.E.M. Fraaije)


I was only interested in non bonded terms (Specially GB-Energy), so I
guess, exclusion of bonded terms is not a problem.

Thanks,
Jesmin

On Wed, Aug 29, 2012 at 12:09 PM, Justin Lemkul jalem...@vt.edu wrote:


 On 8/29/12 11:27 AM, jesmin jahan wrote:

 Ops!

 Thanks Justin for you quick reply.
 Sorry, I have attached a log file from previous run. I am attaching
 the correct log file here. Please have a look.


 I don't see a new .log file attached anywhere.


 Actually, I am a Computer Science student. I do not have enough
 background of Molecular Dynamics.
 I am using these three commands and

 pdb2gmx -f 1F15-full.pdb -ter -ignh -ff amber99 -water none
 grompp -f mdr.mdp -c conf.gro -p topol.top -o imd.tpr
 OMP_NUM_THREADS=12 mdrun -nt 16 -s imd.tpr

 and my .mdp file is like this:

 constraints =  none
 integrator  =  md
 pbc =  no
 dt  =  0.001   ; ps
 nsteps  =  0 ; 10 ps = 100 ns
 rcoulomb= 1
 rvdw= 1
 rlist   =1
 nstgbradii  = 1
 rgbradii= 1
 implicit_solvent=  GBSA
 gb_algorithm=  HCT ; OBC ; Still
 sa_algorithm=  None


 What else might go wrong?


 The normal workflow included energy minimization before running MD.  Basic
 tutorial material covers this.  Without EM, you assume that whatever
 structure you're using is suitable for MD, which may or may not be true.

 -Justin


 Thanks,
 Jesmin

 On Wed, Aug 29, 2012

Re: [gmx-users] Problem with incorrect GB-Polarization Energy Value

2012-08-29 Thread jesmin jahan
5.62500e-01   -2.12500e+00
3.28125e-01   -1.57464e+073.25000e+00
   -3.67188e+00   -2.68750e+00   -1.57464e+07

   Pressure (bar)
0.0e+000.0e+000.0e+00
0.0e+000.0e+000.0e+00
0.0e+000.0e+000.0e+00

   Total Dipole (D)
9.76562e-040.0e+001.95312e-03


M E G A - F L O P S   A C C O U N T I N G

   RF=Reaction-Field  FE=Free Energy  SCFE=Soft-Core/Free Energy
   T=TabulatedW3=SPC/TIP3pW4=TIP4p (single or pairs)
   NF=No Forces

 Computing:   M-Number M-Flops  % Flops
-
 Generalized Born Coulomb 1.626204  78.058 0.3
 GB Coulomb + LJ 73.6290964491.37517.3
 Outer nonbonded loop 1.962706  19.627 0.1
 1,4 nonbonded interactions   1.348860 121.397 0.5
 Born radii (HCT/OBC)78.053220   14283.73955.0
 Born force chain rule   78.0532201170.798 4.5
 NS-Pairs   245.0585265146.22919.8
 Reset In Box 0.509700   1.529 0.0
 CG-CoM   1.019400   3.058 0.0
 Bonds0.514800  30.373 0.1
 Angles   0.934260 156.956 0.6
 Propers  1.742760 399.092 1.5
 Virial   0.510420   9.188 0.0
 Stop-CM  1.019400  10.194 0.0
 Calc-Ekin1.019400  27.524 0.1
-
 Total   25949.137   100.0
-


D O M A I N   D E C O M P O S I T I O N   S T A T I S T I C S

 av. #atoms communicated per step for force:  2 x 103046.0


 R E A L   C Y C L E   A N D   T I M E   A C C O U N T I N G

 Computing: Nodes Number G-CyclesSeconds %
---
 Domain decomp.16  12.8702.0 0.7
 Comm. coord.  16  10.9430.6 0.2
 Neighbor search   16  1   20.102   13.7 5.0
 Force 16  1  132.542   90.432.7
 Wait + Comm. F16  12.3151.6 0.6
 Update16  10.1300.1 0.0
 Comm. energies16  10.0900.1 0.0
 Rest  16 246.272  167.960.8
---
 Total 16 405.265  276.3   100.0
---

Parallel run - timing based on wallclock.

   NODE (s)   Real (s)  (%)
   Time:  8.635  8.635100.0
   (Mnbf/s)   (GFlops)   (ns/day)  (hour/ns)
Performance:  8.715  3.005  0.010   2398.708
Finished mdrun on node 0 Wed Aug 29 09:58:22 2012




Thanks,
Jesmin

On Wed, Aug 29, 2012 at 1:11 PM, Justin Lemkul jalem...@vt.edu wrote:


 On 8/29/12 1:06 PM, jesmin jahan wrote:

 Dear Justin,

 Thanks for your reply.
 Here is the CMV.log file . Please check it.


 What you've posted is output from grompp.  Note that if you're trying to
 send attachments, the list rejects them.



 Actually, the .pdb file I am using is already minimized and we are
 using the same file for amber 11 and Octree based molecular dynamic
 package.


 Something doesn't add up.  The energy values were indicative of a completely
 unphysical system.


 I will also do the minimization step to see what happens.

 One thing I also want to mention is when I run
   grompp -f mdr.mdp -c conf.gro -p topol.top -o imd.tpr command, I get
 following the log.

 NOTE 1 [file mdr.mdp]:
Tumbling and or flying ice-cubes: We are not removing rotation around
center of mass in a non-periodic system. You should probably set
comm_mode = ANGULAR.


 For a single-point energy evaluation this probably isn't significant.



 NOTE 2 [file mdr.mdp]:
You are using a cut-off for VdW interactions with NVE, for good energy
conservation use vdwtype = Shift (possibly with DispCorr)


 NOTE 3 [file mdr.mdp]:
You are using a cut-off for electrostatics with NVE, for good energy
conservation use coulombtype = PME-Switch or Reaction-Field-zero



 Finite cutoffs do have a significant outcome of implicit calculations, but
 if you're doing this to remain consistent with other software, I suppose you

[gmx-users] Problem with OMP_NUM_THREADS=12 mpirun -np 16 mdrun_mpi

2012-08-28 Thread jesmin jahan
Dear All,

I have installed gromacs VERSION 4.6-dev-20120820-87e5bcf with
-DGMX_MPI=ON . I am assuming as OPENMP is default, it will be
automatically installed.

My Compiler is
/opt/apps/intel11_1/mvapich2/1.6/bin/mpicc Intel icc (ICC) 11.1 20101201

And I am using OMP_NUM_THREADS=12 mpirun -np 16 mdrun_mpi -s imd.tpr

I was hopping this will run 16 processes each with 12 threads.
However, in the log file I saw something like this:

 R E A L   C Y C L E   A N D   T I M E   A C C O U N T I N G

 Computing: Nodes Number G-CyclesSeconds %
---
 Domain decomp.16  10.0270.0 1.8
 Comm. coord.  16  10.0020.0 0.1
 Neighbor search   16  10.1130.1 7.7
 Force 16  11.2360.883.4
 Wait + Comm. F16  10.0150.0 1.0
 Update16  10.0050.0 0.4
 Comm. energies16  10.0080.0 0.5
 Rest  16   0.0760.0 5.1
---
 Total 16   1.4810.9   100.0
---


Its not clear whether each of the 16 nodes runs 12 threads internally or not.

If anyone knows about this, please let me know.

Thanks for help.

Best Regards,
Jesmin



--
Jesmin Jahan Tithi
PhD Student, CS
Stony Brook University, NY-11790.
-- 
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
* Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to gmx-users-requ...@gromacs.org.
* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists


[gmx-users] Problem with incorrect GB-Polarization Energy Value

2012-08-27 Thread jesmin jahan
Dear All,

I am using Gromacs 4.5.3 for GB-Polarization Energy Calculation. As I
am not interested to any other energy terms right now, I have set all
the non-bonded parameters to 0.

I am also calculating GB polarization energy using other available
Molecular Dynamic Packages and doing a comparative study between them
(say: Accuracy Vs. Speed Up).
I have already used Gromacs for calculating GB-energy for 168
different Protein molecules and the energy values reported were more
or less the same as reported by others.

Now, I am using a virus shell as input in this process. It contains
1.5 million atoms. Unfortunately, this time, the energy reported is
almost half of the value reported by others.
So, I am a little bit confused. Am I doing something wrong? I have
heard previously that there is no max size for Gromacs.


If anyone have encountered similar kind of problem or have knowledge
about this, please let me know.

Thanks,
Jesmin



-- 
Jesmin Jahan Tithi
PhD Student, CS
Stony Brook University, NY-11790.
-- 
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
* Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to gmx-users-requ...@gromacs.org.
* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists


[gmx-users] Question about Multi-level parallelization: MPI and OpenMP

2012-08-27 Thread jesmin jahan
Dear Gromacs Users,

I have two questions about the multi-level parallelism of Gromacs.
http://www.gromacs.org/Documentation/Acceleration_and_parallelization?highlight=verlet#GPU_acceleration

1. Is this feature is only supported by Gromacs 4.6? Or we can get it
in 4.5.3 also?

2. In Gromacs 4.5.3, I have used OMP_NUM_THREADS=12 mpirun -np 16
mdrun_mpi -s imd.tpr command to run 16 gromacs processes each with 12
threads.

In the log file, I can see

 nodeid: 0  nnodes:  16

Its clear that 16 nodes are being used in work load distribution. But
its not clear whether 12 thread is being used in each of those
processes /nodes because there is no mention about the number of
threads.

Does anyone know about this?

Sincerely.

Jesmin

-- 
Jesmin Jahan Tithi
PhD Student, CS
Stony Brook University, NY-11790.
-- 
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
* Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to gmx-users-requ...@gromacs.org.
* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists


Re: [gmx-users] Question about Multi-level parallelization: MPI and OpenMP

2012-08-27 Thread jesmin jahan
Okay Mark, Thanks!.

Best Regards,
Jesmin

On Mon, Aug 27, 2012 at 6:15 PM, Mark Abraham mark.abra...@anu.edu.au wrote:
 On 28/08/2012 2:40 AM, jesmin jahan wrote:

 Dear Gromacs Users,

 I have two questions about the multi-level parallelism of Gromacs.

 http://www.gromacs.org/Documentation/Acceleration_and_parallelization?highlight=verlet#GPU_acceleration

 1. Is this feature is only supported by Gromacs 4.6? Or we can get it
 in 4.5.3 also?


 That section of that page is pretty specific about versions. The shiny new
 stuff is due in 4.6.


 2. In Gromacs 4.5.3, I have used OMP_NUM_THREADS=12 mpirun -np 16
 mdrun_mpi -s imd.tpr command to run 16 gromacs processes each with 12
 threads.

 In the log file, I can see

   nodeid: 0  nnodes:  16

 Its clear that 16 nodes are being used in work load distribution. But
 its not clear whether 12 thread is being used in each of those
 processes /nodes because there is no mention about the number of
 threads.

 Does anyone know about this?


 Again from that page: With GROMACS 4.6 OpenMP multithreading is
 supported...

 Mark
 --
 gmx-users mailing listgmx-users@gromacs.org
 http://lists.gromacs.org/mailman/listinfo/gmx-users
 * Please search the archive at
 http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
 * Please don't post (un)subscribe requests to the list. Use the www
 interface or send it to gmx-users-requ...@gromacs.org.
 * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists



-- 
Jesmin Jahan Tithi
PhD Student, CS
Stony Brook University, NY-11790.
-- 
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
* Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to gmx-users-requ...@gromacs.org.
* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists


[gmx-users] Error:The selected GPU (#0, Tesla M2090) is not supported by Gromacs!

2012-08-25 Thread jesmin jahan
Dear all,
I got the following error while running mdrun-gpu, I got the following error:

The selected GPU (#0, Tesla M2090) is not supported by Gromacs!
Although in the Gromacs site , it says that Tesla M2090 is supported.

Then,  I have used mdrun-gpu -device
OpenMM:platform=Cuda,memtest=15,deviceid=0,force-device=yes -s
imd.tpr command.
I got the following warning and I am using this for calculating
GB-polarization energy, but the result does not contains
GB-polarization energy at all.
WARNING: Non-supported GPU selected (#0, Tesla M2090), forced
continuing.Note, that the simulation can be slow or it migth even
crash.

Started mdrun on node 0 Sat Aug 25 00:04:18 2012

   Step   Time Lambda
  00.00.0

   Energies (kJ/mol)
  PotentialKinetic En.   Total EnergyTemperature
   -2.22432e+050.0e+00   -2.22432e+050.0e+00

If anyone has ever experienced this kind of problem, or knows the
solution, please let me know. Thanks in advance.
Thanks,
Jesmin


-- 
Jesmin Jahan Tithi
PhD Student, CS
Stony Brook University, NY-11790.
-- 
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Only plain text messages are allowed!
* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
* Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to gmx-users-requ...@gromacs.org.
* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists


Re: [gmx-users] Error:The selected GPU (#0, Tesla M2090) is not supported by Gromacs!

2012-08-25 Thread jesmin jahan
Hi Justin,

Thanks for your reply.

Do you mean that Gromacs does not support tesla M2090?

I have used the force-device=yes option. But the problem is, it runs
but does not give me GB-polarization energy, It only gives the  the
potential energy.
I intention was to calculate the GB-Energy. In .mdp file I added

constraints =  none
integrator  =  md
pbc =  no
dt  =  0.001   ; ps
nsteps  =  0 ; 10 ps = 100 ns
rcoulomb= 1
rvdw= 1
rlist   =1
nstgbradii  = 1
rgbradii= 1
implicit_solvent=  GBSA
gb_algorithm=  OBC ; Still
sa_algorithm=  None


Thanks,
Jesmin

On Sat, Aug 25, 2012 at 11:25 AM, Justin Lemkul jalem...@vt.edu wrote:


 On 8/25/12 10:57 AM, jesmin jahan wrote:

 Dear all,
 I got the following error while running mdrun-gpu, I got the following
 error:

 The selected GPU (#0, Tesla M2090) is not supported by Gromacs!
 Although in the Gromacs site , it says that Tesla M2090 is supported.

 Then,  I have used mdrun-gpu -device
 OpenMM:platform=Cuda,memtest=15,deviceid=0,force-device=yes -s
 imd.tpr command.
 I got the following warning and I am using this for calculating
 GB-polarization energy, but the result does not contains
 GB-polarization energy at all.
 WARNING: Non-supported GPU selected (#0, Tesla M2090), forced
 continuing.Note, that the simulation can be slow or it migth even
 crash.

 Started mdrun on node 0 Sat Aug 25 00:04:18 2012

 Step   Time Lambda
00.00.0

 Energies (kJ/mol)
PotentialKinetic En.   Total EnergyTemperature
 -2.22432e+050.0e+00   -2.22432e+050.0e+00

 If anyone has ever experienced this kind of problem, or knows the
 solution, please let me know. Thanks in advance.


 The list is simply outdated and will be updated in the next version.  Your
 command line (using force-device=yes) is the only workaround.

 -Justin

 --
 

 Justin A. Lemkul, Ph.D.
 Research Scientist
 Department of Biochemistry
 Virginia Tech
 Blacksburg, VA
 jalemkul[at]vt.edu | (540) 231-9080
 http://www.bevanlab.biochem.vt.edu/Pages/Personal/justin

 
 --
 gmx-users mailing listgmx-users@gromacs.org
 http://lists.gromacs.org/mailman/listinfo/gmx-users
 * Only plain text messages are allowed!
 * Please search the archive at
 http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
 * Please don't post (un)subscribe requests to the list. Use the www
 interface or send it to gmx-users-requ...@gromacs.org.
 * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists



-- 
Jesmin Jahan Tithi
PhD Student, CS
Stony Brook University, NY-11790.
-- 
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Only plain text messages are allowed!
* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
* Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to gmx-users-requ...@gromacs.org.
* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists


[gmx-users] Problem with The requested platform CUDA could not be found

2012-08-23 Thread jesmin jahan
Dear Friends,

I have download the pre-compiled binary version of openmm4.1.1 form
https://simtk.org/. It is the precompiled OpenMM libraries for 64 bit
Linux supported on NVIDIA GPUs with CUDA Toolkit 4.1 and on AMD GPUs
with APP SDK 2.4.

I have compiled Gromacs 4.5.5 with cuda 4.1 and linked with this
pre-compiled library.

export PATH=$PATH:/home1//OpenMM4.1.1-Linux64/OpenMM4.1.1-Linux64
export OPENMM_ROOT_DIR=/home1//OpenMM4.1.1-Linux64/OpenMM4.1.1-Linux64
export 
OPENMM_PLUGIN_DIR=/home1//OpenMM4.1.1-Linux64/OpenMM4.1.1-Linux64/lib/plugins
export 
LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/home1//OpenMM4.1.1-Linux64/OpenMM4.1.1-Linux64/lib

But If I run mdrun-gpu with some .tpr file, I get the following error
--
Program mdrun-gpu, VERSION 4.5.5-dev
Source code file:
/home1/01945/jesmin/gromacs/src/kernel/openmm_wrapper.cpp, line: 1272

Fatal error:
The requested platform CUDA could not be found.
---

I saw in gromacs users email archive that lots of people has faced
similar kind of problem and the most probable reason is mismatching
cuda version for openmm and Gromacs. But I think for my case, I am
using the same version.

what else might go wrong. If there is anyone who knows about the
problem, please reply me.

Thanks,
Jesmin

-- 
Jesmin Jahan Tithi
PhD Student, CS
Stony Brook University, NY-11790.
-- 
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Only plain text messages are allowed!
* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
* Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to gmx-users-requ...@gromacs.org.
* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists


[gmx-users] Problem with Gromacs 4.6 installation

2012-08-22 Thread jesmin jahan
Dear All,

I am trying to install Gromacs 4.6 with -DGMX_OPENMM=ON

I am getting the following errors in make install-mdrun

../mdlib/libmd_openmm.so.6: undefined reference to `omp_get_thread_num'
../mdlib/libmd_openmm.so.6: undefined reference to `omp_get_num_threads'
../mdlib/libmd_openmm.so.6: undefined reference to `GOMP_parallel_start'
../mdlib/libmd_openmm.so.6: undefined reference to `GOMP_loop_end_nowait'
../mdlib/libmd_openmm.so.6: undefined reference to
`GOMP_loop_ordered_static_start'
../mdlib/libmd_openmm.so.6: undefined reference to `GOMP_parallel_end'
../mdlib/libmd_openmm.so.6: undefined reference to `GOMP_barrier'
../mdlib/libmd_openmm.so.6: undefined reference to
`GOMP_loop_ordered_static_next'
collect2: ld returned 1 exit status
make[3]: *** [src/kernel/mdrun-openmm] Error 1
make[2]: *** [src/kernel/CMakeFiles/mdrun.dir/all] Error 2
make[1]: *** [src/kernel/CMakeFiles/install-mdrun.dir/rule] Error 2
make: *** [install-mdrun] Error 2


Is there anyone out there who has faced the same kind of problem and
know the solution?
Any kind of help will be highly appreciated.

Thanks,
Jesmin

-- 
Jesmin Jahan Tithi
PhD Student, CS
Stony Brook University, NY-11790.
-- 
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Only plain text messages are allowed!
* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
* Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to gmx-users-requ...@gromacs.org.
* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists


[gmx-users] Problem with Gromacs installation with GPU.

2012-08-21 Thread jesmin jahan
Dear All,

I have installed gromacs 4.5.3  on a cluster. I downloaded the
gromacs-4.5-GPU-beta2-X86_64 binaries and followed the following
instructions:

* INSTALLING FROM BINARY DISTRIBUTION:

0. Prerequisites:
- OpenMM (included in the binary release)
- NVIDIA CUDA libraries (version =3.0);
- NVIDIA driver (for details on compatiblity consult
  http://www.nvidia.com/Download/index5.aspx);
- NVIDIA CUDA-enabled GPU (for compatiblity list see
  http://www.gromacs.org/gpu).


1. Download and unpack the binary package for the respective
OS and architecture. Copy the content of the package to your
normal GROMACS installation directory (or to a custom location).
Note that as the distributed Gromacs-GPU packages do not contain
the entire set of tools and utilities included in a full GROAMCS
installation. Therefore, it is recommended to have a ≥v4.5
standard Gromacs installation along the GPU accelerated one.
e.g. on unix:

tar -xvf gromacs-4.5-GPU.tar.gz
cp -R gromacs-4.5-GPU/* PATH_TO_GROMACS_INSTALLATION


2. Add the openmm/lib directory to your library path, e.g. in bash:

export LD_LIBRARY_PATH=PATH_TO_GROMACS/openmm/lib:$LD_LIBRARY_PATH

If there are other OpenMM versions installed, make sure that the
supplied libraries have preference when running mdrun-gpu.
Also, make sure that the CUDA libraries installed match the version
of CUDA with which GROMACS-GPU is compiled.


3. Set the OPENMM_PLUGIN_DIR environment variable to contain the path
to the openmm/lib/plugins directory, e.g. in bash:

export OPENMM_PLUGIN_DIR=PATH_TO_GROMACS/openmm/lib/plugins


4. At this point, running the command:

PATH_TO_GROMACS/bin/mdrun-gpu -h

should display the standard mdrun help which means that all the
necessary libraries are accessible.




But running the PATH_TO_GROMACS/bin/mdrun-gpu -h command gives me

/home1/01945/jesmin/Gromacs/4.5.3/bin/mdrun-gpu: error while loading
shared libraries: libcudart.so.3: cannot open shared object file: No
such file or directory

So, if I actually look into the cuda library directory I found the
following files:

libcublas.solibcudart.solibcufft.solibcuinj.so
   libcurand.solibcusparse.solibnpp.so
libcublas.so.4  libcudart.so.4  libcufft.so.4
libcuinj.so.4  libcurand.so.4  libcusparse.so.4
libnpp.so.4
libcublas.so.4.2.9  libcudart.so.4.2.9  libcufft.so.4.2.9
libcuinj.so.4.2.9  libcurand.so.4.2.9  libcusparse.so.4.2.9
libnpp.so.4.2.9

Is there anyone who knows about how to fix this problem?

Please let me know.


Then, I installed the Gromacs-openmm precompiled version which used
cuda 4.1  following the instructions


To install Gromacs-OpenMM, follow these steps.

1. Install Gromacs 4.

2. Install OpenMM.

3. Copy mdrun-openmm to gromacs/bin, where gromacs is the root
directory of your Gromacs installation (e.g. /usr/local/gromacs).

4. Copy params.agb to gromacs/share/gromacs/top.

5. Add the OpenMM lib directory to your library path (PATH on Windows,
DYLD_LIBRARY_PATH on Mac OS X, LD_LIBRARY_PATH on Linux).

6. If you have an Nvidia GPU, install CUDA
(http://www.nvidia.com/object/cuda_get.html).  Make sure the CUDA lib
directory is in your library path.

7. If you installed OpenMM in the default location (/usr/local/openmm
on Mac OS X or Linux, Program Files\OpenMM on Windows), no further
steps are required.  If you installed it in a different location, set
the environment variable OPENMM_PLUGIN_DIR to point to the OpenMM
plugin directory (e.g. /usr/local/openmm/lib/plugins).


Note that I do not have any /usr/local directory. my home is
/home1/01945/jesmin.

Now if I type

mdrun-openmm -h, I get the following error:


/home1/01945/jesmin/Gromacs/4.5.3/bin/mdrun-openmm:
/home1/01945/jesmin/cilk/lib64/libstdc++.so.6: version
`GLIBCXX_3.4.11' not found (required by
/home1/01945/jesmin/Gromacs/4.5.3/openmm/lib/libOpenMM.so)

Can anyone help me to about how to solve this problem?

Thanks,
Jesmin
-- 
Jesmin Jahan Tithi
PhD Student, CS
Stony Brook University, NY-11790.
--
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Only plain text messages are allowed!
* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
* Please don't post (un)subscribe requests to the list. Use the
www interface or send it to gmx-users-requ...@gromacs.org.
* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists


Re: [gmx-users] Problem with Gromacs installation with GPU.

2012-08-21 Thread jesmin jahan
Dear  Szilárd

I have downloaded Gromacs 4.6 from git. But I saw that implicit
solvent feature is still not supported.


Features currently not supported by the new GPU and SSE kernels:
Implicit solvent (but this will still be supported on the GPU through OpenMM)

But I need the implicit solvent feature for my experiments. I went to
https://simtk.org/project/ where I found GromacsOpenmm2.0. But that
one is also deprecated!
So, its not clear to me what does it meant by but this will still be
supported on the GPU through OpenMM
How to add this feature with Gromacs4.6?

Or is there any version of Gromacs which supports implicit solvent
simulation on GPU? Please let me know.

Thanks,
Jesmin





On Tue, Aug 21, 2012 at 4:24 PM, Szilárd Páll szilard.p...@cbr.su.se wrote:

 Dear Jesmin,

 On Tue, Aug 21, 2012 at 7:21 PM, jesmin jahan shraba...@gmail.com wrote:
  Dear All,
 
  I have installed gromacs 4.5.3  on a cluster. I downloaded the
  gromacs-4.5-GPU-beta2-X86_64 binaries and followed the following
  instructions:

 Those binaries are extremely outdated. Please compile Gromacs form the
 last source release or the latest git version from
 release-4-5-patches.

 Cheers,
 --
 Szilárd


  
  * INSTALLING FROM BINARY DISTRIBUTION:
 
  0. Prerequisites:
  - OpenMM (included in the binary release)
  - NVIDIA CUDA libraries (version =3.0);
  - NVIDIA driver (for details on compatiblity consult
http://www.nvidia.com/Download/index5.aspx);
  - NVIDIA CUDA-enabled GPU (for compatiblity list see
http://www.gromacs.org/gpu).
 
 
  1. Download and unpack the binary package for the respective
  OS and architecture. Copy the content of the package to your
  normal GROMACS installation directory (or to a custom location).
  Note that as the distributed Gromacs-GPU packages do not contain
  the entire set of tools and utilities included in a full GROAMCS
  installation. Therefore, it is recommended to have a ≥v4.5
  standard Gromacs installation along the GPU accelerated one.
  e.g. on unix:
 
  tar -xvf gromacs-4.5-GPU.tar.gz
  cp -R gromacs-4.5-GPU/* PATH_TO_GROMACS_INSTALLATION
 
 
  2. Add the openmm/lib directory to your library path, e.g. in bash:
 
  export LD_LIBRARY_PATH=PATH_TO_GROMACS/openmm/lib:$LD_LIBRARY_PATH
 
  If there are other OpenMM versions installed, make sure that the
  supplied libraries have preference when running mdrun-gpu.
  Also, make sure that the CUDA libraries installed match the version
  of CUDA with which GROMACS-GPU is compiled.
 
 
  3. Set the OPENMM_PLUGIN_DIR environment variable to contain the path
  to the openmm/lib/plugins directory, e.g. in bash:
 
  export OPENMM_PLUGIN_DIR=PATH_TO_GROMACS/openmm/lib/plugins
 
 
  4. At this point, running the command:
 
  PATH_TO_GROMACS/bin/mdrun-gpu -h
 
  should display the standard mdrun help which means that all the
  necessary libraries are accessible.
 
 
  
 
  But running the PATH_TO_GROMACS/bin/mdrun-gpu -h command gives me
 
  /home1/01945/jesmin/Gromacs/4.5.3/bin/mdrun-gpu: error while loading
  shared libraries: libcudart.so.3: cannot open shared object file: No
  such file or directory
 
  So, if I actually look into the cuda library directory I found the
  following files:
 
  libcublas.solibcudart.solibcufft.solibcuinj.so
 libcurand.solibcusparse.solibnpp.so
  libcublas.so.4  libcudart.so.4  libcufft.so.4
  libcuinj.so.4  libcurand.so.4  libcusparse.so.4
  libnpp.so.4
  libcublas.so.4.2.9  libcudart.so.4.2.9  libcufft.so.4.2.9
  libcuinj.so.4.2.9  libcurand.so.4.2.9  libcusparse.so.4.2.9
  libnpp.so.4.2.9
 
  Is there anyone who knows about how to fix this problem?
 
  Please let me know.
 
 
  Then, I installed the Gromacs-openmm precompiled version which used
  cuda 4.1  following the instructions
  
 
  To install Gromacs-OpenMM, follow these steps.
 
  1. Install Gromacs 4.
 
  2. Install OpenMM.
 
  3. Copy mdrun-openmm to gromacs/bin, where gromacs is the root
  directory of your Gromacs installation (e.g. /usr/local/gromacs).
 
  4. Copy params.agb to gromacs/share/gromacs/top.
 
  5. Add the OpenMM lib directory to your library path (PATH on Windows,
  DYLD_LIBRARY_PATH on Mac OS X, LD_LIBRARY_PATH on Linux).
 
  6. If you have an Nvidia GPU, install CUDA
  (http://www.nvidia.com/object/cuda_get.html).  Make sure the CUDA lib
  directory is in your library path.
 
  7. If you installed OpenMM in the default location (/usr/local/openmm
  on Mac OS X or Linux, Program Files\OpenMM on Windows), no further
  steps are required.  If you installed it in a different location, set
  the environment variable OPENMM_PLUGIN_DIR to point to the OpenMM
  plugin directory (e.g. /usr/local/openmm/lib/plugins).
 
  
  Note that I do not have any /usr/local directory. my home is
  /home1/01945/jesmin.
 
  Now if I type
 
  mdrun-openmm -h, I get the following error:
 
 
  /home1/01945/jesmin/Gromacs/4.5.3/bin/mdrun-openmm:
  /home1

Re: [gmx-users] Questions regarding Polarization Energy Calculation

2012-08-17 Thread jesmin jahan
Hi Mark,

Thanks for your reply.

In the ffnonbonded.itp, I set all sigma and epsilon values to zero.
So, the the LJ energy is coming as zero. But the  coulomb potential is
non zero. What should I do to make it zero?

  Energies (kJ/mol)
GB PolarizationLJ (SR)   Coulomb (SR)  PotentialKinetic En.
   -2.23121e+030.0e+00   -3.47729e+04   -3.70041e+040.0e+00
   Total EnergyTemperature Pressure (bar)
   -3.70041e+040.0e+000.0e+00


One more point: I am not sure whether getting the 0 for an energy
does mean that it is not being calculated at all!
It seems, the energy value was calculated but the result was  zero
because the sigma and epsilon were zero. In that case, the time
reported by the program will also include the time of the extra
calculation (unless the program is smart enough to know before hand
that the result is going to be zero and return from the top!). So,
while comparing with other molecular dynamic packages, its not fair to
report that time for gromacs which also includes non GB time (as that
can make Gromacs slower than others who do not include non GB-time).

Any suggestions about this? What is the fair approach for gromacs?

Thanks,
Jesmin

On Fri, Aug 17, 2012 at 12:00 AM, Mark Abraham mark.abra...@anu.edu.au wrote:
 On 17/08/2012 1:14 PM, jesmin jahan wrote:

 Hi Mark,

 According to your advice   remove the  the bonded terms and zero the
 VDW parameters,
 I removed everything under [ bond] , [angles], [pairs] and [ dihedrals
 ],


 This only removes the bonded terms (in the sense of those atoms that
 interact because of the presence of bonds). The VDW parameters for
 non-bonded interactions are in ffnonbonded.itp for your force field. You
 should probably follow the advice here
 http://www.gromacs.org/Documentation/How-tos/Adding_a_Residue_to_a_Force_Field#Modifying_a_force_field
 to get a local copy you can change conveniently.


   and run the simulation mdrun rerun.

 I  got output something like the following:


 Energies (kJ/mol)
 GB PolarizationLJ (SR)   Coulomb (SR)  PotentialKinetic
 En.
 -2.23121e+037.54287e+07   -3.47729e+047.53917e+07
 0.0e+00
 Total EnergyTemperature Pressure (bar)
  7.53917e+070.0e+000.0e+00

 where the previous output was something like this:

 Energies (kJ/mol)
 Bond  AngleProper Dih.  Improper Dih.GB
 Polarization
  2.12480e+034.80088e+021.06648e+039.04861e+01
 -2.23122e+03
LJ-14 Coulomb-14LJ (SR)   Coulomb (SR)
 Potential
  7.05695e+025.47366e+03   -4.16856e+02   -8.74797e+03
 -1.45483e+03
  Kinetic En.   Total EnergyTemperature Pressure (bar)
  0.0e+00   -1.45483e+030.0e+000.0e+00



 Energies (kJ/mol)
 GB PolarizationLJ (SR)   Coulomb (SR)  PotentialKinetic
 En.
 -2.23121e+034.17621e+13   -3.47729e+044.17621e+13
 0.0e+00
 Total EnergyTemperature Pressure (bar)
  4.17621e+130.0e+000.0e+00


 So, you can see, although it has managed to remove some extra terms,
 the LJ and Columb potential are still there. I searched for VWD
 parameters. Although I saw various options for VWD,  its not clear
 from the options, how to turn it off. Could you kindly tell me more
 clearly about it?


 I was also looking into the forcefield.itp file. I set the gen-pairs
 to no , fudgeLJ 1 and fudgeQQ to 1 which were yes, .5 and .83
 respectively originally.

 [ defaults ]
 ; nbfunccomb-rule   gen-pairs   fudgeLJ fudgeQQ
 1   2  no 1 1

 Please let me know how to get rid of calculation of other energies
 (LJ, Culumb and Total Potential) and how to set the parameters for
 this properly.


 You can't get rid of the total. It's the total. You're trying to keep the
 (GB) Coulomb.

 Mark



 Thanks for your help.

 Sincerely,
 Jesmin
 On Thu, Aug 16, 2012 at 3:27 AM, Mark Abraham mark.abra...@anu.edu.au
 wrote:

 On 16/08/2012 5:08 PM, jesmin jahan wrote:

 Hi Mark,

 Thanks for your reply.
 If I open the .tpr file using notepad, it seems to be a binary file.
 Then, how to remove the  the bonded terms and zero the VDW parameters?


 In the .top file from which you made the .tpr. (And contributing .itp
 files)
 Parts of chapter 5 may help with this process.

 Mark


 I really need to compare how fast different well known package can
 compute GB-polarization energy and how good the energy values are?
 That's why time is an important factor me my experiments and I  really
 want to measure the time for GB energy in isolation !

 Thanks,
 Jesmin

 On Thu, Aug 16, 2012 at 2:44 AM, Mark Abraham mark.abra...@anu.edu.au
 wrote:

 On 16/08/2012 4:26 PM, jesmin jahan wrote:

 Hi Mark,

 Thanks for your previous reply.
 I tried to run single point energy simulation with some proteins.
 I got .log files with content like this:

 Energies (kJ/mol

Re: [gmx-users] Questions regarding Polarization Energy Calculation

2012-08-17 Thread jesmin jahan
Okay thanks. I got it. :-)

Best Regards,
Jesmin

On Fri, Aug 17, 2012 at 11:17 PM, Mark Abraham mark.abra...@anu.edu.au wrote:
 On 18/08/12, *jesmin jahan * shraba...@gmail.com wrote:

 Hi Mark,

 Thanks for your reply.

 In the ffnonbonded.itp, I set all sigma and epsilon values to zero.
 So, the the LJ energy is coming as zero. But the  coulomb potential is
 non zero. What should I do to make it zero?


 Each charge-charge interaction contributes both a Coulomb term and a GB
 polarization term. They're computed at nearly the same time, because both
 need the distance, so both contribute to the same timing stats. They show up
 in the different energy terms that you see. There's no way to separate them
 - and any code that can do so is probably still slower than GROMACS doing
 both.


   Energies (kJ/mol)
 GB PolarizationLJ (SR)   Coulomb (SR)  Potential Kinetic En.
-2.23121e+030.0e+00   -3.47729e+04   -3.70041e+04 0.0e+00
Total EnergyTemperature Pressure (bar)
-3.70041e+040.0e+000.0e+00


 One more point: I am not sure whether getting the 0 for an energy
 does mean that it is not being calculated at all!


 Most of GROMACS is supposed to avoid computing zero interactions. It's not
 true for the all-vs-all GB loops, which you're not using, and currently not
 true for bonded interactions. Since nobody seems likely would compute
 without LJ in a real calculation, this is not a big deal from GROMACS point
 of view. I'm still skeptical about what you're trying to measure.
 Performance of well-written code is generally not additive from its
 components. If you're later going to be computing on a real molecular
 system, you're going to have LJ and/or bonded terms, and how well the codes
 compute them *together with the Coulomb terms* is what you really want to
 measure. It's like measuring how long a top chef takes to prepare a meal,
 and then wondering how they manage to feed a whole restaurant on time. They
 do lots of time-slicing and get lots of efficiencies of scale that are not
 available if you're doing just one meal.


 It seems, the energy value was calculated but the result was  zero
 because the sigma and epsilon were zero. In that case, the time


 ...or that the LJ component of the energy is always present even if there
 are no interactions to compute for it, which I expect is the case.


 reported by the program will also include the time of the extra
 calculation (unless the program is smart enough to know before hand
 that the result is going to be zero and return from the top!).


 IIRC neighbour searching detects there's no LJ and so doesn't trigger loops
 that compute LJ.


 So,
 while comparing with other molecular dynamic packages, its not fair to
 report that time for gromacs which also includes non GB time (as that
 can make Gromacs slower than others who do not include non GB-time).

 Any suggestions about this? What is the fair approach for gromacs?


 Look at the flop break-down to see if GROMACS thinks it is computing LJ any
 more. Or do some real timing measurements and compare before and after.

 Mark




 Thanks,
 Jesmin

 On Fri, Aug 17, 2012 at 12:00 AM, Mark Abraham mark.abra...@anu.edu.au
 wrote:
  On 17/08/2012 1:14 PM, jesmin jahan wrote:
 
  Hi Mark,
 
  According to your advice   remove the  the bonded terms and zero the
  VDW parameters,
  I removed everything under [ bond] , [angles], [pairs] and [ dihedrals
  ],
 
 
  This only removes the bonded terms (in the sense of those atoms that
  interact because of the presence of bonds). The VDW parameters for
  non-bonded interactions are in ffnonbonded.itp for your force field. You
  should probably follow the advice here
 
  http://www.gromacs.org/Documentation/How-tos/Adding_a_Residue_to_a_Force_Field#Modifying_a_force_field
  to get a local copy you can change conveniently.
 
 
and run the simulation mdrun rerun.
 
  I  got output something like the following:
 
 
  Energies (kJ/mol)
  GB PolarizationLJ (SR)   Coulomb (SR) PotentialKinetic
  En.
  -2.23121e+037.54287e+07   -3.47729e+04 7.53917e+07
  0.0e+00
  Total EnergyTemperature Pressure (bar)
   7.53917e+070.0e+000.0e+00
 
  where the previous output was something like this:
 
  Energies (kJ/mol)
  Bond  AngleProper Dih. Improper Dih.GB
  Polarization
   2.12480e+034.80088e+021.06648e+03 9.04861e+01
  -2.23122e+03
 LJ-14 Coulomb-14LJ (SR) Coulomb (SR)
  Potential
   7.05695e+025.47366e+03   -4.16856e+02 -8.74797e+03
  -1.45483e+03
   Kinetic En.   Total EnergyTemperature Pressure (bar)
   0.0e+00   -1.45483e+030.0e+00 0.0e+00
 
 
 
  Energies (kJ/mol)
  GB PolarizationLJ (SR)   Coulomb (SR) PotentialKinetic
  En.
  -2.23121e+034.17621e+13   -3.47729e+04 4.17621e+13
  0.0e+00
  Total EnergyTemperature Pressure (bar

Re: [gmx-users] Questions regarding Polarization Energy Calculation

2012-08-16 Thread jesmin jahan
Hi Mark,

Thanks for your previous reply.
I tried to run single point energy simulation with some proteins.
I got .log files with content like this:

Energies (kJ/mol)
   Bond  AngleProper Dih.  Improper Dih.GB Polarization
1.54109e+043.84351e+038.47152e+033.58425e+02   -1.69666e+04
  LJ-14 Coulomb-14LJ (SR)   Coulomb (SR)  Potential
4.29664e+033.63997e+042.22900e+05   -5.18818e+042.22832e+05
Kinetic En.   Total EnergyTemperature Pressure (bar)
1.08443e+091.08465e+092.73602e+070.0e+00
...

Computing:   M-Number M-Flops  % Flops
-
 Generalized Born Coulomb 0.005711   0.274 0.2
 GB Coulomb + LJ  0.416308  25.39518.5
 Outer nonbonded loop 0.016367   0.164 0.1
 1,4 nonbonded interactions   0.008410   0.757 0.6
 Born radii (HCT/OBC) 0.439486  80.42658.5
 Born force chain rule0.439486   6.592 4.8
 NS-Pairs 0.943653  19.81714.4
 Reset In Box 0.003179   0.010 0.0
 CG-CoM   0.006358   0.019 0.0
 Bonds0.003219   0.190 0.1
 Angles   0.005838   0.981 0.7
 Propers  0.011273   2.582 1.9
 Virial   0.003899   0.070 0.1
 Stop-CM  0.003179   0.032 0.0
 Calc-Ekin0.006358   0.172 0.1
-
 Total 137.479   100.0
-


D O M A I N   D E C O M P O S I T I O N   S T A T I S T I C S

 av. #atoms communicated per step for force:  2 x 6859.0


 R E A L   C Y C L E   A N D   T I M E   A C C O U N T I N G

 Computing: Nodes Number G-CyclesSeconds %
---
 Domain decomp.16  10.0430.0 1.4
 Comm. coord.  16  10.0030.0 0.1
 Neighbor search   16  10.1030.0 3.5
 Force 16  11.5300.551.5
 Wait + Comm. F16  10.2640.1 8.9
 Write traj.   16  10.0620.0 2.1
 Update16  10.0010.0 0.0
 Comm. energies16  20.9330.331.4
 Rest  16   0.0310.0 1.1
---
 Total 16   2.9700.9   100.0
---

NOTE: 31 % of the run time was spent communicating energies,
  you might want to use the -gcom option of mdrun


Parallel run - timing based on wallclock.

   NODE (s)   Real (s)  (%)
   Time:  0.056  0.056100.0
   (Mnbf/s)   (GFlops)   (ns/day)  (hour/ns)
Performance:  7.497  2.442  1.535 15.637


From the log file, it seems, the time includes the time for LJ and
Columb Potential Energy. But as I said before, I am only interested to
GB-energy times. I am doing a comparative study of GB-energy
performance (values vs time) for different molecular dynamic packages.

That's why I was trying to deduct the time for any other extra energy
computation time from it.

Can anyone tell me how to get the exact time of GB-polarization energy
(including Born radii) and excluding the times for any other
additional energy (like LJ and Columb etc) from gromacs simutation?


Thanks,
Jesmin



On Tue, Aug 14, 2012 at 10:16 AM, jesmin jahan shraba...@gmail.com wrote:
 Thanks Mark for your reply. I was trying to use Single-Point Energy
 Calculation as you advised in your first reply but for most of the
 files the simulation failed because I was using the original .pdb
 files in the mdrun command.

 Anyways. I really appreciate your help.
 Thanks again,
 Jesmin

 On Tue, Aug 14, 2012 at 1:26 AM, Mark Abraham mark.abra...@anu.edu.au wrote:
 On 14/08/2012 7:38 AM, jesmin jahan wrote:

 Dear Gromacs Users,

 I have some questions regarding GB-Polarization Energy Calculation
 with Gromacs. I will be grateful if someone can help me with the
 answers.

 I am trying to calculate GB-Polarization energy for different Protein
 molecules. I am

Re: [gmx-users] Questions regarding Polarization Energy Calculation

2012-08-16 Thread jesmin jahan
Hi Mark,

Thanks for your reply.
If I open the .tpr file using notepad, it seems to be a binary file.
Then, how to remove the  the bonded terms and zero the VDW parameters?

I really need to compare how fast different well known package can
compute GB-polarization energy and how good the energy values are?
That's why time is an important factor me my experiments and I  really
want to measure the time for GB energy in isolation !

Thanks,
Jesmin

 On Thu, Aug 16, 2012 at 2:44 AM, Mark Abraham mark.abra...@anu.edu.au wrote:

 On 16/08/2012 4:26 PM, jesmin jahan wrote:

 Hi Mark,

 Thanks for your previous reply.
 I tried to run single point energy simulation with some proteins.
 I got .log files with content like this:

 Energies (kJ/mol)
 Bond  AngleProper Dih.  Improper Dih.GB Polarization
  1.54109e+043.84351e+038.47152e+033.58425e+02   -1.69666e+04
LJ-14 Coulomb-14LJ (SR)   Coulomb (SR)  Potential
  4.29664e+033.63997e+042.22900e+05   -5.18818e+042.22832e+05
  Kinetic En.   Total EnergyTemperature Pressure (bar)
  1.08443e+091.08465e+092.73602e+070.0e+00
 ...

 Computing:   M-Number M-Flops  % Flops
 -
   Generalized Born Coulomb 0.005711   0.274 0.2
   GB Coulomb + LJ  0.416308  25.39518.5
   Outer nonbonded loop 0.016367   0.164 0.1
   1,4 nonbonded interactions   0.008410   0.757 0.6
   Born radii (HCT/OBC) 0.439486  80.42658.5
   Born force chain rule0.439486   6.592 4.8
   NS-Pairs 0.943653  19.81714.4
   Reset In Box 0.003179   0.010 0.0
   CG-CoM   0.006358   0.019 0.0
   Bonds0.003219   0.190 0.1
   Angles   0.005838   0.981 0.7
   Propers  0.011273   2.582 1.9
   Virial   0.003899   0.070 0.1
   Stop-CM  0.003179   0.032 0.0
   Calc-Ekin0.006358   0.172 0.1
 -
   Total 137.479   100.0
 -


  D O M A I N   D E C O M P O S I T I O N   S T A T I S T I C S

   av. #atoms communicated per step for force:  2 x 6859.0


   R E A L   C Y C L E   A N D   T I M E   A C C O U N T I N G

   Computing: Nodes Number G-CyclesSeconds %
 ---
   Domain decomp.16  10.0430.0 1.4
   Comm. coord.  16  10.0030.0 0.1
   Neighbor search   16  10.1030.0 3.5
   Force 16  11.5300.551.5
   Wait + Comm. F16  10.2640.1 8.9
   Write traj.   16  10.0620.0 2.1
   Update16  10.0010.0 0.0
   Comm. energies16  20.9330.331.4
   Rest  16   0.0310.0 1.1
 ---
   Total 16   2.9700.9   100.0
 ---

 NOTE: 31 % of the run time was spent communicating energies,
you might want to use the -gcom option of mdrun


 Parallel run - timing based on wallclock.

 NODE (s)   Real (s)  (%)
 Time:  0.056  0.056100.0
 (Mnbf/s)   (GFlops)   (ns/day)  (hour/ns)
 Performance:  7.497  2.442  1.535 15.637


 From the log file, it seems, the time includes the time for LJ and
 Columb Potential Energy. But as I said before, I am only interested to
 GB-energy times. I am doing a comparative study of GB-energy
 performance (values vs time) for different molecular dynamic packages.


 Since the LJ calculation also needs the distances, GROMACS does them in the 
 same loops and makes no apology for being efficient. :-) If you're really 
 trying to measure the time for the GB energy in isolation, then you will 
 need to construct a different model physics that lacks LJ interactions. Or 
 perhaps you don't really want to measure the time for GB energy in 
 isolation. Depends what

Re: [gmx-users] Questions regarding Polarization Energy Calculation

2012-08-16 Thread jesmin jahan
Hi Mark,

According to your advice   remove the  the bonded terms and zero the
VDW parameters,
I removed everything under [ bond] , [angles], [pairs] and [ dihedrals
], and run the simulation mdrun rerun.

I  got output something like the following:


   Energies (kJ/mol)
GB PolarizationLJ (SR)   Coulomb (SR)  PotentialKinetic En.
   -2.23121e+037.54287e+07   -3.47729e+047.53917e+070.0e+00
   Total EnergyTemperature Pressure (bar)
7.53917e+070.0e+000.0e+00

where the previous output was something like this:

Energies (kJ/mol)
   Bond  AngleProper Dih.  Improper Dih.GB Polarization
2.12480e+034.80088e+021.06648e+039.04861e+01   -2.23122e+03
  LJ-14 Coulomb-14LJ (SR)   Coulomb (SR)  Potential
7.05695e+025.47366e+03   -4.16856e+02   -8.74797e+03   -1.45483e+03
Kinetic En.   Total EnergyTemperature Pressure (bar)
0.0e+00   -1.45483e+030.0e+000.0e+00



   Energies (kJ/mol)
GB PolarizationLJ (SR)   Coulomb (SR)  PotentialKinetic En.
   -2.23121e+034.17621e+13   -3.47729e+044.17621e+130.0e+00
   Total EnergyTemperature Pressure (bar)
4.17621e+130.0e+000.0e+00


So, you can see, although it has managed to remove some extra terms,
the LJ and Columb potential are still there. I searched for VWD
parameters. Although I saw various options for VWD,  its not clear
from the options, how to turn it off. Could you kindly tell me more
clearly about it?


I was also looking into the forcefield.itp file. I set the gen-pairs
to no , fudgeLJ 1 and fudgeQQ to 1 which were yes, .5 and .83
respectively originally.

[ defaults ]
; nbfunccomb-rule   gen-pairs   fudgeLJ fudgeQQ
1   2  no 1 1

Please let me know how to get rid of calculation of other energies
(LJ, Culumb and Total Potential) and how to set the parameters for
this properly.

Thanks for your help.

Sincerely,
Jesmin
On Thu, Aug 16, 2012 at 3:27 AM, Mark Abraham mark.abra...@anu.edu.au wrote:
 On 16/08/2012 5:08 PM, jesmin jahan wrote:

 Hi Mark,

 Thanks for your reply.
 If I open the .tpr file using notepad, it seems to be a binary file.
 Then, how to remove the  the bonded terms and zero the VDW parameters?


 In the .top file from which you made the .tpr. (And contributing .itp files)
 Parts of chapter 5 may help with this process.

 Mark



 I really need to compare how fast different well known package can
 compute GB-polarization energy and how good the energy values are?
 That's why time is an important factor me my experiments and I  really
 want to measure the time for GB energy in isolation !

 Thanks,
 Jesmin

 On Thu, Aug 16, 2012 at 2:44 AM, Mark Abraham mark.abra...@anu.edu.au
 wrote:

 On 16/08/2012 4:26 PM, jesmin jahan wrote:

 Hi Mark,

 Thanks for your previous reply.
 I tried to run single point energy simulation with some proteins.
 I got .log files with content like this:

 Energies (kJ/mol)
  Bond  AngleProper Dih.  Improper Dih.GB
 Polarization
   1.54109e+043.84351e+038.47152e+033.58425e+02
 -1.69666e+04
 LJ-14 Coulomb-14LJ (SR)   Coulomb (SR)
 Potential
   4.29664e+033.63997e+042.22900e+05   -5.18818e+04
 2.22832e+05
   Kinetic En.   Total EnergyTemperature Pressure (bar)
   1.08443e+091.08465e+092.73602e+070.0e+00
 ...

 Computing:   M-Number M-Flops  %
 Flops

 -
Generalized Born Coulomb 0.005711   0.274
 0.2
GB Coulomb + LJ  0.416308  25.395
 18.5
Outer nonbonded loop 0.016367   0.164
 0.1
1,4 nonbonded interactions   0.008410   0.757
 0.6
Born radii (HCT/OBC) 0.439486  80.426
 58.5
Born force chain rule0.439486   6.592
 4.8
NS-Pairs 0.943653  19.817
 14.4
Reset In Box 0.003179   0.010
 0.0
CG-CoM   0.006358   0.019
 0.0
Bonds0.003219   0.190
 0.1
Angles   0.005838   0.981
 0.7
Propers  0.011273   2.582
 1.9
Virial   0.003899   0.070
 0.1
Stop-CM  0.003179   0.032
 0.0
Calc-Ekin0.006358   0.172
 0.1

 -
Total 137.479
 100.0

Re: [gmx-users] Questions regarding Polarization Energy Calculation

2012-08-14 Thread jesmin jahan
Thanks Mark for your reply. I was trying to use Single-Point Energy
Calculation as you advised in your first reply but for most of the
files the simulation failed because I was using the original .pdb
files in the mdrun command.

Anyways. I really appreciate your help.
Thanks again,
Jesmin

On Tue, Aug 14, 2012 at 1:26 AM, Mark Abraham mark.abra...@anu.edu.au wrote:
 On 14/08/2012 7:38 AM, jesmin jahan wrote:

 Dear Gromacs Users,

 I have some questions regarding GB-Polarization Energy Calculation
 with Gromacs. I will be grateful if someone can help me with the
 answers.

 I am trying to calculate GB-Polarization energy for different Protein
 molecules. I am interested both in energy values with the time
 required to calculate the Born Radii and Polarization Energy.
 I am not doing any energy minimization step as the files I am using as
 input are already minimized.

 Here is the content of my  mdrun.mdp file:

 constraints =  none
 integrator=  md
 pbc   =  no
 dt =  0.001
 nsteps =  0
 implicit_solvent=  GBSA
 gb_algorithm=  HCT
 sa_algorithm=  None

 And I am using following three steps for all the .pdb files I have:

 let x is the name of the .pdb file.

 pdb2gmx -f x.pdb -ter -ignh -ff amber99sb -water none
 grompp -f mdr.mdp -c conf.gro -p topol.top -o imd.tpr
 mpirun -np 8 mdrun_mpi  -deffnm imd -v -g x.log


 So you're not using the advice I gave you about how to calculate single
 point energies. OK.


 1 .Now the running time reported by a log file also includes other
 times. Its also not clear to me whether the time includes the time for
 Born Radii calculations.


 The timing breakdown is printed at the end of the .log file. Likely your
 time is heavily dominated by the GB calculation and communication cost. Born
 radii calculation are part of the former, and not reported separately. You
 should not bother with timing measurements unless your run goes for at least
 several minutes, else your time will be dominated by I/O and setup costs.


 So, to get the GB-energy time  I am doing the following: I am also
 running a simulation with implicit_solvent set to no and I am
 taking the difference of these two (with GB and Without GB). Is that a
 right approach?


 No, that measures the weight difference between an apple and an orange, not
 whether the apple's seeds are heavy.


 I also want to be sure that it also includes Born-Radii calculation time.


 It's part of the GB calculation, so it's included in its timing.



 Is there any other approach to do this?


 2. I was trying to run the simulations on 192 cores (16 nodes each
 with 12 codes). But I got There is no domain decomposition for 12
 nodes that is compatible with the given box and a minimum cell size of
 2.90226 nm error for some pdb files. Can anyone explain what is
 happening. Is there any restriction on number of nodes can be used?


 Yes. See discussion linked from http://www.gromacs.org/Documentation/Errors



 3. I run the simulations with 1 way 96 (8 nodes each with 12 cores).
 Its not clear to me from the log file whether Gromacs is able to
 utilize all the 92 cores. It seems, it is using only 8 nodes.
 Does Gromacs use both shared and distributed memory parallelism?


 Not at the moment. Look at the top of your .log file for clues about what
 your configuration is making available to GROMACS. It is likely that mpirun
 -np 8 makes only 8 MPI processes available to GROMACS. Using more will
 require you to use your MPI installation correctly (and we can't help with
 that).


 4.   In the single-point energy  calculation mdrun -s input.tpr
 -rerun configuration.pdb, is the configuration.pdb mentioned  is the
 original pdb file used on pdb2gmx  with -f option? Or its a modified
 pdb file? I am asking because if I use the original file that does not
 work always :-(


 It can be any configuration that matches the .top file you gave to grompp.
 That's the point - you only need one run input file to compute the energy of
 any such configuration you later want. The configuration you gave to grompp
 (or any other tool) doesn't matter.


 5. Is there any known speedup factor of Gromacs on multicores?


 That depends on your simulation system, hardware, network and algorithm.
 Don't bother with fewer than hundreds of atoms per core.

 Mark
 --
 gmx-users mailing listgmx-users@gromacs.org
 http://lists.gromacs.org/mailman/listinfo/gmx-users
 * Only plain text messages are allowed!
 * Please search the archive at
 http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
 * Please don't post (un)subscribe requests to the list. Use the www
 interface or send it to gmx-users-requ...@gromacs.org.
 * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists



-- 
Jesmin Jahan Tithi
PhD Student, CS
Stony Brook University, NY-11790.
-- 
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx

[gmx-users] Questions regarding Polarization Energy Calculation

2012-08-13 Thread jesmin jahan
Dear Gromacs Users,

I have some questions regarding GB-Polarization Energy Calculation
with Gromacs. I will be grateful if someone can help me with the
answers.

I am trying to calculate GB-Polarization energy for different Protein
molecules. I am interested both in energy values with the time
required to calculate the Born Radii and Polarization Energy.
I am not doing any energy minimization step as the files I am using as
input are already minimized.

Here is the content of my  mdrun.mdp file:

constraints =  none
integrator=  md
pbc   =  no
dt =  0.001
nsteps =  0
implicit_solvent=  GBSA
gb_algorithm=  HCT
sa_algorithm=  None

And I am using following three steps for all the .pdb files I have:

let x is the name of the .pdb file.

pdb2gmx -f x.pdb -ter -ignh -ff amber99sb -water none
grompp -f mdr.mdp -c conf.gro -p topol.top -o imd.tpr
mpirun -np 8 mdrun_mpi  -deffnm imd -v -g x.log



1 .Now the running time reported by a log file also includes other
times. Its also not clear to me whether the time includes the time for
Born Radii calculations.
So, to get the GB-energy time  I am doing the following: I am also
running a simulation with implicit_solvent set to no and I am
taking the difference of these two (with GB and Without GB). Is that a
right approach?
I also want to be sure that it also includes Born-Radii calculation time.

Is there any other approach to do this?


2. I was trying to run the simulations on 192 cores (16 nodes each
with 12 codes). But I got There is no domain decomposition for 12
nodes that is compatible with the given box and a minimum cell size of
2.90226 nm error for some pdb files. Can anyone explain what is
happening. Is there any restriction on number of nodes can be used?

3. I run the simulations with 1 way 96 (8 nodes each with 12 cores).
Its not clear to me from the log file whether Gromacs is able to
utilize all the 92 cores. It seems, it is using only 8 nodes.
Does Gromacs use both shared and distributed memory parallelism?

4.   In the single-point energy  calculation mdrun -s input.tpr
-rerun configuration.pdb, is the configuration.pdb mentioned  is the
original pdb file used on pdb2gmx  with -f option? Or its a modified
pdb file? I am asking because if I use the original file that does not
work always :-(

5. Is there any known speedup factor of Gromacs on multicores?



Thanks for your help.

Best Regards,
Jesmin

--
Jesmin Jahan Tithi
PhD Student, CS
Stony Brook University, NY-11790.
-- 
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Only plain text messages are allowed!
* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
* Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to gmx-users-requ...@gromacs.org.
* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists


[gmx-users] About getting running time from a Gromacs Run

2012-08-09 Thread jesmin jahan
Dear Gromacs Users,

I am a new user of Gromacs  and I am using gromacs to calculate
GB-polarization energy .

I have a couple of questions about it. It would be great if someone
can answer them. Thanks in advance.

Questions:

1: I got an output like this in log file from a protein simulation:

 R E A L   C Y C L E   A N D   T I M E   A C C O U N T I N G

 Computing: Nodes Number G-CyclesSeconds %
---
 Comm. coord.  24  10.2350.1 1.0
 Force 24  1   17.3695.271.9
 Wait + Comm. F24  10.2580.1 1.1
 Write traj.   24  10.2960.1 1.2
 Update24  10.0050.0 0.0
 Comm. energies24  27.2522.230.0
---
 Total 24  24.1577.3   100.0
---

NOTE: 30 % of the run time was spent communicating energies,
  you might want to use the -gcom option of mdrun


Parallel run - timing based on wallclock.

   NODE (s)   Real (s)  (%)
   Time:  0.303  0.303100.0
   (Mnbf/s)   (GFlops)   (ns/day)  (hour/ns)
Performance:  0.000 54.029  0.285 84.257

Now my question is: What is the time to calculate the GB-polarization
Energy that is shown in log? Or How to get the time?

2. I have a protein with 1.5 million atoms. I tried to run it with
gromacs on 16 machines each with 12 cores. It generated a segmentation
fault. Is there any size limit to run in gromacs?

3. Can I only calculate the GB-polarization energy without calculating
any Forces? How?

4. What are the good tools to get the GB- energy values along with the
time to calculate the GB-energy?

Thanks,
Jesmin

--
Jesmin Jahan Tithi
PhD Student, CS
Stony Brook University, NY-11790.
-- 
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Only plain text messages are allowed!
* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
* Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to gmx-users-requ...@gromacs.org.
* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists


Re: [gmx-users] About getting running time from a Gromacs Run

2012-08-09 Thread jesmin jahan
Hi Mark,

Thanks for your reply. I will try to follow your instructions.

Thanks,
Jesmin

On Thu, Aug 9, 2012 at 9:06 PM, Mark Abraham mark.abra...@anu.edu.au wrote:
 On 10/08/2012 3:06 AM, jesmin jahan wrote:

 Dear Gromacs Users,

 I am a new user of Gromacs  and I am using gromacs to calculate
 GB-polarization energy .

 I have a couple of questions about it. It would be great if someone
 can answer them. Thanks in advance.

 Questions:

 1: I got an output like this in log file from a protein simulation:

   R E A L   C Y C L E   A N D   T I M E   A C C O U N T I N G

   Computing: Nodes Number G-CyclesSeconds %
 ---
   Comm. coord.  24  10.2350.1 1.0
   Force 24  1   17.3695.271.9
   Wait + Comm. F24  10.2580.1 1.1
   Write traj.   24  10.2960.1 1.2
   Update24  10.0050.0 0.0
   Comm. energies24  27.2522.230.0
 ---
   Total 24  24.1577.3   100.0
 ---

 NOTE: 30 % of the run time was spent communicating energies,
you might want to use the -gcom option of mdrun


 Parallel run - timing based on wallclock.

 NODE (s)   Real (s)  (%)
 Time:  0.303  0.303 100.0
 (Mnbf/s)   (GFlops)   (ns/day)  (hour/ns)
 Performance:  0.000 54.029  0.285 84.257

 Now my question is: What is the time to calculate the GB-polarization
 Energy that is shown in log? Or How to get the time?


 It's a side-effect of the Force component of the timing breakdown.


 2. I have a protein with 1.5 million atoms. I tried to run it with
 gromacs on 16 machines each with 12 cores. It generated a segmentation
 fault. Is there any size limit to run in gromacs?


 No, but if you run yourself out of memory results are unpredictable. Consult
 your .log and stdout files for clues what is going on.


 3. Can I only calculate the GB-polarization energy without calculating
 any Forces? How?


 http://www.gromacs.org/Documentation/How-tos/Single-Point_Energy


 4. What are the good tools to get the GB- energy values along with the
 time to calculate the GB-energy?


 As above.

 Mark
 --
 gmx-users mailing listgmx-users@gromacs.org
 http://lists.gromacs.org/mailman/listinfo/gmx-users
 * Only plain text messages are allowed!
 * Please search the archive at
 http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
 * Please don't post (un)subscribe requests to the list. Use the www
 interface or send it to gmx-users-requ...@gromacs.org.
 * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists



-- 
Jesmin Jahan Tithi
PhD Student, CS
Stony Brook University, NY-11790.
-- 
gmx-users mailing listgmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
* Only plain text messages are allowed!
* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
* Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to gmx-users-requ...@gromacs.org.
* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists