[gmx-users] Automatically assign the protonation states for pdb2gmx

2020-02-21 Thread ZHANG Cheng
I want to run more than 300 MD, each with a different PDB (more precisely, 
variants derived from a same wild type). I need to manually assign the 
protonation states using the "-inter" option every time, which is impossible 
for more than 300 times.


gmx pdb2gmx -f protein.pdb -o protein_processed.gro -water spce 
-inter -ignh -merge interactive


The protonation states come from the pdb2pqr website. Is there an alternative 
way to obtain the .gro file by providing the necessary inputs (e.g. pdb, 
protonations), so that I can batch obtain the 300 corresponding .gro files?


I know the "echo" could not work for the charge assignment.


It would be ultimately possible if I can understand the fundamental codes 
behind "pdb2gmx", then write a batch code to process multiple PDB files at 
once. But is there a simpler route?
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


Re: [gmx-users] How to avoid the infinite potential energy in simulations of dimers?

2020-02-21 Thread Qing Liu
Dear Justin Lemkul,

 Thanks for your reply. I just downloaded the Cryo-EM structure of the dimer 
from RCSB, added missing residues using Modeller and built systems by the 
following commands:



 gmx pdb2gmx -f dimer.pdb -o dimer.gro -p dimer.top -water tip3p -ff 
amber99sb-ildn -ignh
  gmx editconf -f dimer.gro -o vac-min-pbc.gro -bt cubic -d 1.5  -c
  gmx solvate -cp vac-min-pbc.gro -cs spc216.gro -p dimer.top -o 
vac-min-pbc-solv.gro
  gmx grompp -v -f sol-min.mdp -c vac-min-pbc-solv.gro -p dimer.top -o 
vac-min-pbc-solv.tpr
  gmx genion -s vac-min-pbc-solv.tpr -o vac-min-pbc-solv-salt.gro -conc 
0.15 -neutral -pname NA -nname CL  -p dimer.top
  gmx grompp -f sol-min.mdp -c vac-min-pbc-solv-salt.gro -p dimer.top -o 
vac-min-pbc-solv-salt-min.tpr
  gmx  mdrun -v -deffnm vac-min-pbc-solv-salt-min
  gmx grompp -f pr-md.mdp -c vac-min-pbc-solv-salt-min.gro -p dimer.top -o 
pr-md.tpr  -r vac-min-pbc-solv-salt-min.gro
  gmx mdrun -nb gpu -v -deffnm pr-md


The contents of sol-min.mdp file are:


  ; Define can be used to control processes
  define  = -DFLEXIBLE
  ; Parameters describing what to do, when to stop and what to save
   integrator  = steep ; Algorithm (steep = steepest descent 
minimization)
   emtol   = 1.0   ; Stop minimization when the maximum 
force < 1.0 kJ/mol
   nsteps  = 5000  ; Maximum number of (minimization) steps 
to perform
   nstenergy   = 1 ; Write energies to disk every nstenergy 
steps
   energygrps  = System; Which energy group(s) to write to disk
  ; Parameters describing how to find the neighbors of each atom and how to 
calculate the interactions
 ns_type = grid  ; Method to determine neighbor list 
(simple, grid)
 coulombtype = cut-off   ; Treatment of long range electrostatic 
interactions
 rcoulomb= 1.0   ; long range electrostatic cut-off
 rvdw= 1.0   ; long range Van der Waals cut-off
 constraints = none  ; Bond types to replace by constraints
 pbc = xyz   ; Periodic Boundary Conditions (yes/no)
 sc-coul = yes



The log file of energy minimization shows:
  Steepest Descents converged to machine precision in 2108 steps,
  but did not reach the requested Fmax < 1.
  Potential Energy  = -2.0181694e+07
  Maximum force =  3.4778253e+02 on atom 114839
   Norm of force =  3.9966155e+01


The contents of pr-md.mdp are:
 title   = PR MD
define  =-DPOSRES
;run control
integrator  =md
tinit   =0
   dt  =0.002
nsteps  =50
comm_mode   =Linear
nstcomm =10
   comm_grps   =System
   ;output control
   nstxout =0
  nstvout =0
  nstlog  =5
   nstcalcenergy   =1
   nstenergy   =5
   nstxtcout   =5
  xtc_grps=System
  energygrps  =System
 ;neighbor searching
  nstlist =10
  ns_type =grid
  pbc =xyz
  rlist   =1.4
 ;electrostatics
  coulombtype =PME
  rcoulomb=1.4
  ;vdw
  vdwtype =Cut-off
  rvdw=1.4
  dispCorr=EnerPres
  ;Ewald
  fourierspacing  =0.1
  pme_order   =4
  ewald_rtol  =1e-5
 ;temperature coupling
  tcoupl  =v-rescale
  tc_grps =System
  tau_t   =0.1
  ref_t   =200
  ;velocity generation
  gen-vel =no
  ;bonds
  constraints =all-bonds
  constraint_algorithm=SHAKE
  shake-tol   =0.0001
  morse   =no
  continuation=yes
  sc-coul =yes



The pr-md.log shows:

   Fatal error:
   Step 2460: The total potential energy is nan, which is not finite. The LJ and
   electrostatic contributions to the energy are 3.25168e+06 and -1.99067e+07,
   respectively. A non-finite potential energy can be caused by overlapping
   interactions in bonded interactions or very large or Nan coordinate values.
   Usually this is caused by a badly- or non-equilibrated initial configuration,
   incorrect interactions or parameters in the topology.








--


Best wishes, 



Qing Liu

Fudan Univ.

Mobile: +86—13358129621

E-mail: 15110700...@fudan.edu.cn






>Message: 5
>Date: Fri, 21 Feb 2020 09:21:34 -0500
>From: Justin Lemkul 
>To: gmx-us...@gromacs.org
>Subject: Re: [gmx-users] How to avoid the infinite potential energy in
>   simulations of dimers?
>Message-ID: 
>Content-Type: text/plain; charset=gbk; format=flowed
>
>
>
>On 2/21/20 5:11 AM, Qing Liu wrote:
>> Dear Gromacs users,
>>I run some simulations of dimers, a protein binding another protein. 
>> When these simulations step into equilibriumstages,  they will crash with 
>> the following:
>>
>>
>>
>>
>>
>>Step 2029: The total 

Re: [gmx-users] REMD stall out

2020-02-21 Thread Daniel Burns
This was not actually the solution.  Wanted to follow up in case
someone else is experiencing this problem.  We are reinstalling the openmp
version.

On Thu, Feb 20, 2020 at 3:10 PM Daniel Burns  wrote:

> Hi again,
>
> It seems including our openmp module was responsible for the issue the
> whole time.  When I submit the job only loading pmix and gromacs, replica
> exchange proceeds.
>
> Thank you,
>
> Dan
>
> On Mon, Feb 17, 2020 at 9:09 AM Mark Abraham 
> wrote:
>
>> Hi,
>>
>> That could be caused by configuration of the parallel file system or MPI
>> on
>> your cluster. If only one file descriptor is available per node to an MPI
>> job, then your symptoms are explained. Some kinds of compute jobs follow
>> such a model, so maybe someone optimized something for that.
>>
>> Mark
>>
>> On Mon, 17 Feb 2020 at 15:56, Daniel Burns  wrote:
>>
>> > HI Szilard,
>> >
>> > I've deleted all my output but all the writing to the log and console
>> stops
>> > around the step noting the domain decomposition (or other preliminary
>> > task).  It is the same with or without Plumed - the TREMD with Gromacs
>> only
>> > was the first thing to present this issue.
>> >
>> > I've discovered that if each replica is assigned its own node, the
>> > simulations proceed.  If I try to run several replicas on each node
>> > (divided evenly), the simulations stall out before any trajectories get
>> > written.
>> >
>> > I have tried many different -np and -ntomp options as well as several
>> slurm
>> > job submission scripts with node/ thread configurations but multiple
>> > simulations per node will not work.  I need to be able to run several
>> > replicas on the same node to get enough data since it's hard to get more
>> > than 8 nodes (and as a result, replicas).
>> >
>> > Thanks for your reply.
>> >
>> > -Dan
>> >
>> > On Tue, Feb 11, 2020 at 12:56 PM Daniel Burns 
>> wrote:
>> >
>> > > Hi,
>> > >
>> > > I continue to have trouble getting an REMD job to run.  It never
>> makes it
>> > > to the point that it generates trajectory files but it never gives any
>> > > error either.
>> > >
>> > > I have switched from a large TREMD with 72 replicas to the Plumed
>> > > Hamiltonian method with only 6 replicas.  Everything is now on one
>> node
>> > and
>> > > each replica has 6 cores.  I've turned off the dynamic load balancing
>> on
>> > > this attempt per the recommendation from the Plumed site.
>> > >
>> > > Any ideas on how to troubleshoot?
>> > >
>> > > Thank you,
>> > >
>> > > Dan
>> > >
>> > --
>> > Gromacs Users mailing list
>> >
>> > * Please search the archive at
>> > http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
>> > posting!
>> >
>> > * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>> >
>> > * For (un)subscribe requests visit
>> > https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
>> > send a mail to gmx-users-requ...@gromacs.org.
>> >
>> --
>> Gromacs Users mailing list
>>
>> * Please search the archive at
>> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
>> posting!
>>
>> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>>
>> * For (un)subscribe requests visit
>> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
>> send a mail to gmx-users-requ...@gromacs.org.
>>
>
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


Re: [gmx-users] How to avoid the infinite potential energy in simulations of dimers?

2020-02-21 Thread Justin Lemkul




On 2/21/20 5:11 AM, Qing Liu wrote:

Dear Gromacs users,
   I run some simulations of dimers, a protein binding another protein. 
When these simulations step into equilibriumstages,  they will crash with the 
following:





   Step 2029: The total potential energy is nan, which is not 
finite. The LJ and
   electrostatic contributions to the energy are 746119 and 
-5.71229e+06,
   respectively. A non-finite potential energy can be caused by 
overlapping
  interactions in bonded interactions or very large or Nan 
coordinate values.
 Usually this is caused by a badly- or non-equilibrated initial 
configuration,
  incorrect interactions or parameters in the topology.



   I try to turn on soft-core potential but it's not working. However, the 
simulations of monomers with same mdp files are normal.  How should I do to 
solve the problem?



Your initial configuration is unreasonable as you have a massive LJ 
repulsion. How did you construct the coordinates of the system? Likely 
energy minimization will have reported unreasonable forces, as well.


-Justin

--
==

Justin A. Lemkul, Ph.D.
Assistant Professor
Office: 301 Fralin Hall
Lab: 303 Engel Hall

Virginia Tech Department of Biochemistry
340 West Campus Dr.
Blacksburg, VA 24061

jalem...@vt.edu | (540) 231-3129
http://www.thelemkullab.com

==

--
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


[gmx-users] Regarding gpu acceleration

2020-02-21 Thread Faizan Abul Qais
Dear sir,
I am trying to use gpu for gromacs. It gives error as "cuda is depracted".

I am using
Ubuntu 18.04.4
Cudua toolkit 10.2
Gromacs 2018.1
gcc version 7.4
Nvidia driver version 435
Nvidia GTX GeForce 1050 Ti

Does these Versions support gpu acceleration?
Plz help.
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


[gmx-users] How to avoid the infinite potential energy in simulations of dimers?

2020-02-21 Thread Qing Liu
Dear Gromacs users,
  I run some simulations of dimers, a protein binding another protein. When 
these simulations step into equilibriumstages,  they will crash with the 
following:





  Step 2029: The total potential energy is nan, which is not 
finite. The LJ and
  electrostatic contributions to the energy are 746119 and 
-5.71229e+06,
  respectively. A non-finite potential energy can be caused by 
overlapping
 interactions in bonded interactions or very large or Nan 
coordinate values.
Usually this is caused by a badly- or non-equilibrated initial 
configuration,
 incorrect interactions or parameters in the topology.
   

  I try to turn on soft-core potential but it's not working. However, the 
simulations of monomers with same mdp files are normal.  How should I do to 
solve the problem?






--


Best wishes, 



Qing Liu

Fudan Univ.

Mobile: +86—13358129621

E-mail: 15110700...@fudan.edu.cn
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.

[gmx-users] Thesis developed software; which Open-source License?

2020-02-21 Thread Henry Wittler
Greetings comrades. This question may be somewhat redundant, so dont expect
reply to every aspect, may ask at RG also. However if anyone have more or
less insight, please reply.

My thesis has calculation software that are built upon partly innovative
and partly basic calculation/plotting/graphing software based upon
"VMD/tcl” coding, GROMACS (GMX) and Python/MatplotlibScipy/Numpy (etc).
The solvated protein data (chap. 6 of thesis) are obtained from GMX
simulations, moreover the analysis data by VMD/tcl etc.

I have uploaded so far just part of the code (for chap. 4.1) I intend to
upload code from whole thesis to github link
https://wittler-github.github.io/A_MD_Analysis_of_Insulin/
The following thesis link(downloadable) contains the calculated data along
with the graphs.

http://hdl.handle.net/1959.9/568798  (
https://www.researchgate.net/project/A-Molecular-Dynamics-Analysis-of-Insulin)


I mainly want to share the novel GMX simulated data and tools I’ve
developed from the above-mentioned softwares, that are described in my
thesis. I do not expect to contribute to GMX directly from this repository,
possibly via other created repository if so.

Currently I have applied the BSD-3 clause version, which I understand to be
the most relevant and simple to use. Choice is not set-in stone it appears,
however if changing I understand one need to notify everyone that has taken
part of, or are using the code.
The BSD-3 do appears to be one of the most popular for open-source code and
compatible between different softwares. I thought of using LGPL2 since GMX
uses that, however BSD-3 does appear more simple to understand. The only
innovative scripts including GMX are just linux bash/automatic scripts that
make the simulation of replicas straightforward. Can I share these script
under the BSD-3 license, even though technically they are modified scripts
using standard gmx commands (not altering the original gmx software
v.5.0.4)? Can I share some of the GMX code (parameter files etc) and some
of the original GMX simulated solvated protein data (in addition to data
calculated/graphed by matplotlib etc), at my github repository with no
issues between BSD-3 license and LGPL2?

The other python, and VMD/tcl softwares I do not see any issue with. Are
there any clash with these above-mentioned softwares to be aware of?

Any other insights anyone else has here about open-source licensing when
distributing code?

*Kind regards,*

*Dr. Henry P.A. Wittler*

*Department of Chemistry and Physics, LIMS, La Trobe University,
MelbourneWorking in Ludvika, Sweden*
*Skype: henry.wittler*
Researchgate & Linkedin
linkedin.com/in/henry-per-andreas-wittler-b03256191

-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.

Re: [gmx-users] Regarding to pme on gpu

2020-02-21 Thread Szilárd Páll
Hi.
On Tue, Feb 18, 2020 at 5:11 PM Jimmy Chen  wrote:
>
> Hi,
>
> When set -pme gpu in mdrun, only one rank can be set for pme, -npme 1. What
> is the reason about only one rank for pme if use gpu to offload. Is it the
> limitation or somehow?

This is a limitation of the implementation, currently PME
decomposition is not supported with PME offload. Hence, PME work needs
to be assigned to a single GPU, be it the same as the one that does
the PP computation or a separate one (assigned to the PME rank).

> I am interesting in any performance improvement is still doable and any
> improved plan for gpu kernel of pme and pp. there is no too much different
> in pme and pp between 2019.3 and 2020.

Can you please clarify what you mean?

The 2020 release has made some improvements to both PP (bonded
kernels) and PME performance, but these can indeed be minor.

Cheers,
--
Szilárd

>
> Thanks,
> Jimmy
> --
> Gromacs Users mailing list
>
> * Please search the archive at 
> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!
>
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>
> * For (un)subscribe requests visit
> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
> mail to gmx-users-requ...@gromacs.org.
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.